Home / Galileo Discovery Center / Technical Advocacy / Controlling the Storage Blob: Six Key Areas That Can Improve Storage Efficiency and Affordability

Read Now

Controlling the Storage Blob: Six Key Areas That Can Improve Storage Efficiency and Affordability

by | Jan 21, 2015 | Technical Advocacy

In the 1958 film “The Blob,” the blob begins as a small alien creature brought to earth via a meteorite, but it quickly begins consuming every human it contacts and becomes ever larger and voracious with each victim.

There are many parallels that can be drawn between “The Blob” and modern business IT data storage. Data storage requirements, for example, just keep growing and growing, consuming increasing hardware, human resources and overall expenses, and there often seems to be no way to keep it from expanding out of control.

Unlike “The Blob” however, we humans actually have a firm understanding about the growing problem of expansive IT data storage requirements, owing primarily to the fact we created the problem in the first place.

Armed with that understanding, IT administrators can take steps to rein in the complexity and expense associated with ever-expanding data storage requirements. According to a report issued by the storage technology analyst firm, Taneja Group, based out of Hopkinton, Mass., a storage monitoring and management solution should deliver efficiency and affordability in six key areas.

  1. Improving decision-making and cost-control – Deciding which data is most important and requires the highest performance storage solutions is key to reducing costs. An efficient storage monitoring solution can streamline such decisions, reducing time and freeing up application resources.
  2. Capacity planning and IT budgeting – A storage monitoring and management solution should be easy to set up so if storage or performance thresholds are exceeded, alarms are automatically triggered. By knowing thresholds ahead of time, IT administrators can budget accordingly.
  3. Exploit Virtualization Technology – Virtualization is, essentially, creating multiple, virtual partitions on a single server, storage device or computing network. Virtualization allows a single hardware device to act as several, so IT administrators can more efficiently leverage the physical footprint and energy utilization of a single device. Virtualization can be leverage on-site or as a Software-as-a-Service (SaaS) and allows for more convenient and secure data monitoring.
  4. IT function management should support business objectives – While it’s one thing to monitor and optimize storage resource management (SRM) to relieve some of the burden on IT administrators, storage monitoring should always be optimized to serve overall business goals and provide executives with timely and relevant performance analysis when necessary.
  5. Avoid unplanned downtime and slowdowns – Unplanned hardware or software outages can be disastrous for any business. A good storage monitoring and management solution should be able to detect any potential downtime events and alert administrators accordingly. More than that, storage monitoring should be equipped to prevent system slowdowns in addition to outright outages. A system slowdown can be just as costly, if not more so, than an outage—which usually have failover redundancies—because a slowdown can be more difficult and time-consuming to isolate. As the Taneja report stated “the real problem isn’t failure, it’s slowdowns.”
  6. Minimize additional IT purchases and optimize hardware performance – Computing resources and storage capacity are often underutilized. IT hardware is expensive, as is the power required to operate it. Virtualization solutions and similar applications can dramatically improve overall IT utilization.

Without a strong storage direction strategy in place, data can quickly overwhelm any organization, just like “The Blob.” Vigilant storage monitoring and management is crucial to stay one step ahead of the data blob.

Related Articles