Get Even More Visitors To Your Blog, Upgrade To A Business Listing >>

Five Major Obstacles to Optimal Data Management

Data-driven organizations have large amounts of digital assets that are critical to their businesses and are growing exponentially. That’s because organizations are finding new ways to use data they collect to further their goals. In 2018 alone, 97.2 percent of organizations reportedly invested in big data and artificial intelligence.

Why is it then that many still stress over the growth of data? Given that data is now widely accepted as the single greatest asset of any given organization, it seems that data growth should be a point of anticipation. The main reasons for contention seem to be the cost of storage solutions, hardware footprint and the Management of the physical storage itself. So how can organizations leverage data management to maximize efficiencies, create further value and bring down storage costs? And what barriers stand in their way today?

We drew from our 40 years of experience in the storage industry to identify five major obstacles to optimal data management.

1 – Identifying Inactive Data

Organizations often store all data, active and inactive, on an expensive Primary Tier of storage intended for active data. An overly cumbersome Primary Tier generally equates to larger and longer backups, costly new storage purchases, extra storage administration, as well as a greater need for power, cooling and floor space. Upwards of 80 percent of data is typically inactive, meaning data is being stored on the wrong tier, costing organizations millions of dollars a year.

By removing inactive data from the Primary Tier of storage, administrators can prevent their primary storage from filling up and reduce the need to purchase additional primary storage. A smaller primary storage tier will also shrink backup windows, cut costs, and increase overall performance, thus freeing capital that could be used to purchase a faster tier of primary storage that includes NVMe, Flash or other solid-state disk storage (SSD). Lower-cost storage solutions are readily available, but when users are not sure what data has gone cold, ensuring that data is stored on the right tier can be virtually impossible. To make informed decisions as to what data they would like to offload from primary storage, users need knowledge of the data they already have.

2 – Complex & Costly Solutions

When it comes to offloading primary storage, existing solutions to migrate data to lower-cost storage repositories often lead to further complexities, costs and even isolation of information. While solutions from providers of primary storage may offer excellent means for staging data between various targets within the Primary Tier of storage, they tend to encourage vendor lock-in and prevent a “best of breed” approach to implementing more affordable forms of storage for cold data. Hierarchical Storage Management (HSM) software is another approach, and one that is designed specifically to stage data on the appropriate level of storage hardware. However, for a number of reasons, these solutions are only justified in extreme data environments. Even new software packages that fit into the category of “Storage Management Software” seem to fall victim to one of the two challenges found with HSM packages – either extremely complex or enormously expensive. There are solutions that have capacity-based licensing that charge on the amount of data scanned regardless of whether the data is moved or not.

As the cost per gigabyte of hardware storage has decreased significantly for NAS and tape, the cost per gigabyte of the software required to move data to lower cost storage has increased exponentially. In many cases, the total cost of implementing a data management solution can negate any savings from moving data to a lower cost tier, or create more management complications than it solves.

 Rising Cost Of Software Versus Falling Cost Of Hardware

3 – Inadequate Protection

Cyberattack, ransomware, natural disaster or simple human error can destroy an organization’s data in a matter of seconds. A successful data management strategy requires the assurance of both data reliability and data protection. Because data offloaded from the Primary Tier of storage is not part of the typical backup process, keeping multiple copies of data in multiple locations is critical to reduce the likelihood that the data will be lost.

The increasing value of data over the last decade means that data is held for longer periods of time – often forever. If your storage management software is simply migrating data to a lower-cost tier of storage, then it is not doing enough to protect your organization’s most valuable asset.

4 – Addressing All Workflow Needs

Modern storage management must offer complete data management. Automatically identifying inactive data and moving it to lower cost storage is obviously a key component, but an optimal data management strategy should allow users to define policies and direct their data to any combination of storage targets based on how that data will be used.

While data is not considered “active” on this lower cost tier (also known as the Perpetual Tier), there is quite a bit still happening at this level. The Perpetual Tier is used for secondary storage; distribution; multiple copies (a responsive copy and DR copy); backup; archive; project archive; and traditional disaster recovery. Finally, there are many times when administrators or users need to manually move data, migrating logically grouped files associated with a given project or data collection together as an archive. Requiring multiple software solutions to manage all of these workflows places a disproportionate burden on data-driven organizations, hampering their efforts to configure their most economical storage tier to be as responsive as their workflows demand.

A modern storage approach should take into consideration all of these issues. By doing so, IT administrators, content creators and data curators will be able to create a better experience for external clients or internal data users to share research, monetize data, create a competitive edge in the market or satisfy whatever mandates their organization relies on to accomplish ‘access to data.’

Continue reading on how to implement optimal data management in our white paper, “Data Archiving Best Practices Using a Perpetual Storage Tier”.

The post Five Major Obstacles to Optimal Data Management appeared first on Spectra Logic.



This post first appeared on Category Archive For "Blog" | Spectra Logic, please read the originial post: here

Share the post

Five Major Obstacles to Optimal Data Management

×

Subscribe to Category Archive For "blog" | Spectra Logic

Get updates delivered right to your inbox!

Thank you for your subscription

×