Late last year, a Cohesity survey found that 47% of IT executives are worried about blowing their organization’s IT budget on unnecessary storage. This week, the vendor’s a second 900-respondent survey Mass Data Fragmentation in the Cloud indicates that the cost of moving large volumes of data is a significant concern. Looking just at organizations that have yet to complete their mandate to move to the public cloud, 42% of IT executives are concerned about the large costs, worry about expenses associated with on-premises to public cloud data migration.

Data migration, rightsizing storage, and eliminating redundancy are the focus of conversations about saving money on public cloud storage.

Migration concerns have been heightened in recent years because multicloud has become more prevalent and because of data gravity. Data gravity describes the tendency of data to be stored where it is created, and then for applications to run in nearby environments to gain better latency and throughput. If you take security requirements and costs out of the equation, it is better to store your data in the same public cloud in which it is running. However, the cost of migrating data from corporate data centers is significant. As noted by Datamation, cloud companies provide free data migrations, but this might not be the best option for companies with a multicloud approach that does not involve a massive one-time lift to a new cloud environment.

Source: “Cohesity’s Secondary Data Market Study.” 87% of IT executives surveyed said their organization’s secondary data is fragmented. This group is concerned that too much is being spent on storage.

Besides migration, data is often also moved when as part of the day-to-day operations of an application. A public cloud application utilizing data stored in another cloud (private or public) faces charges based either on the bandwidth or the amount of data being transferred.

Source: “Cohesity’s Mass Data Fragmentation in the Cloud.” This chart shows the IT team’s concerns of the 50% of respondents at an organization where there is an uncompleted mandate from a senior executive to move to the public cloud. Note that this question did not ask about security and other commonly cited concerns about public cloud.

The new Cohesity study also found that respondents: 1) keep on average three copies of the same data created in public cloud environments and 2) utilize an average of three separate vendor solutions or point products to manage “secondary” storage and data across all public clouds. Secondary storage is usually non-mission critical, static secondary data stored in backups, archives, file shares, object stores, test and development systems, analytics, data sets, private and public clouds. Best practices dictate that since it does not have to be accessed as often, secondary data should be stored on cheap hardware. Storage lifecycle management systems tag data based on age and importance, and then software decide where the data should be stored, or if necessary deleted. This approach is also how most companies are approaching GDPR requirements regarding personal data. Without storage lifecycle management, individuals end up choosing what types of storage should be utilized for different use cases.

Read More:   Deep Diving into React Native Tutorial for Beginners 2022

Concern about overspending on public cloud is significant, with many organizations struggling with the best approach to cost optimization. Even after choosing the most cost-effective execution venue and type of compute, the price to move and store data is keeping IT executives up at night. It is yet to be determined if vendor solutions offer a simple way to get a good nights sleep.

Reducing the Number of Backups

Although storing secondary data is cheap, the costs add up when IT doesn’t throw away old backups. Nutanix and Veeam Software are partnering to provide software that helps companies reduce the number of snapshots and other backups needed. To help explain the need for their products, Nutanix has released the Cloud Usage Report 2019 and Veeam published its 2019 Cloud Data Management Report. Looked at holistically, these studies give context to decisions regarding how and when backups are created.

Source: “Nutanix’s Cloud Usage Report 2019”

The Nutanix report provides details data about its Xi Beam customers’ cloud usage. We don’t think the data should be used to draw any broad conclusions because it only represents a small subset of Nutanix’s over 13,000 customers. That being said, it illustrates how companies are optimizing their use of cloud resources. The chart below shows how many actions are taken to eliminate redundant resources, with over 92% being the deletion of unused snapshots. This does not mean that reducing snapshot storage is saving the most money, but rather it is just recording an observed event in the cloud environment. It also shows that companies with less than a $1 billion annual revenue are commonly taking actions to rightsize VMs. Whether or not these actions are automated would have a significant impact on the relevance of this data.

Source: “Nutanix’s Cloud Usage Report 2019”. Defining the different types of storage services can be difficult, so it is difficult to make an accurate comparison of Azure and AWS customer spending patterns.

Source: Veeam’s “2019 Cloud Data Management Report.” Fifty-four percent of “high priority” applications are either backed up or replicated at least once an hour, while “normal” applications are protected that often 37% of the time.

Readers may also be interested in Best Practices for Effective Cloud Migration, which was published on our site earlier this year.

Read More:   How to Turn Your AIOps Vision into Autonomous Cloud Reality – InApps 2022

Feature image via Pixabay.

InApps is a wholly owned subsidiary of Insight Partners, an investor in the following companies mentioned in this article: Veeam Software.