Blog

How Tape Technology Delivers Value in Modern Data-driven Businesses…in the Age of Zettabyte Storage

Reading Time: 3 minutes

The newly released whitepaper from IT analyst firm ESG (Enterprise Strategy Group), sponsored by IBM and Fujifilm, entitled, “How Tape Technology Delivers Value in Modern Data-driven Businesses,” focuses on exciting, new advances in tape technology that are now positioning tape for a critical role in effective data protection and retention in the age of zettabyte (ZB) storage. That’s right “zettabyte storage!”

The whitepaper cites the need to store 17 ZB of persistent data by 2025. This includes “cold data” stored long-term and rarely accessed that is estimated to account for 80% of all data stored today. Just one ZB is a tremendous amount of data equal to one million petabytes that would need 55 million 18 TB hard drives or 55 million 18 TB LTO-9 tapes to store. Just like the crew in the movie Jaws needed a bigger boat, the IT industry is going to need higher capacity SSDs, HDDs, and higher density tape cartridges! On the tape front, help is on the way as demonstrated by IBM and Fujifilm in the form of a potential 580 TB capacity tape cartridge. Additional highlights from ESG’s whitepaper are below.

New Tape Technology
IBM and Fujifilm set a new areal density record of 317 Gb/sq. inch on linear magnetic tape translating to a potential cartridge capacity of 580 TB native featuring a new magnetic particle called Strontium Ferrite (SrFe) with the ability to deliver capacities that extend well beyond disk, LTO, and enterprise tape roadmaps. SrFe magnetic particles are 60% smaller than the current defacto standard Barium Ferrite magnetic particles yet exhibit even better magnetic signal strength and archival life. On the hardware front, the IBM team has developed tape head enhancements and servo technologies to leverage even narrower data tracks to contribute to the increase in capacity.

The Case for Tape at Hyperscalers and Others
Hyperscale data centers are major new consumers of tape technologies due to their need to manage massive data volumes while controlling costs. Tape is allowing hyperscalers including cloud service providers to achieve business objectives by providing data protection for critical assets, archival capabilities, easy capacity scaling, the lowest TCO, high reliability, fast throughput, low power consumption, and air gap protection. But tape also makes sense for small to large enterprise data centers facing the same data growth challenges including the need to scale their environments while keeping their costs down.

Data Protection, Archive, Resiliency, Intelligent Data Management
According to an ESG survey revealed in the whitepaper, tape users identified reliability, cybersecurity, long archival life, low cost, efficiency, flexibility, and capacity as top attributes in tape usage today and favor tape for its long-term value. Data is growing relentlessly with longer retention periods as the value of data is increasing thanks to the ability to apply advanced analytics to derive a competitive advantage. Data is often kept for longer periods to meet compliance, regulatory, and for corporate governance reasons. Tape is also playing a role in cybercrime prevention with WORM, encryption, and air gap capabilities. Intelligent data management software, typical in today’s active archive environments, automatically moves data from expensive, energy-intensive tiers of storage to more economical and energy-efficient tiers based on user-defined policies.

ESG concludes that tape is the strategic answer to the many challenges facing data storage managers including the growing amount of data as well as TCO, cybersecurity, scalability, reliability, energy efficiency, and more. IBM and Fujifilm’s technology demonstration ensures the continuing role of tape as data requirements grow in the future and higher capacity media is required for cost control with the benefit of CO2 reductions among others. Tape is a powerful solution for organizations that adopt it now!

To read the full ESG whitepaper, click here.

 

 

 

 

 

 

 

Read More

Understanding Your True Cost of IT and Your Carbon Footprint

Reading Time: 4 minutes

I recently attended a webinar about why IT folks have a love/hate relationship with the cloud. They love the cloud because of its on-demand flexibility, unlimited compute and storage capacity, elimination of CAPEX costs, etc. They hate it, according to the webinar presenter, because of the cost that often produces “sticker shock.” Other irritants might include regulatory compliance issues and cyber security concerns.

To be completely fair to the cloud, the presenter explained that discipline and accountability could be brought to bear to help control costs and that organizations need to establish “a cloud center of excellence.” But at the same time, the presenter showed data from a study that suggested that 58% of respondents were moving some cloud-based workloads back to on-premises, private cloud environments. Finally, the presenter advised the audience to “understand your true cost of IT, TCO tools are out there!”

Getting Back to Hybrid Storage Strategies

I think the overall message of the webinar was that the cloud is great when used for the right applications and that a hybrid approach including a healthy mix of public cloud plus private cloud makes a lot of sense. In fact, the trend prior to COVID-19 appeared to be clearly hybrid. Cloud repatriation was happening as IT managers realized that the cloud is not a panacea for everything. During the COVID period, private cloud data centers were understaffed and under-supported by vendors, so the path of least resistance was to over-leverage the public cloud once again. As we begin to emerge from COVID lockdowns and IT staff returns to the data center, attention is being paid once again to finding a healthy mix of public cloud and on-premises private cloud.

This approach only makes sense and clearly reinforces that it is not an either-or scenario. In the case of storage, the cloud complements on-premises storage including today’s highly advanced and automated tape systems. Cloud comes in handy for example when on-demand access is frequently needed by multiple clients while tape systems can manage less frequently accessed and large data sets needing long-term retention including sensitive data and mission-critical data that can be air-gapped as a cyber security best practice. Tape is particularly well suited for these applications thanks to tape’s:

  • High capacity
  • Ease of scalability
  • Ease of removability
  • Long archival life and reliability
  • Low TCO
  • Low energy consumption and low carbon footprint

TCO Tools are Out There

Getting back to the webinar story and the advice to “understand your true cost of IT,” indeed TCO tools are out there and Fujifilm is pleased to offer a free, web-based interactive TCO tool developed by IT economics expert Brad Johns Consulting, LLC. This tool compares 5 year and 10 year TCO of automated tape systems to economy disk systems and cloud-based cold storage. The tool allows users to input the volume of data to be stored, the annual growth rate, the percent of cloud data retrieval as well as other variables such as the local cost per Kwh, the expense of full time storage management staff, number of copies of data, etc. The tool has been available for many years now and has evolved overtime to be as comprehensive as possible and includes the following CAPEX and OPEX cost variables:

  • Media and hardware for disk and tape
  • Maintenance for disk and tape
  • Energy for disk and tape
  • Offsite vaulting for tape
  • Storage management for disk, tape, and cloud
  • Storage and retrieval fees for cloud
  • Data transfer fees for cloud
  • Business level support for cloud

Reducing Energy Consumption and CO2 with Tape

Regarding the cost of energy for disk and tape, this expense can be significant over time especially for disk systems that are constantly spinning 24/7 generating heat and therefore require cooling. Given the heightened awareness of global warming and climate change, organizations are looking for ways to reduce energy consumption and their carbon footprint. Data center operations are no exception and have been spotlighted for their energy-intensive applications. Making greater use of renewable energy is part of the answer, but renewable energy can’t come online fast enough, or cheaply enough, to keep up with exponential data growth. Conservation has an even bigger potential to make a difference and that is where tape systems really shine.

Studies show that under certain scenarios inclusive of data management servers and network infrastructure, tape consumes 87% less energy than equivalent amounts of disk storage and therefore produces 87% less CO2 all while reducing TCO by 86% at the same time. Given that data quickly becomes static and frequency of access goes down dramatically after just 30 to 90 days, it makes sense to move that data from energy-intensive and higher cost tiers of storage like flash, performance disk, or even economy disk to lower-cost, energy-efficient tape systems. A good active archive architecture with intelligent data management software is a great way to achieve such storage optimization (getting the right data, in the right place, at the right time, and at the right cost).

To help highlight the energy advantage of tape and its reduction in CO2, the Fujifilm TCO tool now includes a calculation purely focused on the storage hardware layer that shows the reduction in CO2 compared to disk systems, with an example shown below based on storing 5.0 PB for 10 years with 30% annual growth and 12% data retrieval from the cloud.

So not only is TCO reduced with automated tape systems compared to disk and cloud storage, but a meaningful reduction in CO2 can be achieved and that is exactly what we all need to be doing to help slow down the negative impacts of global warming and climate change.

Read More

New Video Surveillance TCO Tool Makes the Case for LTO Tape Tier in Video Surveillance Operations

Reading Time: 4 minutes

Recently my neighborhood had a rash of car break-ins by what turned out to be just a band of mischievous teenagers. But what struck me about this occurrence was the flood of homeowner video surveillance clips that appeared on social media and that were sent to the local police department to help identify the wrongdoers. It seems like everyone in the neighborhood has a home video surveillance system, perhaps to catch a doorstep package thief, or if nothing else, to catch the guilty dog walkers!

A Booming Market for Video Surveillance Solutions

Indeed, the video surveillance market is booming, not just in the relatively nascent consumer market, but in the commercial market and has been for a long time – in a much bigger way. The reasons for this include more affordable cameras with better resolutions soaring from 720p up to 4k and even 8k. In the meantime, video surveillance systems are finding more and more applications. Retail shopping malls, banks, hotels, city streets, transportation and highways, manufacturing and distribution operations, airport security, college dorm and campus security, corporate security, police body and dash cams, to name just a few – all need good quality video surveillance.

Video Retention Costs Soar

However, these higher resolution cameras have sent the costs of video retention soaring. So much high-resolution raw footage quickly fills up available hard disk drives commonly used to store or retain video surveillance content. According to a Seagate video surveillance calculator, an installation of 100 cameras recording eight hours a day at 30 frames per second, 1080p resolution, with a retention period of 90 days would require 2,006 terabytes of storage. That’s 2.0 petabytes of expensive, energy-intensive hardware. Those with unlimited budgets can simply add more disks. But everyone else faces tough choices: shorten retention periods? lower video resolution? reduce the number of cameras or frames per second? None of these support the goals of why the video surveillance system was installed in the first place.

Read more

Read More

Tape Storage vs. Disk Storage: Getting the Facts Straight about Total Cost of Ownership Calculations

Reading Time: 3 minutes

Modern tape storage has long been recognized for its low cost. Several analyst white papers have been published that demonstrate the low cost of storing data on tape. For example, “Quantifying the Economic Benefits of LTO-8 Technology” is a white paper that can be found on the LTO.org website. However, occasionally a storage solution provider publishes a white paper that claims to show that their solution is less expensive than tape storage for a particular use case. A good example is a recent white paper published by a disk-based backup-as-a-service provider who will remain unidentified out of respect for what they do. For the purpose of this blog, let’s call them “BaaS.” So let’s dig into their analysis which makes several assumptions that result in higher costs for tape storage than most users would experience.

Total Cost of Ownership (TCO) Process

The first step in developing a Total Cost of Ownership (TCO) estimate is the determination of the amount of data to be stored. The BaaS whitepaper separates the amount of primary data, which we wish to protect, from backup data, which is the data physically stored on the backup media. They estimate the amount of backup data residing in the tape library to be two to four times the primary data. This is due to their use of the old daily/ weekly/monthly/ full backup methodology for estimating the amount of backup data. The result is that two to four times the amount of primary data ends up being stored on tape, raising the tape hardware and media costs by two to four times.

Read more

Read More

Tiered Storage: Building the Optimal Storage Infrastructure

Reading Time: < 1 minute


Fortunately, as data continues to grow exponentially, the selection of data storage technologies has never been more robust. The choice of what storage device to use for which application at a given point in time is a balancing act making trade-offs between frequency of access (performance), cost, and capacity. Storage tiering has become a key strategy that lets you optimize the use of storage resources, save costs and make the best use of storage technology for each data classification. The foundations of tiered storage had their beginnings over 30 years ago when disk, automated tape libraries and advanced policy-based data management software such as (HSM) combined to effectively migrate less-active data to less expensive storage devices.

Tiered storage integrates hardware and storage management software to provide a seamless operation and for customers to realize the huge TCO and ROI economic benefits available from optimized storage implementations. The business case for implementing tiered storage is compelling and becomes increasingly so as storage pools get larger. Today’s storage tiers offer several technologies ranging from ultra-high capacity, low cost storage at one end of the hierarchy to very high levels of performance and functionality and at the other. The non-stop, increasing growth of data will require the continual evolution of new, more advanced approaches to tiered storage and management capabilities.

For more information, check out this Horison Information Strategies White Paper “Tiered Storage: Building the Optimal Storage Infrastructure.”

Read More

THE ASCENT TO HYPERSCALE – Part 3

Reading Time: 2 minutes

Part 3: THE VALUE OF TAPE RISES RAPIDLY AS HYPERSCALE DATA CENTERS GROW

In Part 2 of this series, we looked at some of the key characteristics of hyperscale data centers. Now, we’ll explore how tape plays a role.

Today HSDCs are leveraging the many advantages of tape technology solutions to manage massive data growth and long-term retention challenges. Keep in mind most digital data doesn’t need to be immediately accessible and can optimally and indefinitely reside on tape subsystems. Some data requires secure, long-term storage solutions for regulatory reasons or due to the potential value that the data can provide through content analysis at a later date. Advanced tape architectures allow HSDCs to achieve business objectives by providing data protection for critical assets, backup, recovery, archive, easy capacity scaling, the lowest TCO, highest reliability, the fastest throughput, and cybersecurity protection via the air gap. These benefits are expected to increase for tape in the future.

Fighting the cybercrime epidemic has become a major problem for most data centers and HSDCs are no exception. Tape can play a key role in its prevention and provides WORM (Write-Once-Read-Many) and encryption capabilities providing a secure storage medium for compliance, legal and any valuable files. Tape, as an “Air Gap” solution, has gained momentum providing an electronically disconnected copy of data that prevents cybercrime disasters from attacking data stored on tape. Disk systems remaining online 7×24 are the primary target as they are always vulnerable to a cybercrime attack.

HSDCs are taking advantage of tiered storage by integrating high-performance SSDs, HDD arrays and automated tape libraries. Even though HSDCs are struggling with the exploding growth of disk farms which are devouring IT budgets and overcrowding data centers, many continue to maintain expensive disks often half full of data which often has little or no activity for several years. Obviously, few data centers can afford to sustain this degree of inefficiency. The greatest benefits of tiered storage are achieved when tape is used as its scalability, lower price and lower TCO plays an increasing role as the size of the storage environment increases. For the hyperscale world “adding disk is tactical – adding tape is strategic.”

For more information on this topic, check out our white paper: The Ascent to Hyperscale.

 

Read More

LTO-8 Delivers!

Reading Time: 2 minutes

By Rich Gadomski

March 19, 2020

As LTO-8 drives and media are increasingly deployed and widely available, the value proposition of LTO-8 is being confirmed by customers and it’s a pretty impressive story.

In the case of a major high-performance computing (HPC) customer who had been using LTO-6 previously for their archive, the jump to LTO-8 has done wonders for their available capacity. With approximately 7,000 slots in their library, fully loaded with LTO-6 media at 2.5TB each yielded a total native storage capacity of 17.5 PB. Migrating to LTO-8 drives and eventually converting those slots to LTO-8 media at 12.0 TB gives them up to a massive 84 PBs, almost a 5X increase. That’s lots of room to scale as needed!

Performance also gets a big boost as LTO-6 drives are rated at 160 MB per second transfer rate compared to LTO-8 drives at 360 MB per second. This means fewer drives are required to meet the same performance objectives. As a result, TCO also gets a major boost as fewer drives, fewer pieces of media and no additional floor space or library frames are required to manage the same amount of data.

Read more

Read More

When it Comes to Data Storage Spending, Knowing Your Total Cost of Ownership is Key

Reading Time: < 1 minute
Ever wonder if you are getting the best deal on your data storage? Understanding the total cost of ownership (TCO) is critically important to any data storage purchase decision.
Today we introduced our new TCO Calculator, an updated version of our online tool that helps IT professionals assess and compare TCO for automated tape storage, disk-based storage, and cloud-based archive storage. The new TCO Calculator raises the maximum user storage baseline from 10PB to 100PB, integrating the IBM TS4500 enterprise library using LTO-8 drives and media for initial capacities over 10PB. Amazon S3 Glacier Deep Archive and bulk retrieval service is now also included in cloud storage cost comparisons.
After entering data into the TCO Calculator, users can download a customizable results report which includes an executive summary, key cost assumptions, and TCO by cost category and type (e.g., energy costs, offsite costs, service fees, labor, bandwidth, etc.).
Find out how you can start saving on your data storage costs now. Access the free TCO Calculator here.
Read More

Storage Switzerland Video: Considering the Total, Rather than Upfront, Cost of Backup Storage Infrastructure

Reading Time: < 1 minute

In a recent Storage Switzerland blog, Lead Analyst George Crump talks about how, because IT is perpetually working to lower both capital and operating expenses associated with backup storage infrastructure, backup workloads are common targets for migration to the cloud. However, this is not necessarily the most effective strategy for optimizing cost efficiencies.

In this video, he talks with IT consultant Brad Johns about why IT organizations should holistically evaluate the total cost of ownership (TCO) of their backup storage infrastructure, as opposed to focusing solely on immediate costs such as upfront infrastructure acquisition.

Check out George’s blog for more details:

Considering the Total, Rather than Upfront, Cost of Backup Storage Infrastructure

Read More

LET’S DISCUSS YOUR NEEDS

We can help you reduce cost, decrease vendor lock-in, and increase productivity of storage staff while ensuring accessibility and longevity of data.

Contact Us >