The Tape Storage Council, (TSC), released a new report “Tape to Play Critical Roles as the Zettabyte Era Takes Off,” which highlights the current trends, usages and technology innovations occurring within the tape storage industry. The zettabyte era is in full swing generating unprecedented capacity demand as many businesses move closer to Exascale storage requirements.
According to the LTO Program, 148 Exabytes (EB) of total tape capacity (compressed) shipped in 2021, marking an impressive record year. With a growth rate of 40%, this strong performance in shipments continues following the previous record-breaking 110 EB capacity shipped in 2019 and 105 EB of capacity shipped in the pandemic affected year of 2020.
The ever-increasing thirst for IT services has pushed energy usage, carbon emissions, and reducing the storage industry’s growing impact on global climate change to center stage. Plus, ransomware and cybercrime protection requirements are driving increased focus on air gap protection measures.
As a result of these trends, among others, the TSC expects tape to play an even broader role in the IT ecosystem going forward as the number of exabyte-sized environments grow. Key trends include:
Data-intensive applications and workflows fuel new tape growth.
Data accessibility. Tape performance improves access times and throughput.
Tape should be included in every green data center strategy.
Storage optimization receives a big boost from an active archive which provides dynamic optimization and fast data access for archival storage systems.
Organizations continue to invest in LTO tape technology thanks to its high capacity, reliability, low cost, low power consumption and strong data protection features, especially as threats to cybersecurity soar.
The LTO Technology Provider Companies (IBM, HPE, and Quantum) issued a press release earlier this week announcing record capacity shipments for LTO in 2021 of 148 Exabytes (148,000 Petabytes) compressed (up from 105 EB compressed in 2020, +40%). More and more of the world’s data is being stored on LTO data tape. That’s good news for the IT industry! Is it not? After all, end users and service providers need:
A strategic way to store and protect massive amounts of increasingly valuable data, especially data that’s gone cool or cold
A cost-effective and reliable long term storage solution
An air gap defense against cybercrime
An eco-friendly form of storage!
Industry Pundits React Some industry pundits, biased toward the HDD industry, took the opportunity to downplay the news. They said the data is inaccurate or insignificant compared to the capacity shipments for HDD last year. Really? Does tape technology threaten the market for HDD? Is it still about tape vs. disk in their minds? Have trains, trucks, and ships put air freight out of business? Or does a more strategic thought process say: “These technologies complement each other. We need both to meet the needs of end-users, storage service providers, and society itself…”
Analysts Predict Huge Zettabyte Demand Indeed, if the big industry analysts firms are right, we will need to be storing more than 11.0 zettabytes of data in 2025. Just one zettabyte would require 55 million 18.0 TB HDDs or 55 million LTO-9 tape cartridges. Should we store all of that data on HDD, whether it is hot, warm, cool, or cold? Of course, we can’t just delete excess data. Now that we can analyze the data and derive a competitive advantage from it, the value of data has increased and we need to store more and more data for longer periods of time. As a result, the projections for the amount of persistent data to be stored are growing exponentially. We will need huge amounts of flash, HDD, tape, and even future storage solutions like DNA to address the data storage challenge.
A Strategic Approach to Data Storage The key to success will be a strategic approach that leverages intelligent data management software to automate data movement to the right tiers of storage at the right time, the right cost, and the right energy profile. Employing a strategic approach to data storage in an effort to reduce costs and energy consumption all while maintaining service level agreements seems to make sense. Take a good look at an active archive solution, for example. Yet again, there are those industry pundits who say, the amount of energy saved by moving static, inactive, and infrequently accessed data to a tape tier is not significant in the big picture of the IT industry. The real problem they say is the amount of energy consumed by a single Google search. But isn’t that like saying; “Don’t bother turning the lights out before leaving the office for the night. It’s just a drop in the ocean of energy consumption,” or “Why bother turning off the engine of your car when filling up on gas? It’s just a few minutes of idle time and won’t really impact CO2 emissions at all.” Right?
Change of Attitude Needed But this is the wrong attitude and exactly what has to change to make a difference. Collectively, if we all switch off a light and all turn the car’s engine off, we will make a difference. We might even get motivated for more change! How about installing LED light bulbs or investing in an electric vehicle? Or maybe make the commitment and take the leadership on a renewable energy installation. Attitudes have to change, believing we can make a difference collectively. If data is inactive, why keep it on energy-intensive, constantly spinning disk? Are we all doing whatever it takes to make a difference?
New Flagship UN Report Is a Wake-up Call If we believe the latest studies on climate change coming out of the United Nations, we need to start quickly taking any action we can. A new UN report on climate change from earlier this month indicated that harmful carbon emissions in the last decade have never been higher in earth’s history. It’s proof that the world is on a “fast track” to climate disaster. UN Secretary General Antonio Guterres has warned that it’s ‘now or never’ to limit global warming to 1.5 degrees C. Climate change is the result of more than a century of unsustainable energy and land use, lifestyles, and patterns of consumption and production. Guterres adds, “This is not fiction or exaggeration. It is what science tells us will result from our current energy policies. We are on a pathway to global warming of more than double the 1.5-degrees C limit” that was agreed in Paris in 2015. To limit global warming to around 1.5 C (2.7 F), the IPCC report insists that global greenhouse gas emissions will have to peak “before 2025 at the latest, and be reduced by 43% by 2030.”
Reducing Energy Consumption and CO2 Emissions with Tape To help increase awareness and understanding of energy consumption in data storage, a number of whitepapers have been published highlighting alternative options for storage including LTO data tape. A recent IDC whitepaper studied migrating cold data from HDDs to LTO tape. The opportunity to positively impact the environment by shifting to tape is staggering. This strategic approach can reduce storage-related CO2 emissions by, coincidently, 43.7% by 2030. This would avoid 664 M metric tons of CO2 cumulatively. That’s the equivalent amount of CO2 produced by 144 million passenger cars driven in the course of a year!
Other research shows that tape consumes 87% less energy than equivalent amounts of HDD storage. When CO2 emissions are analyzed over the entire product lifecycle (from raw materials to production to distribution, usage, and disposal) of HDD and tape, studies show a 95% reduction in CO2 in favor of tape compared to HDD. The same study shows Total Cost of Ownership for long-term data storage can be reduced by more than 70% by using tape instead of HDD. At the same time, tape can provide an effective defense against cybercrime via a physical air gap. All of this is possible by taking a strategic storage approach, where cool or cold data that has aged and is infrequently accessed gets moved from expensive primary storage to economical and environmentally friendly tape systems, online or offline.
Data Center World Attendees Get It In my last blog on my visit and presentation at Data Center World in Austin last month, I mentioned that I was encouraged by the DCW attendees that I met and listened to in my session and other sessions. They are genuinely concerned about the environment and worried about what kind of planet we will be leaving behind for our kids and grandchildren. They recognize the opportunity to improve sustainability in data center operations and are committed to it. But since then it has occurred to me that maybe sustainability is more of a focus for facility teams. Perhaps the top-down pressure from the C-suite has yet to be widely applied to the data storage management teams. However, in the quest to achieve the needed sustainability goals, no stone can remain unturned.
Observing Earth Day for Future Generations With Earth Day being observed today, let’s commit to strategically taking action in response to global warming and climate change. Let’s start changing attitudes from “It won’t make a difference” to “Collectively, we can make a difference.” Let’s look at the bright side of increasing LTO capacity shipments instead of the dark, self-serving side. Let’s think about the planet that’s home for us and the future generations of our families to come.
The Arrival of the Zettabyte Era The data storage market has clearly entered the “zettabyte era” where new capacity shipments have exceeded a massive one zettabyte for a couple of years now. The data storage requirements are being driven by the phenomenon of “digital transformation” and the rising value of data that needs to be stored for longer periods of time, and in some cases, indefinitely. Further accelerating the zettabyte era is the other era we are all in, that being the “pandemic era”. With this era comes the unanticipated need for an unexpected remote workforce and the ever-expanding internet with its proliferation of online apps.
Pandemic Related Supply Shortages The pandemic has brought with it related disruptions to the global supply chain including shortages of semiconductor chips. It’s been tough to get modern goods from toys to notebooks to refrigerators to automobiles. The combination of zettabyte and pandemic era has even put a strain on supply chains and the availability of SSDs and HDDs needed to support the digital transformation. This has been the cause of fluctuating prices based on quarterly supply and demand swings.
Supply Chain Challenges Persist While pandemic-related labor shortages have delayed the production and distribution of goods, other factors are making matters worse. How about global warming, climate change, and the ensuing natural disasters that have had negative impacts on the supply chain? How about international rivalries and tensions impacting the availability of key components? Or cybercriminals shutting down vital infrastructure? Bottom line: industry pundits say we can expect supply chain hassles to continue throughout 2022.
Supply Chain Contingency Planning in Data Storage Faced with supply chain risks in any industry, it’s always good to have contingency plans to mitigate risk and ensure ongoing operations. The IT industry is no exception where the availability of commodities that we may take for granted can be interrupted by any of the factors listed above from unforeseen demand to pandemic-related shortages to global warming, trade wars, and cybercrime.
A great way to avoid supply chain disruptions in the availability of primary storage devices like SSDs and HDDs is to employ intelligent data management software, typical of active archive solutions, that will automate the migration of data from these potentially supply chain affected devices to a modern, automated tape library. Since 60 to 80 percent of data quickly goes cold after a short period of time, why keep it stored on higher performing, expensive, and energy-intensive devices? Given the global supply chain uncertainty, 3 good reasons to migrate data from primary storage devices to tape storage are:
Free up capacity on expensive Tier 1 and Tier 2 storage devices like SSDs and HDDs in favor of TCO friendly tape systems
Reduce energy consumption and related CO2 emissions by leveraging the low power profile of automated tape systems
Take advantage of tape’s natural air gap security in the never-ending war against ransomware
The above actually makes sense even in the absence of supply chain concerns. Since data to be stored is growing at a CAGR of around 30% versus IT budget growth somewhere in the low single digits, the IT industry needs to find a more cost-effective storage solution. With the increasing value of data and indefinite retention periods, the long-term archival profile of tape coupled with best-in-class reliability actually makes sense.
Fighting Climate Change and Cybercrime Finally, we all have to engage in the battle against global warming and climate change if we are to preserve the planet that we inhabit. Studies show that tape systems consume 87% less energy than equivalent amounts of disk storage and produce 95% less CO2 emissions than disk over the product lifecycle. Other studies show that collectively, the global IT industry could avoid as much as 664 million metric tons of CO2 emissions by strategically moving more data to tape systems. As data cools off or goes cold, it should migrate to less expensive, less energy-intensive, and more secure tiers of storage.
Once the pandemic era finally subsides, it will be environmental calamities brought on by climate change and the relentless threat of cybercriminals that will have long-term impacts on supply chains.
The newly released whitepaper from IT analyst firm ESG (Enterprise Strategy Group), sponsored by IBM and Fujifilm, entitled, “How Tape Technology Delivers Value in Modern Data-driven Businesses,” focuses on exciting, new advances in tape technology that are now positioning tape for a critical role in effective data protection and retention in the age of zettabyte (ZB) storage. That’s right “zettabyte storage!”
The whitepaper cites the need to store 17 ZB of persistent data by 2025. This includes “cold data” stored long-term and rarely accessed that is estimated to account for 80% of all data stored today. Just one ZB is a tremendous amount of data equal to one million petabytes that would need 55 million 18 TB hard drives or 55 million 18 TB LTO-9 tapes to store. Just like the crew in the movie Jaws needed a bigger boat, the IT industry is going to need higher capacity SSDs, HDDs, and higher density tape cartridges! On the tape front, help is on the way as demonstrated by IBM and Fujifilm in the form of a potential 580 TB capacity tape cartridge. Additional highlights from ESG’s whitepaper are below.
New Tape Technology IBM and Fujifilm set a new areal density record of 317 Gb/sq. inch on linear magnetic tape translating to a potential cartridge capacity of 580 TB native featuring a new magnetic particle called Strontium Ferrite (SrFe) with the ability to deliver capacities that extend well beyond disk, LTO, and enterprise tape roadmaps. SrFe magnetic particles are 60% smaller than the current defacto standard Barium Ferrite magnetic particles yet exhibit even better magnetic signal strength and archival life. On the hardware front, the IBM team has developed tape head enhancements and servo technologies to leverage even narrower data tracks to contribute to the increase in capacity.
The Case for Tape at Hyperscalers and Others Hyperscale data centers are major new consumers of tape technologies due to their need to manage massive data volumes while controlling costs. Tape is allowing hyperscalers including cloud service providers to achieve business objectives by providing data protection for critical assets, archival capabilities, easy capacity scaling, the lowest TCO, high reliability, fast throughput, low power consumption, and air gap protection. But tape also makes sense for small to large enterprise data centers facing the same data growth challenges including the need to scale their environments while keeping their costs down.
Data Protection, Archive, Resiliency, Intelligent Data Management According to an ESG survey revealed in the whitepaper, tape users identified reliability, cybersecurity, long archival life, low cost, efficiency, flexibility, and capacity as top attributes in tape usage today and favor tape for its long-term value. Data is growing relentlessly with longer retention periods as the value of data is increasing thanks to the ability to apply advanced analytics to derive a competitive advantage. Data is often kept for longer periods to meet compliance, regulatory, and for corporate governance reasons. Tape is also playing a role in cybercrime prevention with WORM, encryption, and air gap capabilities. Intelligent data management software, typical in today’s active archive environments, automatically moves data from expensive, energy-intensive tiers of storage to more economical and energy-efficient tiers based on user-defined policies.
ESG concludes that tape is the strategic answer to the many challenges facing data storage managers including the growing amount of data as well as TCO, cybersecurity, scalability, reliability, energy efficiency, and more. IBM and Fujifilm’s technology demonstration ensures the continuing role of tape as data requirements grow in the future and higher capacity media is required for cost control with the benefit of CO2 reductions among others. Tape is a powerful solution for organizations that adopt it now!
As recently announced by Fujifilm, LTO-9 has arrived and is available for immediate delivery. It certainly comes at a time when the IT industry is so challenged to manage rampant data growth, control costs, reduce carbon footprint and fight off cyber-attacks. LTO-9 is coming to market just in time to meet all of these challenges with the right features like capacity, low cost, energy efficiency, and cyber security.
What a Great Run for LTO First of all, it is remarkable to look at how far LTO Ultrium technology has come since its introduction. LTO made its market debut in 2000 with the first generation LTO-1 at 100/200 GB native/compressed capacity with 384 data tracks. Transfer rate was just 20 MB native and 40 MB compressed per second. Fast forward 21 years to the availability of LTO-9 now with 18/45 TB native/ compressed capacity on 8,960 data tracks, with transfer rate increasing to 400 MB per second, 1,000 MB per second compressed! In terms of compressed capacity, that’s a 225X increase compared to LTO-1. Since 2000, Fujifilm alone has manufactured and sold over 170 million LTO tape cartridges, a pretty good run indeed.
Capacity to Absorb Bloated Data Sets We are firmly in the zettabyte age now and it’s no secret that data is growing faster than most organizations can handle. With compound annual data growth rates of 30 to 60% for most organizations, keeping data protected for the long term is increasingly challenging. Just delete it you say? That’s not an option as the value of data is increasing rapidly thanks to the many analytics tools we now have to derive value from it. If we can derive value from that data, even older data sets, then we want to keep it indefinitely. But this data can’t economically reside on Tier 1 or Tier 2 storage. Ideally, it will move to Tier 3 tape as an archive or active archive where online access can be maintained. LTO-9 is perfect for this application thanks to its large capacity (18 TB native, 45 TB compressed) and high data transfer rate (400 MB sec native, 1,000 MB sec compressed).
Lowest TCO to Help Control Costs Understanding your true total cost of ownership is of vital importance today as exponential data growth continues unabated. The days of just throwing more disk at storage capacity issues without any concern for cost are long gone. In fact, studies show that IT budgets on average are growing at less than 2.0% annually yet data growth is in the range of 30% to 60%. That’s a major disconnect! When compared to disk or cloud options, automated tape systems have the lowest TCO profile even for relatively low volumes of data less than one petabyte. And for larger workloads, the TCO is even more compelling. Thanks to LTO-9’s higher capacity and fast transfer rate, the efficiency of automated tape systems will improve keeping the TCO advantage firmly on tape’s side.
Lowest Energy Profile to Reduce Carbon Footprint Perhaps of even greater concern these days are the environmental impacts of energy-intensive IT operations and their negative effect on global warming and climate change. You may have thought 2020 was a pretty bad year, being tied for the hottest year on record with 2016. Remember the raging forest fires out West or the frequency of hurricanes and tropical storms? Well, it turns out 2021 is just as bad if not worse with the Caldor Fire and Hurricane IDA fresh in our memory.
Tape technology has a major advantage in terms of energy consumption as tape systems require no energy unless tapes are being read or written to in a tape drive. Otherwise, tapes that are idle in a library slot or vaulted offsite consume no energy. As a result, the CO2 footprint is significantly lower than always on disk systems, constantly spinning and generating heat that needs to be cooled. Studies show that tape systems consume 87% less energy and therefore produce 87% less CO2 than equivalent amounts of disk storage in the actual usage phase. More recent studies show that when you look at the total life cycle from raw materials and manufacturing to distribution, usage, and disposal, tape actually produces 95% less CO2 than disk. When you consider that 60% to 80% of data quickly gets cold with the frequency of access dropping off after just 30, 60, or 90 days, it only makes sense to move that data from expensive, energy-intensive tiers of storage to inexpensive energy-efficient tiers like tape. The energy profile of tape only improves with higher capacity generations such as LTO-9.
A Last Line of Defense Against Cybercrime Once again, 2021 is just as bad if not worse than 2020 when it comes to cybercrime and ransomware attacks. Every webinar you attend on this subject will say something to the effect of: “it’s not a question of if; it’s a question of when you will become the next ransomware victim.” The advice from the FBI is pretty clear: “Backup your data, system images, and configurations, test your backups, and keep backups offline.”
This is where the tape air gap plays an increasingly important role. Tape cartridges have always been designed to be easily removable and portable in support of any disaster recovery scenario. Thanks to the low total cost of ownership of today’s high-capacity automated tape systems, keeping a copy of mission-critical data offline, and preferably offsite, is economically feasible – especially considering the prevalence of ransomware attacks and the associated costs of recovery, ransom payments, lost revenue, profit, and fines.
In the event of a breach, organizations can retrieve a backup copy from tape systems, verify that it is free from ransomware and effectively recover. The high capacity of LTO-9 makes this process even more efficient, with fewer pieces of media moving to and from secure offsite locations.
The Strategic Choice for a Transforming World LTO-9 is the “strategic” choice for organizations because using tape to address long-term data growth and volume is strategic, adding disk is simply a short-term tactical measure. It’s easy to just throw more disks at the problem of data growth, but if you are being strategic about it, you invest in a long-term tape solution.
The world is “transforming” amidst the COVID pandemic as everyone has to do more with less and budgets are tight, digital transformation has accelerated, and we are now firmly in the zettabyte age which means we have more data to manage efficiently, cost-effectively, and in an environmentally friendly way. The world is also transforming as new threats like cybercrime become a fact of life, not just a rare occurrence that happens to someone else. In this respect, LTO-9 indeed comes to market at the right time with the right features to meet all of these challenges.
There is increasing pressure around the world to reduce emissions and lower mankind’s carbon footprint. It is up to the IT sector to do its part, and that means considerably lowering power usage. But that is easier said than done when you consider the statistics.
IDC predicts we will arrive at the mind-boggling figure of 175 zettabytes of data in the digital universe within 4 years. 175 ZB? Consider how long it takes most users to fill a one TB drive. Well, 175 ZB equates to approximately 175 billion TB drives.
The problem is this: how do you reduce IT’s overall power draw in the face of a massive and continual upsurge in data storage? Once 175 ZB of data exists, there is no possibility of containing electrical usage if the vast majority of storage is sitting on hard disk drives (HDDs). The only solution is to cure the industry’s addiction to disk.
Here are the numbers. Data centers alone account for close to 2% of all power consumed in the U.S., about 73 billion kilowatt hours (kWh) in 2020. That is enough to set off the alarm bells. Yet tremendous progress has been made over the past two decades in terms of data center efficiency. When power consumption in data centers soared by 90% between 2000 and 2005 period, the industry acted forcefully. The rate of growth slowed to 24% between 2005 and 2010 and then fell to less than 5% for the entire decade between 2010 and 2020. That’s miraculous when you consider that it was achieved during a period that represented the largest surge in storage growth in history. Smartphones, streaming video, texting, multi-core processors, analytics, the Internet of Things (IoT), cloud storage, big data, and other IT innovations demanded the retention of more and more data.
Big strides were made in Power Usage Effectiveness (PUE – the ratio of data center power consumption divided by the power usage). Data centers have largely done a good job in improving the efficiency of their operations. But the one area lagging badly behind is storage efficiency.
In mid-December 2020, Fujifilm issued a press release to announce that, together with IBM Research, they had successfully achieved a record areal density of 317 Gbpsi (billion bits per square inch) on next-generation magnetic tape coated with next-generation Strontium Ferrite (SrFe) magnetic particles. This areal density achievement would yield an amazing native storage capacity of 580TB on a standard-sized data cartridge. That’s almost 50 times more capacity than what we have now with an LTO-8 tape based on Barium Ferrite (BaFe) at 12TB native.
Shortly after the news came out, I was on a call with a member of our sales team discussing the announcement and he asked me when the 580TB cartridge would be available and if there was any pricing information available yet? He was also curious about transfer speed performance. I had to admit that those details are still TBD, so he asked me “what are the 3 big takeaways” from the release? So let’s dive into what those takeaways are.
Tape has no fundamental technology roadblocks
To understand the magnitude of tape areal density being able to reach 317 Gbpsi, we have to understand just how small that is in comparison to HDD technology. Current HDD areal density is already at or above 1,000 Gbpsi while achieving 16TB to 20TB per drive on as many as nine disk platters. This level of areal density is approaching what is known as the “superparamagnetic limitation,” where the magnetic particle is so small that it starts to flip back and forth between positive and negative charge. Not ideal for long-term data preservation.
So to address this, HDD manufacturers have employed things like helium-filled drives to allow for closer spacing between disk platters that allow for more space for more platters, and therefore more capacity. HDD manufacturers are also increasing capacity with new techniques for recording involving heat (HAMR) or microwaves (MAMR) and other techniques. As a result HDD capacities are expected to reach up to 50TB within the next five years or so. The reason tape can potentially reach dramatically higher capacities has to do with the fact that a tape cartridge contains over 1,000 meters of half-inch-wide tape, and, therefore, has far greater surface area than a stack of even eight or nine 3.5-inch disk platters.
But let’s also look at track density in addition to areal density. Think about the diameter of a single strand of human hair which is typically 100 microns wide. If a single data track on HDD is 50 nanometers wide, you are looking at 2,000 data tracks for HDD on the equivalent width of a single strand of human hair! For tape, with a track width of approximately 1,200 nanometers, you are looking at just 84 data tracks. But this is actually a positive for tape technology because it shows that tape has a lot of headroom in both areal density and track density, and that will lead to higher capacities and help to maintain a low TCO for tape.
But let me make it clear that this is not about HDD vs. tape. We are now in the zettabyte age having shipped just over an impressive one zettabyte (1,000 exabytes) of new storage capacity into the global market in 2019 of all media types. According to IDC, that number will balloon to a staggering 7.5 ZB by 2025. We will need a lot of HDDs and a lot of tape (and flash for that matter) to store 7.5 ZB!
By Rich Gadomski, Head of Tape Evangelism at FUJIFILM Recording Media, U.S.A., Inc.
The past decade saw the renaissance of data tape technology with dramatic improvements to capacity, reliability, performance, and TCO giving rise to new industry adoptions and functionality. This trend will only continue in 2021 as data storage and archival needs in the post-COVID digital economy demand exactly what tape has to offer. Below are 5 key contributions tape will make to the storage industry in 2021.
Containing the Growing Cost of Storage One lingering effect of the pandemic will be the need for more cost containment in already budget-strapped IT operations. We are well into the “zettabyte age,” and storing more data with tighter budgets will be more important than ever. Businesses will need to take an intelligent and data-centric approach to storage to make sure the right data is in the right place at the right time. This will mean storage optimization and tiering where high capacity, low-cost tape plays a critical role — especially in active archive environments.
A Best Practice in Fighting Ransomware One of many negative side effects of COVID-19 has been the increasing activity of ransomware attacks, not only in the healthcare industry which is most vulnerable at this time, but across many industries, everywhere. Backup and DR vendors are no doubt adding sophisticated new anti-ransomware features to their software that can help mitigate the impact and expedite recovery. But as a last line of defense, removable tape media will increasingly provide air-gap protection in 2021, just in case the bad actors are one step ahead of the good guys.
Compatibility with Object Storage Object storage is rapidly growing thanks to its S3 compatibility, scalability, relatively low cost and ease of search and access. But even object storage content eventually goes cold, so why keep that content on more expensive, energy-intensive HDD systems? This is where tape will play an increasing role in 2021, freeing up capacity on object storage systems by moving that content to a less expensive tape tier all while maintaining the native object format on tape.
By Rich Gadomski, Head of Tape Evangelism at FUJIFILM Recording Media, U.S.A., Inc.
The past 10 years have been marked by explosive data growth and demand for storage. Meanwhile, the tape industry has experienced a renaissance thanks to significant advancements in capacity, reliability, performance, and functionality that have led to new applications and key industry adoption. Here’s a look at some of the key milestones.
In terms of capacity, the decade started for LTO with LTO-5 at 1.5 TB native capacity and culminated most recently with LTO-8 at 12.0 TB and LTO-9 soon to be delivered at 18.0 TB.
Enterprise tape formats started the decade at 1.0 TB native and are currently at 20.0 TB native.
Barium Ferrite magnetic particles became a key enabler for multi-terabyte tapes and were demonstrated by IBM and Fujifilm in 2015 to have the potential to achieve 220 TB on a single tape cartridge. This signaled that tape technology had no fundamental areal density limitations for the foreseeable future.
By the end of the decade, IBM and Fujifilm demonstrated the ability to achieve a record areal density of 317 GBPSI using the next generation of magnetic particles, Strontium Ferrite, with a potential cartridge capacity of 580 TB.
Reliability and Performance
During the decade, tape achieved the highest reliability rating as measured by Bit Error Rate at 1 x 1019, even better than enterprise HDD at 1 x 1016.
Data transfer rates for tape also improved from 140 MB/sec. in 2010 to an impressive 400 MB/sec.
LTFS provided an open tape file system with media partitions for faster “disk-like” access and ease of interchangeability, making LTO a de facto standard in the Media & Entertainment industry.
New Applications and Key Industry Adoption
Storing objects on tape became a reality with object archive software solutions offering S3 compatibility, objects can now move to and from tape libraries in their native object format.
The concept of active archiving grew in popularity with tape as a key component complementing flash, HDD and cloud for cost-effectively maintaining online archives.
Tape was recognized for its ease of removability and portability, providing air gap protection in the escalating war against cybercrime.
Major U.S. hyperscalers began to rely on tape during the decade for both back-up and deep archive applications. In one well-publicized example, Google restored a February 2011 Gmail outage from its tape backups. Microsoft adopted tape for Azure later in the decade. Tape became firmly established as a competitive advantage for these and other hyper scalers based on its scalability, long archival life, lowest TCO, low energy consumption, and air gap security.
With this steady technological advancement over the last decade, tape has been recognized for its complementary value to flash, HDD and cloud in tiered storage strategies for managing data in the zettabyte age.
According to Aaron Ogus, partner development manager for Microsoft Azure Storage, storing a zettabyte of storage will be financially feasible in 2020. Data growth will always exceed expectations, and tape has a more credible road map and one that is easier to get to with not as much investment. Learn more in this video blog: