By Guest Blogger Peter Faulhaber, former president and CEO, FUJIFILM Recording Media U.S.A., Inc.
The Hyperscale Data Center (HSDC) secondary storage market is quickly emerging, requiring advanced solutions for petascale and exascale storage systems, not currently available. According to HORISON Information Strategies, HSDCs currently use around 3% of the world’s electrical energy. Due to the massive energy footprint of HSDCs, climate protection measures have become increasingly important in recent years, with cloud computing offering the greatest advantages for sustainable operation by reducing the energy and carbon footprint over the entire data life cycle.
The slowing rate of HDD and tape technology development roadmaps in recent years, along with HDD and tape storage supplier consolidations are particularly concerning trends to HSDCs. Neither HDD nor tape technology is currently positioned by itself to effectively meet the enormous HSDC storage requirements that future performance and capacity demands. High technical asset specificity requires significant R&D investment, yet have limited ROI potential outside of hyperscalers.
HSDCs manage over 60% of the world’s data today with a CAGR of 35 – 40%, with a growing need for cost-effective secondary storage that still meets certain performance thresholds.
The vendors and manufacturers are dis-incentivized to invest in novel technology; the risk reward is not high enough, while HSDCs are leveraging their buying and bargaining power. Manufacturers need to invest hundreds of millions to bring innovative solutions to market in a long development cycle, without a commitment from the HSDC market.
As a result, the secondary storage market is left with incremental investments in existing technologies and moves slowly.
The conditions are set for a widening gap between customer demands and product solutions in the secondary storage market.
The current “vendor-driven” strategy will not keep pace with HSDC requirements for secondary storage as such offerings fall far behind HSDC curves. Photonics, DNA, glass, and holographic experiments are attempting to address the market, and have been in labs for decades, but most have drawbacks, and none are on the near-term horizon for customer deployment. These initiatives show that a change is needed to get ahead of the demand curve.
However, the opportunity also exists to mitigate this risk by bringing the interested parties together to share the risk reward paradigm. HSDCs need a quantum leap, which only comes with significant investment, best shared by the interested parties.
The Semiconductor Research Corporation (SRC) addressed the concept of vertical market failure in September 2021 in its published article “New Trajectories for Memory and Storage,” stating, “The prospect of vertical market failure can be mitigated by private sector market participants through risk-share agreements between customers and suppliers, as well as increased vertical integration.”
Without change, current technologies will fall far behind HSDC demand curves, and the current vendor-driven trajectory increases the likelihood of un-met demand and stagnation of growth for all involved.
The Tape Storage Council, (TSC), released a new report “Tape to Play Critical Roles as the Zettabyte Era Takes Off,” which highlights the current trends, usages and technology innovations occurring within the tape storage industry. The zettabyte era is in full swing generating unprecedented capacity demand as many businesses move closer to Exascale storage requirements.
According to the LTO Program, 148 Exabytes (EB) of total tape capacity (compressed) shipped in 2021, marking an impressive record year. With a growth rate of 40%, this strong performance in shipments continues following the previous record-breaking 110 EB capacity shipped in 2019 and 105 EB of capacity shipped in the pandemic affected year of 2020.
The ever-increasing thirst for IT services has pushed energy usage, carbon emissions, and reducing the storage industry’s growing impact on global climate change to center stage. Plus, ransomware and cybercrime protection requirements are driving increased focus on air gap protection measures.
As a result of these trends, among others, the TSC expects tape to play an even broader role in the IT ecosystem going forward as the number of exabyte-sized environments grow. Key trends include:
Data-intensive applications and workflows fuel new tape growth.
Data accessibility. Tape performance improves access times and throughput.
Tape should be included in every green data center strategy.
Storage optimization receives a big boost from an active archive which provides dynamic optimization and fast data access for archival storage systems.
Organizations continue to invest in LTO tape technology thanks to its high capacity, reliability, low cost, low power consumption and strong data protection features, especially as threats to cybersecurity soar.
The LTO Technology Provider Companies (IBM, HPE, and Quantum) issued a press release earlier this week announcing record capacity shipments for LTO in 2021 of 148 Exabytes (148,000 Petabytes) compressed (up from 105 EB compressed in 2020, +40%). More and more of the world’s data is being stored on LTO data tape. That’s good news for the IT industry! Is it not? After all, end users and service providers need:
A strategic way to store and protect massive amounts of increasingly valuable data, especially data that’s gone cool or cold
A cost-effective and reliable long term storage solution
An air gap defense against cybercrime
An eco-friendly form of storage!
Industry Pundits React Some industry pundits, biased toward the HDD industry, took the opportunity to downplay the news. They said the data is inaccurate or insignificant compared to the capacity shipments for HDD last year. Really? Does tape technology threaten the market for HDD? Is it still about tape vs. disk in their minds? Have trains, trucks, and ships put air freight out of business? Or does a more strategic thought process say: “These technologies complement each other. We need both to meet the needs of end-users, storage service providers, and society itself…”
Analysts Predict Huge Zettabyte Demand Indeed, if the big industry analysts firms are right, we will need to be storing more than 11.0 zettabytes of data in 2025. Just one zettabyte would require 55 million 18.0 TB HDDs or 55 million LTO-9 tape cartridges. Should we store all of that data on HDD, whether it is hot, warm, cool, or cold? Of course, we can’t just delete excess data. Now that we can analyze the data and derive a competitive advantage from it, the value of data has increased and we need to store more and more data for longer periods of time. As a result, the projections for the amount of persistent data to be stored are growing exponentially. We will need huge amounts of flash, HDD, tape, and even future storage solutions like DNA to address the data storage challenge.
A Strategic Approach to Data Storage The key to success will be a strategic approach that leverages intelligent data management software to automate data movement to the right tiers of storage at the right time, the right cost, and the right energy profile. Employing a strategic approach to data storage in an effort to reduce costs and energy consumption all while maintaining service level agreements seems to make sense. Take a good look at an active archive solution, for example. Yet again, there are those industry pundits who say, the amount of energy saved by moving static, inactive, and infrequently accessed data to a tape tier is not significant in the big picture of the IT industry. The real problem they say is the amount of energy consumed by a single Google search. But isn’t that like saying; “Don’t bother turning the lights out before leaving the office for the night. It’s just a drop in the ocean of energy consumption,” or “Why bother turning off the engine of your car when filling up on gas? It’s just a few minutes of idle time and won’t really impact CO2 emissions at all.” Right?
Change of Attitude Needed But this is the wrong attitude and exactly what has to change to make a difference. Collectively, if we all switch off a light and all turn the car’s engine off, we will make a difference. We might even get motivated for more change! How about installing LED light bulbs or investing in an electric vehicle? Or maybe make the commitment and take the leadership on a renewable energy installation. Attitudes have to change, believing we can make a difference collectively. If data is inactive, why keep it on energy-intensive, constantly spinning disk? Are we all doing whatever it takes to make a difference?
New Flagship UN Report Is a Wake-up Call If we believe the latest studies on climate change coming out of the United Nations, we need to start quickly taking any action we can. A new UN report on climate change from earlier this month indicated that harmful carbon emissions in the last decade have never been higher in earth’s history. It’s proof that the world is on a “fast track” to climate disaster. UN Secretary General Antonio Guterres has warned that it’s ‘now or never’ to limit global warming to 1.5 degrees C. Climate change is the result of more than a century of unsustainable energy and land use, lifestyles, and patterns of consumption and production. Guterres adds, “This is not fiction or exaggeration. It is what science tells us will result from our current energy policies. We are on a pathway to global warming of more than double the 1.5-degrees C limit” that was agreed in Paris in 2015. To limit global warming to around 1.5 C (2.7 F), the IPCC report insists that global greenhouse gas emissions will have to peak “before 2025 at the latest, and be reduced by 43% by 2030.”
Reducing Energy Consumption and CO2 Emissions with Tape To help increase awareness and understanding of energy consumption in data storage, a number of whitepapers have been published highlighting alternative options for storage including LTO data tape. A recent IDC whitepaper studied migrating cold data from HDDs to LTO tape. The opportunity to positively impact the environment by shifting to tape is staggering. This strategic approach can reduce storage-related CO2 emissions by, coincidently, 43.7% by 2030. This would avoid 664 M metric tons of CO2 cumulatively. That’s the equivalent amount of CO2 produced by 144 million passenger cars driven in the course of a year!
Other research shows that tape consumes 87% less energy than equivalent amounts of HDD storage. When CO2 emissions are analyzed over the entire product lifecycle (from raw materials to production to distribution, usage, and disposal) of HDD and tape, studies show a 95% reduction in CO2 in favor of tape compared to HDD. The same study shows Total Cost of Ownership for long-term data storage can be reduced by more than 70% by using tape instead of HDD. At the same time, tape can provide an effective defense against cybercrime via a physical air gap. All of this is possible by taking a strategic storage approach, where cool or cold data that has aged and is infrequently accessed gets moved from expensive primary storage to economical and environmentally friendly tape systems, online or offline.
Data Center World Attendees Get It In my last blog on my visit and presentation at Data Center World in Austin last month, I mentioned that I was encouraged by the DCW attendees that I met and listened to in my session and other sessions. They are genuinely concerned about the environment and worried about what kind of planet we will be leaving behind for our kids and grandchildren. They recognize the opportunity to improve sustainability in data center operations and are committed to it. But since then it has occurred to me that maybe sustainability is more of a focus for facility teams. Perhaps the top-down pressure from the C-suite has yet to be widely applied to the data storage management teams. However, in the quest to achieve the needed sustainability goals, no stone can remain unturned.
Observing Earth Day for Future Generations With Earth Day being observed today, let’s commit to strategically taking action in response to global warming and climate change. Let’s start changing attitudes from “It won’t make a difference” to “Collectively, we can make a difference.” Let’s look at the bright side of increasing LTO capacity shipments instead of the dark, self-serving side. Let’s think about the planet that’s home for us and the future generations of our families to come.
As I started to write this blog on recent ransomware observations, an email message popped up on my PC from our IT department advising of additional and more stringent security enhancements taking place almost immediately to toughen my company’s cybersecurity and increase our protection against current and emerging threats. A sign of these cybercrime times, indeed!
Ransomware Trending According to a February 2022 Alert from CISA (Cybersecurity & Infrastructure Security Agency), 2021 trends showed an increasing threat of ransomware to organizations globally with tactics and techniques continuing to evolve in technological sophistication. So-called “big game” organizations like Colonial Pipeline, Kronos, JBS, Kaseya, and SolarWinds made the ransomware headlines over the past year or so. But according to the CISA Alert, by mid-2021, many ransomware threat actors, under pressure from U.S. authorities, turned their attention toward mid-sized victims to reduce the scrutiny and disruption caused by said authorities.
In a recent Enterprise Strategy Group (ESG) study, 64% of respondents said their organization had paid a ransom to regain access to data, applications, or systems. These findings are supported by the latest Threat Landscape report from the European Union Agency for Cybersecurity. It highlighted a 150% rise in ransomware in 2021 compared to 2020. The agency expects that trend to continue, and even accelerate in 2022.
But these numbers hide the stark reality of the ransomware scourge. Gangs like DarkSide, REvil, and BlackMatter are terrorizing organizations with ransomware – and they are getting smarter and more organized. They have moved beyond the basic ploy of infecting files, locking users out of their data, and demanding a fee. They still want money. But they also endanger reputations by exposing attacks, blackmailing companies by threatening to reveal corporate or personal dirty laundry, and selling intellectual property (IP) to competitors.
As a result, cybersecurity spending has become a priority in most organizations. According to ESG, 69% of organizations plan to spend more on cybersecurity in 2022 than in the previous year, while 68% of senior IT decision-makers identify ransomware as one of their organization’s top 5 business priorities. Such is the fear factor that organizations are now treating cybersecurity ahead of other organizational imperatives such as the cloud, artificial intelligence (AI), digital transformation, and application development.
New Federal Mandate and the SEC Takes Action On March 15th, in an effort to thwart cyberattacks from foreign spies and criminal hacking groups, President Biden signed into law a requirement for many critical-infrastructure companies to report to the government when they have been hacked. This way, authorities can better understand the scope of the problem and take appropriate action.
It’s also no wonder that the Security and Exchange Commission (SEC) is taking action. On March 9th, the SEC voted 3 to 1 to propose reporting and disclosures related to cybercrime incidents and preparedness. In a nutshell, the SEC will be asking publicly traded companies:
To disclose material cybersecurity incidents
To disclose its policies and procedures to identify and manage cybersecurity risks
To disclose management’s role and expertise in managing cybersecurity risks
To disclose the board of director’s oversight role
Specifically, the SEC will want to know:
Whether a company undertakes activities to prevent, detect and minimize the effects of cybersecurity incidents
Whether it has business continuity, contingency, and recovery plans in the event of a cybersecurity incident
Whether the entire board, certain board members, or a board committee is responsible for the oversight of cybersecurity risks
Whether and how the board or board committee considers cybersecurity risks as part of its business strategy, risk management, and financial oversight
Holding publicly traded companies and their boards accountable for best practices in combating ransomware is a big step in the right direction and will no doubt free up the required budgets and resources.
Lowering the Fear Factor Cybersecurity is already a top spending priority for 2022 and with SEC regulations looming, will likely continue to be a priority for quite some time. Companies are busy beefing up the tools and resources needed to thwart ransomware. They are buying intrusion response tools and services, extended or managed detection and response suites, security information and event management platforms, antivirus, anti-malware, next-generation firewalls, and more, including cybercrime insurance policies.
What may be missing in the spending frenzy, however, are some fundamental basics that can certainly lower the fear factor. Backup tools are an essential ingredient in being able to swiftly recover from ransomware or other attacks. Similarly, thorough and timely patch management greatly lowers the risk of hackers finding a way into the enterprise via an unpatched vulnerability.
Another smart purchase is software that scans data and backups to ensure that no ransomware or malware is hidden inside. It is not uncommon for a ransomware victim to conduct a restore and find that its backup files have also been corrupted by malware. Cleansing data that is ready to be backed up has become critical. These are some of the fundamental basics that need to be in place in the fight against ransomware. Organizations that neglect them suffer far more from breaches than those that take care of them efficiently.
Adding an Air Gap Another fundamental basic is the elegantly simple air gap. When data is stored in the cloud, on disk, or in a backup appliance, it remains connected to the network. This leaves it vulnerable to unauthorized access and infection from bad actors. An air gap is essentially a physical gap between data and the network. It disconnects backed up or archived data from the Internet.
Such a gap commonly exists by partitioning in, or removing tapes from, an automated tape library and either storing them on a shelf or sending them to a secure external service provider. If that data is properly scanned prior to being backed up or archived to ensure it is free of infection, it offers certainty that a corruption-free copy of data exists. If a ransomware attack occurs, the organization can confidently fall back on a reliable copy of its data – and avoid any ransom demands.
Effectively Combatting Ransomware There is no silver security bullet that will 100% guarantee freedom from ransomware. It is truly a multi-faceted strategy. Implementation of best-of-breed security tools is certainly necessary. But they must be supported by the steadfast application of backup and patching best practices and the addition of a tape-based air gap.
CISA, the FBI, and cybersecurity insurance companies all recommend offline, offsite, air-gapped copies of data. This can be achieved cost-effectively with today’s removable, and highly portable modern tape technology. The boards of publicly traded companies will likely want to do whatever it takes to demonstrate compliance with best practices to meet the SEC requirements. This should include air-gapped tape as part of a prudent and comprehensive strategy. A best practice in these cybercrime times, indeed!
In this executive Q&A, Chris Kehoe, Director of Sales & Marketing, discusses his role at FUJIFILM Recording Media U.S.A. and how the company’s Object Archive software helps solve a major customer pain point as data continues to grow yet resources and budgets do not.
Q: Tell us about your role as Director of Sales and Marketing for Fujifilm’s Data Management Solutions?
As the Director of Sales and Marketing for Fujifilm’s Data Management Solutions, I’m tasked with bringing Fujifilm’s Object Archive software product to the North American market. This includes implementing a sales and marketing strategy for specific target markets. My team provides full support for demand generation, sales, and post-sales activities such as installation and support. There are two major focal points in these roles; the first is building and implementing a focused, market-based approach ensuring our product values intersect the market and customers’ needs. The second is ensuring the best customer experience while working with Fujifilm products, people, and resellers. This includes on-the-street sales and engineering readiness and customer support capabilities to ensure a fully capable delivery of exceptional customer satisfaction.
“Object Archive delivers low-cost storage and high reliability for long term data archiving and preservation,” – Chris Kehoe
Q: What are the key features and benefits of Object Archive software?
Object Archive software operates like an on-premise cloud archive service through its simple-to-use S3 API and cross-organization and multi-tenant capabilities. By leveraging today’s highly advanced data tape, automated tape libraries, and state-of-the-art software, Object Archive delivers low-cost storage and high reliability for long-term data archiving and preservation. This solves a major customer pain point as data continues to grow yet resources and budgets do not.
Q: What is your basic go-to-market strategy and what are your key target markets?
Our basic go-to-market strategy is to sell Object Archive into the North American market through Fujifilm’s VAR channel. One of our primary targets is the computational science and digital preservation departments inside of the non-profit research, research universities, and government labs. These customers have a critical need to properly classify data and to move that data as it ages and cools to the right storage, at the right time and cost. Object Archive supports that strategy very effectively.
Q: What’s your perspective on tape technology and its future?
Tape technology is uniquely suited as the only technology that has the capability to scale in terms of the capacity that is required to specifically meet the long-term retention needs resulting from the significant projected growth of data. There is no other solution that can achieve similar cost, performance, and retention metrics. Tape has a significant advantage when it comes to TCO, has plenty of performance for the profile of data that it stores and protects, and a long archival life beyond what is probably needed. Add to that best-in-class reliability and the benefit of the lowest energy-consuming data storage solution available. That’s important at a time when sustainability and climate change are becoming a priority for just about everyone.
Q: What is your perspective on cybercrime and the benefits of air gap? Air gap is a no brainer for tape systems, since the beginning of its development, tape has been designed and used to manage and protect data against online and physical threats and disasters. Moving a copy of your data to offline tape means that this data is no longer connected to the network, it’s removed from the threat matrix of online attacks. Moving a copy of your data to a secure offsite vault will protect your data from numerous threats and disasters. It has always been a best practice across all organizations to have a fully protected copy of data offline. This is even more critical today since the threat of cybercrime and ransomware is not going away anytime soon. In fact, it will only continue to increase and we’re glad to help our customers protect themselves.
I think it’s safe to say people like surveys, probably not everyone, but most people do. Why? Experts in the field suggest that people are willing to take surveys because respondents feel their opinions are valued and that their answers will be used and may even result in a benefit to society. They feel their participation will impact something they care about, and they want to share their opinion with those who will listen and act on the information.
Surveying the C-Suite on Sustainability So it’s not surprising that Fujifilm got a great response rate to a recently launched survey entitled Awareness Survey on Environmental Issues in the Digital Domain. As many as 1,200 C-suite professionals responded including CEOs, CFOs, CSOs, CTOs, and CIOs from companies of 100 or more employees in the United States, Germany, Japan, and China.
The survey revealed that there is a growing awareness around broader environmental issues among corporate leaders, and that’s great news as the negative impacts of global warming and climate change keep piling up, flood after flood, wildfire after wildfire, and storm after storm.
When it comes to IT infrastructure specifically, the majority of U.S. respondents believe sustainability improvements in IT services and equipment can positively impact climate change, but 40% indicated that they did not know or were unsure if data storage can have a negative environmental impact and increase the cost of doing business.
Increasing Data Storage Requirements Data storage can certainly be energy-intensive. This is a problem that is only getting worse as the value of data rises with the ability to analyze and derive competitive advantage from it. As a result, demand for long-term data retention is increasing. In fact, data to be stored is expected to grow from just 2.0 zettabytes in 2016 to 4.1 ZB in 2020 and is expected to reach 11.1 ZB in 2025 according to a recent whitepaper from IDC. Just one ZB is a vast amount of data equal to one million petabytes that would need 55 million 18 TB hard disk drives (HDDs) or 55 million 18 TB LTO-9 tapes to store. The environmental impact of the energy required to support this volume of storage is greatly underestimated, as are the associated carbon emissions. When asked in the survey what barriers exist for those who have not considered more eco-friendly data storage options, 31% in the U.S. cited a lack of awareness or understanding of the issue.
Hot vs. Cold Data There was also a lack of awareness pertaining to frequently accessed “hot” data and less frequently accessed “cold” data, with 36% of respondents saying they either don’t or are unsure if they differentiate between the two. And 35% don’t realize that differentiating between hot and cold data can impact sustainability, affordability, and security. An interesting fact about data is that it quickly goes cold and access frequency drops off significantly after just 30, 60, or even 90 days. In fact, industry analysts estimate that 60% to 80% of all data stored is cold and qualifies as “archival”. Yet through inertia, that data often remains on energy intensive, constantly spinning and heat-producing tiers of storage like hard disk drives.
Reducing Energy Consumption and CO2 Emissions with Tape To help increase awareness and understanding of this issue, a number of whitepapers have been published highlighting alternative options for storage including LTO data tape. A recent IDC whitepaper shows how migrating cold data from HDDs to LTO tape can reduce data centers’ CO2 emissions by 43.7% by 2030, avoiding 664 M metric tons of CO2 cumulatively. Other research shows that tape consumes 87% less energy than equivalent amounts of HDD storage. When CO2 emissions are analyzed over the entire product lifecycle (from raw materials to production to distribution, usage, and disposal) of HDD and tape, studies show a 95% reduction in CO2 in favor of tape compared to HDD. The same study shows Total Cost of Ownership for long-term data storage can be reduced by more than 70% using tape instead of HDD. All of this is possible by taking a storage optimization approach, where data that has aged and is infrequently accessed, otherwise known as cold data, gets moved from expensive primary storage like solid-state flash drives and HDDs to economical and environmentally friendly tape systems.
As far as security is concerned, tape is also playing a role in cybercrime prevention with air gap capabilities, WORM, and encryption. Intelligent data management software, typical in today’s active archive environments, can automatically move data from expensive, energy-intensive tiers of storage to more economical and energy-efficient tiers based on user-defined policies. By moving inactive data out of primary storage, the ransomware attack surface can also be reduced.
Renewable Energy Plus Conservation Another interesting point from the survey reveals that 51% of participants said that their companies are using renewable energy to reduce carbon emissions, while 22% said they are doing so via climate protection projects and 13% through carbon offsets. Renewable energy is a key factor in reducing CO2 emissions and Fujifilm is a fan (see photo at right of our LTO plant in Bedford, MA). But alone renewables likely can’t come online fast enough or cheaply enough to keep up with data growth rates of between 30% – 60% annually in major data centers today. That’s why conservation has to be part of the equation. The very first metric to be analyzed in data center energy efficiency is simply the amount of energy that’s being consumed.
Alternative Data Storage Options Finally, 81% of respondents noted that they would consider an alternative data storage option that is more sustainable and affordable. That option exists in the form of today’s modern and highly advanced data tape systems that offer the lowest energy consumption and cost profile. Add to that its best-in-class reliability rating of any storage media and longest archival life. So for the benefit of society, let’s act on the information that the survey reveals. It’s really just a question of getting the right data, in the right place, at the right time.
The Arrival of the Zettabyte Era The data storage market has clearly entered the “zettabyte era” where new capacity shipments have exceeded a massive one zettabyte for a couple of years now. The data storage requirements are being driven by the phenomenon of “digital transformation” and the rising value of data that needs to be stored for longer periods of time, and in some cases, indefinitely. Further accelerating the zettabyte era is the other era we are all in, that being the “pandemic era”. With this era comes the unanticipated need for an unexpected remote workforce and the ever-expanding internet with its proliferation of online apps.
Pandemic Related Supply Shortages The pandemic has brought with it related disruptions to the global supply chain including shortages of semiconductor chips. It’s been tough to get modern goods from toys to notebooks to refrigerators to automobiles. The combination of zettabyte and pandemic era has even put a strain on supply chains and the availability of SSDs and HDDs needed to support the digital transformation. This has been the cause of fluctuating prices based on quarterly supply and demand swings.
Supply Chain Challenges Persist While pandemic-related labor shortages have delayed the production and distribution of goods, other factors are making matters worse. How about global warming, climate change, and the ensuing natural disasters that have had negative impacts on the supply chain? How about international rivalries and tensions impacting the availability of key components? Or cybercriminals shutting down vital infrastructure? Bottom line: industry pundits say we can expect supply chain hassles to continue throughout 2022.
Supply Chain Contingency Planning in Data Storage Faced with supply chain risks in any industry, it’s always good to have contingency plans to mitigate risk and ensure ongoing operations. The IT industry is no exception where the availability of commodities that we may take for granted can be interrupted by any of the factors listed above from unforeseen demand to pandemic-related shortages to global warming, trade wars, and cybercrime.
A great way to avoid supply chain disruptions in the availability of primary storage devices like SSDs and HDDs is to employ intelligent data management software, typical of active archive solutions, that will automate the migration of data from these potentially supply chain affected devices to a modern, automated tape library. Since 60 to 80 percent of data quickly goes cold after a short period of time, why keep it stored on higher performing, expensive, and energy-intensive devices? Given the global supply chain uncertainty, 3 good reasons to migrate data from primary storage devices to tape storage are:
Free up capacity on expensive Tier 1 and Tier 2 storage devices like SSDs and HDDs in favor of TCO friendly tape systems
Reduce energy consumption and related CO2 emissions by leveraging the low power profile of automated tape systems
Take advantage of tape’s natural air gap security in the never-ending war against ransomware
The above actually makes sense even in the absence of supply chain concerns. Since data to be stored is growing at a CAGR of around 30% versus IT budget growth somewhere in the low single digits, the IT industry needs to find a more cost-effective storage solution. With the increasing value of data and indefinite retention periods, the long-term archival profile of tape coupled with best-in-class reliability actually makes sense.
Fighting Climate Change and Cybercrime Finally, we all have to engage in the battle against global warming and climate change if we are to preserve the planet that we inhabit. Studies show that tape systems consume 87% less energy than equivalent amounts of disk storage and produce 95% less CO2 emissions than disk over the product lifecycle. Other studies show that collectively, the global IT industry could avoid as much as 664 million metric tons of CO2 emissions by strategically moving more data to tape systems. As data cools off or goes cold, it should migrate to less expensive, less energy-intensive, and more secure tiers of storage.
Once the pandemic era finally subsides, it will be environmental calamities brought on by climate change and the relentless threat of cybercriminals that will have long-term impacts on supply chains.
By Rich Gadomski, Head of Tape Evangelism, FUJIFILM Recording Media U.S.A., Inc.
It seems like 2020 and 2021 have blended to combine into one long, tough time for all of us. Let’s hope 2022 emerges on the brighter side! In the meantime, here are 5 big predictions we see coming up in this New Year and beyond:
1. Increasing Focus on IT Energy Consumption
Severe weather was once again a hallmark of 2021, from the Texas deep freeze right up to the bitter end of 2021. As unusual tornadoes and wildfires reminded us of the negative impact of global warming and climate change.
According to a report from the United Nations released in August of 2021, irreversible damage has already been done to the environment as a result of greenhouse gas emissions. The world showed renewed interest in the COP 26 conference in Glasgow where countries from around the globe gathered to pledge their commitments to combat climate change.
Wall Street got in on the act too and will increasingly demand that companies disclose their sustainability initiatives and results. Accordingly, more and more companies will be appointing Chief Sustainability Officers who will put pressure on their organization’s energy usage including energy-intensive IT operations. The use of renewables, but also energy conservation measures will be mandated.
Curbing CO2 emissions is quickly becoming a C-suite imperative and storage will not escape the scrutiny. Research shows that 81% of CIOs would consider alternative data storage options that are more cost-effective and sustainable. This will set the stage for new tape system deployments that not only can reduce TCO by more than 70%, but can reduce CO2 emissions by 95% compared to traditional HDD storage.
2. Return to Hybrid Cloud Strategies
Prior to COVID 19, the term “cloud repatriation” appeared often in the press as it turned out that cloud was not a panacea for everything. But COVID 19 understandably created short-term storage strategies often resulting in a flight to the cloud.
However, long-term thinking will favor hybrid cloud strategies where the best of public cloud plus on-prem private cloud provides maximum flexibility and value. This will especially apply to data accessibility, regulatory requirements, data governance, and cybercrime risks including ransomware.
Today’s modern automated tape solutions will provide the advantages of cost, scalability, reliability, and data protection to support the hybrid cloud model.
3. Storage Optimization Will Be Key to Data Growth Management
With the continuing digital transformation comes the zettabyte age of storage where data to be stored globally will approach 6.0 zettabytes (ZB) in 2022, according to a leading IT industry analyst. Just one ZB would require 55 million 18.0 TB HDDs or 55 million 18.0 TB LTO-9 cartridges!
Storage optimization, that is to say, getting the right data, in the right place, at the right time, and at the right cost will be critical to maintaining competitive advantage.
Intelligent data management will be required, leveraging multiple tiers of storage, active archives, and innovative S3-compatible archive solutions for object storage. Nowhere will this be more apparent than in digital preservation and high-performance computing environments with a simple need to offload expensive object storage to cost-effective tape systems using an S3-compatible API.
4. Continuing Rise of Ransomware
It has been said that ransomware is only in “its infancy” and it’s been said many more times, an attack is not a matter of “if” but “when.” The FBI and CISA have weighed in with this advice:
“Backup your data, system images, and configurations, test your backups, and keep backups offline.”
As ransomware hackers mature in sophistication (and profits), online backups are increasingly being targeted to hamper recovery efforts, including cloud-based backups connected to a network. As a result, the value of affordable, removable, and highly-portable tape will only increase, providing true air gap protection (meaning offline, offsite backups in a secure location).
5. Video Surveillance Content Management
As we predicted last year, data tape has increasingly become a strategic option in managing the ballooning volume of video content associated with video surveillance applications.
Due to security reasons, regulatory compliance, or for future analytics, retention volumes and periods will only increase making legacy HDD solutions cost-prohibitive and unsustainable in terms of energy consumption. Look for increasing adoption of cost-effective tier 2 tape in video retention workflows in 2022.
Successfully emerging from the combined years of 2020 and 2021 will require getting back to strategic, long-term planning. Given the relentless growth of data, environmental concerns, and limited resources and budgets, today’s highly advanced tape storage will play an increasingly vital role in 2022 and beyond.
The recent release of LTO-9 makes it clear that LTO tape serves the needs of enterprise environments. The cloud hyperscalers and many large organizations are firm believers in LTO tape as the best medium for large-scale storage and archiving.
Despite only being released in early September, tape drives and systems are now available for LTO-9 from the likes of IBM, Quantum, and Spectra Logic. On the media side, companies such as Fujifilm and Sony have launched LTO-9 tape cartridges.
LTO-9 Delivers Capacity and Performance
Fujifilm’s LTO Ultrium 9 data cartridge, for example, offers up to 45 TB of storage capacity (18 TB for non-compressed data), a 50% increase from the previous generation of LTO-8. The boost in storage capacity is achieved using Barium Ferrite (BaFe) magnetic particles, formulated into fine particles with Fujifilm’s Nanocubic technology that evenly distributes the magnetic particles on the tape surface, forming a smooth and thin magnetic layer for improved read/write performance. LTO-9 also delivers high-speed data transfer reaching up to 1,000MB/sec. for compressed data (400MB/sec. native), a 25% increase over LTO-8.
LTO-9 Deployment at CERN
I recently had the opportunity to interview Vladimir Bahyl, a beta LTO-9 user at world-renowned CERN. Vladimir, who is in charge of data archiving at CERN is now rolling out LTO-9 within his storage environment. This is where data from the Large Hadron Collider (LHC) is stored. Massive amounts of data have been generated to date in experiments using a particle collider that measures 17 miles in circumference and is located roughly 100 meters below the France–Switzerland border. Tape has been in use at CERN for about five decades, and the organization currently stores around 400 PB on tape. This enables CERN to keep pace with the data explosion.
After a three-year break for upgrades, the collider is about to recommence operation. The IT department expects up to 180 PB of data to be added in 2022. CERN can cope with that quantity of information courtesy of a sophisticated tape-disk-SSD architecture. All results and all raw data from all CERN experiments are stored on tape and archived. When anything needs to be analyzed, it is transferred to disk and SSD.
Leveraging oRAO for Enhanced Access Time
A feature known as Open Recommended Access Order (oRAO) was included in LTO media for the first time with LTO-9. oRAO enables the retrieval of tape content in a more efficient way. Instead of sequentially laying out data on tape, oRAO takes advantage of multiple serpentine tracks on one tape to arrange data for more rapid access. According to testing at CERN, oRAO can position tape for data access anywhere from 30% to 70% faster than traditional sequential tape. What this adds up to is the need for fewer tape drives for the data recall workflow. If more backup vendors adopt this technology, it could seriously reduce the time needed for a restore.
Benefits of Tape at CERN
Users such as CERN gravitate toward tape for many reasons. In terms of long-term stability, data can still be recovered from tape after 30 years, whereas hard drives struggle to retain data beyond five years. Tape reliability is higher too. CERN has dozens of hard drives failing every week (out of tens of thousands it has on-site) compared to a negligible failure rate for tape.
But economics certainly factor in. Tape brings big savings in terms of CAPEX and OPEX. From an operating expense standpoint, tape consumes no power while tape cartridges are sitting idle, so it’s cost-effective and eco-friendly. The most cost-effective technology for large-scale, long-term storage is tape. And as it offers an air gap, to thwart online hackers, it raises the level of security.
In terms of reliability, the Fujifilm tape used at CERN performed well during a recent tape repacking project to switch older cartridges to the latest tape generation format. Out of 100 PB of tape that was read and repacked (6,300 miles of physical media), only 5 GB (3.5 feet) of data was corrupted – but even those files were recovered soon after.
By Rich Gadomski, Head of Tape Evangelism, FUJIFILM Recording media U.S.A, Inc.
Climate change and the effects of global warming have increasingly been in the spotlight as we emerge from the all-consuming COVID pandemic. Indeed, sustainability has become a strategic imperative for organizations across the globe.
Recognizing the magnitude of this issue in the energy-intensive IT industry and in data storage operations specifically, Fujifilm has endeavored to help raise awareness of the energy advantage of today’s modern and highly advanced tape solutions.
In recent whitepapers by Brad Johns Consulting, IDC, Horison Information Strategies, and others, you can read about the energy advantage of tape compared to alternative storage technologies like HDD. But does it actually help end-users meet their sustainability goals in real-world applications?
To answer this question, I recently hosted a virtual roundtable discussion entitled, “Is Tape Really Eco-Friendly?” The panelists included two end-users, Jason Adrian from Microsoft Azure and Vladimir Bahyl from CERN. To review his whitepaper findings, I invited Brad Johns, TCO and energy consumption expert. And to provide feedback from the broader market of end-users, I invited Shawn Brume from IBM to share his observations.
The roundtable kicked off with a brief recap of Brad John’s analysis where he finds that for long-term storage of inactive or cold data, tape consumes 87%less energy than equivalent amounts of hard disk drives, produces 87% less carbon emissions, and reduces TCO by 86%. When looking at the total product lifecycle from procurement of raw material to production, distribution, usage, and disposal, tape produces 95% less CO2 equivalents and produces 80% less electronic waste than hard disk drives.
Those are pretty compelling numbers! But are the end-users seeing that benefit?
Jason Adrian from Microsoft Azure weighed in with the following comment: “When you take the material savings and power savings, tape actually does offer quite a bit of advantage compared to other technologies that are on the market today.”
Vladimir Bahyl from CERN offered; “We have been using tape for over 50 years at CERN. We are fully aware of the possibility to have hard drives that spin down and this saves some power when not in use. However, this completely changes the workflow that we have in-house…and adds complexity. Our archive is not a super cold archive, it is actually an active archive and tape is a natural building block in this system.”
Shawn Brume from IBM observed; “You can bring the total CO2 down to .42 metric tons per year per petabyte with tape. Which for most customers is 2 to 4X better in the overall lifecycle than HDD and believe it or not, 2 to 4X better than flash/SSDs. Customers are seeing that tape represents significant sustainability value.”
As organizations and IT operations specifically seek to achieve their sustainability goals, strategically moving inactive, infrequently accessed, cool or cold data to tape can have substantial environmental benefits.