By Guest Blogger, Dr. Shawn O. Brume Sc. D., IBM Tape Evangelist and Strategist
According to a study by McKinsey, the average lifespan of companies listed in Standard & Poor’s is less than 18 years! That means that tape technology is already in business almost 4 times longer than the average S&P will survive. Tape technology celebrated 70 years young on May 21st. Tape has been and continues to be the most transforming data storage technology in history.
In the 50’s it was the only viable technology for storing data generated by the few computers in existence. In the 60’s tape took the world to the moon and preserved the data for usage nearly 40 years later when it was retrieved to assist in modern space explorations. By the 70’s Tape was dominating storage, transforming the financial industry by providing the ability to access data on accounts with minimal human intervention. The 80’s and 90’s continued the transformation of data availability by performing transactional data storage for ATMs, but also was key in the investigation of the space shuttle Challenger disaster; an investigation enhanced as a result of the durability of tape even when submerged in saltwater.
Today tape lives in the data center, preserving Zettabytes of data. Data being preserved and utilized across nearly every industry, examples:
Healthcare – Data preserved on tape is being utilized to develop new predictive health services. Digital medical records can be retained for the life of patients and shared across organizations.
Financial – Online transaction retention ensures customers valuable financial data is protected in the eventuality of a cyber-attack. Mortgage loans are preserved without fear of tampering.
Cloud – Data stored in public clouds are growing at a 30% faster rate than traditional storage. Cloud providers rely on tape to provide data durability and low-cost storage subscriptions.
Tape’s popularity has often been driven by the low cost of storage, modern data storage requires so much more including cyber-resiliency, data durability and low carbon footprints that enable sustainable IT.
Cyber Resiliency – Tape is the only true airgap data storage solution available.
Data Durability – Tape has a native single copy durability of 11- Nines. This means the likelihood of a single bit failure is 1 in 100 Petabytes.
Sustainability – At scale tape technology is 96% lower carbon footprint than highly dense HDD storage (when comparing OCP Bryce canyon and IBM tape technology with 27PB of data).
If preserving data, in a cyber-resilient solution, at low cost, with relatively low carbon impact meets your business outcomes, then why wait? Clearly tape is here to stay and surging in usage across nearly every business use case.
Happy 70-years to an amazing technology!
For more information about technology since tape’s introduction, check out this post from my colleague Mike Doran.
As I started to write this blog on recent ransomware observations, an email message popped up on my PC from our IT department advising of additional and more stringent security enhancements taking place almost immediately to toughen my company’s cybersecurity and increase our protection against current and emerging threats. A sign of these cybercrime times, indeed!
Ransomware Trending According to a February 2022 Alert from CISA (Cybersecurity & Infrastructure Security Agency), 2021 trends showed an increasing threat of ransomware to organizations globally with tactics and techniques continuing to evolve in technological sophistication. So-called “big game” organizations like Colonial Pipeline, Kronos, JBS, Kaseya, and SolarWinds made the ransomware headlines over the past year or so. But according to the CISA Alert, by mid-2021, many ransomware threat actors, under pressure from U.S. authorities, turned their attention toward mid-sized victims to reduce the scrutiny and disruption caused by said authorities.
In a recent Enterprise Strategy Group (ESG) study, 64% of respondents said their organization had paid a ransom to regain access to data, applications, or systems. These findings are supported by the latest Threat Landscape report from the European Union Agency for Cybersecurity. It highlighted a 150% rise in ransomware in 2021 compared to 2020. The agency expects that trend to continue, and even accelerate in 2022.
But these numbers hide the stark reality of the ransomware scourge. Gangs like DarkSide, REvil, and BlackMatter are terrorizing organizations with ransomware – and they are getting smarter and more organized. They have moved beyond the basic ploy of infecting files, locking users out of their data, and demanding a fee. They still want money. But they also endanger reputations by exposing attacks, blackmailing companies by threatening to reveal corporate or personal dirty laundry, and selling intellectual property (IP) to competitors.
As a result, cybersecurity spending has become a priority in most organizations. According to ESG, 69% of organizations plan to spend more on cybersecurity in 2022 than in the previous year, while 68% of senior IT decision-makers identify ransomware as one of their organization’s top 5 business priorities. Such is the fear factor that organizations are now treating cybersecurity ahead of other organizational imperatives such as the cloud, artificial intelligence (AI), digital transformation, and application development.
New Federal Mandate and the SEC Takes Action On March 15th, in an effort to thwart cyberattacks from foreign spies and criminal hacking groups, President Biden signed into law a requirement for many critical-infrastructure companies to report to the government when they have been hacked. This way, authorities can better understand the scope of the problem and take appropriate action.
It’s also no wonder that the Security and Exchange Commission (SEC) is taking action. On March 9th, the SEC voted 3 to 1 to propose reporting and disclosures related to cybercrime incidents and preparedness. In a nutshell, the SEC will be asking publicly traded companies:
To disclose material cybersecurity incidents
To disclose its policies and procedures to identify and manage cybersecurity risks
To disclose management’s role and expertise in managing cybersecurity risks
To disclose the board of director’s oversight role
Specifically, the SEC will want to know:
Whether a company undertakes activities to prevent, detect and minimize the effects of cybersecurity incidents
Whether it has business continuity, contingency, and recovery plans in the event of a cybersecurity incident
Whether the entire board, certain board members, or a board committee is responsible for the oversight of cybersecurity risks
Whether and how the board or board committee considers cybersecurity risks as part of its business strategy, risk management, and financial oversight
Holding publicly traded companies and their boards accountable for best practices in combating ransomware is a big step in the right direction and will no doubt free up the required budgets and resources.
Lowering the Fear Factor Cybersecurity is already a top spending priority for 2022 and with SEC regulations looming, will likely continue to be a priority for quite some time. Companies are busy beefing up the tools and resources needed to thwart ransomware. They are buying intrusion response tools and services, extended or managed detection and response suites, security information and event management platforms, antivirus, anti-malware, next-generation firewalls, and more, including cybercrime insurance policies.
What may be missing in the spending frenzy, however, are some fundamental basics that can certainly lower the fear factor. Backup tools are an essential ingredient in being able to swiftly recover from ransomware or other attacks. Similarly, thorough and timely patch management greatly lowers the risk of hackers finding a way into the enterprise via an unpatched vulnerability.
Another smart purchase is software that scans data and backups to ensure that no ransomware or malware is hidden inside. It is not uncommon for a ransomware victim to conduct a restore and find that its backup files have also been corrupted by malware. Cleansing data that is ready to be backed up has become critical. These are some of the fundamental basics that need to be in place in the fight against ransomware. Organizations that neglect them suffer far more from breaches than those that take care of them efficiently.
Adding an Air Gap Another fundamental basic is the elegantly simple air gap. When data is stored in the cloud, on disk, or in a backup appliance, it remains connected to the network. This leaves it vulnerable to unauthorized access and infection from bad actors. An air gap is essentially a physical gap between data and the network. It disconnects backed up or archived data from the Internet.
Such a gap commonly exists by partitioning in, or removing tapes from, an automated tape library and either storing them on a shelf or sending them to a secure external service provider. If that data is properly scanned prior to being backed up or archived to ensure it is free of infection, it offers certainty that a corruption-free copy of data exists. If a ransomware attack occurs, the organization can confidently fall back on a reliable copy of its data – and avoid any ransom demands.
Effectively Combatting Ransomware There is no silver security bullet that will 100% guarantee freedom from ransomware. It is truly a multi-faceted strategy. Implementation of best-of-breed security tools is certainly necessary. But they must be supported by the steadfast application of backup and patching best practices and the addition of a tape-based air gap.
CISA, the FBI, and cybersecurity insurance companies all recommend offline, offsite, air-gapped copies of data. This can be achieved cost-effectively with today’s removable, and highly portable modern tape technology. The boards of publicly traded companies will likely want to do whatever it takes to demonstrate compliance with best practices to meet the SEC requirements. This should include air-gapped tape as part of a prudent and comprehensive strategy. A best practice in these cybercrime times, indeed!
I think it’s safe to say people like surveys, probably not everyone, but most people do. Why? Experts in the field suggest that people are willing to take surveys because respondents feel their opinions are valued and that their answers will be used and may even result in a benefit to society. They feel their participation will impact something they care about, and they want to share their opinion with those who will listen and act on the information.
Surveying the C-Suite on Sustainability So it’s not surprising that Fujifilm got a great response rate to a recently launched survey entitled Awareness Survey on Environmental Issues in the Digital Domain. As many as 1,200 C-suite professionals responded including CEOs, CFOs, CSOs, CTOs, and CIOs from companies of 100 or more employees in the United States, Germany, Japan, and China.
The survey revealed that there is a growing awareness around broader environmental issues among corporate leaders, and that’s great news as the negative impacts of global warming and climate change keep piling up, flood after flood, wildfire after wildfire, and storm after storm.
When it comes to IT infrastructure specifically, the majority of U.S. respondents believe sustainability improvements in IT services and equipment can positively impact climate change, but 40% indicated that they did not know or were unsure if data storage can have a negative environmental impact and increase the cost of doing business.
Increasing Data Storage Requirements Data storage can certainly be energy-intensive. This is a problem that is only getting worse as the value of data rises with the ability to analyze and derive competitive advantage from it. As a result, demand for long-term data retention is increasing. In fact, data to be stored is expected to grow from just 2.0 zettabytes in 2016 to 4.1 ZB in 2020 and is expected to reach 11.1 ZB in 2025 according to a recent whitepaper from IDC. Just one ZB is a vast amount of data equal to one million petabytes that would need 55 million 18 TB hard disk drives (HDDs) or 55 million 18 TB LTO-9 tapes to store. The environmental impact of the energy required to support this volume of storage is greatly underestimated, as are the associated carbon emissions. When asked in the survey what barriers exist for those who have not considered more eco-friendly data storage options, 31% in the U.S. cited a lack of awareness or understanding of the issue.
Hot vs. Cold Data There was also a lack of awareness pertaining to frequently accessed “hot” data and less frequently accessed “cold” data, with 36% of respondents saying they either don’t or are unsure if they differentiate between the two. And 35% don’t realize that differentiating between hot and cold data can impact sustainability, affordability, and security. An interesting fact about data is that it quickly goes cold and access frequency drops off significantly after just 30, 60, or even 90 days. In fact, industry analysts estimate that 60% to 80% of all data stored is cold and qualifies as “archival”. Yet through inertia, that data often remains on energy intensive, constantly spinning and heat-producing tiers of storage like hard disk drives.
Reducing Energy Consumption and CO2 Emissions with Tape To help increase awareness and understanding of this issue, a number of whitepapers have been published highlighting alternative options for storage including LTO data tape. A recent IDC whitepaper shows how migrating cold data from HDDs to LTO tape can reduce data centers’ CO2 emissions by 43.7% by 2030, avoiding 664 M metric tons of CO2 cumulatively. Other research shows that tape consumes 87% less energy than equivalent amounts of HDD storage. When CO2 emissions are analyzed over the entire product lifecycle (from raw materials to production to distribution, usage, and disposal) of HDD and tape, studies show a 95% reduction in CO2 in favor of tape compared to HDD. The same study shows Total Cost of Ownership for long-term data storage can be reduced by more than 70% using tape instead of HDD. All of this is possible by taking a storage optimization approach, where data that has aged and is infrequently accessed, otherwise known as cold data, gets moved from expensive primary storage like solid-state flash drives and HDDs to economical and environmentally friendly tape systems.
As far as security is concerned, tape is also playing a role in cybercrime prevention with air gap capabilities, WORM, and encryption. Intelligent data management software, typical in today’s active archive environments, can automatically move data from expensive, energy-intensive tiers of storage to more economical and energy-efficient tiers based on user-defined policies. By moving inactive data out of primary storage, the ransomware attack surface can also be reduced.
Renewable Energy Plus Conservation Another interesting point from the survey reveals that 51% of participants said that their companies are using renewable energy to reduce carbon emissions, while 22% said they are doing so via climate protection projects and 13% through carbon offsets. Renewable energy is a key factor in reducing CO2 emissions and Fujifilm is a fan (see photo at right of our LTO plant in Bedford, MA). But alone renewables likely can’t come online fast enough or cheaply enough to keep up with data growth rates of between 30% – 60% annually in major data centers today. That’s why conservation has to be part of the equation. The very first metric to be analyzed in data center energy efficiency is simply the amount of energy that’s being consumed.
Alternative Data Storage Options Finally, 81% of respondents noted that they would consider an alternative data storage option that is more sustainable and affordable. That option exists in the form of today’s modern and highly advanced data tape systems that offer the lowest energy consumption and cost profile. Add to that its best-in-class reliability rating of any storage media and longest archival life. So for the benefit of society, let’s act on the information that the survey reveals. It’s really just a question of getting the right data, in the right place, at the right time.
The newly released whitepaper from IT analyst firm ESG (Enterprise Strategy Group), sponsored by IBM and Fujifilm, entitled, “How Tape Technology Delivers Value in Modern Data-driven Businesses,” focuses on exciting, new advances in tape technology that are now positioning tape for a critical role in effective data protection and retention in the age of zettabyte (ZB) storage. That’s right “zettabyte storage!”
The whitepaper cites the need to store 17 ZB of persistent data by 2025. This includes “cold data” stored long-term and rarely accessed that is estimated to account for 80% of all data stored today. Just one ZB is a tremendous amount of data equal to one million petabytes that would need 55 million 18 TB hard drives or 55 million 18 TB LTO-9 tapes to store. Just like the crew in the movie Jaws needed a bigger boat, the IT industry is going to need higher capacity SSDs, HDDs, and higher density tape cartridges! On the tape front, help is on the way as demonstrated by IBM and Fujifilm in the form of a potential 580 TB capacity tape cartridge. Additional highlights from ESG’s whitepaper are below.
New Tape Technology IBM and Fujifilm set a new areal density record of 317 Gb/sq. inch on linear magnetic tape translating to a potential cartridge capacity of 580 TB native featuring a new magnetic particle called Strontium Ferrite (SrFe) with the ability to deliver capacities that extend well beyond disk, LTO, and enterprise tape roadmaps. SrFe magnetic particles are 60% smaller than the current defacto standard Barium Ferrite magnetic particles yet exhibit even better magnetic signal strength and archival life. On the hardware front, the IBM team has developed tape head enhancements and servo technologies to leverage even narrower data tracks to contribute to the increase in capacity.
The Case for Tape at Hyperscalers and Others Hyperscale data centers are major new consumers of tape technologies due to their need to manage massive data volumes while controlling costs. Tape is allowing hyperscalers including cloud service providers to achieve business objectives by providing data protection for critical assets, archival capabilities, easy capacity scaling, the lowest TCO, high reliability, fast throughput, low power consumption, and air gap protection. But tape also makes sense for small to large enterprise data centers facing the same data growth challenges including the need to scale their environments while keeping their costs down.
Data Protection, Archive, Resiliency, Intelligent Data Management According to an ESG survey revealed in the whitepaper, tape users identified reliability, cybersecurity, long archival life, low cost, efficiency, flexibility, and capacity as top attributes in tape usage today and favor tape for its long-term value. Data is growing relentlessly with longer retention periods as the value of data is increasing thanks to the ability to apply advanced analytics to derive a competitive advantage. Data is often kept for longer periods to meet compliance, regulatory, and for corporate governance reasons. Tape is also playing a role in cybercrime prevention with WORM, encryption, and air gap capabilities. Intelligent data management software, typical in today’s active archive environments, automatically moves data from expensive, energy-intensive tiers of storage to more economical and energy-efficient tiers based on user-defined policies.
ESG concludes that tape is the strategic answer to the many challenges facing data storage managers including the growing amount of data as well as TCO, cybersecurity, scalability, reliability, energy efficiency, and more. IBM and Fujifilm’s technology demonstration ensures the continuing role of tape as data requirements grow in the future and higher capacity media is required for cost control with the benefit of CO2 reductions among others. Tape is a powerful solution for organizations that adopt it now!
As recently announced by Fujifilm, LTO-9 has arrived and is available for immediate delivery. It certainly comes at a time when the IT industry is so challenged to manage rampant data growth, control costs, reduce carbon footprint and fight off cyber-attacks. LTO-9 is coming to market just in time to meet all of these challenges with the right features like capacity, low cost, energy efficiency, and cyber security.
What a Great Run for LTO First of all, it is remarkable to look at how far LTO Ultrium technology has come since its introduction. LTO made its market debut in 2000 with the first generation LTO-1 at 100/200 GB native/compressed capacity with 384 data tracks. Transfer rate was just 20 MB native and 40 MB compressed per second. Fast forward 21 years to the availability of LTO-9 now with 18/45 TB native/ compressed capacity on 8,960 data tracks, with transfer rate increasing to 400 MB per second, 1,000 MB per second compressed! In terms of compressed capacity, that’s a 225X increase compared to LTO-1. Since 2000, Fujifilm alone has manufactured and sold over 170 million LTO tape cartridges, a pretty good run indeed.
Capacity to Absorb Bloated Data Sets We are firmly in the zettabyte age now and it’s no secret that data is growing faster than most organizations can handle. With compound annual data growth rates of 30 to 60% for most organizations, keeping data protected for the long term is increasingly challenging. Just delete it you say? That’s not an option as the value of data is increasing rapidly thanks to the many analytics tools we now have to derive value from it. If we can derive value from that data, even older data sets, then we want to keep it indefinitely. But this data can’t economically reside on Tier 1 or Tier 2 storage. Ideally, it will move to Tier 3 tape as an archive or active archive where online access can be maintained. LTO-9 is perfect for this application thanks to its large capacity (18 TB native, 45 TB compressed) and high data transfer rate (400 MB sec native, 1,000 MB sec compressed).
Lowest TCO to Help Control Costs Understanding your true total cost of ownership is of vital importance today as exponential data growth continues unabated. The days of just throwing more disk at storage capacity issues without any concern for cost are long gone. In fact, studies show that IT budgets on average are growing at less than 2.0% annually yet data growth is in the range of 30% to 60%. That’s a major disconnect! When compared to disk or cloud options, automated tape systems have the lowest TCO profile even for relatively low volumes of data less than one petabyte. And for larger workloads, the TCO is even more compelling. Thanks to LTO-9’s higher capacity and fast transfer rate, the efficiency of automated tape systems will improve keeping the TCO advantage firmly on tape’s side.
Lowest Energy Profile to Reduce Carbon Footprint Perhaps of even greater concern these days are the environmental impacts of energy-intensive IT operations and their negative effect on global warming and climate change. You may have thought 2020 was a pretty bad year, being tied for the hottest year on record with 2016. Remember the raging forest fires out West or the frequency of hurricanes and tropical storms? Well, it turns out 2021 is just as bad if not worse with the Caldor Fire and Hurricane IDA fresh in our memory.
Tape technology has a major advantage in terms of energy consumption as tape systems require no energy unless tapes are being read or written to in a tape drive. Otherwise, tapes that are idle in a library slot or vaulted offsite consume no energy. As a result, the CO2 footprint is significantly lower than always on disk systems, constantly spinning and generating heat that needs to be cooled. Studies show that tape systems consume 87% less energy and therefore produce 87% less CO2 than equivalent amounts of disk storage in the actual usage phase. More recent studies show that when you look at the total life cycle from raw materials and manufacturing to distribution, usage, and disposal, tape actually produces 95% less CO2 than disk. When you consider that 60% to 80% of data quickly gets cold with the frequency of access dropping off after just 30, 60, or 90 days, it only makes sense to move that data from expensive, energy-intensive tiers of storage to inexpensive energy-efficient tiers like tape. The energy profile of tape only improves with higher capacity generations such as LTO-9.
A Last Line of Defense Against Cybercrime Once again, 2021 is just as bad if not worse than 2020 when it comes to cybercrime and ransomware attacks. Every webinar you attend on this subject will say something to the effect of: “it’s not a question of if; it’s a question of when you will become the next ransomware victim.” The advice from the FBI is pretty clear: “Backup your data, system images, and configurations, test your backups, and keep backups offline.”
This is where the tape air gap plays an increasingly important role. Tape cartridges have always been designed to be easily removable and portable in support of any disaster recovery scenario. Thanks to the low total cost of ownership of today’s high-capacity automated tape systems, keeping a copy of mission-critical data offline, and preferably offsite, is economically feasible – especially considering the prevalence of ransomware attacks and the associated costs of recovery, ransom payments, lost revenue, profit, and fines.
In the event of a breach, organizations can retrieve a backup copy from tape systems, verify that it is free from ransomware and effectively recover. The high capacity of LTO-9 makes this process even more efficient, with fewer pieces of media moving to and from secure offsite locations.
The Strategic Choice for a Transforming World LTO-9 is the “strategic” choice for organizations because using tape to address long-term data growth and volume is strategic, adding disk is simply a short-term tactical measure. It’s easy to just throw more disks at the problem of data growth, but if you are being strategic about it, you invest in a long-term tape solution.
The world is “transforming” amidst the COVID pandemic as everyone has to do more with less and budgets are tight, digital transformation has accelerated, and we are now firmly in the zettabyte age which means we have more data to manage efficiently, cost-effectively, and in an environmentally friendly way. The world is also transforming as new threats like cybercrime become a fact of life, not just a rare occurrence that happens to someone else. In this respect, LTO-9 indeed comes to market at the right time with the right features to meet all of these challenges.
I recently attended a webinar about why IT folks have a love/hate relationship with the cloud. They love the cloud because of its on-demand flexibility, unlimited compute and storage capacity, elimination of CAPEX costs, etc. They hate it, according to the webinar presenter, because of the cost that often produces “sticker shock.” Other irritants might include regulatory compliance issues and cyber security concerns.
To be completely fair to the cloud, the presenter explained that discipline and accountability could be brought to bear to help control costs and that organizations need to establish “a cloud center of excellence.” But at the same time, the presenter showed data from a study that suggested that 58% of respondents were moving some cloud-based workloads back to on-premises, private cloud environments. Finally, the presenter advised the audience to “understand your true cost of IT, TCO tools are out there!”
Getting Back to Hybrid Storage Strategies
I think the overall message of the webinar was that the cloud is great when used for the right applications and that a hybrid approach including a healthy mix of public cloud plus private cloud makes a lot of sense. In fact, the trend prior to COVID-19 appeared to be clearly hybrid. Cloud repatriation was happening as IT managers realized that the cloud is not a panacea for everything. During the COVID period, private cloud data centers were understaffed and under-supported by vendors, so the path of least resistance was to over-leverage the public cloud once again. As we begin to emerge from COVID lockdowns and IT staff returns to the data center, attention is being paid once again to finding a healthy mix of public cloud and on-premises private cloud.
This approach only makes sense and clearly reinforces that it is not an either-or scenario. In the case of storage, the cloud complements on-premises storage including today’s highly advanced and automated tape systems. Cloud comes in handy for example when on-demand access is frequently needed by multiple clients while tape systems can manage less frequently accessed and large data sets needing long-term retention including sensitive data and mission-critical data that can be air-gapped as a cyber security best practice. Tape is particularly well suited for these applications thanks to tape’s:
Ease of scalability
Ease of removability
Long archival life and reliability
Low energy consumption and low carbon footprint
TCO Tools are Out There
Getting back to the webinar story and the advice to “understand your true cost of IT,” indeed TCO tools are out there and Fujifilm is pleased to offer a free, web-based interactive TCO tool developed by IT economics expert Brad Johns Consulting, LLC. This tool compares 5 year and 10 year TCO of automated tape systems to economy disk systems and cloud-based cold storage. The tool allows users to input the volume of data to be stored, the annual growth rate, the percent of cloud data retrieval as well as other variables such as the local cost per Kwh, the expense of full time storage management staff, number of copies of data, etc. The tool has been available for many years now and has evolved overtime to be as comprehensive as possible and includes the following CAPEX and OPEX cost variables:
Media and hardware for disk and tape
Maintenance for disk and tape
Energy for disk and tape
Offsite vaulting for tape
Storage management for disk, tape, and cloud
Storage and retrieval fees for cloud
Data transfer fees for cloud
Business level support for cloud
Reducing Energy Consumption and CO2 with Tape
Regarding the cost of energy for disk and tape, this expense can be significant over time especially for disk systems that are constantly spinning 24/7 generating heat and therefore require cooling. Given the heightened awareness of global warming and climate change, organizations are looking for ways to reduce energy consumption and their carbon footprint. Data center operations are no exception and have been spotlighted for their energy-intensive applications. Making greater use of renewable energy is part of the answer, but renewable energy can’t come online fast enough, or cheaply enough, to keep up with exponential data growth. Conservation has an even bigger potential to make a difference and that is where tape systems really shine.
Studies show that under certain scenarios inclusive of data management servers and network infrastructure, tape consumes 87% less energy than equivalent amounts of disk storage and therefore produces 87% less CO2 all while reducing TCO by 86% at the same time. Given that data quickly becomes static and frequency of access goes down dramatically after just 30 to 90 days, it makes sense to move that data from energy-intensive and higher cost tiers of storage like flash, performance disk, or even economy disk to lower-cost, energy-efficient tape systems. A good active archive architecture with intelligent data management software is a great way to achieve such storage optimization (getting the right data, in the right place, at the right time, and at the right cost).
To help highlight the energy advantage of tape and its reduction in CO2, the Fujifilm TCO tool now includes a calculation purely focused on the storage hardware layer that shows the reduction in CO2 compared to disk systems, with an example shown below based on storing 5.0 PB for 10 years with 30% annual growth and 12% data retrieval from the cloud.
So not only is TCO reduced with automated tape systems compared to disk and cloud storage, but a meaningful reduction in CO2 can be achieved and that is exactly what we all need to be doing to help slow down the negative impacts of global warming and climate change.
With the recent high-profile cases of ransomware hitting the news cycle like Colonial Pipeline, JBS and others, it appears ransomware is not going away anytime soon and may just be in its infancy. Ransomware is a lucrative business model for cybercriminals with ransom demands that can reach into the millions of dollars as was the case with Colonial ($4.4 M) and JBS ($11.0). Ransomware-as-a-Service (RaaS) is making the barriers of entry extremely low, so we can expect to see more bad actors entering the business and more attacks across every industry.
The sense of urgency is ratcheting up as the C-suite is clearly focused on cybersecurity. I was speaking to one customer about deploying offsite/offline backup tapes as an air gap who said “Cybersecurity is the top focus for us in the next six weeks. We need to act fast”. In addition to shoring up cybersecurity plans, or putting key components in place, the notion of acquiring cyber insurance is cropping up and no doubt is also on the C-suite agenda.
So what is Cyber Insurance?
Cyber insurance, also referred to as cyber-liability insurance, seeks to help companies recover and mitigate the damage from cyberattacks such as ransomware, data destruction or theft, extortion demands, denial of service attacks, etc. This class of insurance has been around since the early 1990s and is rapidly evolving and growing in terms of revenue for insurance companies. One report I came across pegged the market for this type of insurance at $3.15 B in 2019 and is expected to rise to over $20 B by 2025. According to another report, about a third of all large U.S. companies carry cyber insurance.
Typical corporate insurance policies for general liability and property damage most likely don’t cover cybercrime, so cyber insurance has become a stand-alone offering specifically suited for cybercrime protection. Depending on the policy, below are just a handful of items that typically may be covered:
Incident response costs related to restoring systems to pre-existing conditions
Recovery cost of data or software that has been deleted or corrupted
The cost of cyber extortion including the negotiation and execution of ransom payments
Lost profits due to IT system downtime
Financial theft or fraud arising from the cyber attack
There is increasing pressure around the world to reduce emissions and lower mankind’s carbon footprint. It is up to the IT sector to do its part, and that means considerably lowering power usage. But that is easier said than done when you consider the statistics.
IDC predicts we will arrive at the mind-boggling figure of 175 zettabytes of data in the digital universe within 4 years. 175 ZB? Consider how long it takes most users to fill a one TB drive. Well, 175 ZB equates to approximately 175 billion TB drives.
The problem is this: how do you reduce IT’s overall power draw in the face of a massive and continual upsurge in data storage? Once 175 ZB of data exists, there is no possibility of containing electrical usage if the vast majority of storage is sitting on hard disk drives (HDDs). The only solution is to cure the industry’s addiction to disk.
Here are the numbers. Data centers alone account for close to 2% of all power consumed in the U.S., about 73 billion kilowatt hours (kWh) in 2020. That is enough to set off the alarm bells. Yet tremendous progress has been made over the past two decades in terms of data center efficiency. When power consumption in data centers soared by 90% between 2000 and 2005 period, the industry acted forcefully. The rate of growth slowed to 24% between 2005 and 2010 and then fell to less than 5% for the entire decade between 2010 and 2020. That’s miraculous when you consider that it was achieved during a period that represented the largest surge in storage growth in history. Smartphones, streaming video, texting, multi-core processors, analytics, the Internet of Things (IoT), cloud storage, big data, and other IT innovations demanded the retention of more and more data.
Big strides were made in Power Usage Effectiveness (PUE – the ratio of data center power consumption divided by the power usage). Data centers have largely done a good job in improving the efficiency of their operations. But the one area lagging badly behind is storage efficiency.
By Chris Kehoe, Head of Infrastructure Engineering, FUJIFILM Recording Media U.S.A., Inc.
Object storage has many benefits. Near infinite capacity combined with good metadata capabilities and low cost have propelled it beyond its initial use cases of archiving and backup. More recently, it is being deployed as an aid to compute processing at the edge, in analytics, machine learning, disaster recovery, and regulatory compliance. However, one recent paper perhaps got a little over-enthusiastic in claiming that disk-based object storage provided an adequate safeguard against the threat of ransomware.
The basic idea proposed is that ransomware protection is achieved by having multiple copies of object data protecting against that kind of intrusion. If the object store suffers ransomware incursion, the backup is there for recovery purposes. The flaw in this logic, however, is that any technology that is online cannot be considered to be immune to ransomware. Unless it is the work of an insider, any attempt at hacking must enter via online resources. Any digital file or asset that is online – whether it stored in a NAS filer, a SAN array, or on object storage – is open to attack.
Keeping multiple copies of object storage is certainly a wise strategy and does offer a certain level of protection. But if those objects are online on disk, a persistent connection exists that can be compromised. Even in cases where spin-down disk is deployed, there still remains an automated electronic connection. As soon as a data request is made, therefore, the data is online and potentially exposed to the nefarious actions of cybercriminals.
By Rich Gadomski, Head of Tape Evangelism, FUJIFILM Recording Media U.S.A., Inc.
In case you were not aware of it, March 31st is World Backup Day. To be sure, a quick visit to the official website confirms that this day is just a reminder for consumers to backup their PCs and cell phones. According to the website, only 25% of consumers are protecting their precious memories. Surely the helpful recommendations for routine backup doesn’t apply to the storage professionals that keep our enterprise data safe and our websites up and running. Or does it?
When Disaster Strikes a Data Center
On Wednesday, March 10th, 2021, a fire broke out at the OVHCloud data center in Strasbourg, France. The fire quickly spread out of control and completely destroyed compute, network and storage infrastructure. According to some accounts, as many as 3.6 million websites including government agencies, financial institutions and gaming sites went dark. Others complained that years’ worth of data was permanently lost.
We know that the statistics regarding cost of downtime and the number of companies that don’t ever recover from catastrophic data loss are alarming. The often-cited University of Texas study shows that 94% of companies do not survive, 43% never reopen and 51% close within two years. That’s why the cardinal sin in data protection is not being able to recover data.
OVH reminds us that, however unlikely, data center disasters like an all-consuming fire can still happen. Although these days a more sinister threat continues to loom and tends to grab the headlines and our attention, namely: ransomware.
Usage of Cookies