FUJIFILM INSIGHTS BLOG

Data Storage

Digital Preservation: What’s the Most Cost-Effective Way to Preserve Data?

Reading Time: 5 minutes

Row of data servers

Businesses of every size and type generate and collect enormous amounts of data. And there’s no getting around the fact that storing all that data is expensive. But many of the costs associated with data storage are simply unnecessarily high — particularly when it comes to cold data.

In business terms, choosing the best data storage method amounts to determining which method offers the best return on investment. The problem is the confusion surrounding the various data types and how to manage and store them. Put simply, many organizations haven’t established a data management strategy.

In this article, we’ll examine the data preservation area of data management. While it’s only one part of your business’s larger data strategy, as you’ll see, the wrong approach has a significant impact on a business’s bottom line.

From Hot to Cold

Person fixing a computer

In data storage, one of the most important differences in data types is between what’s referred to as hot and cold data. Data storage services often use this temperature analogy to describe their service tiers in terms of performance. The analogy hearkens back to the early days of computing, where internal hard drives were near the “hot” computer circuitry while external drives were located further away.

While hot and cold data aren’t standardized technical terms, the theme is common across the storage industry, and understanding it is key to developing a cost-effective strategy for data preservation.

Hot data storage refers to high-performance storage that’s readily accessible. This is the storage that people read and write to on a regular basis. Some examples of hot data storage include:

  • The solid-state drive in your personal computer
  • The drives in a data center that hold information for a website
  • The servers your company uses to share and collaborate on files

On the other hand, cold storage refers to storage mediums with much slower performance and access requirements. Data kept in cold storage is typically transferred, saved and rarely accessed. The data kept in cold storage is often archival in nature. It can be large quantities of research data, critical business information that requires backup and redundancy, or records an organization needs to retain for legal or regulatory reasons.

Given the characteristics, the technologies powering hot and cold storage methods have significant cost differences. Prices vary, but hot storage can cost four times as much as cold storage. If your objective is data preservation, high-performance data storage is not the solution.

Where and How to Preserve Data

Data backup in progress

With an understanding of hot and cold data in mind, next comes where and how to store data. While there are a myriad of options, these options fit into two categories: on-premises storage or cloud storage services. Each approach has its benefits and drawbacks, and deciding which one is right for an organization is a complex, if not daunting, task.

Nearly every organization uses cloud services in some form or another. Cloud computing has brought new and surprising innovations, with many still to come. But the promises of the affordability of cloud solutions might have been a little inflated. More than a third of all enterprises spend more than $12 million every year implementing cloud solutions.

In most applications where cloud storage excels, performance is the determining factor. And while there are more affordable cloud storage services for archival purposes, these services charge access fees when you need to access your data, which can drive up costs.

Finally, cloud services are worrying where security is concerned. When an organization has data stored on infrastructure outside its control, securing that information is infinitely more complex. And even though cloud security has improved considerably, data breaches are still common. For this reason alone, many businesses dealing with sensitive data choose to preserve data on their own infrastructure.

When an organization has robust security policies in place, there’s simply no contest: Data is more secure on-premises.

A Hybrid Approach to Data Storage and Preservation

Man working on computer at office desk

For most businesses, a hybrid cloud approach for data management is the most practical solution in terms of cost and performance. Cloud storage isn’t cheap, but the low barrier to entry, sheer scalability and immense integration capabilities ensure it will play an integral role in most businesses’ IT architecture.

Meanwhile, for organizations with vast amounts of archival data in which preservation and security are paramount, on-premises infrastructure is the smart choice. Of the many options available, tape storage has significant benefits compared to other storage mediums for cold data preservation.

In the realm of affordability, tape storage has the lowest cost per gigabyte and is far more reliable than disk-based drives, meaning fewer bit errors. Over a 5- and 10-year period, an automated tape-based archival system can save organizations from 49% to 86% over a disk-based system. Cloud-based solutions don’t compare, either. Even with minimal retrieval on a cloud cold storage service, tape still equates to savings of more than 40%.

If you’d like to see the numbers for yourself, this free online tool, developed by storage economics expert Brad Johns Consulting, can show you exactly how much tape can save on your business’ data management.

Affordability aside, an automated tape system improves security significantly. One of its biggest benefits is that it can be leveraged for off-site cold storage — an air gap between an organization’s data and the rest of the world. Tape’s inherent mobility means the chances of a business’ mission-critical data being compromised are almost zero.

Preserve Your Data With Object Archive

Team of employees working on their computers

Recognizing a hybrid future where organizations rely on a combination of cloud services and private infrastructure, FUJIFILM developed Object Archive to bridge the gap between secure cold storage and cloud data. Compatibility with Amazon S3’s API means Object Archive fits neatly into existing workflows while providing a means to migrate large quantities of cold but valuable data to tape seamlessly, while avoiding the unnecessary expense of disk based object storage.

Automated tape solutions coupled with FUJIFILM’s Object Archive provide businesses with a powerful, environmentally-friendly solution that’s infinitely scalable. Helping customers preserve and protect their data — this is FUJIFILM’s mission.

Get started with a free trial of Object Archive today.

Read More

Flash Memory Summit a Big Hit with Tape and DNA Included

Reading Time: 4 minutes

I had the opportunity to attend in person and present on the latest in tape technology at the 16th Annual Flash Memory Summit (FMS) held in Santa Clara last week. That’s right, tape technology at a flash conference. My friends from the DNA Data Storage Alliance were there presenting too. So what gives?

Well, first of all let me say, as the organizer of the Fujifilm Global IT Executive Summit, I was really impressed with the quality and scale of the show. It was my first FMS, and I wasn’t quite prepared for a General Session ballroom set for 1,500 people or so (plus overflow space and monitors in the outside hallway like a good evangelical church). Compliments to Tom Coughlin, Chuck Sobey and so many others for putting on a well-organized and content rich program.

Cold Storage Supports Hot Storage

So back to the question, what gives in terms of cold storage having a seat at a hot storage show?  I was kind of wondering about this myself and honestly never thought that my abstract to talk about tape would be accepted. But it was.

Coming from the East Coast, I arrived the day before the official opening so I went to get my badge and do a little recon. I had the luck to run into Chuck Sobey as he was making the rounds and checking on everything. While we chatted, Chuck mentioned that he and the FMS committee were considering a specific archive agenda for FMS 2023 and would I be interested in that? Of course!

The next day my break-out panel on data storage was bright and early but I had a good time working with our panel moderator, Jean Bozman of Cloud Architects Advisors, and fellow panelists Wim De Wispelaere of Western Digital and Javier Gonzalez of Samsung.

The title of my presentation was “Leveraging Tape to Support Primary Flash Storage.” My very basic agenda covered the following points:

  • While flash technology has been around for 35 years, tape is celebrating 70 years. Why is that?
  • Healthy capacity shipment forecast for tape with CAGR of 19%, according to TrendFocus
  • Price relationships between SSD, HDD and tape (see chart below)
  • CO2 relationships between SSD, HDD and tape (see chart below)
  • Reliability relationships between SSD, HDD and tape (see chart below)
  • The evolving storage pyramid with hot, warm and cold tiers of data
  • Fujifilm/IBM tape tech demos featuring Strontium Ferrite at 580 TB native capacity
  • The LTO roadmap featuring LTO-12 at 144 TB native capacity
  • And finally my conclusion: It’s not about flash vs. disk vs. tape, it’s about storage optimization. Getting the right data in the right place at the right time, and the right energy profile and cost!

So that’s how you leverage tape to support primary flash storage. You move cold data that’s gone static and inactive but can’t be deleted, from expensive, energy intensive primary storage tiers to low cost, eco-friendly tiers of storage like tape. An IBM executive once said, “the best way to afford more flash is to deploy tape systems.”

Explosion of Data That Needs to Be Stored

After our data storage panel, I attended the larger general session keynote presentations from the big flash companies like Kioxia (formerly Toshiba), Western Digital, Samsung, SK Hynix, Marvell, Intel and others.

What struck me was the amazing amount of innovation happening in CPUs, GPUs, DPUs and memory at the very top of the “pyramid” such as SRAM, DRAM and SCM (storage class memory). New data intensive workloads and applications are exploding including real-time analytics, AI/ML, VR/AR, IoT, HPC, and cybersecurity just to name a few. In support of these workloads and applications, significant advances are happening in speed, cost performance, power consumption, and scalability.

This, of course, is increasing not only the amount of data that is being generated, but increasing its value as well. Increasing amounts of valuable data will need to be stored for longer periods of time. The more data we can save the better for analytics. But in order to do this cost effectively, reliably and in an energy conscious manner, the industry is going to need increasing amounts of archival storage like tape (think active archive), DNA (think deep archive) and maybe yet to be developed hybrid or new storage solutions.

So that’s why tape and DNA were at Flash Memory Summit and why we hope to see more dedicated archival content at FMS 2023.

Read More

Avoiding Potential Risk of Stagnation in the Secondary Storage Market

Reading Time: 2 minutes

By Guest Blogger Peter Faulhaber, former president and CEO, FUJIFILM Recording Media U.S.A., Inc.

The Hyperscale Data Center (HSDC) secondary storage market is quickly emerging, requiring advanced solutions for petascale and exascale storage systems, not currently available. According to HORISON Information Strategies, HSDCs currently use around 3% of the world’s electrical energy. Due to the massive energy footprint of HSDCs, climate protection measures have become increasingly important in recent years, with cloud computing offering the greatest advantages for sustainable operation by reducing the energy and carbon footprint over the entire data life cycle.

The slowing rate of HDD and tape technology development roadmaps in recent years, along with HDD and tape storage supplier consolidations are particularly concerning trends to HSDCs. Neither HDD nor tape technology is currently positioned by itself to effectively meet the enormous HSDC storage requirements that future performance and capacity demands. High technical asset specificity requires significant R&D investment, yet have limited ROI potential outside of hyperscalers.

HSDCs manage over 60% of the world’s data today with a CAGR of 35 – 40%, with a growing need for cost-effective secondary storage that still meets certain performance thresholds.

The vendors and manufacturers are dis-incentivized to invest in novel technology; the risk reward is not high enough, while HSDCs are leveraging their buying and bargaining power. Manufacturers need to invest hundreds of millions to bring innovative solutions to market in a long development cycle, without a commitment from the HSDC market.

As a result, the secondary storage market is left with incremental investments in existing technologies and moves slowly.

The conditions are set for a widening gap between customer demands and product solutions in the secondary storage market.

The current “vendor-driven” strategy will not keep pace with HSDC requirements for secondary storage as such offerings fall far behind HSDC curves. Photonics, DNA, glass, and holographic experiments are attempting to address the market, and have been in labs for decades, but most have drawbacks, and none are on the near-term horizon for customer deployment. These initiatives show that a change is needed to get ahead of the demand curve.

However, the opportunity also exists to mitigate this risk by bringing the interested parties together  to share the risk reward paradigm. HSDCs need a quantum leap, which only comes with significant investment, best shared by the interested parties.

The Semiconductor Research Corporation (SRC) addressed the concept  of vertical market failure in September 2021 in its published article “New Trajectories for Memory and Storage,” stating, “The prospect of vertical market failure can be mitigated by private sector market participants through risk-share agreements between customers and suppliers, as well as increased vertical integration.”

Without change, current technologies will fall far behind HSDC demand curves, and the current vendor-driven trajectory increases the likelihood of un-met demand and stagnation of growth for all involved.

 

Read More

WOW! 70 Years of Tape Technology

Reading Time: 2 minutes

By Guest Blogger, Dr. Shawn O. Brume Sc. D., IBM Tape Evangelist and Strategist

According to a study by McKinsey, the average lifespan of companies listed in Standard & Poor’s is less than 18 years! That means that tape technology is already in business almost 4 times longer than the average S&P will survive.  Tape technology celebrated 70 years young on May 21st.  Tape has been and continues to be the most transforming data storage technology in history.

In the 50’s it was the only viable technology for storing data generated by the few computers in existence. In the 60’s tape took the world to the moon and preserved the data for usage nearly 40 years later when it was retrieved to assist in modern space explorations. By the 70’s Tape was dominating storage, transforming the financial industry by providing the ability to access data on accounts with minimal human intervention. The 80’s and 90’s continued the transformation of data availability by performing transactional data storage for ATMs, but also was key in the investigation of the space shuttle Challenger disaster; an investigation enhanced as a result of the durability of tape even when submerged in saltwater.

Today tape lives in the data center, preserving Zettabytes of data. Data being preserved and utilized across nearly every industry, examples:

Healthcare –  Data preserved on tape is being utilized to develop new predictive health services. Digital medical records can be retained for the life of patients and shared across organizations.

Financial – Online transaction retention ensures customers valuable financial data is protected in the eventuality of a cyber-attack. Mortgage loans are preserved without fear of tampering.

Cloud – Data stored in public clouds are growing at a 30% faster rate than traditional storage. Cloud providers rely on tape to provide data durability and low-cost storage subscriptions.

Tape’s popularity has often been driven by the low cost of storage, modern data storage requires so much more including cyber-resiliency, data durability and low carbon footprints that enable sustainable IT.

Cyber Resiliency – Tape is the only true airgap data storage solution available.
Data Durability – Tape has a native single copy durability of 11- Nines. This means the likelihood of a single bit failure is 1 in 100 Petabytes.

Sustainability – At scale tape technology is 96% lower carbon footprint than highly dense HDD storage (when comparing OCP Bryce canyon and IBM tape technology with 27PB of data).

If preserving data, in a cyber-resilient solution, at low cost, with relatively low carbon impact meets your business outcomes, then why wait? Clearly tape is here to stay and surging in usage across nearly every business use case.

Happy 70-years to an amazing technology!

For more information about technology since tape’s introduction, check out this post from my colleague Mike Doran.

For more information on current tape products see the IBM product page.

 

Read More

Tape Advancements Push Storage and Sustainability Benefits to New Levels

Reading Time: 2 minutes

The Tape Storage Council, (TSC), released a new report “Tape to Play Critical Roles as the Zettabyte Era Takes Off,” which highlights the current trends, usages and technology innovations occurring within the tape storage industry.  The zettabyte era is in full swing generating unprecedented capacity demand as many businesses move closer to Exascale storage requirements.

According to the LTO Program, 148 Exabytes (EB) of total tape capacity (compressed) shipped in 2021, marking an impressive record year. With a growth rate of 40%, this strong performance in shipments continues following the previous record-breaking 110 EB capacity shipped in 2019 and 105 EB of capacity shipped in the pandemic affected year of 2020.

The ever-increasing thirst for IT services has pushed energy usage, carbon emissions, and reducing the storage industry’s growing impact on global climate change to center stage. Plus, ransomware and cybercrime protection requirements are driving increased focus on air gap protection measures.

As a result of these trends, among others, the TSC expects tape to play an even broader role in the IT ecosystem going forward as the number of exabyte-sized environments grow. Key trends include:

  • Data-intensive applications and workflows fuel new tape growth.
  • Data accessibility. Tape performance improves access times and throughput.
  • Tape should be included in every green data center strategy.
  • Storage optimization receives a big boost from an active archive which provides dynamic optimization and fast data access for archival storage systems.

Organizations continue to invest in LTO tape technology thanks to its high capacity, reliability, low cost, low power consumption and strong data protection features, especially as threats to cybersecurity soar.

To access the full report, visit: Tape to Play Critical Roles as the Zettabyte Era Takes Off.

 

Read More

Observing Earth Day 2022 In Light of Record LTO Data Tape Capacity Shipments

Reading Time: 5 minutes

The LTO Technology Provider Companies (IBM, HPE, and Quantum) issued a press release earlier this week announcing record capacity shipments for LTO in 2021 of 148 Exabytes (148,000 Petabytes) compressed (up from 105 EB compressed in 2020, +40%). More and more of the world’s data is being stored on LTO data tape. That’s good news for the IT industry! Is it not? After all, end users and service providers need:

  • A strategic way to store and protect massive amounts of increasingly valuable data, especially data that’s gone cool or cold
  • A cost-effective and reliable long term storage solution
  • An air gap defense against cybercrime
  • An eco-friendly form of storage!

 

Industry Pundits React
Some industry pundits, biased toward the HDD industry, took the opportunity to downplay the news. They said the data is inaccurate or insignificant compared to the capacity shipments for HDD last year. Really? Does tape technology threaten the market for HDD? Is it still about tape vs. disk in their minds? Have trains, trucks, and ships put air freight out of business? Or does a more strategic thought process say: “These technologies complement each other. We need both to meet the needs of end-users, storage service providers, and society itself…”

Analysts Predict Huge Zettabyte Demand
Indeed, if the big industry analysts firms are right, we will need to be storing more than 11.0 zettabytes of data in 2025. Just one zettabyte would require 55 million 18.0 TB HDDs or 55 million LTO-9 tape cartridges. Should we store all of that data on HDD, whether it is hot, warm, cool, or cold? Of course, we can’t just delete excess data. Now that we can analyze the data and derive a competitive advantage from it, the value of data has increased and we need to store more and more data for longer periods of time. As a result, the projections for the amount of persistent data to be stored are growing exponentially. We will need huge amounts of flash, HDD, tape, and even future storage solutions like DNA to address the data storage challenge.

A Strategic Approach to Data Storage
The key to success will be a strategic approach that leverages intelligent data management software to automate data movement to the right tiers of storage at the right time, the right cost, and the right energy profile. Employing a strategic approach to data storage in an effort to reduce costs and energy consumption all while maintaining service level agreements seems to make sense. Take a good look at an active archive solution, for example. Yet again, there are those industry pundits who say, the amount of energy saved by moving static, inactive, and infrequently accessed data to a tape tier is not significant in the big picture of the IT industry. The real problem they say is the amount of energy consumed by a single Google search. But isn’t that like saying; “Don’t bother turning the lights out before leaving the office for the night. It’s just a drop in the ocean of energy consumption,” or “Why bother turning off the engine of your car when filling up on gas? It’s just a few minutes of idle time and won’t really impact CO2 emissions at all.” Right?

Change of Attitude Needed
But this is the wrong attitude and exactly what has to change to make a difference. Collectively, if we all switch off a light and all turn the car’s engine off, we will make a difference. We might even get motivated for more change! How about installing LED light bulbs or investing in an electric vehicle? Or maybe make the commitment and take the leadership on a renewable energy installation. Attitudes have to change, believing we can make a difference collectively. If data is inactive, why keep it on energy-intensive, constantly spinning disk? Are we all doing whatever it takes to make a difference?

New Flagship UN Report Is a Wake-up Call
If we believe the latest studies on climate change coming out of the United Nations, we need to start quickly taking any action we can. A new UN report on climate change from earlier this month indicated that harmful carbon emissions in the last decade have never been higher in earth’s history. It’s proof that the world is on a “fast track” to climate disaster. UN Secretary General Antonio Guterres has warned that it’s ‘now or never’ to limit global warming to 1.5 degrees C. Climate change is the result of more than a century of unsustainable energy and land use, lifestyles, and patterns of consumption and production. Guterres adds, “This is not fiction or exaggeration. It is what science tells us will result from our current energy policies. We are on a pathway to global warming of more than double the 1.5-degrees C limit” that was agreed in Paris in 2015. To limit global warming to around 1.5 C (2.7 F), the IPCC report insists that global greenhouse gas emissions will have to peak “before 2025 at the latest, and be reduced by 43% by 2030.”

Reducing Energy Consumption and CO2 Emissions with Tape
To help increase awareness and understanding of energy consumption in data storage, a number of whitepapers have been published highlighting alternative options for storage including LTO data tape. A recent IDC whitepaper studied migrating cold data from HDDs to LTO tape. The opportunity to positively impact the environment by shifting to tape is staggering. This strategic approach can reduce storage-related CO2 emissions by, coincidently, 43.7% by 2030. This would avoid 664 M metric tons of CO2 cumulatively. That’s the equivalent amount of CO2 produced by 144 million passenger cars driven in the course of a year!

Other research shows that tape consumes 87% less energy than equivalent amounts of HDD storage. When CO2 emissions are analyzed over the entire product lifecycle (from raw materials to production to distribution, usage, and disposal) of HDD and tape, studies show a 95% reduction in CO2 in favor of tape compared to HDD. The same study shows Total Cost of Ownership for long-term data storage can be reduced by more than 70% by using tape instead of HDD. At the same time, tape can provide an effective defense against cybercrime via a physical air gap. All of this is possible by taking a strategic storage approach, where cool or cold data that has aged and is infrequently accessed gets moved from expensive primary storage to economical and environmentally friendly tape systems, online or offline.

Data Center World Attendees Get It
In my last blog on my visit and presentation at Data Center World in Austin last month, I mentioned that I was encouraged by the DCW attendees that I met and listened to in my session and other sessions. They are genuinely concerned about the environment and worried about what kind of planet we will be leaving behind for our kids and grandchildren. They recognize the opportunity to improve sustainability in data center operations and are committed to it. But since then it has occurred to me that maybe sustainability is more of a focus for facility teams. Perhaps the top-down pressure from the C-suite has yet to be widely applied to the data storage management teams. However, in the quest to achieve the needed sustainability goals, no stone can remain unturned.

Observing Earth Day for Future Generations
With Earth Day being observed today, let’s commit to strategically taking action in response to global warming and climate change. Let’s start changing attitudes from “It won’t make a difference” to “Collectively, we can make a difference.” Let’s look at the bright side of increasing LTO capacity shipments instead of the dark, self-serving side. Let’s think about the planet that’s home for us and the future generations of our families to come.

 

Read More

New Federal Cybersecurity Mandates Enacted and SEC Rules Proposed, Amidst Never-Ending Ransomware Attacks

Reading Time: 5 minutes

As I started to write this blog on recent ransomware observations, an email message popped up on my PC from our IT department advising of additional and more stringent security enhancements taking place almost immediately to toughen my company’s cybersecurity and increase our protection against current and emerging threats. A sign of these cybercrime times, indeed!

Ransomware Trending
According to a February 2022 Alert from CISA (Cybersecurity & Infrastructure Security Agency), 2021 trends showed an increasing threat of ransomware to organizations globally with tactics and techniques continuing to evolve in technological sophistication. So-called “big game” organizations like Colonial Pipeline, Kronos, JBS, Kaseya, and SolarWinds made the ransomware headlines over the past year or so. But according to the CISA Alert, by mid-2021, many ransomware threat actors, under pressure from U.S. authorities, turned their attention toward mid-sized victims to reduce the scrutiny and disruption caused by said authorities.

In a recent Enterprise Strategy Group (ESG) study, 64% of respondents said their organization had paid a ransom to regain access to data, applications, or systems. These findings are supported by the latest Threat Landscape report from the European Union Agency for Cybersecurity. It highlighted a 150% rise in ransomware in 2021 compared to 2020. The agency expects that trend to continue, and even accelerate in 2022.

But these numbers hide the stark reality of the ransomware scourge. Gangs like DarkSide, REvil, and BlackMatter are terrorizing organizations with ransomware – and they are getting smarter and more organized. They have moved beyond the basic ploy of infecting files, locking users out of their data, and demanding a fee. They still want money. But they also endanger reputations by exposing attacks, blackmailing companies by threatening to reveal corporate or personal dirty laundry, and selling intellectual property (IP) to competitors.

As a result, cybersecurity spending has become a priority in most organizations. According to ESG, 69% of organizations plan to spend more on cybersecurity in 2022 than in the previous year, while 68% of senior IT decision-makers identify ransomware as one of their organization’s top 5 business priorities.  Such is the fear factor that organizations are now treating cybersecurity ahead of other organizational imperatives such as the cloud, artificial intelligence (AI), digital transformation, and application development.

New Federal Mandate and the SEC Takes Action
On March 15th, in an effort to thwart cyberattacks from foreign spies and criminal hacking groups, President Biden signed into law a requirement for many critical-infrastructure companies to report to the government when they have been hacked. This way, authorities can better understand the scope of the problem and take appropriate action.

It’s also no wonder that the Security and Exchange Commission (SEC) is taking action. On March 9th, the SEC voted 3 to 1 to propose reporting and disclosures related to cybercrime incidents and preparedness. In a nutshell, the SEC will be asking publicly traded companies:

  • To disclose material cybersecurity incidents
  • To disclose its policies and procedures to identify and manage cybersecurity risks
  • To disclose management’s role and expertise in managing cybersecurity risks
  • To disclose the board of director’s oversight role

Specifically, the SEC will want to know:

  • Whether a company undertakes activities to prevent, detect and minimize the effects of cybersecurity incidents
  • Whether it has business continuity, contingency, and recovery plans in the event of a cybersecurity incident
  • Whether the entire board, certain board members, or a board committee is responsible for the oversight of cybersecurity risks
  • Whether and how the board or board committee considers cybersecurity risks as part of its business strategy, risk management, and financial oversight

Holding publicly traded companies and their boards accountable for best practices in combating ransomware is a big step in the right direction and will no doubt free up the required budgets and resources.

Lowering the Fear Factor
Cybersecurity is already a top spending priority for 2022 and with SEC regulations looming, will likely continue to be a priority for quite some time. Companies are busy beefing up the tools and resources needed to thwart ransomware. They are buying intrusion response tools and services, extended or managed detection and response suites, security information and event management platforms, antivirus, anti-malware, next-generation firewalls, and more, including cybercrime insurance policies.

What may be missing in the spending frenzy, however, are some fundamental basics that can certainly lower the fear factor. Backup tools are an essential ingredient in being able to swiftly recover from ransomware or other attacks. Similarly, thorough and timely patch management greatly lowers the risk of hackers finding a way into the enterprise via an unpatched vulnerability.

Another smart purchase is software that scans data and backups to ensure that no ransomware or malware is hidden inside. It is not uncommon for a ransomware victim to conduct a restore and find that its backup files have also been corrupted by malware. Cleansing data that is ready to be backed up has become critical. These are some of the fundamental basics that need to be in place in the fight against ransomware. Organizations that neglect them suffer far more from breaches than those that take care of them efficiently.

Adding an Air Gap
Another fundamental basic is the elegantly simple air gap. When data is stored in the cloud, on disk, or in a backup appliance, it remains connected to the network. This leaves it vulnerable to unauthorized access and infection from bad actors. An air gap is essentially a physical gap between data and the network. It disconnects backed up or archived data from the Internet.

Such a gap commonly exists by partitioning in, or removing tapes from, an automated tape library and either storing them on a shelf or sending them to a secure external service provider. If that data is properly scanned prior to being backed up or archived to ensure it is free of infection, it offers certainty that a corruption-free copy of data exists. If a ransomware attack occurs, the organization can confidently fall back on a reliable copy of its data – and avoid any ransom demands.

Effectively Combatting Ransomware
There is no silver security bullet that will 100% guarantee freedom from ransomware. It is truly a multi-faceted strategy. Implementation of best-of-breed security tools is certainly necessary. But they must be supported by the steadfast application of backup and patching best practices and the addition of a tape-based air gap.

CISA, the FBI, and cybersecurity insurance companies all recommend offline, offsite, air-gapped copies of data. This can be achieved cost-effectively with today’s removable, and highly portable modern tape technology. The boards of publicly traded companies will likely want to do whatever it takes to demonstrate compliance with best practices to meet the SEC requirements. This should include air-gapped tape as part of a prudent and comprehensive strategy. A best practice in these cybercrime times, indeed!

 

Read More

Majority of C-Suite Respondents Would Consider Alternative Data Storage Option that is More Sustainable and Affordable, Survey Confirms

Reading Time: 4 minutes

I think it’s safe to say people like surveys, probably not everyone, but most people do. Why? Experts in the field suggest that people are willing to take surveys because respondents feel their opinions are valued and that their answers will be used and may even result in a benefit to society. They feel their participation will impact something they care about, and they want to share their opinion with those who will listen and act on the information.

Surveying the C-Suite on Sustainability
So it’s not surprising that Fujifilm got a great response rate to a recently launched survey entitled Awareness Survey on Environmental Issues in the Digital Domain.  As many as 1,200 C-suite professionals responded including CEOs, CFOs, CSOs, CTOs, and CIOs from companies of 100 or more employees in the United States, Germany, Japan, and China.

The survey revealed that there is a growing awareness around broader environmental issues among corporate leaders, and that’s great news as the negative impacts of global warming and climate change keep piling up, flood after flood, wildfire after wildfire, and storm after storm.

When it comes to IT infrastructure specifically, the majority of U.S. respondents believe sustainability improvements in IT services and equipment can positively impact climate change, but 40% indicated that they did not know or were unsure if data storage can have a negative environmental impact and increase the cost of doing business.

Increasing Data Storage Requirements
Data storage can certainly be energy-intensive. This is a problem that is only getting worse as the value of data rises with the ability to analyze and derive competitive advantage from it. As a result, demand for long-term data retention is increasing. In fact, data to be stored is expected to grow from just 2.0 zettabytes in 2016 to 4.1 ZB in 2020 and is expected to reach 11.1 ZB in 2025 according to a recent whitepaper from IDC. Just one ZB is a vast amount of data equal to one million petabytes that would need 55 million 18 TB hard disk drives (HDDs) or 55 million 18 TB LTO-9 tapes to store. The environmental impact of the energy required to support this volume of storage is greatly underestimated, as are the associated carbon emissions. When asked in the survey what barriers exist for those who have not considered more eco-friendly data storage options, 31% in the U.S. cited a lack of awareness or understanding of the issue.

Hot vs. Cold Data
There was also a lack of awareness pertaining to frequently accessed “hot” data and less frequently accessed “cold” data, with 36% of respondents saying they either don’t or are unsure if they differentiate between the two. And 35% don’t realize that differentiating between hot and cold data can impact sustainability, affordability, and security. An interesting fact about data is that it quickly goes cold and access frequency drops off significantly after just 30, 60, or even 90 days. In fact, industry analysts estimate that 60% to 80% of all data stored is cold and qualifies as “archival”. Yet through inertia, that data often remains on energy intensive, constantly spinning and heat-producing tiers of storage like hard disk drives.

Reducing Energy Consumption and CO2 Emissions with Tape
To help increase awareness and understanding of this issue, a number of whitepapers have been published highlighting alternative options for storage including LTO data tape. A recent IDC whitepaper shows how migrating cold data from HDDs to LTO tape can reduce data centers’ CO2 emissions by 43.7% by 2030, avoiding 664 M metric tons of CO2 cumulatively. Other research shows that tape consumes 87% less energy than equivalent amounts of HDD storage. When CO2 emissions are analyzed over the entire product lifecycle (from raw materials to production to distribution, usage, and disposal) of HDD and tape, studies show a 95% reduction in CO2 in favor of tape compared to HDD. The same study shows Total Cost of Ownership for long-term data storage can be reduced by more than 70% using tape instead of HDD. All of this is possible by taking a storage optimization approach, where data that has aged and is infrequently accessed, otherwise known as cold data, gets moved from expensive primary storage like solid-state flash drives and HDDs to economical and environmentally friendly tape systems.

As far as security is concerned, tape is also playing a role in cybercrime prevention with air gap capabilities, WORM, and encryption. Intelligent data management software, typical in today’s active archive environments, can automatically move data from expensive, energy-intensive tiers of storage to more economical and energy-efficient tiers based on user-defined policies. By moving inactive data out of primary storage, the ransomware attack surface can also be reduced.

Renewable Energy Plus Conservation
Another interesting point from the survey reveals that 51% of participants said that their companies are using renewable energy to reduce carbon emissions, while 22% said they are doing so via climate protection projects and 13% through carbon offsets. Renewable energy is a key factor in reducing CO2 emissions and Fujifilm is a fan (see photo at right of our LTO plant in Bedford, MA). But alone renewables likely can’t come online fast enough or cheaply enough to keep up with data growth rates of between 30% – 60% annually in major data centers today. That’s why conservation has to be part of the equation. The very first metric to be analyzed in data center energy efficiency is simply the amount of energy that’s being consumed.

Alternative Data Storage Options
Finally, 81% of respondents noted that they would consider an alternative data storage option that is more sustainable and affordable. That option exists in the form of today’s modern and highly advanced data tape systems that offer the lowest energy consumption and cost profile. Add to that its best-in-class reliability rating of any storage media and longest archival life. So for the benefit of society, let’s act on the information that the survey reveals. It’s really just a question of getting the right data, in the right place, at the right time.

Read More

3 Reasons Why Migrating Data to Tape Systems Makes Sense in Light of SSD and HDD Supply Chain Concerns

Reading Time: 3 minutes

The Arrival of the Zettabyte Era
The data storage market has clearly entered the “zettabyte era” where new capacity shipments have exceeded a massive one zettabyte for a couple of years now. The data storage requirements are being driven by the phenomenon of “digital transformation” and the rising value of data that needs to be stored for longer periods of time, and in some cases, indefinitely. Further accelerating the zettabyte era is the other era we are all in, that being the “pandemic era”. With this era comes the unanticipated need for an unexpected remote workforce and the ever-expanding internet with its proliferation of online apps.

Pandemic Related Supply Shortages
The pandemic has brought with it related disruptions to the global supply chain including shortages of semiconductor chips. It’s been tough to get modern goods from toys to notebooks to refrigerators to automobiles. The combination of zettabyte and pandemic era has even put a strain on supply chains and the availability of SSDs and HDDs needed to support the digital transformation. This has been the cause of fluctuating prices based on quarterly supply and demand swings.

Supply Chain Challenges Persist
While pandemic-related labor shortages have delayed the production and distribution of goods, other factors are making matters worse. How about global warming, climate change, and the ensuing natural disasters that have had negative impacts on the supply chain? How about international rivalries and tensions impacting the availability of key components? Or cybercriminals shutting down vital infrastructure? Bottom line: industry pundits say we can expect supply chain hassles to continue throughout 2022.

Supply Chain Contingency Planning in Data Storage
Faced with supply chain risks in any industry, it’s always good to have contingency plans to mitigate risk and ensure ongoing operations. The IT industry is no exception where the availability of commodities that we may take for granted can be interrupted by any of the factors listed above from unforeseen demand to pandemic-related shortages to global warming, trade wars, and cybercrime.

A great way to avoid supply chain disruptions in the availability of primary storage devices like SSDs and HDDs is to employ intelligent data management software, typical of active archive solutions, that will automate the migration of data from these potentially supply chain affected devices to a modern, automated tape library. Since 60 to 80 percent of data quickly goes cold after a short period of time, why keep it stored on higher performing, expensive, and energy-intensive devices? Given the global supply chain uncertainty, 3 good reasons to migrate data from primary storage devices to tape storage are:

  • Free up capacity on expensive Tier 1 and Tier 2 storage devices like SSDs and HDDs in favor of TCO friendly tape systems
  • Reduce energy consumption and related CO2 emissions by leveraging the low power profile of automated tape systems
  • Take advantage of tape’s natural air gap security in the never-ending war against ransomware

The above actually makes sense even in the absence of supply chain concerns. Since data to be stored is growing at a CAGR of around 30% versus IT budget growth somewhere in the low single digits, the IT industry needs to find a more cost-effective storage solution. With the increasing value of data and indefinite retention periods, the long-term archival profile of tape coupled with best-in-class reliability actually makes sense.

Fighting Climate Change and Cybercrime
Finally, we all have to engage in the battle against global warming and climate change if we are to preserve the planet that we inhabit. Studies show that tape systems consume 87% less energy than equivalent amounts of disk storage and produce 95% less CO2 emissions than disk over the product lifecycle. Other studies show that collectively, the global IT industry could avoid as much as 664 million metric tons of CO2 emissions by strategically moving more data to tape systems.  As data cools off or goes cold, it should migrate to less expensive, less energy-intensive, and more secure tiers of storage.

Once the pandemic era finally subsides, it will be environmental calamities brought on by climate change and the relentless threat of cybercriminals that will have long-term impacts on supply chains.

Read More

How Tape Technology Delivers Value in Modern Data-driven Businesses…in the Age of Zettabyte Storage

Reading Time: 3 minutes

The newly released whitepaper from IT analyst firm ESG (Enterprise Strategy Group), sponsored by IBM and Fujifilm, entitled, “How Tape Technology Delivers Value in Modern Data-driven Businesses,” focuses on exciting, new advances in tape technology that are now positioning tape for a critical role in effective data protection and retention in the age of zettabyte (ZB) storage. That’s right “zettabyte storage!”

The whitepaper cites the need to store 17 ZB of persistent data by 2025. This includes “cold data” stored long-term and rarely accessed that is estimated to account for 80% of all data stored today. Just one ZB is a tremendous amount of data equal to one million petabytes that would need 55 million 18 TB hard drives or 55 million 18 TB LTO-9 tapes to store. Just like the crew in the movie Jaws needed a bigger boat, the IT industry is going to need higher capacity SSDs, HDDs, and higher density tape cartridges! On the tape front, help is on the way as demonstrated by IBM and Fujifilm in the form of a potential 580 TB capacity tape cartridge. Additional highlights from ESG’s whitepaper are below.

New Tape Technology
IBM and Fujifilm set a new areal density record of 317 Gb/sq. inch on linear magnetic tape translating to a potential cartridge capacity of 580 TB native featuring a new magnetic particle called Strontium Ferrite (SrFe) with the ability to deliver capacities that extend well beyond disk, LTO, and enterprise tape roadmaps. SrFe magnetic particles are 60% smaller than the current defacto standard Barium Ferrite magnetic particles yet exhibit even better magnetic signal strength and archival life. On the hardware front, the IBM team has developed tape head enhancements and servo technologies to leverage even narrower data tracks to contribute to the increase in capacity.

The Case for Tape at Hyperscalers and Others
Hyperscale data centers are major new consumers of tape technologies due to their need to manage massive data volumes while controlling costs. Tape is allowing hyperscalers including cloud service providers to achieve business objectives by providing data protection for critical assets, archival capabilities, easy capacity scaling, the lowest TCO, high reliability, fast throughput, low power consumption, and air gap protection. But tape also makes sense for small to large enterprise data centers facing the same data growth challenges including the need to scale their environments while keeping their costs down.

Data Protection, Archive, Resiliency, Intelligent Data Management
According to an ESG survey revealed in the whitepaper, tape users identified reliability, cybersecurity, long archival life, low cost, efficiency, flexibility, and capacity as top attributes in tape usage today and favor tape for its long-term value. Data is growing relentlessly with longer retention periods as the value of data is increasing thanks to the ability to apply advanced analytics to derive a competitive advantage. Data is often kept for longer periods to meet compliance, regulatory, and for corporate governance reasons. Tape is also playing a role in cybercrime prevention with WORM, encryption, and air gap capabilities. Intelligent data management software, typical in today’s active archive environments, automatically moves data from expensive, energy-intensive tiers of storage to more economical and energy-efficient tiers based on user-defined policies.

ESG concludes that tape is the strategic answer to the many challenges facing data storage managers including the growing amount of data as well as TCO, cybersecurity, scalability, reliability, energy efficiency, and more. IBM and Fujifilm’s technology demonstration ensures the continuing role of tape as data requirements grow in the future and higher capacity media is required for cost control with the benefit of CO2 reductions among others. Tape is a powerful solution for organizations that adopt it now!

To read the full ESG whitepaper, click here.

 

 

 

 

 

 

 

Read More

LET’S DISCUSS YOUR NEEDS

We can help you reduce cost, decrease vendor lock-in, and increase productivity of storage staff while ensuring accessibility and longevity of data.

Contact Us >