FUJIFILM INSIGHTS BLOG

Data Storage

3 Big Takeaways from the Fujifilm and IBM 580TB Tape Demonstration

Reading Time: 5 minutes

In mid-December 2020, Fujifilm issued a press release to announce that, together with IBM Research, they had successfully achieved a record areal density of 317 Gbpsi (billion bits per square inch) on next-generation magnetic tape coated with next-generation Strontium Ferrite (SrFe) magnetic particles. This areal density achievement would yield an amazing native storage capacity of 580TB on a standard-sized data cartridge. That’s almost 50 times more capacity than what we have now with an LTO-8 tape based on Barium Ferrite (BaFe) at 12TB native.

Shortly after the news came out, I was on a call with a member of our sales team discussing the announcement and he asked me when the 580TB cartridge would be available and if there was any pricing information available yet? He was also curious about transfer speed performance. I had to admit that those details are still TBD, so he asked me “what are the 3 big takeaways” from the release? So let’s dive into what those takeaways are.

Tape has no fundamental technology roadblocks

To understand the magnitude of tape areal density being able to reach 317 Gbpsi, we have to understand just how small that is in comparison to HDD technology. Current HDD areal density is already at or above 1,000 Gbpsi while achieving 16TB to 20TB per drive on as many as nine disk platters. This level of areal density is approaching what is known as the “superparamagnetic limitation,” where the magnetic particle is so small that it starts to flip back and forth between positive and negative charge. Not ideal for long-term data preservation.

So to address this, HDD manufacturers have employed things like helium-filled drives to allow for closer spacing between disk platters that allow for more space for more platters, and therefore more capacity.  HDD manufacturers are also increasing capacity with new techniques for recording involving heat (HAMR) or microwaves (MAMR) and other techniques. As a result HDD capacities are expected to reach up to 50TB within the next five years or so. The reason tape can potentially reach dramatically higher capacities has to do with the fact that a tape cartridge contains over 1,000 meters of half-inch-wide tape, and, therefore, has far greater surface area than a stack of even eight or nine 3.5-inch disk platters.

But let’s also look at track density in addition to areal density. Think about the diameter of a single strand of human hair which is typically 100 microns wide. If a single data track on HDD is 50 nanometers wide, you are looking at 2,000 data tracks for HDD on the equivalent width of a single strand of human hair! For tape, with a track width of approximately 1,200 nanometers, you are looking at just 84 data tracks. But this is actually a positive for tape technology because it shows that tape has a lot of headroom in both areal density and track density, and that will lead to higher capacities and help to maintain a low TCO for tape.

But let me make it clear that this is not about HDD vs. tape. We are now in the zettabyte age having shipped just over an impressive one zettabyte (1,000 exabytes) of new storage capacity into the global market in 2019 of all media types. According to IDC, that number will balloon to a staggering 7.5 ZB by 2025. We will need a lot of HDDs and a lot of tape (and flash for that matter) to store 7.5 ZB!

Read more

Read More

3 Reasons Why 2010 – 2020 was the Decade of Renaissance for Data Tape

Reading Time: 2 minutes

The past 10 years have been marked by explosive data growth and demand for storage. Meanwhile, the tape industry has experienced a renaissance thanks to significant advancements in capacity, reliability, performance, and functionality that have led to new applications and key industry adoption. Here’s a look at some of the key milestones.

Capacity

  • In terms of capacity, the decade started for LTO with LTO-5 at 1.5 TB native capacity and culminated most recently with LTO-8 at 12.0 TB and LTO-9 soon to be delivered at 18.0 TB.
  • Enterprise tape formats started the decade at 1.0 TB native and are currently at 20.0 TB native.
  • Barium Ferrite magnetic particles became a key enabler for multi-terabyte tapes and were demonstrated by IBM and Fujifilm in 2015 to have the potential to achieve 220 TB on a single tape cartridge. This signaled that tape technology had no fundamental areal density limitations for the foreseeable future.
  • By the end of the decade, IBM and Fujifilm demonstrated the ability to achieve a record areal density of 317 GBPSI using the next generation of magnetic particles, Strontium Ferrite, with a potential cartridge capacity of 580 TB.

 

Reliability and Performance

  • During the decade, tape achieved the highest reliability rating as measured by Bit Error Rate at 1 x 1019, even better than enterprise HDD at 1 x 1016.
  • Data transfer rates for tape also improved from 140 MB/sec. in 2010 to an impressive 400 MB/sec.
  • LTFS provided an open tape file system with media partitions for faster “disk-like” access and ease of interchangeability, making LTO a de facto standard in the Media & Entertainment industry.

 

New Applications and Key Industry Adoption

  • Storing objects on tape became a reality with object archive software solutions offering S3 compatibility, objects can now move to and from tape libraries in their native object format.
  • The concept of active archiving grew in popularity with tape as a key component complementing flash, HDD and cloud for cost-effectively maintaining online archives.
  • Tape was recognized for its ease of removability and portability, providing air gap protection in the escalating war against cybercrime.
  • Major U.S. hyperscalers began to rely on tape during the decade for both back-up and deep archive applications. In one well-publicized example, Google restored a February 2011 Gmail outage from its tape backups. Microsoft adopted tape for Azure later in the decade. Tape became firmly established as a competitive advantage for these and other hyper scalers based on its scalability, long archival life, lowest TCO, low energy consumption, and air gap security.

 

With this steady technological advancement over the last decade, tape has been recognized for its complementary value to flash, HDD and cloud in tiered storage strategies for managing data in the zettabyte age.

Read More

Fujifilm’s Tape Evangelist on The Future of Tape Storage

Reading Time: 4 minutes

In this Executive Q&A, Rich Gadomski, Fujifilm’s Head of Tape Evangelism, discusses the evolution of tape storage and how recent innovations and advancements in tape technology are putting it at center stage.


Explain your role as Tape Evangelist for FUJIFILM Recording Media, U.S.A.

I believe tape technology has a compelling value proposition and a great story to tell. I want to be focused on telling that story and that is my primary role. Of course, it takes a team and we all do our part, but this is what I’m passionate about and want to focus on.

How has the tape storage industry changed over the years?

Tape technology has come a long way since its introduction in the 1950s where we had reel-to-reel tape that could hold just a few megabytes of data. Today we have LTO tape in the market at 12 TB native and enterprise tape at 20TB. Just looking at LTO which came out 20 years ago, it was LTO Gen 1 in 2000 with 100GB of capacity. We expect LTO-9 soon at 18TB so a 180x increase. And just recently we announced a joint demonstration with IBM showing that we can reach up to 580 TB on a single cartridge with our next generation of magnetic particle called Strontium Ferrite. But it’s not just about capacity increases, tape has also made advancements in transfer rate, now faster than HDD, reliability rating, now three or four magnitudes better than HDD and it’s the greenest form of storage consuming far less energy than constantly spinning HDD and this contributes to its lowest total cost of ownership. But today it’s not about tape vs. HDD, in fact, the two technologies complement each other to help customers optimize their data storage.

Why is tape storage still relevant today? What is driving demand for this technology?

I would say first and foremost the incredible amount of data that’s being created in this digital economy. Add to the fact that the value of data is increasing so we want to store more of it for longer periods of time. Yet, IT budgets are flat or barely increasing, so you need a high capacity, highly reliable storage media that is cost-effective for long-term storage and that’s what tape gives you. So tape plays a role in cold archiving, active archiving and yes, backup as well. In fact, tape has renewed interest from customers in the fight against cybercrime and ransomware since customers can easily backup to tape and remove those backups from the network to an isolated and secure location that hackers can’t get to. That’s what we refer to as the tape air gap. Last but not least is energy consumption. Society is rightfully concerned about climate change. Tape consumes 87% less energy than disk and that leads to 87% less CO2 emissions.

Read more

Read More

2019 State of Active Archive Report Outlines Modern Strategies for Data Management

Reading Time: < 1 minute

Archival data is piling up faster than ever as organizations are quickly learning the value of analyzing vast amounts of previously untapped digital data. Industry studies consistently find that the vast majority of all digital data is rarely, if ever, accessed again after it is stored. However, this is changing now with the emergence of big data analytics made possible by Machine Learning (ML) and Artificial Intelligence (AI) tools that bring data back to life and tap its enormous value for improved efficiency and competitive advantage.

The need to securely store, search for, retrieve and analyze massive volumes of archival content is fueling new and more effective advancements in archive solutions. These trends are further compounded as an increasing number of businesses are approaching hyperscale levels with significant archival capacity requirements.

An active archive resolves complexity by leveraging the benefits of an intelligent data management layer, and an increasing number of effective software products that address this are now available. Access and management of data is getting more complex and is requiring modern strategies with intelligent data management techniques. These strategies and techniques are now being enhanced by AI to further improve and automate data management.

Download the Active Archive Alliance 2019 State of Active Archive Report to learn more about the state of the archive market and the expanding role the active archive plays.

Read More

LET’S DISCUSS YOUR NEEDS

We can help you reduce cost, decrease vendor lock-in, and increase productivity of storage staff while ensuring accessibility and longevity of data.

Contact Us >