Blog

Avoiding Potential Risk of Stagnation in the Secondary Storage Market

Reading Time: 2 minutes

June 15, 2022

By Guest Blogger Peter Faulhaber, former president and CEO, FUJIFILM Recording Media U.S.A., Inc.

The Hyperscale Data Center (HSDC) secondary storage market is quickly emerging, requiring advanced solutions for petascale and exascale storage systems, not currently available. According to HORISON Information Strategies, HSDCs currently use around 3% of the world’s electrical energy. Due to the massive energy footprint of HSDCs, climate protection measures have become increasingly important in recent years, with cloud computing offering the greatest advantages for sustainable operation by reducing the energy and carbon footprint over the entire data life cycle.

The slowing rate of HDD and tape technology development roadmaps in recent years, along with HDD and tape storage supplier consolidations are particularly concerning trends to HSDCs. Neither HDD nor tape technology is currently positioned by itself to effectively meet the enormous HSDC storage requirements that future performance and capacity demands. High technical asset specificity requires significant R&D investment, yet have limited ROI potential outside of hyperscalers.

HSDCs manage over 60% of the world’s data today with a CAGR of 35 – 40%, with a growing need for cost-effective secondary storage that still meets certain performance thresholds.

The vendors and manufacturers are dis-incentivized to invest in novel technology; the risk reward is not high enough, while HSDCs are leveraging their buying and bargaining power. Manufacturers need to invest hundreds of millions to bring innovative solutions to market in a long development cycle, without a commitment from the HSDC market.

As a result, the secondary storage market is left with incremental investments in existing technologies and moves slowly.

The conditions are set for a widening gap between customer demands and product solutions in the secondary storage market.

The current “vendor-driven” strategy will not keep pace with HSDC requirements for secondary storage as such offerings fall far behind HSDC curves. Photonics, DNA, glass, and holographic experiments are attempting to address the market, and have been in labs for decades, but most have drawbacks, and none are on the near-term horizon for customer deployment. These initiatives show that a change is needed to get ahead of the demand curve.

However, the opportunity also exists to mitigate this risk by bringing the interested parties together  to share the risk reward paradigm. HSDCs need a quantum leap, which only comes with significant investment, best shared by the interested parties.

The Semiconductor Research Corporation (SRC) addressed the concept  of vertical market failure in September 2021 in its published article “New Trajectories for Memory and Storage,” stating, “The prospect of vertical market failure can be mitigated by private sector market participants through risk-share agreements between customers and suppliers, as well as increased vertical integration.”

Without change, current technologies will fall far behind HSDC demand curves, and the current vendor-driven trajectory increases the likelihood of un-met demand and stagnation of growth for all involved.

 

Read More

3 Big Takeaways from the Fujifilm and IBM 580TB Tape Demonstration

Reading Time: 5 minutes

January 19, 2021

By Rich Gadomski

In mid-December 2020, Fujifilm issued a press release to announce that, together with IBM Research, they had successfully achieved a record areal density of 317 Gbpsi (billion bits per square inch) on next-generation magnetic tape coated with next-generation Strontium Ferrite (SrFe) magnetic particles. This areal density achievement would yield an amazing native storage capacity of 580TB on a standard-sized data cartridge. That’s almost 50 times more capacity than what we have now with an LTO-8 tape based on Barium Ferrite (BaFe) at 12TB native.

Shortly after the news came out, I was on a call with a member of our sales team discussing the announcement and he asked me when the 580TB cartridge would be available and if there was any pricing information available yet? He was also curious about transfer speed performance. I had to admit that those details are still TBD, so he asked me “what are the 3 big takeaways” from the release? So let’s dive into what those takeaways are.

Tape has no fundamental technology roadblocks

To understand the magnitude of tape areal density being able to reach 317 Gbpsi, we have to understand just how small that is in comparison to HDD technology. Current HDD areal density is already at or above 1,000 Gbpsi while achieving 16TB to 20TB per drive on as many as nine disk platters. This level of areal density is approaching what is known as the “superparamagnetic limitation,” where the magnetic particle is so small that it starts to flip back and forth between positive and negative charge. Not ideal for long-term data preservation.

So to address this, HDD manufacturers have employed things like helium-filled drives to allow for closer spacing between disk platters that allow for more space for more platters, and therefore more capacity.  HDD manufacturers are also increasing capacity with new techniques for recording involving heat (HAMR) or microwaves (MAMR) and other techniques. As a result HDD capacities are expected to reach up to 50TB within the next five years or so. The reason tape can potentially reach dramatically higher capacities has to do with the fact that a tape cartridge contains over 1,000 meters of half-inch-wide tape, and, therefore, has far greater surface area than a stack of even eight or nine 3.5-inch disk platters.

But let’s also look at track density in addition to areal density. Think about the diameter of a single strand of human hair which is typically 100 microns wide. If a single data track on HDD is 50 nanometers wide, you are looking at 2,000 data tracks for HDD on the equivalent width of a single strand of human hair! For tape, with a track width of approximately 1,200 nanometers, you are looking at just 84 data tracks. But this is actually a positive for tape technology because it shows that tape has a lot of headroom in both areal density and track density, and that will lead to higher capacities and help to maintain a low TCO for tape.

But let me make it clear that this is not about HDD vs. tape. We are now in the zettabyte age having shipped just over an impressive one zettabyte (1,000 exabytes) of new storage capacity into the global market in 2019 of all media types. According to IDC, that number will balloon to a staggering 7.5 ZB by 2025. We will need a lot of HDDs and a lot of tape (and flash for that matter) to store 7.5 ZB!

(more…)

Read More

Tape Storage: New Game, New Rules

Reading Time: < 1 minute

September 23, 2020

Modern tape storage has become the leading strategic and lowest-cost storage solution for massive amounts of archival and unstructured data. This bodes well for future tape growth as archival data is piling up much faster than it is being analyzed. Over the past decade, the magnetic tape industry has successfully re-architected itself delivering compelling technologies and functionality including cartridge capacity increases, vastly improved bit error rates yielding the highest reliability of any storage device, a media life of 30 years or more, and faster data transfer rates than any previous tape or HDD (Hard Disk Drive).

Many of these innovations have resulted from technologies borrowed from the HDD industry and have been used in the development of both LTO (Linear Tape Open) and enterprise tape products. Additional tape functionality including LTFS, RAIT, RAO, TAOS, smart libraries and the Active Archive adds further value to the tape lineup. HDD technology advancement has slowed while progress for tape, SSD (Solid State Disk) and other semiconductor memories is steadily increasing. Fortunately, today’s tape technology is nothing like the tape of the past.

For more information, check out this Horison Information Strategies White Paper “Tape Storage: It’s a New Game With New Rules.”

Read More

Flash, HDDs and Tape Slay Data Challenges

Reading Time: 3 minutes

By Rich Gadomski

At Storage Visions 2018, held in Santa Clara this past October, I had the opportunity to talk about the future outlook for tape as attendees wanted to know how they were going to store all the data that’s being created. The session I participated in was entitled “Epic Battles with Classic Heros – Flash, HDDs and Tape Slay Data Challenges.” As the title suggests, battling exponential data growth takes more than one storage media type to effectively handle the deluge of data that’s being created (now estimated to be 33 ZB in 2018 and growing to 175 ZB by 2025, according to IDC).

As our session moderator, Jean Bozeman from Hurwitz & Associates pointed out in her introduction, a variety of storage workloads create the need for a spectrum of storage technologies. Certainly the need for speed at the top of the storage pyramid is being addressed by performance HDD and increasingly by ever evolving solid state drives.

The need for longer term storage at scale is the domain of capacity HDD and of course, tape. Managing the data deluge is all about having the right data on the right storage medium at the right time. Not everything can or should be stored on expensive high performance flash. You need high capacity optimized media for long term data retention and that’s where HDD and tape come in to play (often in a user friendly active archive environment).

When it comes to the future of capacity in the domain of HDD, current perpendicular magnetic recording technology has reached  ‘super paramagnetic” limitations where increasing areal density to increase capacity is not a viable option. With helium filled HDDs, more platters can fit in the same form factor as air filled HDDs but this has not allowed a significant growth in capacity.  New technology concepts such as Heat Assisted Magnetic Recording (HAMR) and Microwave Assisted Magnetic Recording (MAMR) are on the horizon but market availability has been delayed. There is also the potential of vacuum sealed HDDs with better operating characteristics than helium that could help HAMR and MAMR HDDs get up to 40 – 50 TB at some point in the future.

But fundamentally, increasing capacity of a storage medium and ultimately reducing its cost is best achieved by increasing areal density. This is where magnetic tape technology really shines as today’s modern tape with per cartridge capacities already as high as 20 TB having very low areal densities compared to HDD.

Therefore, tape has a long runway before facing areal density limits and as a result, future tape roadmaps have the potential to achieve up to 220 TB on a standard form factor cartridge using Barium Ferrite (BaFe) magnetic particles and up to 400 TB using next generation Strontium Ferrite (SrFe). At the same time, both BaFe and SrFe can maintain magnetic signal strength integrity for at least 30 years making them ideal not only for high capacity but for cost effective long term data retention as well.

“No wonder the cloud guys are all using tape now,” exclaimed an attendee in the audience during the Q&A. They certainly also use a lot of flash and a lot of disk too. It is an epic battle and it takes a spectrum of storage technologies to slay the data challenges.

Read More

LET’S DISCUSS YOUR NEEDS

We can help you reduce cost, decrease vendor lock-in, and increase productivity of storage staff while ensuring accessibility and longevity of data.

Contact Us >