Blog

New Report by John Monroe of Furthur Market Research a Wake-up Call for the Storage Industry

Reading Time: 3 minutes

John Monroe, a long-time storage industry expert and Gartner analyst turned independent consultant, recently published a new report entitled “The Escalating Challenge of Preserving Enterprise Data”. The report, co-sponsored by Fujifilm and Twist Bioscience, looks at the supply and demand for SSD, HDD and tape technologies from 2022 to 2030. The findings and conclusions in John’s report are surprising to say the least and should serve as a wake-up call for executives in both the end user and vendor communities. Below are some summaries and excerpts taken from the report and a link is provided to view/download the full report.

Read more

Read More

Long-Awaited Annual IT Executive Summit Returns to San Diego!

Reading Time: 4 minutes

After a two year hiatus due to COVID, Fujifilm’s 12th Annual Global IT Executive Summit took place last week in beautiful, warm and sunny San Diego. This year’s Summit theme was “Optimizing storage in the post-Covid, zettabyte age” where organizations have to do more with fewer resources while the value, volumes and retention periods of data continue to increase unabated. It was so good to once again interact face-to-face with members of the storage industry family including around a hundred or so customers, vendors, industry analysts, and storage industry experts during the 3 day event.

About The Summit
For those not familiar with the Summit, it is an educational conference featuring presentations from industry experts, analysts, vendors and end users about the latest trends, best practices and future developments in data management and storage. A concluding speaker panel with Q & A and peer-to-peer networking opportunities throughout the agenda truly make the Summit a unique storage industry event.

Key End User and Vendor Presentations
Similar to past Summits (we last convened in San Francisco in October of 2019) we enjoyed presentations from key end users including AWS, CERN, Meta/Facebook and Microsoft Azure. These end users are on the leading edge of innovation and in many ways are pioneering a path forward in the effective management of vast volumes of data growing exponentially every year.

From the vendor community, we were treated to the latest updates and soon to be unveiled products and solutions from Cloudian, IBM, Quantum, Spectra Logic, Twist Bioscience (DNA data storage) and Western Digital (HDD). The tape vendors shined a light on the continuing innovations in tape solutions including improvements in ease-of-use and maintenance of automated tape libraries as reviewed by Quantum. New tape applications abound from object storage on tape in support of hybrid cloud strategies as explained by Cloudian and Spectra, to the advantages of sustainable tape storage presented by IBM. It’s not a question of if, but when organizations will need to seriously address carbon emissions related to storage devices. After all: “no planet, no need for storage” quipped one attendee. Also included in the tape application discussions were the massive cold data archiving operations as presented by CERN and the hyper scale cloud service providers.

Finally from the world of tape, was a chilling, harrowing tale of a real life ransomware attack experienced by Spectra Logic and how their own tape products contributed to the safe protection of their data with the simple principal of a tape air gap.

Need for Archival Storage
We also heard about the latest updates in the progress of DNA data storage from Twist Bioscience and where the world of HDD is going from Western Digital. We are now firmly in the zettabyte age with an expected 11 zettabytes of persistent data to be stored by 2025. Just one zettabyte would require 55 million 18TB HDDs or 55 million LTO-9 tapes. As an industry we are going to need a lot of archival storage! That includes future technologies like DNA, advanced HDDs, optical discs, and of course, highly advanced modern tape solutions. Tape will continue to deliver the lowest TCO, lowest energy consumption and excellent cybersecurity. All the while tape is supported by a roadmap with increasing cartridge capacities to meet market demand as it unfolds. Certainly, the cloud service providers will leverage all of these storage media at some point as they fine tune their SLAs and prices for serving hot data to cold archival data.

Fred Moore, Horison Information Strategies

Analysts Share Future Vision
From the analyst community, we were treated to a visionary storage industry outlook from Fred Moore, president of Horison Information Strategies who shared the fact that 80% of all data quickly becomes archival and is best maintained in the lower tiers of his famed storage pyramid as an active archive or cold archive. Following Fred was important data from Brad Johns Consulting that showed the 18X sustainability advantage of eco-friendly tape systems compared to energy intensive HDDs. While we need both technologies, and they are indeed complementary, a tremendous opportunity exists for the storage industry to reduce carbon emissions by simply moving cold, inactive data from HDD to tape systems.

Rounding out the analyst presentations was Philippe Nicolas of Coldago Research with some valuable insights into end user storage requirements and preferences in both the U.S. and Europe.

Innovation from an Industry Expert
From the realm of storage industry experts, we had a compelling talk from Jay Bartlett of Cozaint. With his expertise in the video surveillance market, Jay shared how the boom in video surveillance applications is becoming unsustainable from a retention of content perspective. It will become increasingly cost prohibitive to retain high definition video surveillance footage on defacto-standard HDD storage solutions. Jay revealed a breakthrough allowing for the seamless integration of tier 2 LTO tape with a cost savings benefit of 50%! No longer will we need to rely on grainy, compromised video evidence.

Final Thoughts
The Summit wrapped up with a speaker panel moderated by IT writer and analyst, Philippe Nicolas. One big take away from this session was that while innovation is happening, it will need to continue in the future if we are to effectively store the zettabytes to come. Innovation means investment in R&D and production of new solutions, perhaps even hybrid models of existing technologies. That investment can’t come from the vendors alone and the hyper scalers will need to have some skin in the game.

In conclusion, the Summit was long overdue. The storage eco-system is a family from end users to vendors, to analysts and experts. As a family we learn from each other and help each other. That’s what families do. Fujifilm was pleased to bring the family together from around the globe under one roof, for frank and open discussion that will help solve the challenges we and our society are facing.

Read More

Commitment to Sustainability at Data Center World Includes How to Avoid CO2 Emissions in Long Term Storage with Modern Data Tape Technology

Reading Time: 5 minutes

I had the opportunity to present at AFCOM’s Data Center World (DCW) exhibit and conference in Austin, Texas yesterday. The first thing I have to share about this experience is how surreal it was to get back on an airplane! It was my first trip since COVID started two years ago with many zoom presentations and virtual conferences since then. But not much has changed about air travel. The seating is still cramped, the flight was packed full, and my dog gets more snacks in a four hour period than I did on my four hour flight!

Committed to Sustainability
Sustainability is a hot topic these days and was one of the main themes of this year’s DCW. It was also the topic I presented on, specifically “How to Avoid CO2 Emissions in Long Term Storage with Modern Data Tape Technology.” The good news is that the DCW attendees that I met and listened to in other sessions are genuinely concerned about the environment and worried about what kind of planet we will be leaving behind for our kids and grandchildren. They recognize the opportunity to improve sustainability in data center operations and are committed to it.

Key Questions about Storage
At the outset of my presentation, I asked for a show of hands for those directly involved in data storage. I was not surprised to confirm my suspicion that there would be few if any attendees to raise a hand, since AFCOM’s DCW is more about facilities management than storage management. But I was also glad to see this because we need everyone to be advocates for any possible sustainability improvements in IT operations. So I asked my audience to lean on their colleagues in storage and pose two simple questions to them: “If data has gone cold and is infrequently accessed, why are we keeping it on energy intensive tiers of storage like constantly spinning and heat producing HDD arrays? Why not move it to eco-friendly tape?” The attendees in my session admitted they can feel the power drain and heat being produced by endless disk arrays in their data centers.

Climate Change and Global Warming
I began my presentation by setting the stage on global warming from the forest fires in 2020, to the Texas deep freeze in early 2021, to the fact that July of 2021 was the hottest month ever on earth. Add to this the dire reports from the U.N. in late 2021 and early 2022. All this has led to changing consumer sentiment demanding that governments do more. Thankfully they are. Corporate attitudes are also changing from resistance to action on climate and we will be seeing more CSOs (Chief Sustainability Officers) being appointed and implementing change top down. Even Wall Street and the SEC are getting in on the act, demanding reporting and disclosures on corporate sustainability initiatives.

Energy Intensive IT Industry
Next, I confirmed what we all know, that the IT industry is energy intensive and its demand for energy is rapidly increasing. The demand curve for energy looks similar to the demand curve for data storage. Driven by digital transformation, IDC expects persistent data that needs to be stored to grow from 2.0 ZB in 2016 to more than 11.0 ZB in 2025, a CAGR of 27%. Suffice it to say no one in the audience really understood what a zettabyte was or that just one zettabyte was equal to the capacity of 55 million LTO-9 data cartridges or 55 million 18.0 TB HDDs. That’s a lot of storage requirement for one zettabyte, let alone 11.0 zettabytes in 2025. We are going to need a lot of flash, disk and tape to handle that kind of volume!

Renewable Energy plus Conservation
Next came the conversation about renewables and how Greenpeace has done a great job advocating for more use of renewables in data centers, especially the cloud hyperscalers. But from the looks of progress being made on this front, renewable sources of energy likely can’t come on line fast enough or cheaply enough, or in sufficient volume to satisfy the energy needs of the massive data center industry. While Fujifilm is a big fan of renewables (we use it ourselves for our LTO plant in Boston) what’s really needed is a combination of renewables and energy conservation. How about turning off those lights and HDDs before leaving the office each night!

The Data Life Cycle
When it comes to conserving energy in data storage, one needs to understand a few simple principles related to the “data lifecycle.” Data quickly goes cold and access frequency drops off dramatically after 30, 60 or 90 days. At the same time, data retention periods are getting longer, sometimes reaching indefinite time periods. This is where data tiering saves the day as cold data can move from expensive, energy intensive tiers of storage to economy, eco-friendly tiers like modern data tape.

Advantages of Eco-Friendly Tape
I then shared the research findings from Brad Johns Consulting in his two white papers where tape consumes 87% less energy and produces 87% less CO2 than equivalent amounts of HDD storage. When analyzed over the total product lifecycle from procurement of raw materials to production to distribution to usage and finally disposal, tape produces 95% less CO2 than HDD and produces 80% less e-waste. I also shared the results of an IDC study that shows migrating more cold data from tape to HDD could result in an avoidance of 664 million metric tons of CO2 on a global basis by 2030. That’s the CO2 equivalent of 144 million automobiles being taken off the road for a full year! I also referenced research by IBM showing a side by side compare of TS4500 tape library and Bryce Canyon HDD where the IBM gear produced 80% less CO2 over a ten year period than the Bryce Canyon system. To round things out, I shared the end user perspective from an executive roundtable where Microsoft Azure stated:

“When you take the material savings and power savings, tape actually does offer quite a bit of advantage compared to other technologies that are on the market today.”

Since my audience wanted to know more, I briefly covered tape’s other benefits including:

  • Tape remains the lowest cost storage media on a $/GB basis
  • Tape storage supports air gap ransomware
  • Tape can reliably store data for long periods with an excellent bit error rate
  • Tape technology has room to grow in areal density and therefore capacity, and has a well-defined roadmap

I concluded by saying that data growth is here to stay and the volumes of valuable data are getting enormous. What the industry needs to do in support of strategic data storage management and sustainability objectives is this:

“Get the right data, in the right place, at the right time, at the right cost, and…at the right energy consumption level.”

I think the attendees got the message and now see modern tape storage as part of the carbon reduction answer for the data centers of today and tomorrow. It was well worth the snack deprived four hour flight!

 

 

Read More

How Tape Technology Delivers Value in Modern Data-driven Businesses…in the Age of Zettabyte Storage

Reading Time: 3 minutes

The newly released whitepaper from IT analyst firm ESG (Enterprise Strategy Group), sponsored by IBM and Fujifilm, entitled, “How Tape Technology Delivers Value in Modern Data-driven Businesses,” focuses on exciting, new advances in tape technology that are now positioning tape for a critical role in effective data protection and retention in the age of zettabyte (ZB) storage. That’s right “zettabyte storage!”

The whitepaper cites the need to store 17 ZB of persistent data by 2025. This includes “cold data” stored long-term and rarely accessed that is estimated to account for 80% of all data stored today. Just one ZB is a tremendous amount of data equal to one million petabytes that would need 55 million 18 TB hard drives or 55 million 18 TB LTO-9 tapes to store. Just like the crew in the movie Jaws needed a bigger boat, the IT industry is going to need higher capacity SSDs, HDDs, and higher density tape cartridges! On the tape front, help is on the way as demonstrated by IBM and Fujifilm in the form of a potential 580 TB capacity tape cartridge. Additional highlights from ESG’s whitepaper are below.

New Tape Technology
IBM and Fujifilm set a new areal density record of 317 Gb/sq. inch on linear magnetic tape translating to a potential cartridge capacity of 580 TB native featuring a new magnetic particle called Strontium Ferrite (SrFe) with the ability to deliver capacities that extend well beyond disk, LTO, and enterprise tape roadmaps. SrFe magnetic particles are 60% smaller than the current defacto standard Barium Ferrite magnetic particles yet exhibit even better magnetic signal strength and archival life. On the hardware front, the IBM team has developed tape head enhancements and servo technologies to leverage even narrower data tracks to contribute to the increase in capacity.

The Case for Tape at Hyperscalers and Others
Hyperscale data centers are major new consumers of tape technologies due to their need to manage massive data volumes while controlling costs. Tape is allowing hyperscalers including cloud service providers to achieve business objectives by providing data protection for critical assets, archival capabilities, easy capacity scaling, the lowest TCO, high reliability, fast throughput, low power consumption, and air gap protection. But tape also makes sense for small to large enterprise data centers facing the same data growth challenges including the need to scale their environments while keeping their costs down.

Data Protection, Archive, Resiliency, Intelligent Data Management
According to an ESG survey revealed in the whitepaper, tape users identified reliability, cybersecurity, long archival life, low cost, efficiency, flexibility, and capacity as top attributes in tape usage today and favor tape for its long-term value. Data is growing relentlessly with longer retention periods as the value of data is increasing thanks to the ability to apply advanced analytics to derive a competitive advantage. Data is often kept for longer periods to meet compliance, regulatory, and for corporate governance reasons. Tape is also playing a role in cybercrime prevention with WORM, encryption, and air gap capabilities. Intelligent data management software, typical in today’s active archive environments, automatically moves data from expensive, energy-intensive tiers of storage to more economical and energy-efficient tiers based on user-defined policies.

ESG concludes that tape is the strategic answer to the many challenges facing data storage managers including the growing amount of data as well as TCO, cybersecurity, scalability, reliability, energy efficiency, and more. IBM and Fujifilm’s technology demonstration ensures the continuing role of tape as data requirements grow in the future and higher capacity media is required for cost control with the benefit of CO2 reductions among others. Tape is a powerful solution for organizations that adopt it now!

To read the full ESG whitepaper, click here.

 

 

 

 

 

 

 

Read More

Taking Action Against Climate Change by Reducing CO2 Emissions with Eco-Friendly Tape Systems

Reading Time: 3 minutes

In early August of this year, a United Nations panel called the “Intergovernmental Panel on Climate Change (IPCC)” issued a new report, the Sixth Assessment Report, on climate change and global warming. You can explore the lengthy and technical full report here. But in short, a few key headline statements from the report include:

  • It is unequivocal that human influence has warmed the atmosphere, ocean, and land. Widespread and rapid changes in the atmosphere have occurred.
  • Global warming of 1.5 degrees C and 2.0 degrees C will be exceeded during the 21st century unless deep reductions in carbon dioxide (CO2) and other greenhouse gas emissions occur in the coming decades.
  • Many changes in the climate system become larger in direct relation to increasing global warming. They include increases in the frequency and intensity of hot extremes, marine heatwaves, and heavy precipitation, agricultural and ecological droughts in some regions, and intense tropical cyclones as well as reductions in Arctic sea ice, snow cover, and permafrost.
  • Many changes due to past and future greenhouse gas emissions are irreversible for centuries to millennia, especially changes in the ocean, ice sheets, and global sea level.

The U.N. report is pretty scary, especially that last bullet. But think about the severe weather events we experienced in 2020 only to be outdone by recent calamities in 2021 like the Texas deep freeze, the record heat in the Pacific Northwest, torrential floods in Europe, China and the U.S., extreme storms, not to mention the worsening forest fires.

A Time for IT to Take Action on Climate Change

We as a society, as individuals and as commercial organizations and governments need to take action. No effort is too small, even turning off a single light switch when not needed is worthwhile. Collectively we can make a difference.

The IT industry is no exception and needs to take action. Data centers are major consumers of energy amidst rapid and widespread digital transformation initiatives resulting in exponential data growth. While the IT industry has made significant strides in ramping up renewable sources of energy, it can’t come online fast enough or cheaply enough to make a big difference. What is also needed is energy conservation and storage is a good place to start.

Assessing the Eco-Friendly Advantages of Tape

Back in November of 2020 industry expert and consultant Brad Johns published a whitepaper on the energy advantage of today’s modern and highly advanced data tape systems. That paper, entitled “Reducing Data Center Energy Consumption and Carbon Emissions with Modern Tape” showed:

  • Tape systems consume 87% less energy and therefore reduce CO2 emissions by 87% compared to equivalent capacities of HDD storage.
  • What’s more, the lower energy consumption of tape contributes to an 86% reduction in TCO.

More recently, Brad Johns did an even deeper dive into the energy advantage of tape in a second whitepaper on the subject entitled: “Improving Information Technology Sustainability with Modern Tape Storage.” This time, instead of just looking at energy consumption during the operational usage phase of tape vs. HDD, Brad decided to look at the energy consumption and environmental impact of tape vs. HDD from “cradle to grave.” That is to say, from sourcing of raw materials to manufacturing, to distribution, to usage, and disposal at end of life. Here are the key findings:

  • Tape produces 95% less CO2 than HDD during its lifecycle from manufacturing to disposal.
  • Electronic waste (e-waste) at the time of disposal is reduced by 80% for tape compared to HDD.
  • Ten-year TCO in this paper shows a 73% reduction for tape compared to HDD.

Brad also did a “what if” scenario as follows: what if industry best practices were truly observed and 60% of HDD data was moved to tape systems?

  • 72 million tons of CO2 would be avoided, a 57% reduction compared to keeping all the data on HDD!

To download this whitepaper for complete details, click here.

While simply using more tape for cold and inactive data won’t solve climate change or make scary U.N. reports go away, it certainly is a positive contribution to the global effort. We all need to do whatever we can so that collectively we can make a difference.

 

 

Read More

Storage in the Age of Video Surveillance

Reading Time: 6 minutes

By Andrew Dodd, Guest Blogger, Worldwide Marketing Communications Manager
at Hewlett Packard Enterprise Storage

The presence of a ring of video surveillance cameras clinging to a vantage spot like a cluster of digital coconuts has long been a familiar sight in public spaces. And for many years, in both Hollywood and on television, countless storylines have turned on whether the detectives or investigators could access CCTV footage and solve the mystery by reviewing the tale of the tape.

But although the idea of cameras and surveillance has become an accepted feature of society (like it or not), what is less obvious perhaps is how much the market for video surveillance equipment is growing and how much the cameras themselves have changed. Both of these factors have profound implications for digital storage.

You had better be ready for your close up

First, the market. A 2020 report from IDC entitled “Worldwide Video Surveillance Camera Forecast, 2020-2025” (#US46230720) estimates that by 2025, the worldwide video surveillance camera market will grow to $44 billion, up from $23.6 billion in 2019, with a five-year compound annual growth rate (CAGR) of nearly 13%. This is largely due to the increasing adoption of smart camera systems and analytical software that enables them to be utilized in a variety of roles — beyond simple surveillance. Another report, by research firm IHS Markit, predicts that by the end of 2021 alone, there will be 1 billion surveillance cameras installed globally, with over 50% of those in a single country: China.

The growth of 4K

In the past, video surveillance cameras have sometimes been criticised both for their ubiquity and their usefulness: critics pointed out that although the cameras seemed to be proliferating in many public spaces, their benefit was undermined by poor image quality and resolution. Not any more. The next-gen cameras that are driving the growth to 2025 will increasingly deliver HD and Ultra HD (4K) images of astonishing detail and clarity. In turn, this is opening up a wealth of new applications that can be managed by artificial intelligence systems: for example, monitoring industrial equipment, providing security, and (more controversially) real-time facial recognition.

Why are cameras being deployed?

Many of today’s larger organizations such as hospitals, airports, university campuses, and casinos find themselves needing a video surveillance system as either a replacement for an aging CCTV installation or as a brand-new installation. The ability to quickly and easily provide high-resolution video evidence of a security incident can be very relevant in narrowing down suspects in case of a crime. And the same video evidence can also limit the liability of an organization in case of a lawsuit. So there are clearly business benefits in upgrading to the latest surveillance technology.

The storage challenge

But if the number of cameras is increasing rapidly, and if the quality of the images they produce is becoming more refined and detailed, then all of this can only mean one thing: we’re going to need a lot more storage. Gone are the days when weeks of footage could be kept on a handful of old videotapes that could be wiped and reused at the end of the month. In the first instance, today’s surveillance cameras record primarily record to disk. And a single hour of RAW 4K video footage produced by just one unit consumes something in the region of 110GB of disk capacity. Multiply this by millions of hours, and hundreds of millions of cameras, and it’s clear that video surveillance applications will require colossal amounts of storage, not just for the primary purpose of storing the original footage, but also for backing up and archiving that material.

Read more

Read More

Why Active Archiving is a Hot Concept in Storage Today

Reading Time: 2 minutes

The 2021 Active Archive Alliance annual market report has just been released, entitled “Saved by the Data. Active Archive Leads the Way in a Mid-Pandemic World”.

Certainly, the COVID pandemic was a shock to many companies and put tremendous strain on operations, revenue, and profit. But those companies who had already implemented a sensible active archive strategy were at a competitive advantage thanks to their ability to intelligently manage access to their data.

I think active archiving, the practice of keeping data online all the time and easily accessible to users, is a hot concept in storage right now because it is really about optimization – getting the right data in the right place, at the right time, and at the right cost.

We know that IT budgets are not keeping up with the relentless growth of data. We also know that 60% to 80% of data quickly becomes archival. Typically after 30, 60, or 90 days, files become static and the frequency of access drops off. So why keep that kind of data on expensive primary storage?

Why not let intelligent data management software that is typical of an active archive solution move that data by user-defined policy from high performance, expensive tiers, to lower performance but more cost-effective tiers like economy disk or tape systems, or even cloud? All while maintaining transparent access for users.

We know that the value of data is increasing, retention periods are getting longer, and users want to maintain ready access to their data without IT staff intervention. But we also need to worry about the bottom line, about efficiency, compliance, sustainability, and cybersecurity! Active archiving provides the right solutions to these worries and that’s why it is such a hot concept in storage today.

But enough said, read the full report here and check out what Alliance members had to say in their related virtual conference.

 

 

.

Read More

Managing the Archival Upheaval

Reading Time: < 1 minute


Relentless digital data growth is inevitable as data has become critical to all aspects of human life over the course of the past 30 years and it promises to play a much greater role over the next 30 years. Much of this data will be stored forever mandating the emergence of a more intelligent and highly secure long-term storage infrastructure. Data retention requirements vary widely based on the type of data, but archival data is rapidly piling up everywhere. Digital archiving is now a key strategy for larger enterprises and has become a required discipline for hyperscale data centers.

Many data types are being stored indefinitely anticipating that its potential value will eventually be unlocked. Industry surveys indicate nearly 60% of businesses plan to retain data in some digital format 50 years or more and much of this data will never be modified or deleted. For many organizations, facing terabytes, petabytes and potentially exabytes of archive data for the first time can force the redesign of their entire storage strategy and infrastructure. As businesses, governments, societies, and individuals worldwide increase their dependence on data, data preservation and archiving has become a critical IT practice. Fortunately, the required technologies are now available to manage the archival upheaval.

For more information, check out this Horison Information Strategies White Paper “Managing the Archival Upheaval.”

Read More

Webinar: How Much Do You Really Know About Your Data?

Reading Time: 2 minutes

July 22, 2020

By Kevin Benitez

How much do you really know about your data? Is your data on the right storage type? How active is your data, and how is it being used?

From life sciences and media and entertainment to HPC/Research, higher education, government and consumer products, virtually ALL enterprises struggle to manage data with fewer resources and at less cost. Heterogeneous storage environments have added complexities, costs, and made it difficult for IT managers to manage data.

Don’t let multi-vendor storage silos get in the way of effective data management.

This webinar series goes beyond just organizing your data. Throughout three short webinars, you’ll learn about how to take control, protect, and manage your data – all while enhancing workflow and reducing costs.

Join Floyd Christofferson, CEO of StrongBox Data Solutions, in a webinar series that will teach you how you can make the most of your data:

 

  1. Take Back Control of Your Data + LTFS

Don’t let multi-vendor storage silos get in the way of effective data management.

July 28, 2020 12:00 PM – 12:45 PM Eastern Time

 

  1. Reduce Costs & Increase Data Protection!

How to Better Manage Data Growth in a Multi-Vendor Storage Environment.

August 4, 2020 12:00 PM – 12:45 PM Eastern Time

 

  1. Workflow Magic!

Techniques to better use your data and not waste time trying to wrangle it.

August 11, 2020 12:00 PM – 12:45 PM Eastern Time

 

Register Now

Read More

THE ASCENT TO HYPERSCALE – Part 2

Reading Time: 2 minutes

Part 2: CHARACTERISTICS OF THE HYPERSCALE DATA CENTER

In Part 1 of this series, we looked explored the definition of hyperscale data centers. Now, we’ll take a look at some of the key characteristics.

HSDCs don’t publicly share an abundance of information about their infrastructure. For companies who will operate HSDCs, the cost may be the major barrier to entry, but ultimately it isn’t the biggest issue – automation is. HSDCs must focus heavily on automating and self-healing environments by using AI and ML whenever possible to overcome inevitable and unexpected failures and delays. Unlike many enterprise data centers, which rely on a large full-time staff across a range of disciplines, HSDCs employ fewer tech experts because they have used technology to automate so much of the overall management process. HSDC characteristics include:

  • Small footprint, dense racks–HSDCs squeeze servers, SSDs (Solid State Disks) and HDDs (Hard Disk Drives) directly into the rack itself, as opposed to separate SANs or DAS to achieve the smallest possible footprint (heavy use of racks). HSDC racks are typically larger than standard 19” racks.
  • Automation–Hyperscale storage tends to be software- defined and is benefitting from AI delivering a higher degree of automation and self-healing minimizing direct human involvement. AI will support automated data migration between tiers to further optimize storage assets.
  • Users–The HSDC typically serves millions of users with only a few applications, whereas in a conventional enterprise there are fewer users but many more applications.
  • Virtualization–The facilities also implement very high degrees of virtualization, with as many operating system images running on each physical server as possible.
  • Tape storage adoption–Automated tape libraries are on the rise to complement SSDs and HDDs to easily scale capacity, manage and contain out of control data growth, store archival and unstructured data, significantly lower infrastructure and energy costs, and provide hacker-proof cybercrime security via the tape air gap.
  • Fast scaling bulk storage–HSDCs require fast, easy scaling storage capacity. One petabyte using 15 TB disk drives requires 67 drives and one exabyte requires 66,700 15 TB drives. Tape easily scales capacity by adding media, disk scales by adding drives.
  • Minimal feature set–Hyperscale storage has a minimal, stripped-down feature set and may even lack redundancy as the goal is to maximize storage space and minimize cost.
  • Energy challenges–High power consumption and increasing carbon emissions has forced HSDCs to develop new energy sources to reduce and more effectively manage energy expenses.

In Part 3 of this series, we’ll take a look at the how the value of tape is rapidly rising as hyperscale data centers grow. For more information on this topic, download our white paper: The Ascent to Hyperscale.

Read More

LET’S DISCUSS YOUR NEEDS

We can help you reduce cost, decrease vendor lock-in, and increase productivity of storage staff while ensuring accessibility and longevity of data.

Contact Us >