Blog

LTO-9 Coming to Market at the Right Time with the Right Features to Address the Many Challenges Facing IT Today

Reading Time: 5 minutes

As recently announced by Fujifilm, LTO-9 has arrived and is available for immediate delivery. It certainly comes at a time when the IT industry is so challenged to manage rampant data growth, control costs, reduce carbon footprint and fight off cyber-attacks. LTO-9 is coming to market just in time to meet all of these challenges with the right features like capacity, low cost, energy efficiency, and cyber security.

What a Great Run for LTO
First of all, it is remarkable to look at how far LTO Ultrium technology has come since its introduction. LTO made its market debut in 2000 with the first generation LTO-1 at 100/200 GB native/compressed capacity with 384 data tracks. Transfer rate was just 20 MB native and 40 MB compressed per second. Fast forward 21 years to the availability of LTO-9 now with 18/45 TB native/ compressed capacity on 8,960 data tracks, with transfer rate increasing to 400 MB per second, 1,000 MB per second compressed! In terms of compressed capacity, that’s a 225X increase compared to LTO-1. Since 2000, Fujifilm alone has manufactured and sold over 170 million LTO tape cartridges, a pretty good run indeed.

Capacity to Absorb Bloated Data Sets
We are firmly in the zettabyte age now and it’s no secret that data is growing faster than most organizations can handle. With compound annual data growth rates of 30 to 60% for most organizations, keeping data protected for the long term is increasingly challenging. Just delete it you say? That’s not an option as the value of data is increasing rapidly thanks to the many analytics tools we now have to derive value from it. If we can derive value from that data, even older data sets, then we want to keep it indefinitely. But this data can’t economically reside on Tier 1 or Tier 2 storage. Ideally, it will move to Tier 3 tape as an archive or active archive where online access can be maintained. LTO-9 is perfect for this application thanks to its large capacity (18 TB native, 45 TB compressed) and high data transfer rate (400 MB sec native, 1,000 MB sec compressed).

Lowest TCO to Help Control Costs
Understanding your true total cost of ownership is of vital importance today as exponential data growth continues unabated. The days of just throwing more disk at storage capacity issues without any concern for cost are long gone. In fact, studies show that IT budgets on average are growing at less than 2.0% annually yet data growth is in the range of 30% to 60%. That’s a major disconnect! When compared to disk or cloud options, automated tape systems have the lowest TCO profile even for relatively low volumes of data less than one petabyte. And for larger workloads, the TCO is even more compelling. Thanks to LTO-9’s higher capacity and fast transfer rate, the efficiency of automated tape systems will improve keeping the TCO advantage firmly on tape’s side.

Lowest Energy Profile to Reduce Carbon Footprint
Perhaps of even greater concern these days are the environmental impacts of energy-intensive IT operations and their negative effect on global warming and climate change. You may have thought 2020 was a pretty bad year, being tied for the hottest year on record with 2016. Remember the raging forest fires out West or the frequency of hurricanes and tropical storms? Well, it turns out 2021 is just as bad if not worse with the Caldor Fire and Hurricane IDA fresh in our memory.

Tape technology has a major advantage in terms of energy consumption as tape systems require no energy unless tapes are being read or written to in a tape drive. Otherwise, tapes that are idle in a library slot or vaulted offsite consume no energy. As a result, the CO2 footprint is significantly lower than always on disk systems, constantly spinning and generating heat that needs to be cooled.  Studies show that tape systems consume 87% less energy and therefore produce 87% less CO2 than equivalent amounts of disk storage in the actual usage phase. More recent studies show that when you look at the total life cycle from raw materials and manufacturing to distribution, usage, and disposal, tape actually produces 95% less CO2 than disk.  When you consider that 60% to 80% of data quickly gets cold with the frequency of access dropping off after just 30, 60, or 90 days, it only makes sense to move that data from expensive, energy-intensive tiers of storage to inexpensive energy-efficient tiers like tape. The energy profile of tape only improves with higher capacity generations such as LTO-9.

 

A Last Line of Defense Against Cybercrime
Once again, 2021 is just as bad if not worse than 2020 when it comes to cybercrime and ransomware attacks. Every webinar you attend on this subject will say something to the effect of: “it’s not a question of if; it’s a question of when you will become the next ransomware victim.” The advice from the FBI is pretty clear: “Backup your data, system images, and configurations, test your backups, and keep backups offline.”

This is where the tape air gap plays an increasingly important role. Tape cartridges have always been designed to be easily removable and portable in support of any disaster recovery scenario. Thanks to the low total cost of ownership of today’s high-capacity automated tape systems, keeping a copy of mission-critical data offline, and preferably offsite, is economically feasible – especially considering the prevalence of ransomware attacks and the associated costs of recovery, ransom payments, lost revenue, profit, and fines.

In the event of a breach, organizations can retrieve a backup copy from tape systems, verify that it is free from ransomware and effectively recover. The high capacity of LTO-9 makes this process even more efficient, with fewer pieces of media moving to and from secure offsite locations.

The Strategic Choice for a Transforming World
LTO-9 is the “strategic” choice for organizations because using tape to address long-term data growth and volume is strategic, adding disk is simply a short-term tactical measure. It’s easy to just throw more disks at the problem of data growth, but if you are being strategic about it, you invest in a long-term tape solution.

The world is “transforming” amidst the COVID pandemic as everyone has to do more with less and budgets are tight, digital transformation has accelerated, and we are now firmly in the zettabyte age which means we have more data to manage efficiently, cost-effectively, and in an environmentally friendly way. The world is also transforming as new threats like cybercrime become a fact of life, not just a rare occurrence that happens to someone else. In this respect, LTO-9 indeed comes to market at the right time with the right features to meet all of these challenges.

Read More

Understanding Your True Cost of IT and Your Carbon Footprint

Reading Time: 4 minutes

I recently attended a webinar about why IT folks have a love/hate relationship with the cloud. They love the cloud because of its on-demand flexibility, unlimited compute and storage capacity, elimination of CAPEX costs, etc. They hate it, according to the webinar presenter, because of the cost that often produces “sticker shock.” Other irritants might include regulatory compliance issues and cyber security concerns.

To be completely fair to the cloud, the presenter explained that discipline and accountability could be brought to bear to help control costs and that organizations need to establish “a cloud center of excellence.” But at the same time, the presenter showed data from a study that suggested that 58% of respondents were moving some cloud-based workloads back to on-premises, private cloud environments. Finally, the presenter advised the audience to “understand your true cost of IT, TCO tools are out there!”

Getting Back to Hybrid Storage Strategies

I think the overall message of the webinar was that the cloud is great when used for the right applications and that a hybrid approach including a healthy mix of public cloud plus private cloud makes a lot of sense. In fact, the trend prior to COVID-19 appeared to be clearly hybrid. Cloud repatriation was happening as IT managers realized that the cloud is not a panacea for everything. During the COVID period, private cloud data centers were understaffed and under-supported by vendors, so the path of least resistance was to over-leverage the public cloud once again. As we begin to emerge from COVID lockdowns and IT staff returns to the data center, attention is being paid once again to finding a healthy mix of public cloud and on-premises private cloud.

This approach only makes sense and clearly reinforces that it is not an either-or scenario. In the case of storage, the cloud complements on-premises storage including today’s highly advanced and automated tape systems. Cloud comes in handy for example when on-demand access is frequently needed by multiple clients while tape systems can manage less frequently accessed and large data sets needing long-term retention including sensitive data and mission-critical data that can be air-gapped as a cyber security best practice. Tape is particularly well suited for these applications thanks to tape’s:

  • High capacity
  • Ease of scalability
  • Ease of removability
  • Long archival life and reliability
  • Low TCO
  • Low energy consumption and low carbon footprint

TCO Tools are Out There

Getting back to the webinar story and the advice to “understand your true cost of IT,” indeed TCO tools are out there and Fujifilm is pleased to offer a free, web-based interactive TCO tool developed by IT economics expert Brad Johns Consulting, LLC. This tool compares 5 year and 10 year TCO of automated tape systems to economy disk systems and cloud-based cold storage. The tool allows users to input the volume of data to be stored, the annual growth rate, the percent of cloud data retrieval as well as other variables such as the local cost per Kwh, the expense of full time storage management staff, number of copies of data, etc. The tool has been available for many years now and has evolved overtime to be as comprehensive as possible and includes the following CAPEX and OPEX cost variables:

  • Media and hardware for disk and tape
  • Maintenance for disk and tape
  • Energy for disk and tape
  • Offsite vaulting for tape
  • Storage management for disk, tape, and cloud
  • Storage and retrieval fees for cloud
  • Data transfer fees for cloud
  • Business level support for cloud

Reducing Energy Consumption and CO2 with Tape

Regarding the cost of energy for disk and tape, this expense can be significant over time especially for disk systems that are constantly spinning 24/7 generating heat and therefore require cooling. Given the heightened awareness of global warming and climate change, organizations are looking for ways to reduce energy consumption and their carbon footprint. Data center operations are no exception and have been spotlighted for their energy-intensive applications. Making greater use of renewable energy is part of the answer, but renewable energy can’t come online fast enough, or cheaply enough, to keep up with exponential data growth. Conservation has an even bigger potential to make a difference and that is where tape systems really shine.

Studies show that under certain scenarios inclusive of data management servers and network infrastructure, tape consumes 87% less energy than equivalent amounts of disk storage and therefore produces 87% less CO2 all while reducing TCO by 86% at the same time. Given that data quickly becomes static and frequency of access goes down dramatically after just 30 to 90 days, it makes sense to move that data from energy-intensive and higher cost tiers of storage like flash, performance disk, or even economy disk to lower-cost, energy-efficient tape systems. A good active archive architecture with intelligent data management software is a great way to achieve such storage optimization (getting the right data, in the right place, at the right time, and at the right cost).

To help highlight the energy advantage of tape and its reduction in CO2, the Fujifilm TCO tool now includes a calculation purely focused on the storage hardware layer that shows the reduction in CO2 compared to disk systems, with an example shown below based on storing 5.0 PB for 10 years with 30% annual growth and 12% data retrieval from the cloud.

So not only is TCO reduced with automated tape systems compared to disk and cloud storage, but a meaningful reduction in CO2 can be achieved and that is exactly what we all need to be doing to help slow down the negative impacts of global warming and climate change.

Read More

Why Active Archiving is a Hot Concept in Storage Today

Reading Time: 2 minutes

The 2021 Active Archive Alliance annual market report has just been released, entitled “Saved by the Data. Active Archive Leads the Way in a Mid-Pandemic World”.

Certainly, the COVID pandemic was a shock to many companies and put tremendous strain on operations, revenue, and profit. But those companies who had already implemented a sensible active archive strategy were at a competitive advantage thanks to their ability to intelligently manage access to their data.

I think active archiving, the practice of keeping data online all the time and easily accessible to users, is a hot concept in storage right now because it is really about optimization – getting the right data in the right place, at the right time, and at the right cost.

We know that IT budgets are not keeping up with the relentless growth of data. We also know that 60% to 80% of data quickly becomes archival. Typically after 30, 60, or 90 days, files become static and the frequency of access drops off. So why keep that kind of data on expensive primary storage?

Why not let intelligent data management software that is typical of an active archive solution move that data by user-defined policy from high performance, expensive tiers, to lower performance but more cost-effective tiers like economy disk or tape systems, or even cloud? All while maintaining transparent access for users.

We know that the value of data is increasing, retention periods are getting longer, and users want to maintain ready access to their data without IT staff intervention. But we also need to worry about the bottom line, about efficiency, compliance, sustainability, and cybersecurity! Active archiving provides the right solutions to these worries and that’s why it is such a hot concept in storage today.

But enough said, read the full report here and check out what Alliance members had to say in their related virtual conference.

 

 

.

Read More

Reducing IT’s Carbon Footprint via Tape While Improving Cybersecurity and Protecting the Bottom Line

Reading Time: 4 minutes

By Drew Robb, Guest Blogger

There is increasing pressure around the world to reduce emissions and lower mankind’s carbon footprint. It is up to the IT sector to do its part, and that means considerably lowering power usage. But that is easier said than done when you consider the statistics.

IDC predicts we will arrive at the mind-boggling figure of 175 zettabytes of data in the digital universe within 4 years. 175 ZB? Consider how long it takes most users to fill a one TB drive. Well, 175 ZB equates to approximately 175 billion TB drives.

The problem is this: how do you reduce IT’s overall power draw in the face of a massive and continual upsurge in data storage? Once 175 ZB of data exists, there is no possibility of containing electrical usage if the vast majority of storage is sitting on hard disk drives (HDDs). The only solution is to cure the industry’s addiction to disk.

Here are the numbers. Data centers alone account for close to 2% of all power consumed in the U.S., about 73 billion kilowatt hours (kWh) in 2020. That is enough to set off the alarm bells. Yet tremendous progress has been made over the past two decades in terms of data center efficiency. When power consumption in data centers soared by 90% between 2000 and 2005 period, the industry acted forcefully. The rate of growth slowed to 24% between 2005 and 2010 and then fell to less than 5% for the entire decade between 2010 and 2020. That’s miraculous when you consider that it was achieved during a period that represented the largest surge in storage growth in history. Smartphones, streaming video, texting, multi-core processors, analytics, the Internet of Things (IoT), cloud storage, big data, and other IT innovations demanded the retention of more and more data.

Big strides were made in Power Usage Effectiveness (PUE – the ratio of data center power consumption divided by the power usage). Data centers have largely done a good job in improving the efficiency of their operations. But the one area lagging badly behind is storage efficiency.

Read more

Read More

Reducing Carbon Emissions through the Data Tape Ecosystem

Reading Time: 5 minutes

By Rich Gadomski, Fujifilm, and Paul Lupino and Tom Trela, Iron Mountain

If there was ever a time for industries and governments around the world to come together and finally take steps to mitigate climate change, now would seem to be it. The return of the United States to the Paris Climate Agreement and the recent U.S. –  China talks on climate change are all positive signs when it comes to moving the needle forward on sustainability initiatives. While fighting COVID-19 took center stage in 2020 and early 2021, our future depends on what we do collectively to reduce our environmental impact now and in the immediate years ahead.

It’s Hard to Deny Global Warming and Climate Change

According to an article that appeared in the Wall Street Journal earlier this year, NASA has ranked 2020 as tied with 2016 for the warmest year since record-keeping began in 1880. In a separate assessment, NOAA  (National Oceanic and Atmospheric Administration), which relies on slightly different temperature records and methods, calculated that the global average temperature last year was the second highest to date – just 0.04 degrees Fahrenheit shy of tying the record set in 2016.

On top of the record number of hurricanes and the wildfires out west, the recent Texas deep freeze, which caused widespread power outages and other weather-related tragedies and calamities, seems to be just one more example of climate change. Weather patterns are becoming more unpredictable, which can result in extreme heat, cold and increased intensity of natural disasters.

It is widely acknowledged that global temperatures have been rising especially in the north polar region where we have seen a dramatic shrinking of the polar ice cap. When Arctic air warms, it sets off an atmospheric phenomenon that weakens the polar vortex (the normal jet stream of wind that keeps frigid air to the north) and allows cold air to fall…as far as Texas.

Data Center Energy Consumption and the Advantage of Modern Tape Technology

The key to mitigating the worst impacts of climate change is a reduction in the amount of greenhouse gases produced by humans. Producing energy is extremely resource-intensive, so reducing the amount of energy we consume in all aspects of our lives is of critical importance.

Data centers are significant consumers of energy accounting for as much as 2% of global demand and rising to 8% by some estimates. Data centers can do their part to reduce energy consumption in many ways by becoming more energy-efficient, including simply migrating the vast amounts of still valuable, but rarely accessed, “cold data”.

Read more

Read More

New Video Surveillance TCO Tool Makes the Case for LTO Tape Tier in Video Surveillance Operations

Reading Time: 4 minutes

Recently my neighborhood had a rash of car break-ins by what turned out to be just a band of mischievous teenagers. But what struck me about this occurrence was the flood of homeowner video surveillance clips that appeared on social media and that were sent to the local police department to help identify the wrongdoers. It seems like everyone in the neighborhood has a home video surveillance system, perhaps to catch a doorstep package thief, or if nothing else, to catch the guilty dog walkers!

A Booming Market for Video Surveillance Solutions

Indeed, the video surveillance market is booming, not just in the relatively nascent consumer market, but in the commercial market and has been for a long time – in a much bigger way. The reasons for this include more affordable cameras with better resolutions soaring from 720p up to 4k and even 8k. In the meantime, video surveillance systems are finding more and more applications. Retail shopping malls, banks, hotels, city streets, transportation and highways, manufacturing and distribution operations, airport security, college dorm and campus security, corporate security, police body and dash cams, to name just a few – all need good quality video surveillance.

Video Retention Costs Soar

However, these higher resolution cameras have sent the costs of video retention soaring. So much high-resolution raw footage quickly fills up available hard disk drives commonly used to store or retain video surveillance content. According to a Seagate video surveillance calculator, an installation of 100 cameras recording eight hours a day at 30 frames per second, 1080p resolution, with a retention period of 90 days would require 2,006 terabytes of storage. That’s 2.0 petabytes of expensive, energy-intensive hardware. Those with unlimited budgets can simply add more disks. But everyone else faces tough choices: shorten retention periods? lower video resolution? reduce the number of cameras or frames per second? None of these support the goals of why the video surveillance system was installed in the first place.

Read more

Read More

3 Big Takeaways from the Fujifilm and IBM 580TB Tape Demonstration

Reading Time: 5 minutes

In mid-December 2020, Fujifilm issued a press release to announce that, together with IBM Research, they had successfully achieved a record areal density of 317 Gbpsi (billion bits per square inch) on next-generation magnetic tape coated with next-generation Strontium Ferrite (SrFe) magnetic particles. This areal density achievement would yield an amazing native storage capacity of 580TB on a standard-sized data cartridge. That’s almost 50 times more capacity than what we have now with an LTO-8 tape based on Barium Ferrite (BaFe) at 12TB native.

Shortly after the news came out, I was on a call with a member of our sales team discussing the announcement and he asked me when the 580TB cartridge would be available and if there was any pricing information available yet? He was also curious about transfer speed performance. I had to admit that those details are still TBD, so he asked me “what are the 3 big takeaways” from the release? So let’s dive into what those takeaways are.

Tape has no fundamental technology roadblocks

To understand the magnitude of tape areal density being able to reach 317 Gbpsi, we have to understand just how small that is in comparison to HDD technology. Current HDD areal density is already at or above 1,000 Gbpsi while achieving 16TB to 20TB per drive on as many as nine disk platters. This level of areal density is approaching what is known as the “superparamagnetic limitation,” where the magnetic particle is so small that it starts to flip back and forth between positive and negative charge. Not ideal for long-term data preservation.

So to address this, HDD manufacturers have employed things like helium-filled drives to allow for closer spacing between disk platters that allow for more space for more platters, and therefore more capacity.  HDD manufacturers are also increasing capacity with new techniques for recording involving heat (HAMR) or microwaves (MAMR) and other techniques. As a result HDD capacities are expected to reach up to 50TB within the next five years or so. The reason tape can potentially reach dramatically higher capacities has to do with the fact that a tape cartridge contains over 1,000 meters of half-inch-wide tape, and, therefore, has far greater surface area than a stack of even eight or nine 3.5-inch disk platters.

But let’s also look at track density in addition to areal density. Think about the diameter of a single strand of human hair which is typically 100 microns wide. If a single data track on HDD is 50 nanometers wide, you are looking at 2,000 data tracks for HDD on the equivalent width of a single strand of human hair! For tape, with a track width of approximately 1,200 nanometers, you are looking at just 84 data tracks. But this is actually a positive for tape technology because it shows that tape has a lot of headroom in both areal density and track density, and that will lead to higher capacities and help to maintain a low TCO for tape.

But let me make it clear that this is not about HDD vs. tape. We are now in the zettabyte age having shipped just over an impressive one zettabyte (1,000 exabytes) of new storage capacity into the global market in 2019 of all media types. According to IDC, that number will balloon to a staggering 7.5 ZB by 2025. We will need a lot of HDDs and a lot of tape (and flash for that matter) to store 7.5 ZB!

Read more

Read More

5 Key Data Tape Storage Trends for 2021

Reading Time: 3 minutes

The past decade saw the renaissance of data tape technology with dramatic improvements to capacity, reliability, performance, and TCO giving rise to new industry adoptions and functionality. This trend will only continue in 2021 as data storage and archival needs in the post-COVID digital economy demand exactly what tape has to offer. Below are 5 key contributions tape will make to the storage industry in 2021.

Containing the Growing Cost of Storage
One lingering effect of the pandemic will be the need for more cost containment in already budget-strapped IT operations. We are well into the “zettabyte age,” and storing more data with tighter budgets will be more important than ever. Businesses will need to take an intelligent and data-centric approach to storage to make sure the right data is in the right place at the right time. This will mean storage optimization and tiering where high capacity, low-cost tape plays a critical role — especially in active archive environments.

A Best Practice in Fighting Ransomware
One of many negative side effects of COVID-19 has been the increasing activity of ransomware attacks, not only in the healthcare industry which is most vulnerable at this time, but across many industries, everywhere.  Backup and DR vendors are no doubt adding sophisticated new anti-ransomware features to their software that can help mitigate the impact and expedite recovery. But as a last line of defense, removable tape media will increasingly provide air-gap protection in 2021, just in case the bad actors are one step ahead of the good guys.

Compatibility with Object Storage
Object storage is rapidly growing thanks to its S3 compatibility, scalability, relatively low cost and ease of search and access. But even object storage content eventually goes cold, so why keep that content on more expensive, energy-intensive HDD systems? This is where tape will play an increasing role in 2021, freeing up capacity on object storage systems by moving that content to a less expensive tape tier all while maintaining the native object format on tape.

Read more

Read More

3 Reasons Why 2010 – 2020 was the Decade of Renaissance for Data Tape

Reading Time: 2 minutes

The past 10 years have been marked by explosive data growth and demand for storage. Meanwhile, the tape industry has experienced a renaissance thanks to significant advancements in capacity, reliability, performance, and functionality that have led to new applications and key industry adoption. Here’s a look at some of the key milestones.

Capacity

  • In terms of capacity, the decade started for LTO with LTO-5 at 1.5 TB native capacity and culminated most recently with LTO-8 at 12.0 TB and LTO-9 soon to be delivered at 18.0 TB.
  • Enterprise tape formats started the decade at 1.0 TB native and are currently at 20.0 TB native.
  • Barium Ferrite magnetic particles became a key enabler for multi-terabyte tapes and were demonstrated by IBM and Fujifilm in 2015 to have the potential to achieve 220 TB on a single tape cartridge. This signaled that tape technology had no fundamental areal density limitations for the foreseeable future.
  • By the end of the decade, IBM and Fujifilm demonstrated the ability to achieve a record areal density of 317 GBPSI using the next generation of magnetic particles, Strontium Ferrite, with a potential cartridge capacity of 580 TB.

 

Reliability and Performance

  • During the decade, tape achieved the highest reliability rating as measured by Bit Error Rate at 1 x 1019, even better than enterprise HDD at 1 x 1016.
  • Data transfer rates for tape also improved from 140 MB/sec. in 2010 to an impressive 400 MB/sec.
  • LTFS provided an open tape file system with media partitions for faster “disk-like” access and ease of interchangeability, making LTO a de facto standard in the Media & Entertainment industry.

 

New Applications and Key Industry Adoption

  • Storing objects on tape became a reality with object archive software solutions offering S3 compatibility, objects can now move to and from tape libraries in their native object format.
  • The concept of active archiving grew in popularity with tape as a key component complementing flash, HDD and cloud for cost-effectively maintaining online archives.
  • Tape was recognized for its ease of removability and portability, providing air gap protection in the escalating war against cybercrime.
  • Major U.S. hyperscalers began to rely on tape during the decade for both back-up and deep archive applications. In one well-publicized example, Google restored a February 2011 Gmail outage from its tape backups. Microsoft adopted tape for Azure later in the decade. Tape became firmly established as a competitive advantage for these and other hyper scalers based on its scalability, long archival life, lowest TCO, low energy consumption, and air gap security.

 

With this steady technological advancement over the last decade, tape has been recognized for its complementary value to flash, HDD and cloud in tiered storage strategies for managing data in the zettabyte age.

Read More

Why are Two Thirds of Organizations Failing to Backup and Archive Correctly?

Reading Time: 4 minutes

You would think, by now, that backup best practices would have moved into the same category as filling up the tank before a long drive or looking each way before crossing the street. But a new study indicates that most organizations continue to get it fundamentally wrong. How? By continuing to backup long-inactive data that should have been archived instead of remaining in the backup schedule.

The 2020 Active Archive Alliance survey found that 66% of respondents were still using backup systems to store archive data. What’s wrong with that?

  • It greatly lengthens backup windows: Repeatedly backing up unchanging archive data wastes storage resources and adds time to the backup process
  • As data sets grow, a failure to distinguish between backup and archiving becomes increasingly expensive in terms of disk space
  • Even those offloading backups to cheap cloud resources are still running up a large bill over time by unnecessarily backing up cold data
  • Archiving, on the other hand, frees up expensive capacity by moving less frequently used data to more cost-effective storage locations.


Clearing Up Backup Confusions

One of the underlying reasons for this is a confusion between backup and archiving. Backup provides a copy of organizational data for use in recovery from a data loss incident, cyberattack or disaster. These days, it is generally copied onto disk or tape and either retained there or relayed to the cloud. A key point is that backup only copies data, leaving the source data in place. It is also used to restore lost or deleted files rapidly.

Archiving is a different concept entirely. Rather than copying data, it moves data classed as inactive to a more cost-effective tier of storage such as economy disk or tape. This frees up space on higher-tier storage systems such as fast disk or flash. In addition, it shortens the backup window and offers permanent and long-term protection from modification or deletion of data.

Read more

Read More

LET’S DISCUSS YOUR NEEDS

We can help you reduce cost, decrease vendor lock-in, and increase productivity of storage staff while ensuring accessibility and longevity of data.

Contact Us >