Blog

Is Online Object Storage Really Immune to Ransomware? Achieving True Object Storage Immutability with Tape

Reading Time: 3 minutes

By Chris Kehoe, Head of Infrastructure Engineering, FUJIFILM Recording Media U.S.A., Inc.


Object storage has many benefits. Near infinite capacity combined with good metadata capabilities and low cost have propelled it beyond its initial use cases of archiving and backup. More recently, it is being deployed as an aid to compute processing at the edge, in analytics, machine learning, disaster recovery, and regulatory compliance. However, one recent paper perhaps got a little over-enthusiastic in claiming that disk-based object storage provided an adequate safeguard against the threat of ransomware.

The basic idea proposed is that ransomware protection is achieved by having multiple copies of object data protecting against that kind of intrusion. If the object store suffers ransomware incursion, the backup is there for recovery purposes. The flaw in this logic, however, is that any technology that is online cannot be considered to be immune to ransomware. Unless it is the work of an insider, any attempt at hacking must enter via online resources. Any digital file or asset that is online – whether it stored in a NAS filer, a SAN array, or on object storage – is open to attack.

Keeping multiple copies of object storage is certainly a wise strategy and does offer a certain level of protection. But if those objects are online on disk, a persistent connection exists that can be compromised. Even in cases where spin-down disk is deployed, there still remains an automated electronic connection. As soon as a data request is made, therefore, the data is online and potentially exposed to the nefarious actions of cybercriminals.

Read more

Read More

How to Leverage the 3-2-1 Backup Rule and Modern Tape Technology in Backup Applications

Reading Time: 3 minutes

In case you were not aware of it, March 31st is World Backup Day. To be sure, a quick visit to the official website confirms that this day is just a reminder for consumers to backup their PCs and cell phones. According to the website, only 25% of consumers are protecting their precious memories. Surely the helpful recommendations for routine backup doesn’t apply to the storage professionals that keep our enterprise data safe and our websites up and running.  Or does it?

When Disaster Strikes a Data Center

On Wednesday, March 10th, 2021, a fire broke out at the OVHCloud data center in Strasbourg, France. The fire quickly spread out of control and completely destroyed compute, network and storage infrastructure. According to some accounts, as many as 3.6 million websites including government agencies, financial institutions and gaming sites went dark. Others complained that years’ worth of data was permanently lost.

We know that the statistics regarding cost of downtime and the number of companies that don’t ever recover from catastrophic data loss are alarming. The often-cited University of Texas study shows that 94% of companies do not survive, 43% never reopen and 51% close within two years. That’s why the cardinal sin in data protection is not being able to recover data.

OVH reminds us that, however unlikely, data center disasters like an all-consuming fire can still happen. Although these days a more sinister threat continues to loom and tends to grab the headlines and our attention, namely: ransomware.

Read more

Read More

New Video Surveillance TCO Tool Makes the Case for LTO Tape Tier in Video Surveillance Operations

Reading Time: 4 minutes

Recently my neighborhood had a rash of car break-ins by what turned out to be just a band of mischievous teenagers. But what struck me about this occurrence was the flood of homeowner video surveillance clips that appeared on social media and that were sent to the local police department to help identify the wrongdoers. It seems like everyone in the neighborhood has a home video surveillance system, perhaps to catch a doorstep package thief, or if nothing else, to catch the guilty dog walkers!

A Booming Market for Video Surveillance Solutions

Indeed, the video surveillance market is booming, not just in the relatively nascent consumer market, but in the commercial market and has been for a long time – in a much bigger way. The reasons for this include more affordable cameras with better resolutions soaring from 720p up to 4k and even 8k. In the meantime, video surveillance systems are finding more and more applications. Retail shopping malls, banks, hotels, city streets, transportation and highways, manufacturing and distribution operations, airport security, college dorm and campus security, corporate security, police body and dash cams, to name just a few – all need good quality video surveillance.

Video Retention Costs Soar

However, these higher resolution cameras have sent the costs of video retention soaring. So much high-resolution raw footage quickly fills up available hard disk drives commonly used to store or retain video surveillance content. According to a Seagate video surveillance calculator, an installation of 100 cameras recording eight hours a day at 30 frames per second, 1080p resolution, with a retention period of 90 days would require 2,006 terabytes of storage. That’s 2.0 petabytes of expensive, energy-intensive hardware. Those with unlimited budgets can simply add more disks. But everyone else faces tough choices: shorten retention periods? lower video resolution? reduce the number of cameras or frames per second? None of these support the goals of why the video surveillance system was installed in the first place.

Read more

Read More

Air-Gapped Storage Solutions Simply Can’t Be Hacked

Reading Time: 2 minutes

The changing landscape of the data protection industry has evolved from primarily backing up data in order to recover from hardware, software, network failures and human errors, to fighting a mounting wave of cybercrime. Over the years, hardware and software have significantly improved their reliability and resiliency levels but security is a people problem, and people are committing the cybercrimes.

Cybercrime has now become the biggest threat to data protection and the stakes are getting higher as anonymous individuals seek to profit from other’s valuable digital data. With a cease-fire in the cybercrime war highly unlikely, we are witnessing a rapid convergence of data protection and cybersecurity to counter rapidly growing and costly cybercrime threats, including ransomware. The growing cybercrime wave has positioned air-gapped storage solutions as a key component of digital data protection – they simply can’t be hacked.

Traditional backup and archival data can be stored locally or in cloud environments. In contrast, a cyber-resilient copy of data must meet additional more stringent requirements. This is where “air gapping” and tape technology are gaining momentum. The rise of cybercrime officially makes the offline copy of data stored on tape more valuable and takes advantage of what is referred to as the tape air gap. The tape air gap is an electronically disconnected or isolated copy of data in a robotic library or tape rack that prevents cybercriminals from attacking a backup, archive or any other data.

Tape cartridges in a robotic tape library or manually accessed tape cartridges in tape racks, are currently the only data center class air-gapped storage solution available.

For more information, check out this Horison Information Strategies White Paper “The Tape Air Gap: Protecting Your Data From Cybercrime.”

 

 

Read More

Tape Secures its Place in the Future of Enterprise Storage

Reading Time: 4 minutes

By Drew Robb

I recently read an article in StorageNewsletter entitled “End of Removable Storage Media” and I agreed with many of the points including the demise of removable consumer media such as floppies, zip disks, CDs and DVDs. But I disagree about tape being on the way out like the rest of removable media from the past.

Tape was pronounced dead by Data Domain about 15 years ago when deduplication first entered the scene. Yet tape has not only survived, it thrives, particularly in an enterprise setting. Tape capacity shipments have been rising steadily for more than a decade. A record 114,079 PB of total LTO tape capacity (compressed) shipped in 2019, about four times more than shipped in 2009.

Why is this?

Tape offers removability

In an era when data breaches are escalating, and ransomware wreaks havoc, having an air gap between data and the network has become increasingly important. Whether it is a box of tapes stored by Iron Mountain, or tapes kept on site for use in an automated tape library, physical tapes are easy to isolate from the network. This feature of removability also makes tape easy to scale as you only need to add fresh media for more capacity, not more controllers, disk arrays and supporting hardware. Finally, because of its removability, tape is easily transported by truck or plane between data centers or between clouds and will often be faster and cheaper than using expensive bandwidth.

Tape offers high capacity

The latest generation of LTO tape cartridges can hold 18 TB native and 45 TB compressed per cartridge. To put that in perspective, one cartridge can hold 61.2 years of video recording running 24 hours per day, 4.78 billion human genomes worth of sequence information, or 2.88 years of data transmissions from the Hubble Space Telescope. Even larger cartridges are on the near horizon.

Tape underpins the cloud

The dirty little secret of the big cloud providers is that they rely on tape for high-volume, low-cost storage. These providers harness tape to hold multiple PBs of data, as do a great many large financial institutions. But that doesn’t mean tape is only the province of the few. Anyone with 100 TB or more of data will find value and efficiency with tape. In fact, vendors such as XenData are beginning to offer appliances that make it affordable to use tape for smaller workloads.

Read more

Read More

3 Big Takeaways from the Fujifilm and IBM 580TB Tape Demonstration

Reading Time: 5 minutes

In mid-December 2020, Fujifilm issued a press release to announce that, together with IBM Research, they had successfully achieved a record areal density of 317 Gbpsi (billion bits per square inch) on next-generation magnetic tape coated with next-generation Strontium Ferrite (SrFe) magnetic particles. This areal density achievement would yield an amazing native storage capacity of 580TB on a standard-sized data cartridge. That’s almost 50 times more capacity than what we have now with an LTO-8 tape based on Barium Ferrite (BaFe) at 12TB native.

Shortly after the news came out, I was on a call with a member of our sales team discussing the announcement and he asked me when the 580TB cartridge would be available and if there was any pricing information available yet? He was also curious about transfer speed performance. I had to admit that those details are still TBD, so he asked me “what are the 3 big takeaways” from the release? So let’s dive into what those takeaways are.

Tape has no fundamental technology roadblocks

To understand the magnitude of tape areal density being able to reach 317 Gbpsi, we have to understand just how small that is in comparison to HDD technology. Current HDD areal density is already at or above 1,000 Gbpsi while achieving 16TB to 20TB per drive on as many as nine disk platters. This level of areal density is approaching what is known as the “superparamagnetic limitation,” where the magnetic particle is so small that it starts to flip back and forth between positive and negative charge. Not ideal for long-term data preservation.

So to address this, HDD manufacturers have employed things like helium-filled drives to allow for closer spacing between disk platters that allow for more space for more platters, and therefore more capacity.  HDD manufacturers are also increasing capacity with new techniques for recording involving heat (HAMR) or microwaves (MAMR) and other techniques. As a result HDD capacities are expected to reach up to 50TB within the next five years or so. The reason tape can potentially reach dramatically higher capacities has to do with the fact that a tape cartridge contains over 1,000 meters of half-inch-wide tape, and, therefore, has far greater surface area than a stack of even eight or nine 3.5-inch disk platters.

But let’s also look at track density in addition to areal density. Think about the diameter of a single strand of human hair which is typically 100 microns wide. If a single data track on HDD is 50 nanometers wide, you are looking at 2,000 data tracks for HDD on the equivalent width of a single strand of human hair! For tape, with a track width of approximately 1,200 nanometers, you are looking at just 84 data tracks. But this is actually a positive for tape technology because it shows that tape has a lot of headroom in both areal density and track density, and that will lead to higher capacities and help to maintain a low TCO for tape.

But let me make it clear that this is not about HDD vs. tape. We are now in the zettabyte age having shipped just over an impressive one zettabyte (1,000 exabytes) of new storage capacity into the global market in 2019 of all media types. According to IDC, that number will balloon to a staggering 7.5 ZB by 2025. We will need a lot of HDDs and a lot of tape (and flash for that matter) to store 7.5 ZB!

Read more

Read More

5 Key Data Tape Storage Trends for 2021

Reading Time: 3 minutes

The past decade saw the renaissance of data tape technology with dramatic improvements to capacity, reliability, performance, and TCO giving rise to new industry adoptions and functionality. This trend will only continue in 2021 as data storage and archival needs in the post-COVID digital economy demand exactly what tape has to offer. Below are 5 key contributions tape will make to the storage industry in 2021.

Containing the Growing Cost of Storage
One lingering effect of the pandemic will be the need for more cost containment in already budget-strapped IT operations. We are well into the “zettabyte age,” and storing more data with tighter budgets will be more important than ever. Businesses will need to take an intelligent and data-centric approach to storage to make sure the right data is in the right place at the right time. This will mean storage optimization and tiering where high capacity, low-cost tape plays a critical role — especially in active archive environments.

A Best Practice in Fighting Ransomware
One of many negative side effects of COVID-19 has been the increasing activity of ransomware attacks, not only in the healthcare industry which is most vulnerable at this time, but across many industries, everywhere.  Backup and DR vendors are no doubt adding sophisticated new anti-ransomware features to their software that can help mitigate the impact and expedite recovery. But as a last line of defense, removable tape media will increasingly provide air-gap protection in 2021, just in case the bad actors are one step ahead of the good guys.

Compatibility with Object Storage
Object storage is rapidly growing thanks to its S3 compatibility, scalability, relatively low cost and ease of search and access. But even object storage content eventually goes cold, so why keep that content on more expensive, energy-intensive HDD systems? This is where tape will play an increasing role in 2021, freeing up capacity on object storage systems by moving that content to a less expensive tape tier all while maintaining the native object format on tape.

Read more

Read More

3 Reasons Why 2010 – 2020 was the Decade of Renaissance for Data Tape

Reading Time: 2 minutes

The past 10 years have been marked by explosive data growth and demand for storage. Meanwhile, the tape industry has experienced a renaissance thanks to significant advancements in capacity, reliability, performance, and functionality that have led to new applications and key industry adoption. Here’s a look at some of the key milestones.

Capacity

  • In terms of capacity, the decade started for LTO with LTO-5 at 1.5 TB native capacity and culminated most recently with LTO-8 at 12.0 TB and LTO-9 soon to be delivered at 18.0 TB.
  • Enterprise tape formats started the decade at 1.0 TB native and are currently at 20.0 TB native.
  • Barium Ferrite magnetic particles became a key enabler for multi-terabyte tapes and were demonstrated by IBM and Fujifilm in 2015 to have the potential to achieve 220 TB on a single tape cartridge. This signaled that tape technology had no fundamental areal density limitations for the foreseeable future.
  • By the end of the decade, IBM and Fujifilm demonstrated the ability to achieve a record areal density of 317 GBPSI using the next generation of magnetic particles, Strontium Ferrite, with a potential cartridge capacity of 580 TB.

 

Reliability and Performance

  • During the decade, tape achieved the highest reliability rating as measured by Bit Error Rate at 1 x 1019, even better than enterprise HDD at 1 x 1016.
  • Data transfer rates for tape also improved from 140 MB/sec. in 2010 to an impressive 400 MB/sec.
  • LTFS provided an open tape file system with media partitions for faster “disk-like” access and ease of interchangeability, making LTO a de facto standard in the Media & Entertainment industry.

 

New Applications and Key Industry Adoption

  • Storing objects on tape became a reality with object archive software solutions offering S3 compatibility, objects can now move to and from tape libraries in their native object format.
  • The concept of active archiving grew in popularity with tape as a key component complementing flash, HDD and cloud for cost-effectively maintaining online archives.
  • Tape was recognized for its ease of removability and portability, providing air gap protection in the escalating war against cybercrime.
  • Major U.S. hyperscalers began to rely on tape during the decade for both back-up and deep archive applications. In one well-publicized example, Google restored a February 2011 Gmail outage from its tape backups. Microsoft adopted tape for Azure later in the decade. Tape became firmly established as a competitive advantage for these and other hyper scalers based on its scalability, long archival life, lowest TCO, low energy consumption, and air gap security.

 

With this steady technological advancement over the last decade, tape has been recognized for its complementary value to flash, HDD and cloud in tiered storage strategies for managing data in the zettabyte age.

Read More

Fujifilm’s Tape Evangelist on The Future of Tape Storage

Reading Time: 4 minutes

In this Executive Q&A, Rich Gadomski, Fujifilm’s Head of Tape Evangelism, discusses the evolution of tape storage and how recent innovations and advancements in tape technology are putting it at center stage.


Explain your role as Tape Evangelist for FUJIFILM Recording Media, U.S.A.

I believe tape technology has a compelling value proposition and a great story to tell. I want to be focused on telling that story and that is my primary role. Of course, it takes a team and we all do our part, but this is what I’m passionate about and want to focus on.

How has the tape storage industry changed over the years?

Tape technology has come a long way since its introduction in the 1950s where we had reel-to-reel tape that could hold just a few megabytes of data. Today we have LTO tape in the market at 12 TB native and enterprise tape at 20TB. Just looking at LTO which came out 20 years ago, it was LTO Gen 1 in 2000 with 100GB of capacity. We expect LTO-9 soon at 18TB so a 180x increase. And just recently we announced a joint demonstration with IBM showing that we can reach up to 580 TB on a single cartridge with our next generation of magnetic particle called Strontium Ferrite. But it’s not just about capacity increases, tape has also made advancements in transfer rate, now faster than HDD, reliability rating, now three or four magnitudes better than HDD and it’s the greenest form of storage consuming far less energy than constantly spinning HDD and this contributes to its lowest total cost of ownership. But today it’s not about tape vs. HDD, in fact, the two technologies complement each other to help customers optimize their data storage.

Why is tape storage still relevant today? What is driving demand for this technology?

I would say first and foremost the incredible amount of data that’s being created in this digital economy. Add to the fact that the value of data is increasing so we want to store more of it for longer periods of time. Yet, IT budgets are flat or barely increasing, so you need a high capacity, highly reliable storage media that is cost-effective for long-term storage and that’s what tape gives you. So tape plays a role in cold archiving, active archiving and yes, backup as well. In fact, tape has renewed interest from customers in the fight against cybercrime and ransomware since customers can easily backup to tape and remove those backups from the network to an isolated and secure location that hackers can’t get to. That’s what we refer to as the tape air gap. Last but not least is energy consumption. Society is rightfully concerned about climate change. Tape consumes 87% less energy than disk and that leads to 87% less CO2 emissions.

Read more

Read More

Tape Storage vs. Disk Storage: Getting the Facts Straight about Total Cost of Ownership Calculations

Reading Time: 3 minutes

Modern tape storage has long been recognized for its low cost. Several analyst white papers have been published that demonstrate the low cost of storing data on tape. For example, “Quantifying the Economic Benefits of LTO-8 Technology” is a white paper that can be found on the LTO.org website. However, occasionally a storage solution provider publishes a white paper that claims to show that their solution is less expensive than tape storage for a particular use case. A good example is a recent white paper published by a disk-based backup-as-a-service provider who will remain unidentified out of respect for what they do. For the purpose of this blog, let’s call them “BaaS.” So let’s dig into their analysis which makes several assumptions that result in higher costs for tape storage than most users would experience.

Total Cost of Ownership (TCO) Process

The first step in developing a Total Cost of Ownership (TCO) estimate is the determination of the amount of data to be stored. The BaaS whitepaper separates the amount of primary data, which we wish to protect, from backup data, which is the data physically stored on the backup media. They estimate the amount of backup data residing in the tape library to be two to four times the primary data. This is due to their use of the old daily/ weekly/monthly/ full backup methodology for estimating the amount of backup data. The result is that two to four times the amount of primary data ends up being stored on tape, raising the tape hardware and media costs by two to four times.

Read more

Read More

LET’S DISCUSS YOUR NEEDS

We can help you reduce cost, decrease vendor lock-in, and increase productivity of storage staff while ensuring accessibility and longevity of data.

Contact Us >