FUJIFILM INSIGHTS BLOG

Data Storage

Tape Storage: New Game, New Rules

Reading Time: < 1 minute

Modern tape storage has become the leading strategic and lowest-cost storage solution for massive amounts of archival and unstructured data. This bodes well for future tape growth as archival data is piling up much faster than it is being analyzed. Over the past decade, the magnetic tape industry has successfully re-architected itself delivering compelling technologies and functionality including cartridge capacity increases, vastly improved bit error rates yielding the highest reliability of any storage device, a media life of 30 years or more, and faster data transfer rates than any previous tape or HDD (Hard Disk Drive).

Many of these innovations have resulted from technologies borrowed from the HDD industry and have been used in the development of both LTO (Linear Tape Open) and enterprise tape products. Additional tape functionality including LTFS, RAIT, RAO, TAOS, smart libraries and the Active Archive adds further value to the tape lineup. HDD technology advancement has slowed while progress for tape, SSD (Solid State Disk) and other semiconductor memories is steadily increasing. Fortunately, today’s tape technology is nothing like the tape of the past.

For more information, check out this Horison Information Strategies White Paper “Tape Storage: It’s a New Game With New Rules.”

Read More

Why are Two Thirds of Organizations Failing to Backup and Archive Correctly?

Reading Time: 4 minutes

You would think, by now, that backup best practices would have moved into the same category as filling up the tank before a long drive or looking each way before crossing the street. But a new study indicates that most organizations continue to get it fundamentally wrong. How? By continuing to backup long-inactive data that should have been archived instead of remaining in the backup schedule.

The 2020 Active Archive Alliance survey found that 66% of respondents were still using backup systems to store archive data. What’s wrong with that?

  • It greatly lengthens backup windows: Repeatedly backing up unchanging archive data wastes storage resources and adds time to the backup process
  • As data sets grow, a failure to distinguish between backup and archiving becomes increasingly expensive in terms of disk space
  • Even those offloading backups to cheap cloud resources are still running up a large bill over time by unnecessarily backing up cold data
  • Archiving, on the other hand, frees up expensive capacity by moving less frequently used data to more cost-effective storage locations.


Clearing Up Backup Confusions

One of the underlying reasons for this is a confusion between backup and archiving. Backup provides a copy of organizational data for use in recovery from a data loss incident, cyberattack or disaster. These days, it is generally copied onto disk or tape and either retained there or relayed to the cloud. A key point is that backup only copies data, leaving the source data in place. It is also used to restore lost or deleted files rapidly.

Archiving is a different concept entirely. Rather than copying data, it moves data classed as inactive to a more cost-effective tier of storage such as economy disk or tape. This frees up space on higher-tier storage systems such as fast disk or flash. In addition, it shortens the backup window and offers permanent and long-term protection from modification or deletion of data.

Read more

Read More

IoT and AI Generate Demand for Active Archive

Reading Time: 3 minutes

Perhaps lost in the current COVID-19 pandemic, is another pandemic of a far less threatening kind, the pandemic of the Internet of Things or IoT. It’s a good pandemic that will add value to everything we do and make. IoT is also one of the biggest drivers of storage demand currently exploding across the globe. It has been reported that there were 23 Billion IoT devices installed in 2018 with that number expected to grow to 75 billion units by 2025. While driverless cars are a good example of an internet-connected device with each vehicle generating terabytes of data every day, think about the smartphones that everyone has. Then think about smart everything else like smart homes, smart buildings, factories, machines, cities, airplanes, trains, trucks and so on. You start to get the picture and scope of IoT which is truly mind-boggling.

The market for IoT will continue to expand in 2020, especially as 5G networks start to proliferate. If you are like me and have been consuming more television than usual during this period of coronavirus lockdown, you may have noticed a lot of ads touting 5G networks. 5G promises to deliver 10X better performance than 4G, 100X better network density, with 100X more energy efficiency. With this key enabler rolling out, companies deploying IoT projects will need to plan for the data deluge that is coming as a result of these billions of devices!

Read more

Read More

A New Age Dawns for Digital Archives: 2020 State of Industry Report

Reading Time: < 1 minute

The Active Archive Alliance recently released its 2020 Active Archive and the State of the Industry Report, which highlights the increased demand for new data management strategies as well as benefits and use cases for active archive solutions.

Here are a few takeaways from the report:

  • Today’s data demands an intelligent archive solution that leverages the advanced capabilities of data movement technology and scale-out hardware to realize its fast-increasing value.
  • The growing reliance on archival data makes it ground zero for unlocking game-changing data strategies.
  • New applications, such as the Internet of Things (IoT) with billions of nodes and boosted by the arrival of 5G networks, will help fuel insatiable demand for more intelligent active archives.
  • Key usage cases for active archiving tell the real story, starting with healthcare and high-performance computing (HPC) in life sciences.
  • Other top use cases include security, business efficiency and continuity, media and entertainment, and IoT, including autonomous vehicles.

Sponsors of the report include Active Archive Alliance members Atempo, Fujifilm Recording Media USA, IBM, Iron Mountain, Harmony Healthcare IT, MediQuant, PoINT Software & Systems, Quantum, Qumulo, QStar Technologies, Spectra Logic, StrongBox Data Solutions and Western Digital.

You can read the full report here.

For more information on the Active Archive Alliance visit: www.activearchive.com.

Read More

Ransomware Protection Must Include an Air Gap

Reading Time: 4 minutes

Ransomware statistics can be frightening! Research studies suggest that over two million ransomware incidents occurred in 2019 with 60% of organizations surveyed experiencing a ransomware attack in the past year. To make matters worse, the cybercriminals have moved up the food chain. Two thirds of those attacked said the incident cost them $100,000 to $500,000. Another 20% said the price tag exceeded half a million. Overall, the losses are measured in billions of dollars per year. And it’s getting worse. Enterprise Strategy Group (ESG) reports that about half of all organizations have seen a rise in cyber attacks since the recent upsurge in people working from home.

Understandably, this is a big concern to the FBI. It has issued alerts about the dangers of ransomware. One of its primary recommendations to CEOs is the importance of backup with the following key questions:

“Do you backup all critical information? Are backups stored offline? Have you tested your ability to revert to backups during an incident?”

The key word in that line of questioning is “offline.” Hackers have gotten good at staging their attacks slowly over time. They infiltrate a system, quietly ensuring that backups are infected as well as operational systems. When ready, they encrypt the files and announce to the company that they are locked out of their files until the ransom is paid. Any attempt to recover data from disk or the cloud fails as the backup files are infected, too.

The answer is to make tape part of the 3-2-1 system: Three separate copies of data, stored on at least two different storage media with one copy off-site. This might mean, for example, one copy retained on onsite disk, another in the cloud, and one on tape; or one on onsite disk, one on onsite tape as well as tape copies stored offsite.

Read more

Read More

Webinar: How Much Do You Really Know About Your Data?

Reading Time: 2 minutes

July 22, 2020

By Kevin Benitez

How much do you really know about your data? Is your data on the right storage type? How active is your data, and how is it being used?

From life sciences and media and entertainment to HPC/Research, higher education, government and consumer products, virtually ALL enterprises struggle to manage data with fewer resources and at less cost. Heterogeneous storage environments have added complexities, costs, and made it difficult for IT managers to manage data.

Don’t let multi-vendor storage silos get in the way of effective data management.

This webinar series goes beyond just organizing your data. Throughout three short webinars, you’ll learn about how to take control, protect, and manage your data – all while enhancing workflow and reducing costs.

Join Floyd Christofferson, CEO of StrongBox Data Solutions, in a webinar series that will teach you how you can make the most of your data:

 

  1. Take Back Control of Your Data + LTFS

Don’t let multi-vendor storage silos get in the way of effective data management.

July 28, 2020 12:00 PM – 12:45 PM Eastern Time

 

  1. Reduce Costs & Increase Data Protection!

How to Better Manage Data Growth in a Multi-Vendor Storage Environment.

August 4, 2020 12:00 PM – 12:45 PM Eastern Time

 

  1. Workflow Magic!

Techniques to better use your data and not waste time trying to wrangle it.

August 11, 2020 12:00 PM – 12:45 PM Eastern Time

 

Register Now

Read More

THE ASCENT TO HYPERSCALE – Part 3

Reading Time: 2 minutes

Part 3: THE VALUE OF TAPE RISES RAPIDLY AS HYPERSCALE DATA CENTERS GROW

In Part 2 of this series, we looked at some of the key characteristics of hyperscale data centers. Now, we’ll explore how tape plays a role.

Today HSDCs are leveraging the many advantages of tape technology solutions to manage massive data growth and long-term retention challenges. Keep in mind most digital data doesn’t need to be immediately accessible and can optimally and indefinitely reside on tape subsystems. Some data requires secure, long-term storage solutions for regulatory reasons or due to the potential value that the data can provide through content analysis at a later date. Advanced tape architectures allow HSDCs to achieve business objectives by providing data protection for critical assets, backup, recovery, archive, easy capacity scaling, the lowest TCO, highest reliability, the fastest throughput, and cybersecurity protection via the air gap. These benefits are expected to increase for tape in the future.

Fighting the cybercrime epidemic has become a major problem for most data centers and HSDCs are no exception. Tape can play a key role in its prevention and provides WORM (Write-Once-Read-Many) and encryption capabilities providing a secure storage medium for compliance, legal and any valuable files. Tape, as an “Air Gap” solution, has gained momentum providing an electronically disconnected copy of data that prevents cybercrime disasters from attacking data stored on tape. Disk systems remaining online 7×24 are the primary target as they are always vulnerable to a cybercrime attack.

HSDCs are taking advantage of tiered storage by integrating high-performance SSDs, HDD arrays and automated tape libraries. Even though HSDCs are struggling with the exploding growth of disk farms which are devouring IT budgets and overcrowding data centers, many continue to maintain expensive disks often half full of data which often has little or no activity for several years. Obviously, few data centers can afford to sustain this degree of inefficiency. The greatest benefits of tiered storage are achieved when tape is used as its scalability, lower price and lower TCO plays an increasing role as the size of the storage environment increases. For the hyperscale world “adding disk is tactical – adding tape is strategic.”

For more information on this topic, check out our white paper: The Ascent to Hyperscale.

 

Read More

THE ASCENT TO HYPERSCALE – Part 2

Reading Time: 2 minutes

Part 2: CHARACTERISTICS OF THE HYPERSCALE DATA CENTER

In Part 1 of this series, we looked explored the definition of hyperscale data centers. Now, we’ll take a look at some of the key characteristics.

HSDCs don’t publicly share an abundance of information about their infrastructure. For companies who will operate HSDCs, the cost may be the major barrier to entry, but ultimately it isn’t the biggest issue – automation is. HSDCs must focus heavily on automating and self-healing environments by using AI and ML whenever possible to overcome inevitable and unexpected failures and delays. Unlike many enterprise data centers, which rely on a large full-time staff across a range of disciplines, HSDCs employ fewer tech experts because they have used technology to automate so much of the overall management process. HSDC characteristics include:

  • Small footprint, dense racks–HSDCs squeeze servers, SSDs (Solid State Disks) and HDDs (Hard Disk Drives) directly into the rack itself, as opposed to separate SANs or DAS to achieve the smallest possible footprint (heavy use of racks). HSDC racks are typically larger than standard 19” racks.
  • Automation–Hyperscale storage tends to be software- defined and is benefitting from AI delivering a higher degree of automation and self-healing minimizing direct human involvement. AI will support automated data migration between tiers to further optimize storage assets.
  • Users–The HSDC typically serves millions of users with only a few applications, whereas in a conventional enterprise there are fewer users but many more applications.
  • Virtualization–The facilities also implement very high degrees of virtualization, with as many operating system images running on each physical server as possible.
  • Tape storage adoption–Automated tape libraries are on the rise to complement SSDs and HDDs to easily scale capacity, manage and contain out of control data growth, store archival and unstructured data, significantly lower infrastructure and energy costs, and provide hacker-proof cybercrime security via the tape air gap.
  • Fast scaling bulk storage–HSDCs require fast, easy scaling storage capacity. One petabyte using 15 TB disk drives requires 67 drives and one exabyte requires 66,700 15 TB drives. Tape easily scales capacity by adding media, disk scales by adding drives.
  • Minimal feature set–Hyperscale storage has a minimal, stripped-down feature set and may even lack redundancy as the goal is to maximize storage space and minimize cost.
  • Energy challenges–High power consumption and increasing carbon emissions has forced HSDCs to develop new energy sources to reduce and more effectively manage energy expenses.

In Part 3 of this series, we’ll take a look at the how the value of tape is rapidly rising as hyperscale data centers grow. For more information on this topic, download our white paper: The Ascent to Hyperscale.

Read More

Archive Massive Amounts of Data in This Era of Object Storage

Reading Time: 3 minutes

By Kevin Benitez

Data is quickly becoming a critical asset to more and more companies, and now those companies are looking for ways to store massive amounts of data for long periods of time at a low cost for digital preservation or archiving.

Today, large amounts of archived data are being stored in object storage at an ever-increasing rate, both on-premise and in remote data centers such as the cloud.  Object storage has been particularly attractive for storing large amounts of digital information because of its flat architecture and easy access to metadata that allows for easy indexing, finding, and using of digital content. Additionally, object stores can scale to hundreds of petabytes in a single namespace without any performance degradation.

With those needs in mind, Fujifilm has developed the FUJIFILM Object Archive, which allows object storage to be written and read to and from data tape instead of mainstream HDDs, thereby significantly reducing costs. FUJIFILM Object Archive uses OTFormat to leverage object storage and modern tape. The software uses an industry-standard S3-compatible API so that on-premises object storage can be used at a lower cost while maintaining the same operability as cloud storage.

Read more

Read More

The Ascent to Hyperscale

Reading Time: 2 minutes

Part 1: What Are Hyperscale Data Centers?

Hyperscale data centers have spread across the globe to meet unprecedented data storage requirements. In this three-part blog series, we take a look at how the industry is preparing for the next wave of hyperscale storage challenges.

The term “hyper” means extreme or excess. While there isn’t a single, comprehensive definition for HSDCs, they are significantly larger facilities than a typical enterprise data center. The Synergy Research Group Report indicated there were 390 hyperscale data centersworldwideattheendof2017. An overwhelming majority of those facilities, 44%are in the US with China being a distant second with 8%. Currently the world’s largest data center facility has 1.1 million square feet. To put this into perspective the standard size for a professional soccer field is 60,000 square feet, the equivalent to about 18.3 soccer fields. Imagine needing binoculars to look out over an endless array of computer equipment in a single facility. Imagine paying the energy bill!

Hyperscale refers to a computer architecture that massively scales compute power, memory, a high-speed networking infrastructure, and storage resources typically serving millions of users with relatively few applications. While most enterprises can rely on out-of- the-box infrastructures from vendors, hyperscale companies must personalize nearly every aspect of their environment. A HSDC architecture is typically made up of tens of thousands of small, inexpensive, commodity component servers or nodes, providing massive compute, storage and networking capabilities. HSDCs are implementing Artificial Intelligence (AI), and Machine Learning (ML) to help manage the load and are exploiting the storage hierarchy including heavy tape usage for backup, archive, active archive and disaster recovery applications.

In Part 2 of this series, we’ll take a look at the characteristics of the hyperscale data center. For more information on this topic, download our white paper: The Ascent to Hyperscale.

Read More

LET’S DISCUSS YOUR NEEDS

We can help you reduce cost, decrease vendor lock-in, and increase productivity of storage staff while ensuring accessibility and longevity of data.

Contact Us >