Blog

Tiered Storage: Building the Optimal Storage Infrastructure

Reading Time: < 1 minute


Fortunately, as data continues to grow exponentially, the selection of data storage technologies has never been more robust. The choice of what storage device to use for which application at a given point in time is a balancing act making trade-offs between frequency of access (performance), cost, and capacity. Storage tiering has become a key strategy that lets you optimize the use of storage resources, save costs and make the best use of storage technology for each data classification. The foundations of tiered storage had their beginnings over 30 years ago when disk, automated tape libraries and advanced policy-based data management software such as (HSM) combined to effectively migrate less-active data to less expensive storage devices.

Tiered storage integrates hardware and storage management software to provide a seamless operation and for customers to realize the huge TCO and ROI economic benefits available from optimized storage implementations. The business case for implementing tiered storage is compelling and becomes increasingly so as storage pools get larger. Today’s storage tiers offer several technologies ranging from ultra-high capacity, low cost storage at one end of the hierarchy to very high levels of performance and functionality and at the other. The non-stop, increasing growth of data will require the continual evolution of new, more advanced approaches to tiered storage and management capabilities.

For more information, check out this Horison Information Strategies White Paper “Tiered Storage: Building the Optimal Storage Infrastructure.”

Read More

Tape Storage: New Game, New Rules

Reading Time: < 1 minute

Modern tape storage has become the leading strategic and lowest-cost storage solution for massive amounts of archival and unstructured data. This bodes well for future tape growth as archival data is piling up much faster than it is being analyzed. Over the past decade, the magnetic tape industry has successfully re-architected itself delivering compelling technologies and functionality including cartridge capacity increases, vastly improved bit error rates yielding the highest reliability of any storage device, a media life of 30 years or more, and faster data transfer rates than any previous tape or HDD (Hard Disk Drive).

Many of these innovations have resulted from technologies borrowed from the HDD industry and have been used in the development of both LTO (Linear Tape Open) and enterprise tape products. Additional tape functionality including LTFS, RAIT, RAO, TAOS, smart libraries and the Active Archive adds further value to the tape lineup. HDD technology advancement has slowed while progress for tape, SSD (Solid State Disk) and other semiconductor memories is steadily increasing. Fortunately, today’s tape technology is nothing like the tape of the past.

For more information, check out this Horison Information Strategies White Paper “Tape Storage: It’s a New Game With New Rules.”

Read More

Why are Two Thirds of Organizations Failing to Backup and Archive Correctly?

Reading Time: 4 minutes

You would think, by now, that backup best practices would have moved into the same category as filling up the tank before a long drive or looking each way before crossing the street. But a new study indicates that most organizations continue to get it fundamentally wrong. How? By continuing to backup long-inactive data that should have been archived instead of remaining in the backup schedule.

The 2020 Active Archive Alliance survey found that 66% of respondents were still using backup systems to store archive data. What’s wrong with that?

  • It greatly lengthens backup windows: Repeatedly backing up unchanging archive data wastes storage resources and adds time to the backup process
  • As data sets grow, a failure to distinguish between backup and archiving becomes increasingly expensive in terms of disk space
  • Even those offloading backups to cheap cloud resources are still running up a large bill over time by unnecessarily backing up cold data
  • Archiving, on the other hand, frees up expensive capacity by moving less frequently used data to more cost-effective storage locations.


Clearing Up Backup Confusions

One of the underlying reasons for this is a confusion between backup and archiving. Backup provides a copy of organizational data for use in recovery from a data loss incident, cyberattack or disaster. These days, it is generally copied onto disk or tape and either retained there or relayed to the cloud. A key point is that backup only copies data, leaving the source data in place. It is also used to restore lost or deleted files rapidly.

Archiving is a different concept entirely. Rather than copying data, it moves data classed as inactive to a more cost-effective tier of storage such as economy disk or tape. This frees up space on higher-tier storage systems such as fast disk or flash. In addition, it shortens the backup window and offers permanent and long-term protection from modification or deletion of data.

Read more

Read More

A New Age Dawns for Digital Archives: 2020 State of Industry Report

Reading Time: < 1 minute

The Active Archive Alliance recently released its 2020 Active Archive and the State of the Industry Report, which highlights the increased demand for new data management strategies as well as benefits and use cases for active archive solutions.

Here are a few takeaways from the report:

  • Today’s data demands an intelligent archive solution that leverages the advanced capabilities of data movement technology and scale-out hardware to realize its fast-increasing value.
  • The growing reliance on archival data makes it ground zero for unlocking game-changing data strategies.
  • New applications, such as the Internet of Things (IoT) with billions of nodes and boosted by the arrival of 5G networks, will help fuel insatiable demand for more intelligent active archives.
  • Key usage cases for active archiving tell the real story, starting with healthcare and high-performance computing (HPC) in life sciences.
  • Other top use cases include security, business efficiency and continuity, media and entertainment, and IoT, including autonomous vehicles.

Sponsors of the report include Active Archive Alliance members Atempo, Fujifilm Recording Media USA, IBM, Iron Mountain, Harmony Healthcare IT, MediQuant, PoINT Software & Systems, Quantum, Qumulo, QStar Technologies, Spectra Logic, StrongBox Data Solutions and Western Digital.

You can read the full report here.

For more information on the Active Archive Alliance visit: www.activearchive.com.

Read More

Ransomware Protection Must Include an Air Gap

Reading Time: 4 minutes

Ransomware statistics can be frightening! Research studies suggest that over two million ransomware incidents occurred in 2019 with 60% of organizations surveyed experiencing a ransomware attack in the past year. To make matters worse, the cybercriminals have moved up the food chain. Two thirds of those attacked said the incident cost them $100,000 to $500,000. Another 20% said the price tag exceeded half a million. Overall, the losses are measured in billions of dollars per year. And it’s getting worse. Enterprise Strategy Group (ESG) reports that about half of all organizations have seen a rise in cyber attacks since the recent upsurge in people working from home.

Understandably, this is a big concern to the FBI. It has issued alerts about the dangers of ransomware. One of its primary recommendations to CEOs is the importance of backup with the following key questions:

“Do you backup all critical information? Are backups stored offline? Have you tested your ability to revert to backups during an incident?”

The key word in that line of questioning is “offline.” Hackers have gotten good at staging their attacks slowly over time. They infiltrate a system, quietly ensuring that backups are infected as well as operational systems. When ready, they encrypt the files and announce to the company that they are locked out of their files until the ransom is paid. Any attempt to recover data from disk or the cloud fails as the backup files are infected, too.

The answer is to make tape part of the 3-2-1 system: Three separate copies of data, stored on at least two different storage media with one copy off-site. This might mean, for example, one copy retained on onsite disk, another in the cloud, and one on tape; or one on onsite disk, one on onsite tape as well as tape copies stored offsite.

Read more

Read More

Webinar: How Much Do You Really Know About Your Data?

Reading Time: 2 minutes

July 22, 2020

By Kevin Benitez

How much do you really know about your data? Is your data on the right storage type? How active is your data, and how is it being used?

From life sciences and media and entertainment to HPC/Research, higher education, government and consumer products, virtually ALL enterprises struggle to manage data with fewer resources and at less cost. Heterogeneous storage environments have added complexities, costs, and made it difficult for IT managers to manage data.

Don’t let multi-vendor storage silos get in the way of effective data management.

This webinar series goes beyond just organizing your data. Throughout three short webinars, you’ll learn about how to take control, protect, and manage your data – all while enhancing workflow and reducing costs.

Join Floyd Christofferson, CEO of StrongBox Data Solutions, in a webinar series that will teach you how you can make the most of your data:

 

  1. Take Back Control of Your Data + LTFS

Don’t let multi-vendor storage silos get in the way of effective data management.

July 28, 2020 12:00 PM – 12:45 PM Eastern Time

 

  1. Reduce Costs & Increase Data Protection!

How to Better Manage Data Growth in a Multi-Vendor Storage Environment.

August 4, 2020 12:00 PM – 12:45 PM Eastern Time

 

  1. Workflow Magic!

Techniques to better use your data and not waste time trying to wrangle it.

August 11, 2020 12:00 PM – 12:45 PM Eastern Time

 

Register Now

Read More

Busting Myths and Taking Names: Fujifilm, Spectra Logic and Iron Mountain Kick Off Series on Reintroducing Tape to the Modern Data Center

Reading Time: < 1 minute
Modern tape libraries are part of an overall data management lifecycle strategy that offer many benefits including lower cost, energy savings, increased security and long-term shelf life.
We’re excited to partner with Spectra Logic and Iron Mountain on this new Storage Switzerland eBook: Reintroducing Tape to the Modern Data Center. The first chapter debunks some of the common myths of tape storage around reliability, access and operations. Read more about it here.
Stay tuned over the next few weeks as we reveal the next four chapters covering topics such as disaster recovery and backup, performance, cost, and offsite storage.
Interested in learning more on this topic? Register for our webinar 5 Reasons Modern Data Centers Need Tape on September 26th at 11:00 am EDT.
Read More

The Impact of GDPR on Your Data Management Strategy

Reading Time: 2 minutes

By Floyd Christofferson,
SVP of Products at Strongbox Data

It is no illusion that every time you turn around it seems there is another report of a high-profile hack of sensitive personal data, impacting hundreds of millions of people all over the world. The recent Equifax hack released personal financial data of over 143 million consumers, but that was not an isolated incident. In 2016 and 2017 so far there have been at least 26 major hacks around the world that have released personal data of more than 700 million people. These include hacks of telecommunication companies, financial institutions, government agencies, universities, shopping sites, and much more.

The hacks are not a new problem. But in a global economy with often conflicting political and economic priorities at stake, there has been no comprehensive approach to ensuring people have the right to protect and delete if they want, all of their personal data.

The European Union’s new GDPR (General Data Protection Regulation) went into effect in May 2018. Although GDPR is designed to protect European citizens, the rules and penalties apply to any company from any country who does business in Europe. And the penalties are significant, with companies at risk of being fined up to 4% of their global annual gross revenues or €20 million (whichever is greater) for failing to comply with strict right-to-be-forgotten and privacy protections for customer data.

As a result, there is a growing panic among businesses as they try to figure out how to solve this problem in time, and how to do so with existing data management and storage resources that are not designed for this task. And the concern is not only in Europe. Companies in the US and around the world who have customers in Europe are also scrambling to ensure they are in full compliance by the deadline. But according to Gartner, by the end of 2018 over 50% of companies affected by the GDPR worldwide will not be in full compliance with its requirements.

In this paper we offer an overview of the key provisions of GDPR that impact storage and data management for both structured and unstructured data. In subsequent technical briefs, we will go into more detail about specific technical solutions to help ensure your data environment is in compliance, even with your existing storage and data infrastructure.

Read More

LET’S DISCUSS YOUR NEEDS

We can help you reduce cost, decrease vendor lock-in, and increase productivity of storage staff while ensuring accessibility and longevity of data.

Contact Us >