Blog

Archive Massive Amounts of Data in This Era of Object Storage

Reading Time: 3 minutes

By Kevin Benitez

Data is quickly becoming a critical asset to more and more companies, and now those companies are looking for ways to store massive amounts of data for long periods of time at a low cost for digital preservation or archiving.

Today, large amounts of archived data are being stored in object storage at an ever-increasing rate, both on-premise and in remote data centers such as the cloud.  Object storage has been particularly attractive for storing large amounts of digital information because of its flat architecture and easy access to metadata that allows for easy indexing, finding, and using of digital content. Additionally, object stores can scale to hundreds of petabytes in a single namespace without any performance degradation.

With those needs in mind, Fujifilm has developed the FUJIFILM Object Archive, which allows object storage to be written and read to and from data tape instead of mainstream HDDs, thereby significantly reducing costs. FUJIFILM Object Archive uses OTFormat to leverage object storage and modern tape. The software uses an industry-standard S3-compatible API so that on-premises object storage can be used at a lower cost while maintaining the same operability as cloud storage.

Read more

Read More

The Ascent to Hyperscale

Reading Time: 2 minutes

Part 1: What Are Hyperscale Data Centers?

Hyperscale data centers have spread across the globe to meet unprecedented data storage requirements. In this three-part blog series, we take a look at how the industry is preparing for the next wave of hyperscale storage challenges.

The term “hyper” means extreme or excess. While there isn’t a single, comprehensive definition for HSDCs, they are significantly larger facilities than a typical enterprise data center. The Synergy Research Group Report indicated there were 390 hyperscale data centersworldwideattheendof2017. An overwhelming majority of those facilities, 44%are in the US with China being a distant second with 8%. Currently the world’s largest data center facility has 1.1 million square feet. To put this into perspective the standard size for a professional soccer field is 60,000 square feet, the equivalent to about 18.3 soccer fields. Imagine needing binoculars to look out over an endless array of computer equipment in a single facility. Imagine paying the energy bill!

Hyperscale refers to a computer architecture that massively scales compute power, memory, a high-speed networking infrastructure, and storage resources typically serving millions of users with relatively few applications. While most enterprises can rely on out-of- the-box infrastructures from vendors, hyperscale companies must personalize nearly every aspect of their environment. A HSDC architecture is typically made up of tens of thousands of small, inexpensive, commodity component servers or nodes, providing massive compute, storage and networking capabilities. HSDCs are implementing Artificial Intelligence (AI), and Machine Learning (ML) to help manage the load and are exploiting the storage hierarchy including heavy tape usage for backup, archive, active archive and disaster recovery applications.

In Part 2 of this series, we’ll take a look at the characteristics of the hyperscale data center. For more information on this topic, download our white paper: The Ascent to Hyperscale.

Read More

Back to the Future: New-Old Tech Protects Data in the Zettabyte Age

Reading Time: 3 minutes

By Peter Faulhaber

May 7, 2020

I’ve become accustomed to odd looks and lots of questions when I meet new people and tell them I’m in the data tape business. “Really? Tape? In 2020?” is a common response.

I can forgive some people — those who last touched a consumer VHS tape or audiocassette in the late 90s or early 2000s. I’ve come to really enjoy expanding their perspective, though, when I tell them that tape is a major workhorse in “the cloud” and that most of the household-name technology and internet companies are tape users. Business continuity, including several data protection applications is a big part of the reason why, along with tape’s low total cost of ownership and low energy consumption. I think we can all agree that economics and preserving the environment is key to continuity in its own right!

Information is Currency in the Zettabye Age
The worldwide datasphere is currently around 35 Zettabytes (that’s 35 billion Terabytes) and expected to be 175 ZB by 2025 — an estimated annual compound growth rate of 30%.  The odds are good you’re seeing a similar rate of data explosion in your own business. Everything today is born digital, not just “structured” data like databases but “unstructured” data such as spreadsheets, documents, presentations, video, audio and photographs. Add to that the appliances and devices in the “Internet of Things” — smart vehicles, smart planes, smart phones, smart homes, factories and cities. Then add to the mix artificial intelligence, machine learning, ecommerce, email, social media, gaming, surveillance, VR, mobile and more – you can see the path we’re on.

We keep all this data around for years and sometimes decades because it is potentially valuable enough to justify archiving or keeping online in an active archive.  Whether your business relies on archival video footage or photos, harvests data for sale to outside parties or uses information for internal streamlining, strategy, or planning, it’s become impossible to even imagine a modern business without data that is increasing in value.

Read more

Read More

Breaking Down Data Silos — Highlights From SC18

Reading Time: 3 minutes

By Kevin Benitez

I had the opportunity to attend SC18 last month in Dallas. Every year the Supercomputing Conference brings together the latest in supercomputing technology and the most brilliant minds in HPC. People from all over the world and different backgrounds converged this year for the 30thSupercomputing Conference.

As you can imagine, some of the demonstrations were absolutelymind-blowing and worth sharing. For starters, power consumption in data centers is becoming more of a challenge as data rates continue to surge. Fortunately, 3M was live on the trade show floor tackling this issue by demonstrating immersion cooling for data centerswhich has the potential to slash energy use and cost by up to 97%. As this technology continues to evolve,we could see huge gains in performance and in reducing environmental impacts.

The race to dominate quantum computing continues! IBM’s 50-Qubit quantum computer made an appearance at this year’s show. What does it mean to have a computer with 50 qubits working perfectly? (Side note, in quantum computing a qubitis the basic unit of quantum information). According to Robert Schoelkopf, a Yale professor, if you had 50 or 100 qubitsyou could “do unfathomable calculations that can’t be replicated on any classical machine, now or ever.” Although the quantum computer churns out enough computational power to rank within the top ten supercomputers in the world,the device can only compute for 100 milliseconds due to a short-lived power supply.

StrongBox Data’s flagship product, StrongLink, was demonstrated on the show floor as a way to store and contain the vast amount of data that research universities and laboratories are producing. StrongLinkis a software solution that simplifies and reduces the cost of managing multi-vendor storage environments. StrongLink provides multi-protocol access across any file system, object storage, tape and cloud in a global namespace. Users maintain a constant view of files regardless of where they arestored, which maximizes their storage environment for performance and cost.

Recently the University of Southampton’s Supercomputer Iridis 5 teamed up with StrongLink to get more value out of its data. Oz Parchment, Director of the University’s iSolutions IT support division, commented in March saying: “One wayStrongLink interested us was its cognitive component, the ability to look at and match up metadata at scale, which gets interesting when you combine that with different data infrastructures. Our set up currently includes large-scale tape stores, large-scale disc stores, some of that being active data, some of that being nearline data, some being effectively offline data. But then, by linking these into the [Iridis] framework, which StrongLink allows us to do, we can connect these various data lakes that we have across the research side of the organization, and begin to create an open data space for our community where people in one discipline can look through data and see what kinds of data are available in other communities.“

Never has HPC been more crucial. As we say here at Fujifilm “Never Stop Transforming Ourselves and the World.”

Read More

LET’S DISCUSS YOUR NEEDS

We can help you reduce cost, decrease vendor lock-in, and increase productivity of storage staff while ensuring accessibility and longevity of data.

Contact Us >