FUJIFILM INSIGHTS BLOG

Data Storage

Ransomware Protection Must Include an Air Gap

Reading Time: 4 minutes

Ransomware statistics can be frightening! Research studies suggest that over two million ransomware incidents occurred in 2019 with 60% of organizations surveyed experiencing a ransomware attack in the past year. To make matters worse, the cybercriminals have moved up the food chain. Two thirds of those attacked said the incident cost them $100,000 to $500,000. Another 20% said the price tag exceeded half a million. Overall, the losses are measured in billions of dollars per year. And it’s getting worse. Enterprise Strategy Group (ESG) reports that about half of all organizations have seen a rise in cyber attacks since the recent upsurge in people working from home.

Understandably, this is a big concern to the FBI. It has issued alerts about the dangers of ransomware. One of its primary recommendations to CEOs is the importance of backup with the following key questions:

“Do you backup all critical information? Are backups stored offline? Have you tested your ability to revert to backups during an incident?”

The key word in that line of questioning is “offline.” Hackers have gotten good at staging their attacks slowly over time. They infiltrate a system, quietly ensuring that backups are infected as well as operational systems. When ready, they encrypt the files and announce to the company that they are locked out of their files until the ransom is paid. Any attempt to recover data from disk or the cloud fails as the backup files are infected, too.

The answer is to make tape part of the 3-2-1 system: Three separate copies of data, stored on at least two different storage media with one copy off-site. This might mean, for example, one copy retained on onsite disk, another in the cloud, and one on tape; or one on onsite disk, one on onsite tape as well as tape copies stored offsite.

Read more

Read More

Webinar: How Much Do You Really Know About Your Data?

Reading Time: < 1 minute

July 22, 2020

By Kevin Benitez

How much do you really know about your data? Is your data on the right storage type? How active is your data, and how is it being used?

From life sciences and media and entertainment to HPC/Research, higher education, government and consumer products, virtually ALL enterprises struggle to manage data with fewer resources and at less cost. Heterogeneous storage environments have added complexities, costs, and made it difficult for IT managers to manage data.

Don’t let multi-vendor storage silos get in the way of effective data management.

This webinar series goes beyond just organizing your data. Throughout three short webinars, you’ll learn about how to take control, protect, and manage your data – all while enhancing workflow and reducing costs.

Join Floyd Christofferson, CEO of StrongBox Data Solutions, in a webinar series that will teach you how you can make the most of your data:

 

  1. Take Back Control of Your Data + LTFS

Don’t let multi-vendor storage silos get in the way of effective data management.

July 28, 2020 12:00 PM – 12:45 PM Eastern Time

 

  1. Reduce Costs & Increase Data Protection!

How to Better Manage Data Growth in a Multi-Vendor Storage Environment.

August 4, 2020 12:00 PM – 12:45 PM Eastern Time

 

  1. Workflow Magic!

Techniques to better use your data and not waste time trying to wrangle it.

August 11, 2020 12:00 PM – 12:45 PM Eastern Time

 

Read More

THE ASCENT TO HYPERSCALE – Part 3

Reading Time: 2 minutes

Part 3: THE VALUE OF TAPE RISES RAPIDLY AS HYPERSCALE DATA CENTERS GROW

In Part 2 of this series, we looked at some of the key characteristics of hyperscale data centers. Now, we’ll explore how tape plays a role.

Today HSDCs are leveraging the many advantages of tape technology solutions to manage massive data growth and long-term retention challenges. Keep in mind most digital data doesn’t need to be immediately accessible and can optimally and indefinitely reside on tape subsystems. Some data requires secure, long-term storage solutions for regulatory reasons or due to the potential value that the data can provide through content analysis at a later date. Advanced tape architectures allow HSDCs to achieve business objectives by providing data protection for critical assets, backup, recovery, archive, easy capacity scaling, the lowest TCO, highest reliability, the fastest throughput, and cybersecurity protection via the air gap. These benefits are expected to increase for tape in the future.

Fighting the cybercrime epidemic has become a major problem for most data centers and HSDCs are no exception. Tape can play a key role in its prevention and provides WORM (Write-Once-Read-Many) and encryption capabilities providing a secure storage medium for compliance, legal and any valuable files. Tape, as an “Air Gap” solution, has gained momentum providing an electronically disconnected copy of data that prevents cybercrime disasters from attacking data stored on tape. Disk systems remaining online 7×24 are the primary target as they are always vulnerable to a cybercrime attack.

HSDCs are taking advantage of tiered storage by integrating high-performance SSDs, HDD arrays and automated tape libraries. Even though HSDCs are struggling with the exploding growth of disk farms which are devouring IT budgets and overcrowding data centers, many continue to maintain expensive disks often half full of data which often has little or no activity for several years. Obviously, few data centers can afford to sustain this degree of inefficiency. The greatest benefits of tiered storage are achieved when tape is used as its scalability, lower price and lower TCO plays an increasing role as the size of the storage environment increases. For the hyperscale world “adding disk is tactical – adding tape is strategic.”

For more information on this topic, check out our white paper: The Ascent to Hyperscale.

 

Read More

THE ASCENT TO HYPERSCALE – Part 2

Reading Time: 2 minutes

Part 2: CHARACTERISTICS OF THE HYPERSCALE DATA CENTER

In Part 1 of this series, we looked explored the definition of hyperscale data centers. Now, we’ll take a look at some of the key characteristics.

HSDCs don’t publicly share an abundance of information about their infrastructure. For companies who will operate HSDCs, the cost may be the major barrier to entry, but ultimately it isn’t the biggest issue – automation is. HSDCs must focus heavily on automating and self-healing environments by using AI and ML whenever possible to overcome inevitable and unexpected failures and delays. Unlike many enterprise data centers, which rely on a large full-time staff across a range of disciplines, HSDCs employ fewer tech experts because they have used technology to automate so much of the overall management process. HSDC characteristics include:

  • Small footprint, dense racks–HSDCs squeeze servers, SSDs (Solid State Disks) and HDDs (Hard Disk Drives) directly into the rack itself, as opposed to separate SANs or DAS to achieve the smallest possible footprint (heavy use of racks). HSDC racks are typically larger than standard 19” racks.
  • Automation–Hyperscale storage tends to be software- defined and is benefitting from AI delivering a higher degree of automation and self-healing minimizing direct human involvement. AI will support automated data migration between tiers to further optimize storage assets.
  • Users–The HSDC typically serves millions of users with only a few applications, whereas in a conventional enterprise there are fewer users but many more applications.
  • Virtualization–The facilities also implement very high degrees of virtualization, with as many operating system images running on each physical server as possible.
  • Tape storage adoption–Automated tape libraries are on the rise to complement SSDs and HDDs to easily scale capacity, manage and contain out of control data growth, store archival and unstructured data, significantly lower infrastructure and energy costs, and provide hacker-proof cybercrime security via the tape air gap.
  • Fast scaling bulk storage–HSDCs require fast, easy scaling storage capacity. One petabyte using 15 TB disk drives requires 67 drives and one exabyte requires 66,700 15 TB drives. Tape easily scales capacity by adding media, disk scales by adding drives.
  • Minimal feature set–Hyperscale storage has a minimal, stripped-down feature set and may even lack redundancy as the goal is to maximize storage space and minimize cost.
  • Energy challenges–High power consumption and increasing carbon emissions has forced HSDCs to develop new energy sources to reduce and more effectively manage energy expenses.

In Part 3 of this series, we’ll take a look at the how the value of tape is rapidly rising as hyperscale data centers grow. For more information on this topic, download our white paper: The Ascent to Hyperscale.

Read More

Archive Massive Amounts of Data in This Era of Object Storage

Reading Time: 3 minutes

By Kevin Benitez

Data is quickly becoming a critical asset to more and more companies, and now those companies are looking for ways to store massive amounts of data for long periods of time at a low cost for digital preservation or archiving.

Today, large amounts of archived data are being stored in object storage at an ever-increasing rate, both on-premise and in remote data centers such as the cloud.  Object storage has been particularly attractive for storing large amounts of digital information because of its flat architecture and easy access to metadata that allows for easy indexing, finding, and using of digital content. Additionally, object stores can scale to hundreds of petabytes in a single namespace without any performance degradation.

With those needs in mind, Fujifilm has developed the FUJIFILM Object Archive, which allows object storage to be written and read to and from data tape instead of mainstream HDDs, thereby significantly reducing costs. FUJIFILM Object Archive uses OTFormat to leverage object storage and modern tape. The software uses an industry-standard S3-compatible API so that on-premises object storage can be used at a lower cost while maintaining the same operability as cloud storage.

Read more

Read More

The Ascent to Hyperscale

Reading Time: 2 minutes

Part 1: What Are Hyperscale Data Centers?

Hyperscale data centers have spread across the globe to meet unprecedented data storage requirements. In this three-part blog series, we take a look at how the industry is preparing for the next wave of hyperscale storage challenges.

The term “hyper” means extreme or excess. While there isn’t a single, comprehensive definition for HSDCs, they are significantly larger facilities than a typical enterprise data center. The Synergy Research Group Report indicated there were 390 hyperscale data centersworldwideattheendof2017. An overwhelming majority of those facilities, 44%are in the US with China being a distant second with 8%. Currently the world’s largest data center facility has 1.1 million square feet. To put this into perspective the standard size for a professional soccer field is 60,000 square feet, the equivalent to about 18.3 soccer fields. Imagine needing binoculars to look out over an endless array of computer equipment in a single facility. Imagine paying the energy bill!

Hyperscale refers to a computer architecture that massively scales compute power, memory, a high-speed networking infrastructure, and storage resources typically serving millions of users with relatively few applications. While most enterprises can rely on out-of- the-box infrastructures from vendors, hyperscale companies must personalize nearly every aspect of their environment. A HSDC architecture is typically made up of tens of thousands of small, inexpensive, commodity component servers or nodes, providing massive compute, storage and networking capabilities. HSDCs are implementing Artificial Intelligence (AI), and Machine Learning (ML) to help manage the load and are exploiting the storage hierarchy including heavy tape usage for backup, archive, active archive and disaster recovery applications.

In Part 2 of this series, we’ll take a look at the characteristics of the hyperscale data center. For more information on this topic, download our white paper: The Ascent to Hyperscale.

Read More

“Never Stop” Philosophy Guides FUJIFILM

Reading Time: 2 minutes

By Kara Buzzeo

“These are unprecedented times,” a phrase we all have heard so many times, we might have even started saying it in our sleep. Throughout these unprecedented times, Fujifilm has made a commitment to doing its part to combat the uncertainty, chaos, and panic caused by COVID-19.

On March 30th, just 20 days after the World Health Organization declared a global pandemic, Fujifilm started its phase III clinical trials in Japan, to evaluate the safety and efficacy of influenza antiviral drug “Avigan”* for patients with COVID-19.

Since then, FUJIFILM Pharmaceuticals U.S.A, Inc. initiated a phase II clinical trial of Avigan in the  U.S. for patients with COVID-19. Patients will be enrolled in the study at three Massachusetts area hospitals.

FUJIFILM Recording Media U.S.A, Inc. (FRMU) has been able to assist nearby hospitals by providing logistics support during this time, storing the PPE in secure, safe locations.

One of the most outstanding moments during this time of crisis was our employees’ desire to contribute. Inspired by acts of kindness seen around the world and within the organization, FRMU employees wanted to make sure that they were doing their part to help their communities. At the Bedford manufacturing plant, employees took the initiative to manufacture face shields that were then donated to hospitals in Massachusetts, New Hampshire, and Washington D.C. Normally, every year FRMU hosts a kick-off meeting where employee appreciation gifts are given out, however this year, the money was allocated to Amazing Grace Food Pantry in Middletown, CT to aid in helping families put food on the table during this difficult time.

Through collaboration, technology, and our collective expertise, we will continue to work towards improving human health and helping bring the pandemic to an end.

* Avigan is not FDA approved in the United States.

Read More

Back to the Future: New-Old Tech Protects Data in the Zettabyte Age

Reading Time: 3 minutes

By Peter Faulhaber

May 7, 2020

I’ve become accustomed to odd looks and lots of questions when I meet new people and tell them I’m in the data tape business. “Really? Tape? In 2020?” is a common response.

I can forgive some people — those who last touched a consumer VHS tape or audiocassette in the late 90s or early 2000s. I’ve come to really enjoy expanding their perspective, though, when I tell them that tape is a major workhorse in “the cloud” and that most of the household-name technology and internet companies are tape users. Business continuity, including several data protection applications is a big part of the reason why, along with tape’s low total cost of ownership and low energy consumption. I think we can all agree that economics and preserving the environment is key to continuity in its own right!

Information is Currency in the Zettabye Age
The worldwide datasphere is currently around 35 Zettabytes (that’s 35 billion Terabytes) and expected to be 175 ZB by 2025 — an estimated annual compound growth rate of 30%.  The odds are good you’re seeing a similar rate of data explosion in your own business. Everything today is born digital, not just “structured” data like databases but “unstructured” data such as spreadsheets, documents, presentations, video, audio and photographs. Add to that the appliances and devices in the “Internet of Things” — smart vehicles, smart planes, smart phones, smart homes, factories and cities. Then add to the mix artificial intelligence, machine learning, ecommerce, email, social media, gaming, surveillance, VR, mobile and more – you can see the path we’re on.

We keep all this data around for years and sometimes decades because it is potentially valuable enough to justify archiving or keeping online in an active archive.  Whether your business relies on archival video footage or photos, harvests data for sale to outside parties or uses information for internal streamlining, strategy, or planning, it’s become impossible to even imagine a modern business without data that is increasing in value.

Read more

Read More

Whiteboard Video: Using Artificial Intelligence in Cybersecurity

Reading Time: < 1 minute

April 29, 2020

Ransomware continues to threaten the security of enterprise IT infrastructures. In this Fujifilm Summit video, storage analyst George Crump talks to IBM’s Chris Bontempo about how artificial intelligence and machine learning are helping improve cybersecurity by identifying and stopping potential threats.

Watch the video here:

Read More

Whiteboard Video: Data – Use It or Lose It

Reading Time: < 1 minute

April 15, 2020

According to IDC, data growth will reach 175 Zettabytes by 2025. Among that data, approximately 7.5 ZB will be archived or stored.

In this Fujifilm Summit video, storage analyst George Crump talks to IBM’s Shawn Brume about the importance of getting the right data and making sure it is being stored in the right way.

Watch the video here.

 

Read More

LET’S DISCUSS YOUR NEEDS

We can help you reduce cost, decrease vendor lock-in, and increase productivity of storage staff while ensuring accessibility and longevity of data.

Contact Us >