By Tony Ling, Director of Sales, Fujifilm Recording Media U.S.A., Inc.
After the initial shock and disruption of the pandemic, one industry that has rebounded nicely is the world of Media & Entertainment (M&E). Hollywood, video production, and post-production companies have adapted to making films in a COVID environment. At the same time, streaming grew significantly with most of the population homebound. Services such as Netflix, Hulu, Disney+, and Paramount all reported record subscriber growth over the last 15 months…..driving up the demand for new and original content.
Today, the retention and accessibility of digital assets and video content are incredibly vital to maintaining a competitive advantage. As a result, many modern M&E companies continue to assign starring roles to LTO data tape in their workflows to combat the rising expense associated with retaining and protecting capacity-intensive high-res content. 4K, 8K, 3D, and special effects can result in petabytes of storage for a single production!
With its high capacity, reliability, interchangeability, and security, the industry standard for deliverables has long been LTO tape…..this could be anything from daily camera footage, to post/edited work, approval copies, second copies, versions, final product, archival copies, etc. LTO tape is truly a defacto standard and an accepted part of the workflow in the M&E world.
Why are leading M&E companies turning to tape?
More M&E companies are recognizing the advantages of LTO tape, which can store massive amounts of data and combat ever-increasing storage costs across production, post-production, distribution, or archiving. Tape’s starring roles include:
Extremely cost-effective with the lowest TCO in the industry
LTO tape is an ideal solution for M&E companies. LTO is an open format designed for interoperability and together with LTFS, provides easy data access and management—perfect for easy file share, high performance, and improved workflow.
So, the next time you are streaming Star Trek Discovery on Paramount+ or The Mandalorian on Disney+, just remember that somewhere along the way of the making of that show, an LTO tape played a starring role!
Recently my neighborhood had a rash of car break-ins by what turned out to be just a band of mischievous teenagers. But what struck me about this occurrence was the flood of homeowner video surveillance clips that appeared on social media and that were sent to the local police department to help identify the wrongdoers. It seems like everyone in the neighborhood has a home video surveillance system, perhaps to catch a doorstep package thief, or if nothing else, to catch the guilty dog walkers!
A Booming Market for Video Surveillance Solutions
Indeed, the video surveillance market is booming, not just in the relatively nascent consumer market, but in the commercial market and has been for a long time – in a much bigger way. The reasons for this include more affordable cameras with better resolutions soaring from 720p up to 4k and even 8k. In the meantime, video surveillance systems are finding more and more applications. Retail shopping malls, banks, hotels, city streets, transportation and highways, manufacturing and distribution operations, airport security, college dorm and campus security, corporate security, police body and dash cams, to name just a few – all need good quality video surveillance.
Video Retention Costs Soar
However, these higher resolution cameras have sent the costs of video retention soaring. So much high-resolution raw footage quickly fills up available hard disk drives commonly used to store or retain video surveillance content. According to a Seagate video surveillance calculator, an installation of 100 cameras recording eight hours a day at 30 frames per second, 1080p resolution, with a retention period of 90 days would require 2,006 terabytes of storage. That’s 2.0 petabytes of expensive, energy-intensive hardware. Those with unlimited budgets can simply add more disks. But everyone else faces tough choices: shorten retention periods? lower video resolution? reduce the number of cameras or frames per second? None of these support the goals of why the video surveillance system was installed in the first place.
Explosive data growth continues to be a top challenge for today’s organizations and this growth is only going to increase in the future. In fact, according to analyst firm IDC, by 2025 worldwide data will grow 61% to 175 zettabytes, with as much of the data residing in the cloud as in data centers.
New technologies and approaches are continually being created to help address this data storage deluge. Members of the Active Archive Alliance from Fujifilm Recording Media, U.S.A., Inc, Spectra Logic, StrongBox Data and Quantum recently shared their insights into what the future looks like for active archives and data storage in 2019. Here are some of their top predictions:
Artificial Intelligence Creates Demand for Active Archives The evolution of deep learning, machine learning and artificial intelligence will continue to expand in 2019 across every industry as the digital transformation wave produces an explosion of big data. With these AI tools, organizations will be able to extract more value from their data than ever before giving rise to an insatiable demand for more data, more analytics…more competitive advantage. A dramatic increase in storage and specifically active archive will be required to cost effectively and efficiently provide accessibility to big data at scale.
Flash Will Gain Wide-Scale Adoption, But a Need to Store Everything Will Make Secondary Storage More Important Than Ever In the coming year we will see wide-scale adoption of flash storage. Organizations of all sizes will include solid-state drive (SSD) for greater performance, energy savings, space efficiency, and reduced management. New technologies like integrated data protection, storage federation/automation, policy-based provisioning, tiered data movement, and public cloud integration will be built on top of this flash foundation.
With the increased adoption of flash, organizations will also face the challenge of how to affordably store the data that is not mission critical, but still has value and therefore cannot be deleted. With the move to flash organizations will utilize a secondary storage tier to affordably manage all the organizations data, and this will happen through intelligent data management software designed to move data to a more affordable tier, without sacrificing access and searchability of the data.
Shift From Managing Your Storage to Managing Your Data
Data, not the underlying physical storage, is what matters. However, traditional storage systems are “big dumb buckets” that provide precious little insight into what data is growing, what applications or users are accessing it, or what is consuming storage performance and why.
Next-generation storage systems are “data-aware,” with real-time analytics built directly into the storage itself, delivering real-time information on data and performance at massive scale providing insight into data and storage. As organizations better understand their data (how it is being generated, at what pace, by who, for what project) they are more informed as to how to plan and budget for the future growth of their data, and better understand how to move data to different tiers based on customized policies.
Cross-platform Storage Automation Reduces Costs, Increases Productivity The reality is that there is not a “one-size-fits-all” storage solution that addresses the multiple requirements faced by most organizations. The result is that large environments typically rely on multiple storage vendors and point solutions to address the different performance and cost profiles needed for their data. The problem is this adds complexity for IT managers, requiring them to do more with static or shrinking operational budgets. This trend is driving a demand in the industry for solutions that provide automation of data and storage resource management across any storage type from any vendor. Such solutions leverage policy engines and management tools that are driven by multiple types of metadata about the files and their business value as they evolve over time. Such automation tools help data managers know what they have, and gives them control of cross-platform data migration, tiering, active archiving, and protection, without interrupting users. This type of metadata-driven automation will be an increasing trend over the next few years, because it provides demonstrable ROI by reducing OPEX and complexity for IT, breaking storage vendor lock-in, while increasing storage utilization efficiency and user productivity.
Rich Media Content Will Grow Exponentially, Across Many Industries Video now constitutes 50% of all data. Rich media comprises our video surveillance; consumer images, voice and video; medical imagery, IoT, entertainment and social media. Large and unstructured data is often 50 times or larger than the average corporate database. Video is unique, and it is not typically a good fit for traditional backup; it cannot be compressed or deduplicated, it doesn’t work well with replication, snaps or clones, and ingest speed is critical. Rich media is projected to surpass 100 Zetabytes worldwide by 2020. Expect enterprise data services to be increasingly optimized for large or rich media data sets, with infrastructure optimized for ingest processing and the full life cycle management of forms of rich media.
Usage of Cookies