FUJIFILM INSIGHTS BLOG

Data Storage

Spotlighting Active Archive with Solutions from SAVARTUS

Reading Time: 9 minutes

Executive Q & A with Rick Bump, CEO of SAVARTUS

Q1: Welcome Rick to this Fujifilm Insights Blog Executive Q & A! Please tell us a bit about SAVARTUS and your current role as CEO.

Ans: I co-founded SAVARTUS with Christopher Rence to address the pressing challenges in data management. We recognized a growing need driven by the trajectory of data and its evolving demands—challenges the status quo simply can’t handle. SAVARTUS delivers storage solutions, blending innovative optical technology and LTO options, all accessible through our Unified Data Interface (UDI) software. We all know that data is surging—IDC estimated over 180 zettabytes globally by 2025, fueled by AI generating massive datasets and businesses digitizing operations. Costs are exploding too; traditional cloud storage or HDDs rack up steep bills, especially with egress and other API fees for AWS archival storage. Compliance pressures, like GDPR or HIPAA, demand long-term retention and secure destruction, while environmental impact is a rising concern—data centers eat up 1-2% of global electricity, clashing with sustainability goals. The old approaches—power-hungry disks, or costly cloud tiers—lag behind, failing to balance affordability, speed, and eco-friendliness That’s why Christopher and I launched SAVARTUS. Our optical active archive offers retrieval at low cost—fast and cost-effective—while using near-zero idle power. Tape keeps deep storage at the lowest total-cost-of-ownership (TCO) and our UDI software manages it all. I’m thrilled to lead us in tackling these gaps.

Q2: As you seek to provide data management solutions for organizations, from your perspective, what is driving data growth and what is challenging the status quo?

Ans: From our perspective, data growth is being propelled by a mix of technological, regulatory, and behavioral forces, all reshaping how organizations handle their information, while an evolving need to manage digital assets from creation through intentional destruction adds further complexity. First, digital transformation is a primary driver. Businesses are digitizing operations—supply chains, customer interactions, and more—producing vast structured and unstructured data. IoT devices in manufacturing or healthcare generate continuous sensor streams, and AI workloads, especially generative models, create massive datasets for training and content production. IDC’s forecast of over 180 zettabytes by 2025 is proving conservative as these technologies scale. Second, regulatory and compliance demands are extending data lifecycles. Frameworks like GDPR, HIPAA, and SEC Rule 17a-4 require retention—sometimes decades-long—pushing organizations to archive data cost-effectively and securely. This ties directly to the need for end-to-end digital asset management: data must be tracked from creation (for example, for patient records or financial transactions), preserved for legal holds, and eventually destroyed intentionally to comply with privacy laws or reduce liability. A healthcare provider, for instance, might retain records indefinitely but must ensure secure deletion when permissible, amplifying storage demands. Third, user behavior is exploding data volume. Remote work has surged cloud collaboration data—emails, videos, documents—while consumers fuel petabytes through social media, streaming, and e-commerce, captured for analytics. This “datafication” creates assets that organizations must manage holistically: created by users, stored for insights, and destroyed when obsolete, not just left to pile up. What’s challenging the status quo? Traditional data management—HDDs and online cloud—are faltering under these pressures when it comes to long term archival storage for large capacities. Cost is the disruptor: AWS Deep Archive offers $0.00099/GB/month, but retrieval fees and 12- 48-hour latency frustrate users needing quick access. HDDs consume power and hit scalability walls as data centers face energy and environmental constraints. Security is shaking things up too. Ransomware, up 13% in 2024 per Verizon’s DBIR, exploits online storage, driving demand for offline, air-gapped solutions. The old “store everything online” model is crumbling as breaches expose vulnerabilities, especially when compliance demands auditable, immutable records from creation to destruction—records that must be both secure and securely disposed of on schedule. A critical challenge is the need to access traditionally archived data for AI, ML, or other business requirements driving active archive capabilities. The status quo treated archives as static vaults— data locked away on tape or deep cloud tiers, retrieved only for audits. Now, organizations want to mine historical data for training AI models, running analytics, or feeding real-time applications. A media company might pull decades-old footage to train a content recommendation engine, or a retailer might analyze archived transactions for predictive trends. Multi-hour retrieval-SLA’s and cloud’s retrieval costs clash with these active use cases, demanding faster, more accessible archiving. Sustainability is another disruptor. Data centers, consuming 1-2% of global electricity, face ESG scrutiny. HDDs’ constant energy draw does not cut it, while our optical discs—near-zero idle power, 50–100-year lifespan—offer a greener alternative. In essence, data growth stems from digitization, regulation, and behavior, intensified by the need to manage assets from creation to destruction. The status quo is challenged by cost inefficiencies, security risks, sustainability pressures, and the shift to active archiving for AI and ML. For example, our Archive-as-a-Service solution offering (AaaS), with 2–5-minute retrieval, offline security, and low-energy optical storage, addresses these drivers and disruptions, enabling organizations to handle data’s full lifecycle smarter and faster.

Q3: You mentioned that SAVARTUS delivers storage solutions based on optical and tape technologies. Can you elaborate on the ideal scenario for each of those diverse solutions?

Ans: At SAVARTUS, we leverage optical and tape technologies to provide tailored storage solutions, each excelling in distinct scenarios, complemented by our software for unified data access. These offerings address the diverse needs of organizations managing explosive data growth, from creation to destruction, while tackling challenges like cost, security, and accessibility. Here’s how each technology fits into an ideal scenario. Our optical storage shines in scenarios demanding random access, robust security, and sustainable long-term retention for moderate to large data volumes. The ideal use case is an organization needing an on-premise active archive—data that’s not accessed daily but requires quick availability for operational or analytical purposes, such as AI/ML training, compliance audits, or historical analysis. Imagine a media company managing decades of video assets. They need to archive raw footage (perhaps 100 GB files) for potential reuse in new projects or AI-driven content curation. Our optical library, with retrieval times of Time-to-First-Byte in 2-5 minutes, delivers this data far faster than Cloud’s 12-hour SLA (like for AWS Deep Archive). The offline, air-gapped nature of optical discs protects against ransomware, critical for high-value digital assets, while near-zero idle power aligns with sustainability goals. Another prime scenario is regulated industries like healthcare or finance, where data must be retained for 7-10+ years, accessed occasionally for audits, and destroyed securely per GDPR or HIPAA. Our Data Lifecycle Management (DLM) software components enhance this by managing the lifecycle – encrypting data on write, tracking retention periods, and ensuring compliant deletion. With features like customer-managed keys and WORM compliance, we justify a $0.004/GB/month premium over Deep Archive ($0.00099), offering enterprise-grade security and speed at a fraction of Veritas’s $5-$15/GB/month. Tape storage, a stalwart in our portfolio, is ideal for deep, cold storage—massive datasets retained for compliance or disaster recovery with minimal access needs. The perfect scenario is a large enterprise or government agency archiving petabytes of historical data (examples include tax records, seismic surveys) that’s rarely touched but must be preserved indefinitely at the lowest possible cost. Consider a financial institution mandated to keep transactional data for decades. Tape’s ultra-low cost—comparable to Deep Archive’s $0.00099/GB/month—makes it unbeatable for static, multi-PB archives. Our tape solutions, paired with DLM software modules, automate tiering from active systems to tape, ensuring data moves seamlessly from creation to long-term storage. The physical air-gap provides unmatched resilience against cyber threats, and tape’s 30+ year lifespan suits archival needs where retrieval Time-to-first-Byte (TTFB) isn’t urgent. For example, restoring a backup after a disaster fits tape’s strengths—high capacity (perhaps using LTO-9 at 18 TB/cartridge), high 400MB/s transfer rate and durability outweigh a higher TTFB. Tape also excels for organizations prioritizing sustainability and cost over speed. A research institute storing climate data might use tape to keep 50 PB at a fraction of cloud or HDD costs, with our DLM software modules cataloging metadata for eventual retrieval or destruction, ensuring compliance with grant mandates. Regarding the role of Data Lifecycle Management Software; our DLM software components tie these solutions together, managing data, as I like to say, with intention from creation through destruction. For optical, it enables encryption, key management, and fast indexing, supporting active use cases like AI-driven insights from archived data. For tape, it automates migration to deep storage, tracks retention, and schedules secure deletion, optimizing cost and compliance. This holistic approach—optical for accessibility and physical WORM, tape for cost and scale—lets organizations adapt to diverse needs. Why Both? Optical storage suits mid-tier archiving (approximately 100 TB-5 PB) with frequent-enough access, offering speed and security at $0.004/GB/month. Tape dominates ultra-large, static archives (1 PB+). Together, with our DLM software components, SAVARTUS delivers a spectrum of solutions—reasonably fast, secure (physical WORM) optical for active archives, and economical tape for deep storage and retrieval with high transfer rates—meeting the full lifecycle demands of modern data management.

Q4: Your customers can implement your solutions as cold storage, more of a traditional archive, or as an active archive. Can you explain the advantages of cold archive vs. active archive?

Ans: We recognize that organizations will want to use our optical and tape solutions in different ways—cold storage, traditional archiving, or an active archive. Cold archiving is what we lean on for data you stash away and rarely touch, like with our tape or the budget tier of our optical setup. It’s all about keeping costs crazy low—think tape matching AWS Deep Archive at $0.00099/GB/month. Say a government agency’s got decades of tax records to store; they don’t need them often, maybe just for an audit every few years. Cold storage is perfect here because it’s dirt cheap—tape and optical barely use power when idle—and it’s built to last, with tape good for 30+ years and our discs up to 100 years. Plus, being offline keeps it safe from hackers. Our DLM software components seal the deal by tracking retention and scheduling secure deletion, like for GDPR compliance. The trade-off? Retrieval times depend on file size and are higher for optical than with tape. Active archiving, though, is where tape- and optical have their individual advantages. Our optical solution at $0.004/GB/month really pops—is for random access type data and small files you want to grab more often. Tape on the other hand shines with its low cost and high transfer rate, making it specifically attractive for the retrieval of large files and high capacities. Both optical and tape are available with a Glacier front end, supporting on-prem and hybrid environments. With our DLM software components adding encryption on write and fast search, it’s great for things like a retailer pulling archived sales data to predict trends. The catch is it costs more than cold—$0.004 vs. $0.00099—but you’re paying for that quick access and extra utility. Basically, cold storage is your friend for cheap, long-term storage of stuff like backups or regulatory records, while active storage is the way to go if you’re actively using archived data, like for AI or quick audits. Our Unified Data Interface software keeps it all smooth, managing the whole lifecycle either way.

Q5: You mentioned AI as a driver of data storage demand, how does active archive support AI workflows based on your product solutions?

Ans: You’re right—AI is a huge driver of data storage demand these days, and our active archive solution at SAVARTUS is built to support those workflows at the backend. With AI there is a large demand for high performance data storage (HDD/Flash) for data collection, preparation, model development, training/re-training and model deployment. At the same time, after model deployment the large capacity model data must be archived for future references, retraining and for regulatory reasons. The S3/Glacier interface of our tape- and optical archive offerings is perfectly suited to work with typically object storage-based AI model front ends.

Q6: SAVARTUS recently joined the Active Archive Alliance (AAA). What benefits do you hope to enjoy as a part of AAA?

Ans: SAVARTUS’s membership in the Active Archive Alliance is a strategic move, and we anticipate several meaningful benefits from this affiliation. The Active Archive Alliance brings together a knowledgeable group of industry leaders focused on advancing active archiving, and we see this as an opportunity to strengthen our position in the storage landscape. One key benefit is the chance to network with this expert community and gain insights into the future of active archiving. With our optical solution we’re eager to collaborate with peers like IBM or PoINT Software and Systems to refine our approach. Their perspectives could help us enhance features like retrieval speed or indexing, critical for customers accessing historical data for AI-driven analytics, such as a retailer optimizing predictive models with years of sales records. Membership also elevates our influence. By engaging with the Alliance, we can contribute to shaping industry standards, showcasing how our optical technology complements tape’s established role while meeting modern demands—think fast, secure access alongside costeffective deep storage. This could resonate with enterprises, like a financial institution managing audit-ready archives. Finally, the Active Archive Alliance’s credibility enhances our reputation. Affiliation with a respected, vendor-neutral body signals trust to clients, reinforcing our DLM software component’s role in managing data lifecycles end-to-end. For a healthcare provider securing patient data, that trust could be decisive. In short, we aim to leverage expertise, influence trends, and build credibility through the Active Archive Alliance, all to advance our optical-led active archiving solutions. We’re also eyeing the credibility bump. The Active Archive Alliance is all about vendor-neutral, trusted solutions, so being part of that club signals to customers that we’re serious about solving their data growth headaches—cost, security, you name it. It’s like a stamp of approval that could sway a media company picking us over a cloud-only option for their video assets. So yes, it’s about sharpening our game, getting our name out there, and building trust—all while keeping our optical solution shining as the go-to for active archiving.

Q7: Finally, when you are not slaving away for SAVARTUS, what do you enjoy doing in your free time these days?

Ans: When I’m not working, I hunt, fish, camp, hike, scuba dive, sky dive and train our Malinois.  I love being outside!

Thanks for your time, Rick, and we wish you a lot of success with SAVARTUS and in the Active Archive Alliance. Check out the SAVARTUS website at: https://savartus.com

Read More

Revolutionizing Data Management Through Embedded Metadata Extraction

Reading Time: 7 minutes


Executive Q&A with David Cerf, Chief Data Evangelist and Head of U.S. Operations, GRAU DATA

Q1: Welcome, David, to this Fujifilm Insights Blog – Executive Q&A! Please tell us a bit about GRAU DATA and your current role as Chief Data Evangelist and Head of U.S. Operations and International Sales.


Ans: Thanks for having me, Rich! At GRAU DATA, we’re addressing one of today’s most critical challenges: managing the explosive growth of unstructured data. Our mission is to tackle these challenges and help organizations prepare for the exponential data growth expected over the next decade.

In my role, I wear two hats. As Chief Data Evangelist, I focus on educating the market about the transformative potential of metadata in modern data management. Metadata is often the key to unlocking insights from complex datasets. As Head of U.S. Operations and International Sales, I’m dedicated to expanding the global reach of our solutions, like MetadataHub, which empowers organizations to unlock the full value of their unstructured data by capturing its content and context.


Q2: We have been hearing a lot about MetadataHub. Can you describe that product for us?

Ans: MetadataHub is a game-changer for organizations drowning in unstructured data. Whether coming from sensors, microscopes, or satellites, the challenge isn’t just storing all that unstructured data—it’s figuring out how to use it. That’s where MetadataHub shines.

It connects directly to your storage systems—like SMB, NFS, and S3—and automatically extracts all types of metadata, including the critical embedded metadata that tells you the content and context of your files. MetadataHub can uniquely open files, including application-specific files, to extract the content and contextual value needed to derive insights for analytics, AI, and training LLMs. Embedded metadata is often overlooked or not easily accessible, but it provides the crucial content and context that make unstructured data truly actionable.

Think of MetadataHub as a dynamic, searchable repository that acts as a smart proxy for your unstructured data, eliminating the need to constantly recall full files from storage. It integrates seamlessly with modern tools like Snowflake, Databricks, Jupyter Notebooks, KNIME, and LLMs, automating data provisioning to feed data pipelines. By delivering rich metadata without moving the file, MetadataHub accelerates decision-making, streamlines workflows, and maximizes the value of your data. For industries like life sciences, HPC, or manufacturing—where petabytes of research data or billions of sensor readings are the norm—MetadataHub isn’t just helpful; it’s transformative.


Q3: Can you elaborate on some of the key benefits for end users by leveraging embedded metadata and MetadataHub specifically?

Ans: Leveraging embedded metadata with MetadataHub unlocks a wide range of benefits for end users. By extracting critical insights directly from files, MetadataHub provides the content and context organizations need to optimize operations and enhance decision-making. For organizations struggling with scattered storage systems or siloed data, this means reduced costs, better data quality, and improved accessibility.

The most immediate impact users see is in data preparation time. By automating metadata extraction, we’ve managed to reduce data preparation time by up to 90%. This automation means researchers and IT administrators can focus on applying their expertise rather than spending countless hours organizing and preparing data. They get faster results and can access the needed data without waiting for manual processing.

Resource efficiency is another crucial benefit. MetadataHub delivers metadata directly to applications without moving entire files, dramatically reducing infrastructure demands. Since the extracted insights are typically 1,000 times smaller than the original file, we see up to 30% reduction in network, CPU/GPU, and storage loads.

This efficiency translates directly into cost savings. Our system enables intelligent storage decisions based on a file’s actual content and business value, rather than simple metrics like age or access history. When combined with data movers and orchestration tools, MetadataHub automates workflows and facilitates smarter data migration. Organizations can move data from expensive storage to archives based on true business value, reducing storage costs by up to 30%.

The impact on data accessibility is equally significant. MetadataHub breaks down traditional data silos that often plague large organizations. Users can seamlessly discover and access data across the entire organization, regardless of where it’s stored. This global accessibility ensures that valuable insights don’t remain trapped in isolated systems.

Perhaps most importantly for today’s organizations, MetadataHub makes data AI-ready. It transforms raw data into actionable insights that can be immediately used in analytics, artificial intelligence, and machine learning workflows. These insights are formatted and structured to work seamlessly with the tools and applications that modern data scientists and analysts rely on daily.


Q4: Can you describe your typical target customers for MetadataHub?

Ans: Our ideal customers are organizations managing massive volumes of unstructured data. This includes scientific and research institutions such as aerospace agencies, pharmaceutical companies, life sciences organizations, national labs, universities, high-performance computing facilities, and manufacturers. These organizations often generate large datasets from specialized equipment like electron microscopes, genomic sequencers, and satellite imaging systems.

MetadataHub is also an excellent fit for organizations with extensive data archives, particularly those utilizing tape archives or deep storage systems. It enables these customers to capture critical insights before files are stored cost-effectively, ensuring that data remains actionable even in long-term storage.

Additionally, MetadataHub is invaluable for any organization with unstructured data seeking to enhance data quality for AI and large language model (LLM) training, making their data more usable and impactful for advanced analytics and machine learning workflows.


Q5: You just mentioned the benefit of getting the data “AI-ready.” Can you explain how that works?

Ans: The journey to reliable AI begins with data quality. When organizations work with poor-quality data, they inevitably face unreliable AI outputs, slower workflows, and missed opportunities. We’ve found that the key to solving this challenge lies in embedded metadata – those crucial details hidden within files that provide essential context and content for AI applications.

Making unstructured data truly AI-ready requires addressing three fundamental challenges. First, there’s the complexity of managing diverse file types. Unstructured data comes in countless formats, each with its own unique structure and complications that make standardization challenging. Second, organizations must handle increasingly massive data volumes. We’re talking about processing millions or even billions of files efficiently – a scale that demands robust automation.

The third and most critical challenge involves extracting embedded metadata effectively. Modern scientific instruments and sensors generate files that are rich with metadata, often containing hundreds or thousands of vital elements. These include experimental conditions, equipment settings, and measurement parameters – all crucial details that determine the quality and usability of the data for AI applications.

MetadataHub tackles these challenges through automated processing at scale. Our system harmonizes both content and context, making data immediately actionable regardless of its source or format. We place special emphasis on data provenance – tracking the complete history of data from its origin through every transformation. This comprehensive tracking builds the trust, repeatability, and accountability that are absolutely essential for successful AI and machine learning applications.


Q6: I’ve heard you talk about relieving “performance anxiety” when it comes to deep storage such as data tape libraries. Can you tell us about that benefit in more detail?

Ans: Tape archives often come with “performance anxiety” due to the latency in recalling files. However, you don’t need the entire file in most cases—just the insights within it.

MetadataHub solves this by acting as a proxy, capturing critical metadata, and making it instantly accessible. This ensures that all content and context are readily available for AI workflows or applications without the need to recall files from the archive unless absolutely necessary.

This approach creates an “active archive,” where critical insights are captured immediately, enabling files to move to archival storage sooner. This ensures organizations can save costs without losing quick access to the necessary data. These insights remain accessible, enabling organizations to migrate files off expensive performance storage sooner, saving costs while ensuring seamless access to the necessary data, regardless of where the file resides.

For example, the Zuse Institute manages 1 PB of high-performance SSD storage and over 200 PB on tape. By using MetadataHub, they capture critical metadata from files on performance storage, providing instant access to insights without recalling the original files. Once the metadata is captured, Zuse migrates the files to secure, low-cost archival solutions—enterprise tape—while retaining immediate access to actionable metadata.


Q7: You also have a tool included in MetadataHub that you refer to as your “Data Landscape Report.” What exactly does that report do, and how does it benefit end users?

Ans: Great question! The Data Landscape Report is about solving one of the biggest challenges our customers face—”Where is my data?” It provides a clear, 360-degree view of all your unstructured data, no matter where it’s stored—on-premises, in the cloud, or across vendor storage systems like SMB, NFS, and S3.

We hear this from organizations of all sizes. They’ve got storage scattered between on-site systems and the cloud, but there’s no single view to show where everything is or how it’s being used. That’s where the Data Landscape Report comes in—it consolidates all that information into one place.

Now, this isn’t just about tracking file counts or sizes. The report delivers actionable insights, such as file types, data age, usage patterns, and storage efficiency. These insights empower organizations to optimize their data management in several ways. Teams can intelligently move data to the most cost-effective storage tiers, ensuring efficient resource utilization. The report’s comprehensive view makes data migrations straightforward and seamless, eliminating the guesswork typically involved in these projects. Additionally, it strengthens compliance and governance efforts by providing clear visibility into data provenance and history.

What sets the Data Landscape Report apart is its speed—customers can see results within hours, making it an invaluable tool for quickly gaining clarity over their data landscape.


Q8: Finally, when you are not slaving away for GRAU DATA, what do you enjoy doing in your free time these days?

Ans: Oregon is my year-round playground. In the summer, I’m often mountain biking through scenic trails or adventure riding on rugged backroads. When winter comes, I swap the wheels for a snowboard and hit the slopes. Exploring Oregon’s landscapes keeps me energized and inspires the creativity and problem-solving approach I bring to GRAU DATA, helping us tackle some of the most complex data challenges out there.

Thank you, David, for your time. We wish you continued success with GRAU DATA and MetadataHub!

For more information on MetadataHub, visit GRAU DATA’s website or http://Moremetadata.com

Additional resources:
How Data’s DNA Drives Innovation: Transform how your organization discovers, analyzes, and innovates.

2025: The Year Metadata Transforms Data Management


Read More

Rapid Change in Storage Landscape and New Breed of Tape Libraries Drive Innovative Archival Data Storage Solutions from Versity

Reading Time: 6 minutes

Executive Q & A with Bruce Gilpin, CEO of Versity

Welcome, Bruce, to this Fujifilm Insights Blog Executive Q & A!

    1. Please tell us about Versity’s archival storage products and your role and responsibility as CEO.

    Answer:

    One of the really fun and at the same time painful things about starting your own company is that you and your co-founders will have every job at some point. At the beginning, Harriet Coverston was our CTO and lead developer and I did everything else including finance, accounting, marketing, HR, product management, sales, and business development. Over the past 13 years the company has grown tremendously and now we have a fully staffed management team of very high caliber professionals so my job these days is focused more on product strategy, company strategy, and managing my team.

    In terms of product, Versity is laser focused on the large archive data management market segment. Fortunately for us, this is the part of the archival storage market that is still growing and with the dawn of AI, our HPC business is absolutely booming. Scale Out Archive Manager (ScoutAM) is our main product and this is a comprehensive modern super scale solution for mass storage and large archives. We are rapidly consolidating the market that was traditionally spread more or less equally among IBM’s HPSS and Spectrum Archive products, HPE’s DMF product, Quantum’s archive manager, and Oracle’s OHSM & SAM-QFS product. Until Versity came along, there was never a clear leader in the market for various reasons including lack of product differentiation and high switching costs.

    2. Versity has always delivered archival storage solutions to typical HPC customers like the National Labs and Department of Defense. Can you share a bit about the evolving customer profile you are now seeing in your customer base?

    Answer:

    Yes, we are seeing strong growth in our U.S. Government HPC business and in addition we have branched out and gained a lot of traction within the military and civilian government sectors of U.S. allies such as the UK, Australia, New Zealand, Germany, and France. These customers value air gapped data for security and they need modern solutions that can be supported within classified enclaves. With our global partners such as Dell Technologies, we are able to meet the stringent requirements of this customer base. Outside of AI and traditional HPC, we are seeing growth in media and entertainment, genomics, and banking and finance industries. Recently we won our first high frequency trading contract and this niche is likely to grow. People love to say that it’s a gloomy time in the archival data world or the tape storage business. We don’t see that at all. Our business has growth over 30% every single year like clockwork and recently that growth has accelerated. My conclusion is that at the high end of the market where the amount of data is over 100PB, the growth is accelerating. The issue for our competitors is that very few products function well or really have a solid value proposition at those capacity levels where things tend to get very complex and performance sensitive.

    3. Among your expanding customer base, Versity is recognized for its scalability, performance, ease of use, and cost-effectiveness. What else is driving demand among your prospects and target customers?

    Answer:

    Environmental sustainability is becoming a critical factor. Our solutions, like ScoutAM, are designed to optimize power consumption and overall storage expenses, aligning with organizations’ goals to reduce their carbon footprint and power consumption. AI systems are power hungry as we all know so people are looking very hard at ways to balance the power intensive part of the data center with something that is dramatically more efficient. We can deliver that with the combination of ScoutAM and the new class of rack modular tape libraries with field replaceable components.

    4. It’s probably not that long ago that your customers had the challenge of managing archives in the tens or hundreds of petabytes range. But I understand you are now seeing multiple exabyte requirements these days; what is driving the increased storage demand?

    Answer:

    We used to hear about a one exabyte bid or requirement once every year or two and we would be really excited to work on a project of that scale. Today we are working on six different projects that are over one exabyte each and one of them is a greenfield build out for 10 exabytes of capacity. It has really been interesting to see this acceleration take place, and the timing is absolutely perfect for Versity since we gambled with our product plan four years ago and decided to put all of our resources into creating a super scalable solution aimed at the highest capacity users. It was not obvious at the time that this would pay off for us, but it put us in the leadership position at just the right moment. What is driving this is AI and the exponentially increasing capability of various sensors to capture data. We have much more sophisticated sensors proliferating on new sensor platforms – on the ground, in the air, in the ocean, and in space. They all generate data that needs to be effectively managed and secured in a stable long-term repository.

    5. You mentioned AI as a key driver of increasing storage demand among others. But everyone talks about high-performance storage like SSDs associated with AI models. What is the archival demand that is being driven by AI?

    Answer:

    Flash is used in the scratch tier to power AI clusters and it is super effective. Companies like Vast and Weka have done a great job advancing the state of the art for scratch storage – meaning the storage that is closest to the CPU’s and GPU’s. But nobody has enough money to store the source data on flash so we see smaller 1-10 PB flash systems paired with 100-1,000 PB archival systems and in this configuration the archival data is the persistent copy. Usually, we are keeping a persistent copy of the original source data from sensors and sometimes we are keeping copies of checkpoints or the output of the AI cluster whether that is a visualization or some other analysis.

    Q6: Tape systems have always been a standard component that Versity software manages. How are the new tape library systems like Spectra’s Cube or IBM’s Diamondback changing the tape value proposition among your customer base compared to the previous generations of tape libraries?

    Answer:

    These new rack modular systems are very interesting to our user base. I think it is common knowledge at this point that all of the hyper scalers have shifted totally to this model so clearly there are benefits including field serviceable robotics. The rack modular option is not a fit for every use case by any stretch but for very large sites that do a lot of random reads, it has a pretty compelling value proposition, and it opens up the tape market for sites that need simplicity and scalability. Each rack can hold around 30PB of LTO-9 media so these make very useful building blocks. Versity offers a per rack pricing model to help sites keep all of the cost elements within a modular framework and our S3 to tape capability is also a great fit for sites that want the modular libraries but still want to maintain the advantages of feeding the libraries with an independent vendor solution. This allows them to mix and match library vendors or maintain a dual supplier relationship to balance risk and optimize pricing on very large systems.

    Q7: Finally, when you are not slaving away for Versity, what do you enjoy doing in your free time?

    Answer:

    Well, I’m really lucky to live in a part of the world where there is a vibrant outdoor sports community. I am hooked on a newer winter sport called snow kiting and in the summer, I like to kite foil, run, hike, and mountain bike. I have been known to disappear on a meditation retreat once in a while. My two children are in college now, so I have a lot of freedom and I am enjoying every second of it!

    Thanks for your time Bruce, and we wish you a lot of success with Versity’s innovative archival solutions!

    Check out Versity’s website for more info!

    To listen to the audio version of this Q & A, CLICK here.

    Read More

    New Tape-as-a-Service from Geyser Data Delivers Benefits of Tape in a Cloud Based Subscription Model

    Reading Time: 4 minutes

    Executive Q & A with
    Nelson Nahum,
    CEO of Geyser Data

    Q1: Welcome Nelson to this Fujifilm Insights Blog Executive Q & A! Please tell us about Geyser Data’s new tape-as-a-service model and your role and responsibility as CEO.

    Ans: Thank you for having me, Rich. Let me begin with my story on why we created Geyser Data. In a world now forever changed by AI, extracting value from data has become easier than ever before. At the same time the rate of creation of new data is explosive, so we need to look at tape as a great solution for certain storage workloads. Tape is low-cost media, it requires minimal power and it can be protected from cybersecurity threats by simply air gapping it. However, historically tape lost ground to hard disks because tape libraries were difficult to manage and required a lot of capex to buy them. At Geyser Data, this is what we solve, people can use tape-as-a service, without the need to buy and manage tape libraries. And it can be used with the simplicity of S3 APIs and on a pay per month subscription model.

    Q2: Your most recent background prior to Geyser Data is with Zadara. What was your role there and how did that experience help establish your vision for Geyser Data?

    Ans: I co-founded Zadara and ran it for many years until we reached a substantial size with more than 500 cloud locations. At Zadara we pioneered “On premise-as-a-service-storage”. Although Geyser is a different type of storage, my experience allowed me to bring this new idea to light very fast, hire a magnificent team, and build the strategic partnerships necessary to be successful.

    Q3: Tape can currently be consumed in a cloud model by simply engaging some of the well-known deep archive cloud service providers. What makes Geyser Data different?

    Ans: The reason we called the company Geyser is because we allow the user to extract the hot value of the data underneath the glacier; to that extent, some of the critical difference with Geyser and the cold archive tier of the cloud providers are that we don’t charge egress fees nor retrieval fees, so there is a substantial cost savings. Our “Cloud Tape Libraries” have dedicated tapes per user, so the user knows even the barcodes of their tapes and can ask to have the tapes returned to them. Our Cloud Tape Libraries can be “air gapped” for cyber protection and “remounted” instantly when needed. In addition, our Cloud Tape Libraries are multi cloud, so they can be connected to the traditional cloud to copy the data for further processing in the cloud. In short, there are many differences, our Cloud Tape Libraries are really good for people that want to use the data from time to time and those that want to make sure they are in control of their own data. They also want to save money, that is always a key motivator!

    Q4: Tell us about your go to market strategy and who are your target customers?

    Ans: Our go to market strategy is via the reseller channel. We have two types of customers, the customers that use tape on premise today and want to move to the cloud or have DR in the cloud. The other target customers are customers that don’t use tape today. But they have workloads like archive or active archive and others in disk or cloud storage. Now, they can easily use Geyser Data with the same S3 interface and save a lot of money.

    Q5: What are your plans beyond the U.S. market

    Ans: We are definitely building a global Cloud Archive, we have multiple international partners that are ready to go. We will start making announcements soon, probably even before of the end of the year and in early 2025.

    Q6: What are you seeing in the world of data storage that is creating the need for this service offering?

    Ans: Cold data is 70% of all data. But cold data becomes “warmer” when you really want to extract value from it. I believe there is not enough manufacturing capacity of disk drives to store all the data that customers want to store. There would not be enough IT budget or energy available either! By making tape easier to use, I believe tape will have a much more prominent role as the demand for long term massive storage continues to explode.

    Q7: I understand you have had a successful launch within Digital Realty. What makes tape-as-a-service attractive to co-location data center service providers?

    Ans: As I mentioned before tape is very low power and availability of energy is a big concern for colos especially as AI deployments increase. The Spectra Logic Cube library that we use, can store 30PBs using only 1.2 kW, an insignificant amount compared to equivalent HDD storage! Also, tape is much denser than hard disk, so it consumes less floor space too. Finally, Digital Realty has this amazing network fabric that interconnects more than 700 data centers. Any customer of these data centers can establish a private connection today to Geyser Cloud Tape Libraries with just a few clicks.

    Q8: Finally, when you are not slaving away for Geyser Data, what do you enjoy doing in your free time?

    Ans: I love Formula 1 racing, I love watching soccer and I’m really excited for the upcoming World Cup in the U.S.!

    Thanks for your time, Nelson, and we wish you a lot of success with Geyser Data’s innovative tape-as-a-service solution!

    Check out Geyser Data’s website for more info!

    Read More

    Cutting-Edge Data Storage Solutions from XenData Include LTO Data Tape

    Reading Time: 5 minutes

    Executive Q & A with Dr. Phil Storey, CEO of XenData

    Q1: Welcome Phil to this Fujifilm Insights Blog Executive Q & A! Please tell us a bit about XenData and your role and responsibility as CEO.

    Ans: Thank you Rich. I and our CTO, Mark Broadbent started XenData over 20 years ago and our product concept has not really changed in that time.

    We wanted to develop software to manage tape libraries and RAID for long term secure storage of files and to combine the best characteristics of each. That is disk access times combined with the speed, security and longevity of tape.

    The other thing that we wanted was a different business approach. Remember that we started XenData just after the dot com era when there were a huge number of risky businesses that started and then failed. We wanted a solid business with reliable products and great support – which I think we have.

    Q2: XenData has been a long-time sponsor of the Active Archive Alliance and you continue to innovate in the area of active archiving. Can you tell us about your LTO Archive Appliances?

    Ans: I mentioned that our business concept was to develop software. Well, we learned along the way that customers want solutions that are as turnkey as possible. So today, we mainly sell archive appliances managed by XenData software. By supplying the combined hardware and software, we are able to guarantee performance and minimize any possible problems that come with unbounded hardware options.

    Our LTO archive appliances manage one or more LTO libraries and include managed RAID, from a few TB to a PB of disk. We support almost all tape libraries, including from Dell, HPE, IBM, Spectra, Quantum and Qualstar. Our appliances make writing to LTO just like writing to disk on a network. We also have a private cloud interface. We can replicate to public cloud, etc. I could go on. But in summary, by combining disk, tape and cloud we do provide strategic active archive options for our customers.

    Q3: So it seems LTO tape still has a lot to offer, what are the key features and benefits appreciated by your customer base?

    Ans: In summary, I would say, LTO supports high performance active archiving at a reasonable cost. Most of our customers have at least several 100s TB. It is at that volume of data and above where LTO is particularly attractive from a cost perspective. Of course, the fundamentals of tape are a must for our customers which are high reliability and long life. And in these times of climate change, the low energy profile and low carbon footprint of tape is attractive too.

    I should add that we have over 1,500 installations with LTO libraries and about 90% of these are for Media & Entertainment type applications. Our customers include TV stations, Hollywood studios, video production and post-production companies as well as many marketing departments for large corporations and governmental organizations.

    Q4: You mentioned Media & Entertainment customers and their appreciation of your LTO tape solutions, tell us about your new Media Portal software?

    Ans: This is yet another innovative interface into our LTO archive systems. We have our core file-folder interface that can be accessed via standard windows network protocols like SMB and NFS, an option for a private cloud interface that makes accessing the archive like writing to and reading from Amazon Web Services S3 and now Media Portal adds a web interface. A user can browse the file-folder structure of their archive, see previews of video files and image files and then download the files that they need. We also have a search capability which is based on file and folder name. Media Portal will be released later this year and I am very excited about it, especially as it opens up all sorts of options for future development including AI options like converting speech to text and then searching on the text.

    Q5: You mentioned that most of your archive installations are for Media & Entertainment applications. Would you tell us more about some of the other application areas?

    Ans: We have installations in video surveillance, healthcare and life sciences applications. A solution that combines both disk and LTO tape is particularly attractive for large video surveillance installations because it offers longer retention periods with massive scalability at a very reasonable cost.  

    Q6: You recently returned from the IBC show in Amsterdam in mid-September. What can you share about the show; what was the buzz overall and in storage specifically?

    Ans: Not surprisingly, AI was a key theme at the show, with a dedicated AI Tech Zone. The show also addressed sustainability, including energy efficiency in devices and delivery systems. So for us, it was super busy as there is lots of demand for long-term energy efficient storage. One of the recurring themes was the realization among our customers and prospective customers that cloud is so expensive for users even with just a few hundred TBs of content.

    Q7: Finally, when you are not globe-trotting and running XenData, what do you like to do in your spare time?

    Ans: Last year, my wife and I moved from California to just outside of Minneapolis to be close to our two lovely granddaughters. As you know, the winters are brutal in Minnesota, but we had a plan for that. Three years ago, we started designing and building a house on the beach in the Yucatan, just north of Merida. We only completed the project in July of this year. So, the answer to your question, ‘what do we do now’ is just one word: relax!

    Thanks for your time Phil, and we wish you a lot of success with your innovative LTO tape solutions!

    Read More

    Tape Gets Even Greener with New Recycling Program for End-of-Life Data Cartridges

    Reading Time: 5 minutes

    Executive Q & A with Gavin Griffiths,
    Founder & Managing Director, Insurgo Technology

    Q1: Welcome Gavin to this Fujifilm Insights Executive Q & A! Please tell us a bit about your role and responsibility as Founder & Managing Director of Insurgo Technology.

    Ans:

    I have been involved with Tape media for over 20 years, firstly as a salesperson selling the old Open Reel tapes and 3480/90 and 3590 media we manufactured here in Wales, UK. This was just about when LTO1 hit the marketplace to rival the S-DLT technology. A 100GB tape back then was a fantastic feat!

    I started Insurgo Technology nearly 15 years ago, we had a couple of months working from my house, before opening an office for sales and delivery service of tape media worldwide. With advancements in tape technology, we recognized the need for a secure destruction solution for tape. Popular methods for end-of-life tape destruction were outdated and no longer matched the technological advances made by manufacturers. Our focus shifted to developing tangible services that would add value for customers. We explored tools for services like testing, repair, and recovery, along with managing and storing tapes. However, it became clear that the secure destruction of tapes required a more progressive approach to ensure data protection and compliance. I now drive this initiative.

    We set up an R&D division in 2011, headed by our Technical Director and inventor, Roy Spiller. Roy has been in the industry twice as long as me and had manufactured the physical tape as well as servicing the machines which made the tape film. We set out to develop a secure data destruction service, in which we could, hand on heart, issue a certificate of destruction with 100% certainty, that the data on the tape was wiped of all data and traces of information.

    While Insurgo was founded on tape sales and some associated aftermarket services, today we have developed the most secure and traceable systems for tape disposal. We can save over half the CO2 impacts of current shredding and incineration whilst ensuring a complete solid chain of custody.

    That’s 300 metric tons of CO2 saved per 1 million tapes.


    Q2: Can you tell us about the breakthrough work you are doing in recycling of data tape cartridges at the end-of-life stage?

    Ans:

    We have developed technology patented in 53 countries that dramatically reduces CO2 emissions, during the destruction process. Our solution is over 800 times more efficient than traditional shredding machines and 10 times more efficient than standard degaussers. Unique ability to time and date stamps every step of the process, through the chain of custody, irrevocably linking the drives to each tape being processed.

    As part of our ongoing R&D to enhance these processes, we have begun dismantling tapes and repurposing all their main components, through an auditable process We can already demonstrate through life cycle assessment methodology a CO2 reduction of over 50% compared to onsite shredding and incineration. We are working on various aspects of the tape film to improve these numbers even further. Our ultimate goal is to find a new purpose for all the tape components.

    Q3: Tape has been around for 70+ years, but I am not aware of tape recycling programs prior to your program. What has been the driver for this initiative?

    Ans:

    Interestingly, we have been repurposing tape media long before plastic recycling became a trend! Our new recycling service paired with our destruction process helps businesses to mitigate their CO2 impact further.

    Despite these environmental efforts, data security must remain the top priority. It is crucial to recognize that the tape landscape has evolved dramatically. The capacity alone went from 100GB to 50TB in 20 years, yet the methods for destruction used two decades ago should be considered obsolete.

    We now offer customers the opportunity to significantly reduce their environmental impact by over half, while ensuring data security and full traceability. It is the ultimate win-win situation!

    Q4: How do you see differences in demand for this recycling program in the EU vs, U.S.A and/or ROW?

    Ans:

    The EU enforces stricter controls, requiring emissions targets to be achieved, met, and published. Many global companies we engage with are considering integrating these practices into their worldwide policies anticipating future global regulatory requirements. Even without physical recycling, our service benefits both security and CO2 reduction.

    Moving forward, we would like to see a reduction in the data industry’s “end of life” emissions, and this being adopted worldwide. Insurgo Technology is now becoming recognized by Global companies, especially banking corporations based in Europe, as they pursue a more secure and environmentally friendly solution to tape end of life.

    Q5: What do you see as the biggest objections to adoption of your recycling program and how do you address those?

    Ans: It is a great question.

    Honestly, I cannot see any reason why anyone would not want to switch to our system immediately! We have spent over 12 years refining and perfecting our technology, providing a compelling story and journey for all who know us. The systems are portable, the software is online, easy to follow and implement, but most importantly secure.

    We have developed our technology, now we need to expand our message and progress our marketing reach much further, making our proposition concise and simple to understand.

    We have the highest level of security and traceability, soon to add library automation to further improve process times and volume demands, including hyperscaler’s. Add all the environmental benefits we touched on, our technology package is the end to end, world leader in data tape secure destruction.

    Q6: Where can readers get more information about Insurgo data tape cartridge recycling?

    Ans: Details of our technology systems, Investigo software, and scanners will be found by going directly to the page located on the Insurgo website

    https://insurgo.co.uk/secure-disposal-solutions/

    Q6: Finally, when you are not slaving away for Insurgo, what do you enjoy doing in your free time?

    Ans:

    In fairness, I currently have a great work life balance. The team at Insurgo technology work as one to achieve the common goals we set at the start of the financial year, and we all share in the responsibilities for these goals.

    With an active teenage family, time flies by, one with newly acquired interior design eyes and another who enjoys soccer and video gaming, which is a relatable aspect for me as a father to enjoy as well!

    I have a local squash club, keeping me in half decent shape throughout the week and with both the soccer and football seasons just around the corner, my weekends of watching the premier league and NFL will make up the rest of my time.

    Thank you for the opportunity to discuss all these aspects with you Rich.

    Thanks for your time, Gavin, and we wish you a lot of success with your data tape cartridge recycling program!

    Read More

    There is New Value in Old Data Amid AI/ML Boom

    Reading Time: 7 minutes

    Executive Q & A with Chuck Sobey, Chief Scientist of ChannelScience

    Q1: Welcome, Chuck, to this Fujifilm Insights Executive Q & A! Please tell us a bit about your role and responsibility as Chief Scientist at ChannelScience.

    Ans: Thank you, Rich – I appreciate the opportunity to talk with you.

    ChannelScience is the consulting firm I started in 1996 to provide R&D services for emerging memory and storage technologies. At that time, the Internet boom was just starting and the fear of Y2K was building. My initial focus was hard disk drive (HDD) technology. This grew from my prior experience as a designer of thin film magnetic recording heads for HDDs at Applied Magnetics, near Santa Barbara; and then as a read channel architect at Texas Instruments, in Dallas. Data storage technology was growing even faster than semiconductors at that time, so it was an exciting field.

    To promote my consulting work, I wrote storage technology classes for KnowledgeTek and taught them at practically every company related to data storage over the next two decades. This enabled me to engage with large and small companies to help them develop a wide variety of novel storage innovations. These spanned from ever-smaller HDDs, to laser optical tape, to solid-state memory and storage.

    My current responsibilities at ChannelScience include staying current with the state-of-the-art in storage and memory, signal processing, and error correction coding (ECC), and connecting with customers to help develop their new technologies and prepare them for the market. We are also early proponents of semiconductor chiplets and offer pathfinding and strategy consulting on this rapidly growing technology.

    Q2: Can you tell us about the breakthrough work you are doing in recovery of old data from obsolete tape stock?

    Ans: The “state-of-the-art” of recovering legacy data formats is to locate several vintage drives (often on eBay) and scavenge/refurbish them to make one working drive.  There are several challenges with this, in addition to finding and refurbishing the drives. A sufficient supply of vintage heads and rollers must be secured, because these wear out with use. Operators and technicians must be trained to work with a wide variety of drives that do not have support.

    I often point out that if you refurbish a 1970s tape drive, when you have done a perfect job what you have is a 1970s tape drive. Unfortunately, the 50-year-old vintage tapes are no longer in original condition, so this performance level can be insufficient. That said, it is remarkable how well properly-cared-for tapes have held up. It is my belief that some of what we are learning about decaying magnetic patterns on tapes can be used to continue the improvement of modern tape and drive development.

    Based on this state-of-the-art, we recognized that a modern, multi-format tape reader that could read vintage tapes better than the original equipment would address all of these issues. With my background in head design and read channel signal processing, the answer was clear to me: Use modern, sensitive magnetoresistive (MR) heads and pair them with the latest signal processing algorithms for data detection. Furthermore, with such sensitive heads, we believed we could have minimal contact between the head and tape and still get sufficient signal fidelity for improved detection. Minimal contact means we are gentler with delicate tapes, and the system may need less-frequent head cleaning.

    Furthermore, ChannelScience had already developed methods for extreme recoveries for HDDs, DVDs, and solid-state drives (SSDs) and flash (see links below).

    [http://www.channelscience.com/files/Drive-Independent_Data_Recovery.pdf

    http://www.channelscience.com/files/Drive%20Independent%20Data%20Recovery%20Sobey%20Orto%20Sakaguchi%20TMRC%202005%20D5%20PREPRINT.pdf ]

    Q3:  What was the genesis of your multi-format “Do-No-Harm” legacy tape reader?

    Ans: During the pandemic, a Department of Energy (DOE) Funding Opportunity Announcement (FOA, DE-FOA-0002360, issued December 14, 2020) was published that was seeking proposals for “Digitizing and Analyzing Legacy Seismo-Acoustic Data.” It was from DOE’s Office of Defense Nuclear Nonproliferation Research and Development.

    The Comprehensive Nuclear Test Ban Treaty was signed in 1996, and the last tests in the US were conducted in 1992. A wealth of seismic data was recorded for each test. This information went to two places: Paper graphs and 9-track tape. These test results now represent irreproducible scientific data. Other types of irreproducible data are from scientific instruments that no longer exist, such as specific particle accelerators, telescopes, or seismic exploration of no-longer-accessible locations.

    These data sets have new value now – more than they did decades ago – for a simple reason: AI/ML. It is now possible to examine the entire corpus of data for a range of experiments and train and refine new machine learning (ML) models to do new science and make better predictions and classifications. For example, the ability to distinguish a rogue nuclear detonation from an earthquake or a mine excavating explosion can be vastly improved.

    We are grateful for the support of the Department of Energy for our tape reader project. They awarded us three SBIR (Small Business Innovation Research) grants to develop our breakthrough technologies. We received a Phase I award (DE-SC0021850) to apply machine learning to waveforms from damaged tapes. And we received Phase I and Phase II awards (DE-SC0021879) to develop the prototype of our multi-format legacy tape reader. DOE also provided excellent business training through their Energy I-corps and Phase Shift programs.

    We are now seeking first customers to fund the productization of our prototype. What we currently have is a scientific instrument that is operated by Ph.D. scientists. Our next step is to turn this into a robust product that can be shipped and used by adequately trained operators and technicians. If we are successful with our product, we can “make obsolete media obsolete!”

     [At right is the Current ChannelScience Multi-format Legacy Tape Reader prototype, shown with 1” analog instrumentation tape mounted.]

    Q4: Beyond the value for AI/ML, what other applications are there and what type of organizations might be interested in this unique capability?

    Ans: The ability to train and refine AI/ML models on rare data sets is certainly the driver for this funding. There are many types of irreproducible experiments that organizations want to use data from. In addition to nuclear weapons tests, these include particle accelerator data, telemetry from space missions, medical records, demographics, business records, and many others.

    Another area that I believe may have even more potential, is audio and video tape. Although there are many vintage units still available, they are getting scarcer, and key components are wearing out. As always, the data is deteriorating, so better signal fidelity and signal processing than the vintage equipment can provide are needed. The image below shows the resolution we are able to get out of our prototype system. We can resolve individual transitions and the inter-track gaps.

    [Magnetic force microscope-like image of a 9-track ½” digital data tape, created from ChannelScience’s prototype multi-format tape reader.]

    Surprisingly, international diplomacy is another area where we’ve discovered unique opportunities. For example, there is a wealth of under-utilized data in former Eastern Bloc countries. It is stored in non-Western formats and there has been much less focus on recovering these rare data sets. With targeted development, we are confident that our tape reader can recover any such legacy data. Providing technology to access a country’s valuable legacy data is a diplomatic approach the US Department of State has used in the past.

    Another unique opportunity, “sovereign AI,” was described by Jensen Huang (CEO of NVIDIA) in a recent interview. He envisions every country training its own large language model (LLM, like ChatGPT), based on their language, laws, customs, and unique history. This will need as much of each country’s legacy data as possible for training.  

    Q5: Where can readers get more information about this innovative solution?

    Ans: A direct link to an overview slide deck is here.  We will be adding more information to our website over time. A YouTube video of my recent talk at the Vintage Computer Festival Southwest was just posted.

    In addition, I share new information on LinkedIn. For example, I have posted some behind-the-scenes photos of my recent visits to George Blood, the Library of Congress, and the Smithsonian. I invite your readers to connect with me at https://linkedin.com/in/ChuckSobey  

    Q6: You are also deeply involved in one of the largest IT Trade Shows out there, Flash Memory Summit, now known as “FMS: the Future of Memory and Storage.” What can you tell us about FMS and how FMS is evolving?

    Ans: 2024 is the 18th year of FMS. I have been the General Chair since 2017 and an organizer and advisor for several years before that. Registration is open now for this August 6-8, 2024, event at the Santa Clara Convention Center (SCCC).

    I’d like to thank you, Rich, for your help this year, and last, in putting together our cold data and archive sessions. People like you are helping us expand our scope beyond flash (hence, the name-change to simply “FMS”). Our coverage now includes DRAM, HDD, tape, and many other emerging nonvolatile memory technologies – from MRAM to DNA – as well as the applications, such as AI, that continue to drive their adoption. We believe FMS is a special show, where old friends and new meet to reconnect and move the industry forward. It is the best networking opportunity in the industry.

    Coming out of the pandemic, I co-founded another growing IT event, Chiplet Summit. We will hold our 3rd annual event at SCCC on January 21-23, 2025. It is exciting to help this hardware development method expand and grow into a new ecosystem for the rapid development and deployment of leading-edge semiconductor process technologies.

    I invite your readers to attend both of these events!

    Q7: Finally, when you are not slaving away for Channel Science or FMS or Chiplet Summit, what do you enjoy doing in your free time?

    Ans: You are right that there is not much free time! However, when both time and Texas weather permit, I try to go mountain biking. That is harder than in sounds in Plano, which in Spanish means flat! I also love playing with and training our two wonderful German Shepherd Dogs.

    [Ina and Lola preparing for another game of tag.]

    Thanks for your time, Chuck, and we wish you a lot of success with your legacy tape reader, FMS, and Chiplet Summit!

    Read More

    Why LTO Data Tape is a Perfect Fit for the Massive Video Surveillance Market

    Reading Time: 6 minutes

    Executive Q & A with Jay Jason Bartlett, CEO, Cozaint

    Q1: Welcome Jay to this Fujifilm Insights Executive Q & A! Please tell us a bit about your role and responsibility as CEO of Cozaint.

    Ans:  Thanks Rich for the invite. I head up a great team of engineers and professionals that have a ton of experience with intelligent surveillance solutions, product development, data storage, and physical security. I’m fortunate to drive the efforts of Cozaint and help this team deliver a market disruptive video surveillance storage solution.

    Q2: So, you are pioneering the use of today’s modern LTO data tape for video surveillance content retention. Wasn’t tape the defacto standard before HDDs took over?

    Ans: Well, VHS tape was indeed the defacto storage media back in the analog days of video surveillance, say before 2012, when it largely became the domain of HDDs However, we are pushing an innovative new use of LTO data tape media within the video surveillance market to address today’s pain points of high cost and energy intensive HDDs.

    Cozaint has a patent-pending on implementing LTO data tape storage in a video surveillance infrastructure in such a way that the video management software (VMS) is able to recall and playback -all- recorded video, without any extra steps or IT personnel needed.

    And yes, we do believe this is a ground-breaking approach to utilize VMS aware LTO data tape in the video surveillance market.

    Q3:  What is different about today’s market compared to 2016 when LTO was tried in the VS market?

    Ans:

    Because so many managers and executives at organizations with responsibility of the physical security / video surveillance infrastructure have been around for a while, when the word ‘tape’ storage is used, modern LTO-9 data tape technology gets confused with old-fashion analog VHS tape storage.

    And with the advances in LTO technology, today with LTO-9 digital data tape having a capacity of 18-Terabytes on a single cartridge, the ability to record, store, and manage large amounts of video data on inexpensive and eco-friendly LTO data cartridges is a significant advancement. It solves for higher resolution cameras and longer retention periods for things like AI analytics.

    In years past, vendors did try to use LTO in the video surveillance market and some do today but purely for very long-term archive. However, those IT professionals didn’t really have an in-depth understanding of how video needed to be available for easy playback, more of a warm active archive rather than a cold archive use case. Therefore, the implementations were more IT centric instead of video surveillance operations centric. Unfortunately, those prior attempts to use LTO were met with resistance by the users and operators of those VMS systems. They really needed seamless performance when pulling content from an HDD tier or a tape tier.

    Another huge difference from 2016 video storage and today’s systems is how ESG or “green” solutions are more important. How much a video storage system eats up in energy costs and cooling costs and management costs has skyrocketed. This is yet another advantage for LTO as there is absolutely no ‘greener’ storage media than LTO. The TCO costs savings should make anyone give such an LTO solution a long serious look.

    Q4: You said LTO tape must be VMS aware, what does that mean?

    Ans:

    Cozaint has learned from those previous attempts to utilize LTO that the center of the universe of any video surveillance infrastructure is the video management software (VMS). Meaning, that the user / operator of the VMS that is needing to recall and playback recorded video, needs to do so directly and easily via the VMS software ‘timeline’ feature.

    This timeline is where the VMS operator will scroll back and forth (some call it scrubbing the timeline) to search for the event of interest within the recorded video.

    Unlike other systems, such as video editing tools in the media and entertainment industry, the VMS operator really does not know what or where they are looking. They have a general idea, but need to bounce around the VMS timeline to find the event.

    This timeline scrubbing creates a specific challenge for video storage when attempting to implement LTO storage. The VMS and the underlying infrastructure need to be flexible enough, yet sophisticated enough, to know where the video of interest could be stored on a number of LTO tapes and then be able to quickly load and seek the video.

    This is where LTO storage libraries come into focus within such an infrastructure. With multiple LTO drives available in various size LTO libraries, with LTO cartridge ‘slots’ within such a library (think of a classic record jukebox), the ability to quickly find the needed video is significantly faster.

    This LTO library with multiple LTO drive capabilities delivers a level of scalability that is just not affordable in a hard-disk only video storage system.

    And this scalability is what provides for a dramatically lower cost to store video with LTO.

    Q5: Tell us about Marcia and how is it a breakthrough enabler for LTO to work in the VS industry?

    Ans:

    Cozaint’s MARCIA™ middleware software sits behind any file-based VMS system and manages a multiple tier storage infrastructure, for example HDD + LTO. Again, with years of understanding how video surveillance storage is managed, we have learned that the ideal set-up is a two-tier storage approach.

    The first tier of storage is hard-disk based to allow the VMS operator to recall and playback their most recent recorded video. The customer / organization can determine the retention period they need for their operations with, let’s call it instant gratification in video playback, tier 1 and then all the video recordings being held on tier 2 consisting of LTO scalable storage.

    A large expense/cost of a video surveillance infrastructure is in the hard-disk storage portion of the overall solution. Even though hard drives seem “cheap” – when you start needing more than a couple hundred terabytes of video storage, the hard disk storage becomes expensive quickly. Not to mention power and cooling.

    Therefore, when we create a 2-tier video storage solution with minimal hard disk storage and scalable LTO storage, we can deliver a significantly more affordable system.

    MARCIA is the middleware that manages all of this multi-tier storage so that the VMS operator does not need to think about where any of their video is recorded and stored. MARCIA keeps track of what is on tier 1 and tier 2 and quickly loads and plays the video the operator is requesting. And as the operator ‘bounces around the timeline’ MARCIA is able to deliver the requested video either instantly as in a typical solution or within just a couple of minutes if the video is located on LTO.

    What this really delivers for the organization is better outcomes for the overall usage of the recorded video surveillance. If you think about why you are doing the video recordings in the first place, being able to quickly and easily recall and playback video is important.

    More important is the quality of that recorded video. As soon as the industry moved from those analog VHS tapes, the hard-disk based video storage vendors have been coercing compromises in the video quality to make up for the expensive costs of hard disk storage.

    Motion-only based recordings; low-frame rate recordings; low-resolution recordings even from very high-resolution cameras. All of these “normal” practices in video surveillance recording are being compromised because of the expense of hard-disk based storage systems.

    It’s actually quite amazing how users have just become desensitized to these poor-quality compromises.

    Q6: Any idea how much HDD capacity is shipped into the global market just for VS data retention?

    Ans:

    According to IDC, the capacity shipments of HDD into video surveillance applications in 2021 were 111 Exabytes, about 8.0% of total HDD shipments, followed by 79 EB or 7% in 2022 and estimated at 100 EB or about 7.5% in 2023.

    By comparison, LTO capacity shipments in 2023 were 153 EB with 2.5X compression, so maybe 65 EB native? Therefore, the HDD market just for VS applications is larger than the entire LTO market.

    That’s a lot of video data.  And we want to help organizations record, store, and manage that video in the most cost-efficient way possible. While making it easy to playback at the same time.   I think we have solved that issue.

    Q7: Where can readers find more information about your solutions?

    Ans: Sure, just go to www.cozaint.com and also check out the VS TCO calculator comparing the cost of VS retention with and without LTO at: www.bradjohnsconsulting.com/vs-tco

    Q8: Finally, when you are not slaving away for Cozaint, what do you enjoy doing in your free time?

    Ans:

    It’s hard to believe we have been building Cozaint for over six years now and in the video surveillance industry since 2008. I’m obviously older now and my joy of being on a basketball court or out on the water on a sailboat has been replaced with three wonderful adult children and their spouses and now four adorable grandchildren. There really is nothing better than being “Papa.”

    Thanks for your time Jay, and we wish you a lot of success with your software and getting LTO seeded in the Video Surveillance market!

    Read More

    ISC West 24 Reveals Pain Points of Video Surveillance Retention and How LTO Data Tape with Cozaint’s Marcia is Part of the Solution

    Reading Time: 5 minutes

    By Rich Gadomski, Head of Tape Evangelism, FUJIFILM Data Storage Solutions

    The International Security Conference held its annual west coast trade show in Las Vegas on April 10 -12. ISC West is the premier showcase for the latest innovations and solutions spanning a broad range of security technologies for security professionals, system integrators, manufacturers and consultants. This year’s expo featured some 700 exhibitors and more than 35,000 eager attendees who crowded the aisles and booths and created long lines everywhere. In a nutshell, the show was red-hot with lots of cool stuff to see and learn about.

    No Shortage of Innovative Video Surveillance Solutions

    ISC West is more than just video surveillance. One might be amazed by things like gunshot detection or anti-drone devices for example. But video surveillance (VS) is a big part of the show. Below are just some of the evolving VS capabilities and innovations:

    AI-Powered Analytics: Advanced video surveillance systems are leveraging artificial intelligence for real-time video analytics, object recognition, and behavior analysis. These systems can automatically detect and alert for potential security threats, suspicious activities, or anomalies. One amazing, albeit sad example of “object recognition” is the capability of AI to recognize a firearm, say amid a crowded hall of high school students on lunch break.

    High-Resolution Imaging: Surveillance cameras continue to morph in size, shape and capabilities with continued advancements in high-resolution imaging technologies, including data intensive 4K, 8K and even 12K resolution cameras. These increasingly affordable cameras provide clearer and more detailed video footage for enhanced surveillance and forensic analysis. VS cameras even come in “explosion proof” versions, meaning that their electronics are insulated so they can’t spark an explosion in highly flammable environments like oil and chemical refining for example.

    360-Degree Cameras: Security pros are increasing adoption of 360-degree cameras and panoramic video surveillance solutions, which offer comprehensive coverage and eliminate blind spots in large areas or challenging environments.

    Edge Computing: Surveillance cameras are a classic example of an “edge” device and integration of edge computing capabilities into video surveillance devices, enable on-device processing for faster response times, reduced bandwidth requirements, and improved overall system efficiency.

    Integration with IoT and Smart Devices: Also becoming more commonplace now is the integration of video surveillance systems with Internet of Things (IoT) devices and smart sensors for enhanced environment awareness and proactive security measures. This could include integration with access control systems, environmental sensors, and other IoT devices to create a more comprehensive security ecosystem.

    Cloud-Based Solutions: Not unlike the IT industry for certain applications, there is a growing adoption of cloud-based video surveillance platforms, offering scalability, remote accessibility, and easier management of surveillance footage across multiple locations. This is especially true for smaller operations before escalating volumes of content favor a hybrid of on-prem and cloud-based solutions.

    Cybersecurity Features: No industry is immune from cyber criminals, so it was no surprise to see an increased focus on cybersecurity features and protocols to protect video surveillance systems from hacking, data breaches, and other cyber threats. This includes encryption, multi-factor authentication, regular software updates, and offsite/offline storage.

    All the above innovation happening in VS applications sounds terrific and it got a lot of attention at the show. Associated vendors that I spoke to were downright giddy about it.

    But as soon as I asked about storage, the mood suddenly changed to one of concern. How to cost effectively retain more and more VS content is becoming a bit of a nightmare. All of the advances with the VS cameras are great, however, it comes at a cost when recording all that high-res content. 360-degree cameras, IoT enabled systems, edge computing all require more and more storage.

    The video analytics capabilities have historically acted on live video, and AI/ML is teaching everyone about “large datasets” to learn from. That means storing more video for longer periods of time to be able to run new analytics to create more business intelligence.

    Keeping increasing volumes of VS content is simply not sustainable on defacto standard hard disk drive systems from a cost and energy perspective. Cloud offerings have already shifted from ‘cloud storage’ to ‘cloud managed’ because the overall cloud costs have become prohibitive too.

    Relief is on the Way Via LTO and Cozaint

    Fortunately, pain relief for video storage of surveillance content continues to be developed by Cozaint, a video surveillance expert and innovator. At ISC West, Cozaint CEO, Jay Bartlett debuted his company’s new Marcia™ software platform. Marcia dramatically reduces storage costs via integration of a tier 2 LTO data tape library. It’s really an active archive solution that makes video playback and review operator friendly compared to previous attempts to integrate LTO purely as an archive.

    Marcia intelligently stores VS content across multiple storage tiers including HDD, LTO data tape libraries and even cloud if required. Marcia abstracts the backend storage hardware from the front-end video management software (VMS) solution allowing for seamless, fully integrated, and managed storage while allowing for easy video playback.

    The Marcia LTO integration solution is compatible with popular VMS systems using Network Optix’s NxWitness, Digital Wathdog Spectrum IPVMS, Hanwha Wisenet Wave, Cook Security Piko VMS and Cozaint’s own BOBBY VMS.

    The Cozaint solution is also compatible with major LTO automated data tape library systems from vendors such as Dell, HPE, IBM, Overland, Quantum, Qualstar and others.

    Key Benefits of Cozaint Marcia with Integrated LTO Tape Tier

    In the IT world outside of the security halls, LTO data tape is a tried-and-true solution for backup, active archive and deep archive applications. LTO is widely acknowledged as having the lowest TCO compared to any other storage solution including cloud and is the most eco-friendly form of storage as LTO tapes consume no energy unless being read or written to in an LTO tape drive.

    But in the security industry, LTO tape has a very small footprint as a transition was made in the early 2000s from VHS tape to HDD systems and the feature rich LTO tape format was overlooked along the way.

    Previous attempts to integrate LTO into a video surveillance infrastructure used traditional IT backup and archiving processes, which did not take into account how video surveillance operators playback and review video. Thus, those previous attempts were met with user experience issues that did not have positive outcomes.

    Now, the industry has learned that LTO storage must be VMS-aware for users to accept and embrace the LTO advantages. With more VMS support becoming available, LTO will increase in user adoption, especially when all the favorable acquisition costs and operating costs are realized.

    With a Marcia enabled LTO storage tier, users can benefit from a significant reduction in storage costs typically in the range of 50% compared to an all HDD system. Thanks to Marcia, playback and review of all recorded video footage from LTO tape libraries is virtually effortless without the latency of previous VS solutions. Very high capacity LTO tapes such as LTO-9 at 18 TB offer scalability for growth, feature the industry’s best reliability ratings and can be easily stored offline for cyber security purposes.

    It’s Time for More, Not Less

    In a society that sadly needs better security systems to protect life and property, widely deployed, high resolution VS systems are critical. But we need more, not less. We need better quality not worse quality. We need more long-term storage of VS content to support AI, not less.

    Integrating a cost-effective, eco-friendly LTO tape-based storage tier is a very cool solution for a red-hot market need.

    Check out more about Marcia and Cozaint at www.cozaint.com/marcia

    Check out everything you need to know about LTO data tape here.

    Read More

    New Research from ESG Highlights the Impact of AI/ML on Active Archiving

    Reading Time: 2 minutes

    As artificial intelligence (AI) and machine learning (ML) workloads begin to permeate and drive business processes and decision-making at all levels, effective data management is becoming imperative. Data needs to be stored for a long time and it needs to be available to be actively accessed during its lifespan.

    Active archive solutions are ideal for managing information for AI and ML frameworks as they can provide customized optimization for storage, security, and performance. In addition, AI tools can be applied in an active archive to automate and streamline data management processes in various ways. For example, AI can cleanse, normalize, and categorize long-term data for AI workloads, automate metadata tagging and indexing for inactive data, and identify and archive sensitive information.

    A new eBook from ESG Research, sponsored by the Active Archive Alliance, explores the general state of data archiving and the benefits and challenges shaping modern environments and strategies. It also examines how active archiving integrates into modern data archiving practices and the role it plays in determining the success of AI/ML initiatives.

    Key findings include:

    • On average, organizations have 6.7 PB of data on corporate servers, with 44% of the data on an active archive.
    • Active archive data growth is in lock step with overall data growth; archive data doubling roughly every 3.5 years.
    • AI/ML capabilities are a key accelerator for the adoption of active archives.
    • The top three use cases for active archives are analytics, business efficiency, and cyber-resilience.

    Ultimately, AI depends on well-organized data for success, which underscores why effective data management through an active archive is crucial for an AI future. Organizations without intelligent data management processes that feed into business intelligence workloads risk being left behind by competitors who do.

    For more information, access the eBook here: Impact of AI/ML on Active Archiving.

    Read More

    LET’S DISCUSS YOUR NEEDS

    We can help you reduce cost, decrease vendor lock-in, and increase productivity of storage staff while ensuring accessibility and longevity of data.

    Contact Us >