FUJIFILM INSIGHTS BLOG

Data Storage

Spotlighting Active Archive with Solutions from SAVARTUS

Reading Time: 9 minutes

Executive Q & A with Rick Bump, CEO of SAVARTUS

Q1: Welcome Rick to this Fujifilm Insights Blog Executive Q & A! Please tell us a bit about SAVARTUS and your current role as CEO.

Ans: I co-founded SAVARTUS with Christopher Rence to address the pressing challenges in data management. We recognized a growing need driven by the trajectory of data and its evolving demands—challenges the status quo simply can’t handle. SAVARTUS delivers storage solutions, blending innovative optical technology and LTO options, all accessible through our Unified Data Interface (UDI) software. We all know that data is surging—IDC estimated over 180 zettabytes globally by 2025, fueled by AI generating massive datasets and businesses digitizing operations. Costs are exploding too; traditional cloud storage or HDDs rack up steep bills, especially with egress and other API fees for AWS archival storage. Compliance pressures, like GDPR or HIPAA, demand long-term retention and secure destruction, while environmental impact is a rising concern—data centers eat up 1-2% of global electricity, clashing with sustainability goals. The old approaches—power-hungry disks, or costly cloud tiers—lag behind, failing to balance affordability, speed, and eco-friendliness That’s why Christopher and I launched SAVARTUS. Our optical active archive offers retrieval at low cost—fast and cost-effective—while using near-zero idle power. Tape keeps deep storage at the lowest total-cost-of-ownership (TCO) and our UDI software manages it all. I’m thrilled to lead us in tackling these gaps.

Q2: As you seek to provide data management solutions for organizations, from your perspective, what is driving data growth and what is challenging the status quo?

Ans: From our perspective, data growth is being propelled by a mix of technological, regulatory, and behavioral forces, all reshaping how organizations handle their information, while an evolving need to manage digital assets from creation through intentional destruction adds further complexity. First, digital transformation is a primary driver. Businesses are digitizing operations—supply chains, customer interactions, and more—producing vast structured and unstructured data. IoT devices in manufacturing or healthcare generate continuous sensor streams, and AI workloads, especially generative models, create massive datasets for training and content production. IDC’s forecast of over 180 zettabytes by 2025 is proving conservative as these technologies scale. Second, regulatory and compliance demands are extending data lifecycles. Frameworks like GDPR, HIPAA, and SEC Rule 17a-4 require retention—sometimes decades-long—pushing organizations to archive data cost-effectively and securely. This ties directly to the need for end-to-end digital asset management: data must be tracked from creation (for example, for patient records or financial transactions), preserved for legal holds, and eventually destroyed intentionally to comply with privacy laws or reduce liability. A healthcare provider, for instance, might retain records indefinitely but must ensure secure deletion when permissible, amplifying storage demands. Third, user behavior is exploding data volume. Remote work has surged cloud collaboration data—emails, videos, documents—while consumers fuel petabytes through social media, streaming, and e-commerce, captured for analytics. This “datafication” creates assets that organizations must manage holistically: created by users, stored for insights, and destroyed when obsolete, not just left to pile up. What’s challenging the status quo? Traditional data management—HDDs and online cloud—are faltering under these pressures when it comes to long term archival storage for large capacities. Cost is the disruptor: AWS Deep Archive offers $0.00099/GB/month, but retrieval fees and 12- 48-hour latency frustrate users needing quick access. HDDs consume power and hit scalability walls as data centers face energy and environmental constraints. Security is shaking things up too. Ransomware, up 13% in 2024 per Verizon’s DBIR, exploits online storage, driving demand for offline, air-gapped solutions. The old “store everything online” model is crumbling as breaches expose vulnerabilities, especially when compliance demands auditable, immutable records from creation to destruction—records that must be both secure and securely disposed of on schedule. A critical challenge is the need to access traditionally archived data for AI, ML, or other business requirements driving active archive capabilities. The status quo treated archives as static vaults— data locked away on tape or deep cloud tiers, retrieved only for audits. Now, organizations want to mine historical data for training AI models, running analytics, or feeding real-time applications. A media company might pull decades-old footage to train a content recommendation engine, or a retailer might analyze archived transactions for predictive trends. Multi-hour retrieval-SLA’s and cloud’s retrieval costs clash with these active use cases, demanding faster, more accessible archiving. Sustainability is another disruptor. Data centers, consuming 1-2% of global electricity, face ESG scrutiny. HDDs’ constant energy draw does not cut it, while our optical discs—near-zero idle power, 50–100-year lifespan—offer a greener alternative. In essence, data growth stems from digitization, regulation, and behavior, intensified by the need to manage assets from creation to destruction. The status quo is challenged by cost inefficiencies, security risks, sustainability pressures, and the shift to active archiving for AI and ML. For example, our Archive-as-a-Service solution offering (AaaS), with 2–5-minute retrieval, offline security, and low-energy optical storage, addresses these drivers and disruptions, enabling organizations to handle data’s full lifecycle smarter and faster.

Q3: You mentioned that SAVARTUS delivers storage solutions based on optical and tape technologies. Can you elaborate on the ideal scenario for each of those diverse solutions?

Ans: At SAVARTUS, we leverage optical and tape technologies to provide tailored storage solutions, each excelling in distinct scenarios, complemented by our software for unified data access. These offerings address the diverse needs of organizations managing explosive data growth, from creation to destruction, while tackling challenges like cost, security, and accessibility. Here’s how each technology fits into an ideal scenario. Our optical storage shines in scenarios demanding random access, robust security, and sustainable long-term retention for moderate to large data volumes. The ideal use case is an organization needing an on-premise active archive—data that’s not accessed daily but requires quick availability for operational or analytical purposes, such as AI/ML training, compliance audits, or historical analysis. Imagine a media company managing decades of video assets. They need to archive raw footage (perhaps 100 GB files) for potential reuse in new projects or AI-driven content curation. Our optical library, with retrieval times of Time-to-First-Byte in 2-5 minutes, delivers this data far faster than Cloud’s 12-hour SLA (like for AWS Deep Archive). The offline, air-gapped nature of optical discs protects against ransomware, critical for high-value digital assets, while near-zero idle power aligns with sustainability goals. Another prime scenario is regulated industries like healthcare or finance, where data must be retained for 7-10+ years, accessed occasionally for audits, and destroyed securely per GDPR or HIPAA. Our Data Lifecycle Management (DLM) software components enhance this by managing the lifecycle – encrypting data on write, tracking retention periods, and ensuring compliant deletion. With features like customer-managed keys and WORM compliance, we justify a $0.004/GB/month premium over Deep Archive ($0.00099), offering enterprise-grade security and speed at a fraction of Veritas’s $5-$15/GB/month. Tape storage, a stalwart in our portfolio, is ideal for deep, cold storage—massive datasets retained for compliance or disaster recovery with minimal access needs. The perfect scenario is a large enterprise or government agency archiving petabytes of historical data (examples include tax records, seismic surveys) that’s rarely touched but must be preserved indefinitely at the lowest possible cost. Consider a financial institution mandated to keep transactional data for decades. Tape’s ultra-low cost—comparable to Deep Archive’s $0.00099/GB/month—makes it unbeatable for static, multi-PB archives. Our tape solutions, paired with DLM software modules, automate tiering from active systems to tape, ensuring data moves seamlessly from creation to long-term storage. The physical air-gap provides unmatched resilience against cyber threats, and tape’s 30+ year lifespan suits archival needs where retrieval Time-to-first-Byte (TTFB) isn’t urgent. For example, restoring a backup after a disaster fits tape’s strengths—high capacity (perhaps using LTO-9 at 18 TB/cartridge), high 400MB/s transfer rate and durability outweigh a higher TTFB. Tape also excels for organizations prioritizing sustainability and cost over speed. A research institute storing climate data might use tape to keep 50 PB at a fraction of cloud or HDD costs, with our DLM software modules cataloging metadata for eventual retrieval or destruction, ensuring compliance with grant mandates. Regarding the role of Data Lifecycle Management Software; our DLM software components tie these solutions together, managing data, as I like to say, with intention from creation through destruction. For optical, it enables encryption, key management, and fast indexing, supporting active use cases like AI-driven insights from archived data. For tape, it automates migration to deep storage, tracks retention, and schedules secure deletion, optimizing cost and compliance. This holistic approach—optical for accessibility and physical WORM, tape for cost and scale—lets organizations adapt to diverse needs. Why Both? Optical storage suits mid-tier archiving (approximately 100 TB-5 PB) with frequent-enough access, offering speed and security at $0.004/GB/month. Tape dominates ultra-large, static archives (1 PB+). Together, with our DLM software components, SAVARTUS delivers a spectrum of solutions—reasonably fast, secure (physical WORM) optical for active archives, and economical tape for deep storage and retrieval with high transfer rates—meeting the full lifecycle demands of modern data management.

Q4: Your customers can implement your solutions as cold storage, more of a traditional archive, or as an active archive. Can you explain the advantages of cold archive vs. active archive?

Ans: We recognize that organizations will want to use our optical and tape solutions in different ways—cold storage, traditional archiving, or an active archive. Cold archiving is what we lean on for data you stash away and rarely touch, like with our tape or the budget tier of our optical setup. It’s all about keeping costs crazy low—think tape matching AWS Deep Archive at $0.00099/GB/month. Say a government agency’s got decades of tax records to store; they don’t need them often, maybe just for an audit every few years. Cold storage is perfect here because it’s dirt cheap—tape and optical barely use power when idle—and it’s built to last, with tape good for 30+ years and our discs up to 100 years. Plus, being offline keeps it safe from hackers. Our DLM software components seal the deal by tracking retention and scheduling secure deletion, like for GDPR compliance. The trade-off? Retrieval times depend on file size and are higher for optical than with tape. Active archiving, though, is where tape- and optical have their individual advantages. Our optical solution at $0.004/GB/month really pops—is for random access type data and small files you want to grab more often. Tape on the other hand shines with its low cost and high transfer rate, making it specifically attractive for the retrieval of large files and high capacities. Both optical and tape are available with a Glacier front end, supporting on-prem and hybrid environments. With our DLM software components adding encryption on write and fast search, it’s great for things like a retailer pulling archived sales data to predict trends. The catch is it costs more than cold—$0.004 vs. $0.00099—but you’re paying for that quick access and extra utility. Basically, cold storage is your friend for cheap, long-term storage of stuff like backups or regulatory records, while active storage is the way to go if you’re actively using archived data, like for AI or quick audits. Our Unified Data Interface software keeps it all smooth, managing the whole lifecycle either way.

Q5: You mentioned AI as a driver of data storage demand, how does active archive support AI workflows based on your product solutions?

Ans: You’re right—AI is a huge driver of data storage demand these days, and our active archive solution at SAVARTUS is built to support those workflows at the backend. With AI there is a large demand for high performance data storage (HDD/Flash) for data collection, preparation, model development, training/re-training and model deployment. At the same time, after model deployment the large capacity model data must be archived for future references, retraining and for regulatory reasons. The S3/Glacier interface of our tape- and optical archive offerings is perfectly suited to work with typically object storage-based AI model front ends.

Q6: SAVARTUS recently joined the Active Archive Alliance (AAA). What benefits do you hope to enjoy as a part of AAA?

Ans: SAVARTUS’s membership in the Active Archive Alliance is a strategic move, and we anticipate several meaningful benefits from this affiliation. The Active Archive Alliance brings together a knowledgeable group of industry leaders focused on advancing active archiving, and we see this as an opportunity to strengthen our position in the storage landscape. One key benefit is the chance to network with this expert community and gain insights into the future of active archiving. With our optical solution we’re eager to collaborate with peers like IBM or PoINT Software and Systems to refine our approach. Their perspectives could help us enhance features like retrieval speed or indexing, critical for customers accessing historical data for AI-driven analytics, such as a retailer optimizing predictive models with years of sales records. Membership also elevates our influence. By engaging with the Alliance, we can contribute to shaping industry standards, showcasing how our optical technology complements tape’s established role while meeting modern demands—think fast, secure access alongside costeffective deep storage. This could resonate with enterprises, like a financial institution managing audit-ready archives. Finally, the Active Archive Alliance’s credibility enhances our reputation. Affiliation with a respected, vendor-neutral body signals trust to clients, reinforcing our DLM software component’s role in managing data lifecycles end-to-end. For a healthcare provider securing patient data, that trust could be decisive. In short, we aim to leverage expertise, influence trends, and build credibility through the Active Archive Alliance, all to advance our optical-led active archiving solutions. We’re also eyeing the credibility bump. The Active Archive Alliance is all about vendor-neutral, trusted solutions, so being part of that club signals to customers that we’re serious about solving their data growth headaches—cost, security, you name it. It’s like a stamp of approval that could sway a media company picking us over a cloud-only option for their video assets. So yes, it’s about sharpening our game, getting our name out there, and building trust—all while keeping our optical solution shining as the go-to for active archiving.

Q7: Finally, when you are not slaving away for SAVARTUS, what do you enjoy doing in your free time these days?

Ans: When I’m not working, I hunt, fish, camp, hike, scuba dive, sky dive and train our Malinois.  I love being outside!

Thanks for your time, Rick, and we wish you a lot of success with SAVARTUS and in the Active Archive Alliance. Check out the SAVARTUS website at: https://savartus.com

Read More

Revolutionizing Data Management Through Embedded Metadata Extraction

Reading Time: 7 minutes


Executive Q&A with David Cerf, Chief Data Evangelist and Head of U.S. Operations, GRAU DATA

Q1: Welcome, David, to this Fujifilm Insights Blog – Executive Q&A! Please tell us a bit about GRAU DATA and your current role as Chief Data Evangelist and Head of U.S. Operations and International Sales.


Ans: Thanks for having me, Rich! At GRAU DATA, we’re addressing one of today’s most critical challenges: managing the explosive growth of unstructured data. Our mission is to tackle these challenges and help organizations prepare for the exponential data growth expected over the next decade.

In my role, I wear two hats. As Chief Data Evangelist, I focus on educating the market about the transformative potential of metadata in modern data management. Metadata is often the key to unlocking insights from complex datasets. As Head of U.S. Operations and International Sales, I’m dedicated to expanding the global reach of our solutions, like MetadataHub, which empowers organizations to unlock the full value of their unstructured data by capturing its content and context.


Q2: We have been hearing a lot about MetadataHub. Can you describe that product for us?

Ans: MetadataHub is a game-changer for organizations drowning in unstructured data. Whether coming from sensors, microscopes, or satellites, the challenge isn’t just storing all that unstructured data—it’s figuring out how to use it. That’s where MetadataHub shines.

It connects directly to your storage systems—like SMB, NFS, and S3—and automatically extracts all types of metadata, including the critical embedded metadata that tells you the content and context of your files. MetadataHub can uniquely open files, including application-specific files, to extract the content and contextual value needed to derive insights for analytics, AI, and training LLMs. Embedded metadata is often overlooked or not easily accessible, but it provides the crucial content and context that make unstructured data truly actionable.

Think of MetadataHub as a dynamic, searchable repository that acts as a smart proxy for your unstructured data, eliminating the need to constantly recall full files from storage. It integrates seamlessly with modern tools like Snowflake, Databricks, Jupyter Notebooks, KNIME, and LLMs, automating data provisioning to feed data pipelines. By delivering rich metadata without moving the file, MetadataHub accelerates decision-making, streamlines workflows, and maximizes the value of your data. For industries like life sciences, HPC, or manufacturing—where petabytes of research data or billions of sensor readings are the norm—MetadataHub isn’t just helpful; it’s transformative.


Q3: Can you elaborate on some of the key benefits for end users by leveraging embedded metadata and MetadataHub specifically?

Ans: Leveraging embedded metadata with MetadataHub unlocks a wide range of benefits for end users. By extracting critical insights directly from files, MetadataHub provides the content and context organizations need to optimize operations and enhance decision-making. For organizations struggling with scattered storage systems or siloed data, this means reduced costs, better data quality, and improved accessibility.

The most immediate impact users see is in data preparation time. By automating metadata extraction, we’ve managed to reduce data preparation time by up to 90%. This automation means researchers and IT administrators can focus on applying their expertise rather than spending countless hours organizing and preparing data. They get faster results and can access the needed data without waiting for manual processing.

Resource efficiency is another crucial benefit. MetadataHub delivers metadata directly to applications without moving entire files, dramatically reducing infrastructure demands. Since the extracted insights are typically 1,000 times smaller than the original file, we see up to 30% reduction in network, CPU/GPU, and storage loads.

This efficiency translates directly into cost savings. Our system enables intelligent storage decisions based on a file’s actual content and business value, rather than simple metrics like age or access history. When combined with data movers and orchestration tools, MetadataHub automates workflows and facilitates smarter data migration. Organizations can move data from expensive storage to archives based on true business value, reducing storage costs by up to 30%.

The impact on data accessibility is equally significant. MetadataHub breaks down traditional data silos that often plague large organizations. Users can seamlessly discover and access data across the entire organization, regardless of where it’s stored. This global accessibility ensures that valuable insights don’t remain trapped in isolated systems.

Perhaps most importantly for today’s organizations, MetadataHub makes data AI-ready. It transforms raw data into actionable insights that can be immediately used in analytics, artificial intelligence, and machine learning workflows. These insights are formatted and structured to work seamlessly with the tools and applications that modern data scientists and analysts rely on daily.


Q4: Can you describe your typical target customers for MetadataHub?

Ans: Our ideal customers are organizations managing massive volumes of unstructured data. This includes scientific and research institutions such as aerospace agencies, pharmaceutical companies, life sciences organizations, national labs, universities, high-performance computing facilities, and manufacturers. These organizations often generate large datasets from specialized equipment like electron microscopes, genomic sequencers, and satellite imaging systems.

MetadataHub is also an excellent fit for organizations with extensive data archives, particularly those utilizing tape archives or deep storage systems. It enables these customers to capture critical insights before files are stored cost-effectively, ensuring that data remains actionable even in long-term storage.

Additionally, MetadataHub is invaluable for any organization with unstructured data seeking to enhance data quality for AI and large language model (LLM) training, making their data more usable and impactful for advanced analytics and machine learning workflows.


Q5: You just mentioned the benefit of getting the data “AI-ready.” Can you explain how that works?

Ans: The journey to reliable AI begins with data quality. When organizations work with poor-quality data, they inevitably face unreliable AI outputs, slower workflows, and missed opportunities. We’ve found that the key to solving this challenge lies in embedded metadata – those crucial details hidden within files that provide essential context and content for AI applications.

Making unstructured data truly AI-ready requires addressing three fundamental challenges. First, there’s the complexity of managing diverse file types. Unstructured data comes in countless formats, each with its own unique structure and complications that make standardization challenging. Second, organizations must handle increasingly massive data volumes. We’re talking about processing millions or even billions of files efficiently – a scale that demands robust automation.

The third and most critical challenge involves extracting embedded metadata effectively. Modern scientific instruments and sensors generate files that are rich with metadata, often containing hundreds or thousands of vital elements. These include experimental conditions, equipment settings, and measurement parameters – all crucial details that determine the quality and usability of the data for AI applications.

MetadataHub tackles these challenges through automated processing at scale. Our system harmonizes both content and context, making data immediately actionable regardless of its source or format. We place special emphasis on data provenance – tracking the complete history of data from its origin through every transformation. This comprehensive tracking builds the trust, repeatability, and accountability that are absolutely essential for successful AI and machine learning applications.


Q6: I’ve heard you talk about relieving “performance anxiety” when it comes to deep storage such as data tape libraries. Can you tell us about that benefit in more detail?

Ans: Tape archives often come with “performance anxiety” due to the latency in recalling files. However, you don’t need the entire file in most cases—just the insights within it.

MetadataHub solves this by acting as a proxy, capturing critical metadata, and making it instantly accessible. This ensures that all content and context are readily available for AI workflows or applications without the need to recall files from the archive unless absolutely necessary.

This approach creates an “active archive,” where critical insights are captured immediately, enabling files to move to archival storage sooner. This ensures organizations can save costs without losing quick access to the necessary data. These insights remain accessible, enabling organizations to migrate files off expensive performance storage sooner, saving costs while ensuring seamless access to the necessary data, regardless of where the file resides.

For example, the Zuse Institute manages 1 PB of high-performance SSD storage and over 200 PB on tape. By using MetadataHub, they capture critical metadata from files on performance storage, providing instant access to insights without recalling the original files. Once the metadata is captured, Zuse migrates the files to secure, low-cost archival solutions—enterprise tape—while retaining immediate access to actionable metadata.


Q7: You also have a tool included in MetadataHub that you refer to as your “Data Landscape Report.” What exactly does that report do, and how does it benefit end users?

Ans: Great question! The Data Landscape Report is about solving one of the biggest challenges our customers face—”Where is my data?” It provides a clear, 360-degree view of all your unstructured data, no matter where it’s stored—on-premises, in the cloud, or across vendor storage systems like SMB, NFS, and S3.

We hear this from organizations of all sizes. They’ve got storage scattered between on-site systems and the cloud, but there’s no single view to show where everything is or how it’s being used. That’s where the Data Landscape Report comes in—it consolidates all that information into one place.

Now, this isn’t just about tracking file counts or sizes. The report delivers actionable insights, such as file types, data age, usage patterns, and storage efficiency. These insights empower organizations to optimize their data management in several ways. Teams can intelligently move data to the most cost-effective storage tiers, ensuring efficient resource utilization. The report’s comprehensive view makes data migrations straightforward and seamless, eliminating the guesswork typically involved in these projects. Additionally, it strengthens compliance and governance efforts by providing clear visibility into data provenance and history.

What sets the Data Landscape Report apart is its speed—customers can see results within hours, making it an invaluable tool for quickly gaining clarity over their data landscape.


Q8: Finally, when you are not slaving away for GRAU DATA, what do you enjoy doing in your free time these days?

Ans: Oregon is my year-round playground. In the summer, I’m often mountain biking through scenic trails or adventure riding on rugged backroads. When winter comes, I swap the wheels for a snowboard and hit the slopes. Exploring Oregon’s landscapes keeps me energized and inspires the creativity and problem-solving approach I bring to GRAU DATA, helping us tackle some of the most complex data challenges out there.

Thank you, David, for your time. We wish you continued success with GRAU DATA and MetadataHub!

For more information on MetadataHub, visit GRAU DATA’s website or http://Moremetadata.com

Additional resources:
How Data’s DNA Drives Innovation: Transform how your organization discovers, analyzes, and innovates.

2025: The Year Metadata Transforms Data Management


Read More

Rapid Change in Storage Landscape and New Breed of Tape Libraries Drive Innovative Archival Data Storage Solutions from Versity

Reading Time: 6 minutes

Executive Q & A with Bruce Gilpin, CEO of Versity

Welcome, Bruce, to this Fujifilm Insights Blog Executive Q & A!

    1. Please tell us about Versity’s archival storage products and your role and responsibility as CEO.

    Answer:

    One of the really fun and at the same time painful things about starting your own company is that you and your co-founders will have every job at some point. At the beginning, Harriet Coverston was our CTO and lead developer and I did everything else including finance, accounting, marketing, HR, product management, sales, and business development. Over the past 13 years the company has grown tremendously and now we have a fully staffed management team of very high caliber professionals so my job these days is focused more on product strategy, company strategy, and managing my team.

    In terms of product, Versity is laser focused on the large archive data management market segment. Fortunately for us, this is the part of the archival storage market that is still growing and with the dawn of AI, our HPC business is absolutely booming. Scale Out Archive Manager (ScoutAM) is our main product and this is a comprehensive modern super scale solution for mass storage and large archives. We are rapidly consolidating the market that was traditionally spread more or less equally among IBM’s HPSS and Spectrum Archive products, HPE’s DMF product, Quantum’s archive manager, and Oracle’s OHSM & SAM-QFS product. Until Versity came along, there was never a clear leader in the market for various reasons including lack of product differentiation and high switching costs.

    2. Versity has always delivered archival storage solutions to typical HPC customers like the National Labs and Department of Defense. Can you share a bit about the evolving customer profile you are now seeing in your customer base?

    Answer:

    Yes, we are seeing strong growth in our U.S. Government HPC business and in addition we have branched out and gained a lot of traction within the military and civilian government sectors of U.S. allies such as the UK, Australia, New Zealand, Germany, and France. These customers value air gapped data for security and they need modern solutions that can be supported within classified enclaves. With our global partners such as Dell Technologies, we are able to meet the stringent requirements of this customer base. Outside of AI and traditional HPC, we are seeing growth in media and entertainment, genomics, and banking and finance industries. Recently we won our first high frequency trading contract and this niche is likely to grow. People love to say that it’s a gloomy time in the archival data world or the tape storage business. We don’t see that at all. Our business has growth over 30% every single year like clockwork and recently that growth has accelerated. My conclusion is that at the high end of the market where the amount of data is over 100PB, the growth is accelerating. The issue for our competitors is that very few products function well or really have a solid value proposition at those capacity levels where things tend to get very complex and performance sensitive.

    3. Among your expanding customer base, Versity is recognized for its scalability, performance, ease of use, and cost-effectiveness. What else is driving demand among your prospects and target customers?

    Answer:

    Environmental sustainability is becoming a critical factor. Our solutions, like ScoutAM, are designed to optimize power consumption and overall storage expenses, aligning with organizations’ goals to reduce their carbon footprint and power consumption. AI systems are power hungry as we all know so people are looking very hard at ways to balance the power intensive part of the data center with something that is dramatically more efficient. We can deliver that with the combination of ScoutAM and the new class of rack modular tape libraries with field replaceable components.

    4. It’s probably not that long ago that your customers had the challenge of managing archives in the tens or hundreds of petabytes range. But I understand you are now seeing multiple exabyte requirements these days; what is driving the increased storage demand?

    Answer:

    We used to hear about a one exabyte bid or requirement once every year or two and we would be really excited to work on a project of that scale. Today we are working on six different projects that are over one exabyte each and one of them is a greenfield build out for 10 exabytes of capacity. It has really been interesting to see this acceleration take place, and the timing is absolutely perfect for Versity since we gambled with our product plan four years ago and decided to put all of our resources into creating a super scalable solution aimed at the highest capacity users. It was not obvious at the time that this would pay off for us, but it put us in the leadership position at just the right moment. What is driving this is AI and the exponentially increasing capability of various sensors to capture data. We have much more sophisticated sensors proliferating on new sensor platforms – on the ground, in the air, in the ocean, and in space. They all generate data that needs to be effectively managed and secured in a stable long-term repository.

    5. You mentioned AI as a key driver of increasing storage demand among others. But everyone talks about high-performance storage like SSDs associated with AI models. What is the archival demand that is being driven by AI?

    Answer:

    Flash is used in the scratch tier to power AI clusters and it is super effective. Companies like Vast and Weka have done a great job advancing the state of the art for scratch storage – meaning the storage that is closest to the CPU’s and GPU’s. But nobody has enough money to store the source data on flash so we see smaller 1-10 PB flash systems paired with 100-1,000 PB archival systems and in this configuration the archival data is the persistent copy. Usually, we are keeping a persistent copy of the original source data from sensors and sometimes we are keeping copies of checkpoints or the output of the AI cluster whether that is a visualization or some other analysis.

    Q6: Tape systems have always been a standard component that Versity software manages. How are the new tape library systems like Spectra’s Cube or IBM’s Diamondback changing the tape value proposition among your customer base compared to the previous generations of tape libraries?

    Answer:

    These new rack modular systems are very interesting to our user base. I think it is common knowledge at this point that all of the hyper scalers have shifted totally to this model so clearly there are benefits including field serviceable robotics. The rack modular option is not a fit for every use case by any stretch but for very large sites that do a lot of random reads, it has a pretty compelling value proposition, and it opens up the tape market for sites that need simplicity and scalability. Each rack can hold around 30PB of LTO-9 media so these make very useful building blocks. Versity offers a per rack pricing model to help sites keep all of the cost elements within a modular framework and our S3 to tape capability is also a great fit for sites that want the modular libraries but still want to maintain the advantages of feeding the libraries with an independent vendor solution. This allows them to mix and match library vendors or maintain a dual supplier relationship to balance risk and optimize pricing on very large systems.

    Q7: Finally, when you are not slaving away for Versity, what do you enjoy doing in your free time?

    Answer:

    Well, I’m really lucky to live in a part of the world where there is a vibrant outdoor sports community. I am hooked on a newer winter sport called snow kiting and in the summer, I like to kite foil, run, hike, and mountain bike. I have been known to disappear on a meditation retreat once in a while. My two children are in college now, so I have a lot of freedom and I am enjoying every second of it!

    Thanks for your time Bruce, and we wish you a lot of success with Versity’s innovative archival solutions!

    Check out Versity’s website for more info!

    To listen to the audio version of this Q & A, CLICK here.

    Read More

    New Tape-as-a-Service from Geyser Data Delivers Benefits of Tape in a Cloud Based Subscription Model

    Reading Time: 4 minutes

    Executive Q & A with
    Nelson Nahum,
    CEO of Geyser Data

    Q1: Welcome Nelson to this Fujifilm Insights Blog Executive Q & A! Please tell us about Geyser Data’s new tape-as-a-service model and your role and responsibility as CEO.

    Ans: Thank you for having me, Rich. Let me begin with my story on why we created Geyser Data. In a world now forever changed by AI, extracting value from data has become easier than ever before. At the same time the rate of creation of new data is explosive, so we need to look at tape as a great solution for certain storage workloads. Tape is low-cost media, it requires minimal power and it can be protected from cybersecurity threats by simply air gapping it. However, historically tape lost ground to hard disks because tape libraries were difficult to manage and required a lot of capex to buy them. At Geyser Data, this is what we solve, people can use tape-as-a service, without the need to buy and manage tape libraries. And it can be used with the simplicity of S3 APIs and on a pay per month subscription model.

    Q2: Your most recent background prior to Geyser Data is with Zadara. What was your role there and how did that experience help establish your vision for Geyser Data?

    Ans: I co-founded Zadara and ran it for many years until we reached a substantial size with more than 500 cloud locations. At Zadara we pioneered “On premise-as-a-service-storage”. Although Geyser is a different type of storage, my experience allowed me to bring this new idea to light very fast, hire a magnificent team, and build the strategic partnerships necessary to be successful.

    Q3: Tape can currently be consumed in a cloud model by simply engaging some of the well-known deep archive cloud service providers. What makes Geyser Data different?

    Ans: The reason we called the company Geyser is because we allow the user to extract the hot value of the data underneath the glacier; to that extent, some of the critical difference with Geyser and the cold archive tier of the cloud providers are that we don’t charge egress fees nor retrieval fees, so there is a substantial cost savings. Our “Cloud Tape Libraries” have dedicated tapes per user, so the user knows even the barcodes of their tapes and can ask to have the tapes returned to them. Our Cloud Tape Libraries can be “air gapped” for cyber protection and “remounted” instantly when needed. In addition, our Cloud Tape Libraries are multi cloud, so they can be connected to the traditional cloud to copy the data for further processing in the cloud. In short, there are many differences, our Cloud Tape Libraries are really good for people that want to use the data from time to time and those that want to make sure they are in control of their own data. They also want to save money, that is always a key motivator!

    Q4: Tell us about your go to market strategy and who are your target customers?

    Ans: Our go to market strategy is via the reseller channel. We have two types of customers, the customers that use tape on premise today and want to move to the cloud or have DR in the cloud. The other target customers are customers that don’t use tape today. But they have workloads like archive or active archive and others in disk or cloud storage. Now, they can easily use Geyser Data with the same S3 interface and save a lot of money.

    Q5: What are your plans beyond the U.S. market

    Ans: We are definitely building a global Cloud Archive, we have multiple international partners that are ready to go. We will start making announcements soon, probably even before of the end of the year and in early 2025.

    Q6: What are you seeing in the world of data storage that is creating the need for this service offering?

    Ans: Cold data is 70% of all data. But cold data becomes “warmer” when you really want to extract value from it. I believe there is not enough manufacturing capacity of disk drives to store all the data that customers want to store. There would not be enough IT budget or energy available either! By making tape easier to use, I believe tape will have a much more prominent role as the demand for long term massive storage continues to explode.

    Q7: I understand you have had a successful launch within Digital Realty. What makes tape-as-a-service attractive to co-location data center service providers?

    Ans: As I mentioned before tape is very low power and availability of energy is a big concern for colos especially as AI deployments increase. The Spectra Logic Cube library that we use, can store 30PBs using only 1.2 kW, an insignificant amount compared to equivalent HDD storage! Also, tape is much denser than hard disk, so it consumes less floor space too. Finally, Digital Realty has this amazing network fabric that interconnects more than 700 data centers. Any customer of these data centers can establish a private connection today to Geyser Cloud Tape Libraries with just a few clicks.

    Q8: Finally, when you are not slaving away for Geyser Data, what do you enjoy doing in your free time?

    Ans: I love Formula 1 racing, I love watching soccer and I’m really excited for the upcoming World Cup in the U.S.!

    Thanks for your time, Nelson, and we wish you a lot of success with Geyser Data’s innovative tape-as-a-service solution!

    Check out Geyser Data’s website for more info!

    Read More

    Cutting-Edge Data Storage Solutions from XenData Include LTO Data Tape

    Reading Time: 5 minutes

    Executive Q & A with Dr. Phil Storey, CEO of XenData

    Q1: Welcome Phil to this Fujifilm Insights Blog Executive Q & A! Please tell us a bit about XenData and your role and responsibility as CEO.

    Ans: Thank you Rich. I and our CTO, Mark Broadbent started XenData over 20 years ago and our product concept has not really changed in that time.

    We wanted to develop software to manage tape libraries and RAID for long term secure storage of files and to combine the best characteristics of each. That is disk access times combined with the speed, security and longevity of tape.

    The other thing that we wanted was a different business approach. Remember that we started XenData just after the dot com era when there were a huge number of risky businesses that started and then failed. We wanted a solid business with reliable products and great support – which I think we have.

    Q2: XenData has been a long-time sponsor of the Active Archive Alliance and you continue to innovate in the area of active archiving. Can you tell us about your LTO Archive Appliances?

    Ans: I mentioned that our business concept was to develop software. Well, we learned along the way that customers want solutions that are as turnkey as possible. So today, we mainly sell archive appliances managed by XenData software. By supplying the combined hardware and software, we are able to guarantee performance and minimize any possible problems that come with unbounded hardware options.

    Our LTO archive appliances manage one or more LTO libraries and include managed RAID, from a few TB to a PB of disk. We support almost all tape libraries, including from Dell, HPE, IBM, Spectra, Quantum and Qualstar. Our appliances make writing to LTO just like writing to disk on a network. We also have a private cloud interface. We can replicate to public cloud, etc. I could go on. But in summary, by combining disk, tape and cloud we do provide strategic active archive options for our customers.

    Q3: So it seems LTO tape still has a lot to offer, what are the key features and benefits appreciated by your customer base?

    Ans: In summary, I would say, LTO supports high performance active archiving at a reasonable cost. Most of our customers have at least several 100s TB. It is at that volume of data and above where LTO is particularly attractive from a cost perspective. Of course, the fundamentals of tape are a must for our customers which are high reliability and long life. And in these times of climate change, the low energy profile and low carbon footprint of tape is attractive too.

    I should add that we have over 1,500 installations with LTO libraries and about 90% of these are for Media & Entertainment type applications. Our customers include TV stations, Hollywood studios, video production and post-production companies as well as many marketing departments for large corporations and governmental organizations.

    Q4: You mentioned Media & Entertainment customers and their appreciation of your LTO tape solutions, tell us about your new Media Portal software?

    Ans: This is yet another innovative interface into our LTO archive systems. We have our core file-folder interface that can be accessed via standard windows network protocols like SMB and NFS, an option for a private cloud interface that makes accessing the archive like writing to and reading from Amazon Web Services S3 and now Media Portal adds a web interface. A user can browse the file-folder structure of their archive, see previews of video files and image files and then download the files that they need. We also have a search capability which is based on file and folder name. Media Portal will be released later this year and I am very excited about it, especially as it opens up all sorts of options for future development including AI options like converting speech to text and then searching on the text.

    Q5: You mentioned that most of your archive installations are for Media & Entertainment applications. Would you tell us more about some of the other application areas?

    Ans: We have installations in video surveillance, healthcare and life sciences applications. A solution that combines both disk and LTO tape is particularly attractive for large video surveillance installations because it offers longer retention periods with massive scalability at a very reasonable cost.  

    Q6: You recently returned from the IBC show in Amsterdam in mid-September. What can you share about the show; what was the buzz overall and in storage specifically?

    Ans: Not surprisingly, AI was a key theme at the show, with a dedicated AI Tech Zone. The show also addressed sustainability, including energy efficiency in devices and delivery systems. So for us, it was super busy as there is lots of demand for long-term energy efficient storage. One of the recurring themes was the realization among our customers and prospective customers that cloud is so expensive for users even with just a few hundred TBs of content.

    Q7: Finally, when you are not globe-trotting and running XenData, what do you like to do in your spare time?

    Ans: Last year, my wife and I moved from California to just outside of Minneapolis to be close to our two lovely granddaughters. As you know, the winters are brutal in Minnesota, but we had a plan for that. Three years ago, we started designing and building a house on the beach in the Yucatan, just north of Merida. We only completed the project in July of this year. So, the answer to your question, ‘what do we do now’ is just one word: relax!

    Thanks for your time Phil, and we wish you a lot of success with your innovative LTO tape solutions!

    Read More

    Executive Q & A with Michael Arnone, Director of Marketing, FUJIFILM Data Storage Solutions

    Reading Time: 4 minutes

    How long have you been working at Fujifilm and what is your role in the company?

    Michael Arnone - Director of Marketing

    I have been with FUJIFILM Data Storage division for 3.5 years now as Director of Marketing for the FUJIFILM branded Linear Tape Open (LTO) product line, and I’m located at the FUJIFILM North America headquarters in Valhalla, NY. Prior to being with Fujifilm, I spent 13 years in business aviation marketing making sure that sales take off!

    What’s do you like most about your job?

    I have really enjoyed installing a digital marketing infrastructure from the ground up. This is what we call the “DX project” and it involves a lot of new technology, processes and organizational improvements. This allows our sales & marketing teams to be more efficient and results in a better customer experience when seeking information about our products, services and value proposition. But I also like the creative side and the development of content that we use in the DX project.

    Can you talk about some of your recent marketing campaigns?

    Our current campaign is called “Built on Tape” and started in early 2023. This ANA Award-winning campaign features eye-catching creative and is designed to build awareness for the fundamental advantages of modern data tape. We believe organizations can build solid storage strategies with tape as a building block especially when it comes to cool or cold data. Hopefully readers of this blog page will have seen our banner ads throughout the IT world. 

     

    Read more

    Read More

    Executive Q & A: Tom Nakatani, President of FUJIFILM Recording Media U.S.A., Inc.

    Reading Time: 4 minutesIn this executive Q & A, Tom Nakatani, president of FUJIFILM Recording Media U.S.A., Inc. (FRMU) discusses how tape technology plays a vital role in the world of data storage now and in the future.

    Q1) Tell us a bit about yourself, your career at Fujifilm and how you ended up as president of FRMU? 

    I don’t like to age myself, but I have been in the Recording Media Business for 25 years since I joined Fujifilm in 1997. I worked primarily in international sales and marketing, responsible for key customers and partners. I also spent about six years at the European headquarters in Germany.  Most recently I was assigned as VP of Sales and Marketing in the U.S. in September of 2020 before being appointed president of FRMU as of July 1st of this year. As president, I am responsible for the sales and profitability of this division including our Bedford manufacturing facility. I’m pleased to say Bedford is a world class operation with many cutting edge technologies and sustainability initiatives in place. It’s also the world’s largest LTO manufacturing facility, producing the greenest form of storage. But our biggest asset is our team of employees across the organization, from coast to coast, dedicated to exceptional customer satisfaction.

    Q2) What are some of the biggest challenges facing the data storage industry today?

    I think the biggest challenge starts with the ongoing and escalating digital transformation that is generating more data than we ever could have imagined even ten years ago. We are now firmly in the zettabyte age where we have a tendency to keep everything indefinitely and we’re afraid to delete anything. And rightfully so, as the value of data has increased and in many ways it is the new currency in this digital economy.

    But the question is, how can we continue to manage ever increasing volumes of data that are growing exponentially? How can the industry afford it from a TCO perspective and from an energy consumption perspective? The IT industry needs to reduce its impact on global warming and climate change. And how do we protect the data from theft or ransomware? The IT industry needs a cost-effective way to prevent unauthorized access by securing data in offline, offsite locations.

    These are significant challenges but tape solutions are part of the answer. It simply requires a strategic approach to data management and getting the right data in the right place at the right time. Why keep inactive data on 24/7 spinning disk that costs a lot and consumes a lot of energy? Why not move it to modern automated tape systems to reduce cost and CO2 footprint? This will free up HDD space for new, active data! Why not make a low cost copy of the data on tape and send it offsite for cyber security reasons? These solutions are available and are being practiced by the most technologically advanced and data intensive customers in the world today including the major hyperscalers.

    Q3) How is FRMU innovating to address these challenges?

    Together with our global Recording Media colleagues around the world we continue to bring innovative new products and solutions to market. Our tape technology provides the world’s leading companies with high-capacity data storage solutions to help them manage the increasing volumes of valuable data that we just discussed. Our recent release of LTO-9 with 18 TB native and up to 45 TB of compressed capacity is a good example. According to recent studies by industry experts, LTO-9 is even more energy efficient than previous generations of LTO and when compared to HDD can reduce CO2e by more than 95%. In addition, our Bedford facility has come up with innovative ways to custom package our tape products according to specific customer requirements for ease of use and sustainability goals. Our engineering teams have developed diagnostic tools to maximize performance of tape systems for some of our largest customers. We are also very excited about the innovation we are bringing to the object storage market.  Our S3 compatible Object Archive software enables access to low cost tape storage with high reliability and security for long term archiving and preservation of valuable but low access data sets.

    Q4) What role do you think tape will play in the future?

    We believe organizations and enterprises of all kinds will continue to rely on our products for long-term, reliable, secure, eco-friendly and cost-effective data protection and retention. This includes backup for cybersecurity and ransomware protection to active archive for infrequently accessed data to cold archive for so much of the data that is rarely accessed but still has value and can’t be deleted. We have the fundamental building blocks in place to continue increasing areal density and capacity of magnetic linear tape well into the future based on magnetic particle science such as Barium Ferrite, Strontium Ferrite and even Epsilon Ferrite in the more distant future. Our most recent technology demo with IBM showed the potential for 580 TB of native capacity on a single LTO sized cartridge. That’s a lot of data but it’s what will eventually be needed to store and protect data beyond the zettabyte age in an economical and energy efficient manner. I’m sure that advancements will also continue in flash and HDD or new technologies like DNA data storage will come along. But I believe all these technologies will be needed and will complement each other.

    For more information, visit: https://datastorage-na.fujifilm.com/lto-tape-data-storage/

    Read More

    LET’S DISCUSS YOUR NEEDS

    We can help you reduce cost, decrease vendor lock-in, and increase productivity of storage staff while ensuring accessibility and longevity of data.

    Contact Us >