Improving High-Performance Computing With Storage Modernization

Organization: Sandia National Laboratories
Environment: High-Performance Computing (HPC)
Project Value: $2.5M+
Engagement Period: 2021–2027

Products & Technologies:

Dell PowerEdge servers
Dell EMC storage arrays
ScoutAM

Region:
United States — Southwest Region
(New Mexico)

HPC Image

Problem

A Federal Scientific Laboratory needed to modernize an aging data archive supporting high-performance computing workloads. The existing storage environment struggled to keep pace with growing data volumes, bandwidth demands, and reliability requirements for large-scale scientific analysis.

Strategy

A scalable, high-performance storage architecture was designed to improve data throughput, increase capacity, and enhance reliability. The approach integrated high-density storage, high-speed networking, and structured data management to support both current HPC workloads and future growth.

Results

The modernized environment delivered significantly improved performance, scalability, and reliability. Sandia gained a high-throughput, resilient storage platform capable of supporting large datasets, accelerating data access, and enabling continued growth in scientific computing workloads.

BY THE METRICS

Faster
0 %
More Efficient
0 %

 

The Challenge: Aging Data Archive Infrastructure

Sandia National Laboratories required modernization of a legacy storage environment supporting a portion of its High-Performance Computing (HPC) infrastructure. The existing data archive system faced growing demands from large scientific datasets, simulation outputs, and high-throughput computational workloads.
To support future research initiatives, Sandia needed a storage architecture capable of handling increased data volumes, higher bandwidth requirements, and improved system reliability. The new solution also needed to integrate seamlessly with existing HPC compute resources while maintaining high availability and consistent performance for both front-end user workloads and backend system operations.

Storage Architecture Strategy

The modernization effort focused on building a scalable, high-performance storage architecture designed to support large-scale data archiving and rapid data access for HPC workloads.

Key architectural priorities included:

  • Expanding high-capacity storage infrastructure to support growing scientific datasets
  • Improving network throughput between compute nodes and storage systems
  • Designing a highly available storage environment with redundant networking
  • Supporting scalable data management across multiple hosts and systems
  • Establishing infrastructure capable of supporting future hybrid storage architectures

To meet these requirements, the architecture incorporated high-density storage arrays, high-speed networking infrastructure, and scalable storage management software capable of supporting large distributed compute environments. These results connect directly to major storage cost drivers.

Implementation

Wildflower managed the procurement, integration, and installation of the storage and networking infrastructure supporting the HPC archive environment. The deployment included high-capacity storage platforms, high-bandwidth network switching infrastructure, and associated compute and connectivity components.
The implementation emphasized reliability and performance by integrating redundant network paths, high-speed connectivity, and scalable storage management software capable of supporting large numbers of hosts and clients.
Structured infrastructure design practices—including organized cabling, system documentation, and monitoring procedures—were implemented to ensure long-term maintainability and support future storage expansion.

Results
The modernized storage environment significantly improved the performance, scalability, and reliability of Sandia’s HPC data archive platform. This improvement is best understood through throughput vs latency metrics.

Key outcomes included:

  • Increased storage capacity to support expanding scientific datasets
  • Higher network throughput enabling faster data ingestion and retrieval
  • Improved reliability through high-availability architecture and redundant networking
  • Reduced data retrieval latency for HPC workloads
  • A scalable storage platform capable of supporting future hybrid and cloud-integrated storage models
 

 

Mission Impact

Scientific computing environments depend on reliable access to massive datasets generated by simulations, experiments, and analytical models. The upgraded storage architecture provides Sandia National Laboratories with a resilient and scalable data archive platform capable of supporting ongoing research, high-performance simulations, and future data-intensive workloads. These results show why storage for AI workloads matters.

Key Architecture Components

The modernized storage environment incorporated several architectural elements designed to support high-performance scientific computing workloads:

  • High-Capacity Data Archive Storage for large-scale scientific datasets

  • High-Bandwidth Network Switching Infrastructure supporting high-throughput data movement

  • High-Availability Storage Architecture with redundant networking paths

  • Scalable Storage Management Software supporting multi-host access

  • Structured Cabling and Infrastructure Design to support long-term scalability

  • High-Speed Data Connectivity reducing latency in data retrieval

 

PARTNERSHIPS

Dell APEX Solutions For Federal Agencies
Forescout logo

Lets' Talk Through Your Storage Environment

Wildflower Solutions Architects are here to help with every step. 

Let's Talk Through Your Storage Strategy

From architecture to acquisition, our team can help you align your environment with mission needs, compliance requirements, and future growth. Wildflower Solutions Architects are here to help with every step.