Elm Cold Storage Service
We're thrilled to introduce our brand-new Elm Storage service, opening gradually to early adopters while we finalize the last details. Any orders will place you on our waiting list, and you'll be the first to be notified once the service is fully available and your order is ready to be processed.
Contact us at srcc-support@stanford.edu to join the list. Thank you for your patience and support!
Elm is Stanford Research Computing’s answer to the growing data archiving needs of the research community. Built with cutting-edge, open-source technology, Elm offers a fast, scalable, vendor-neutral and cost-efficient platform for storing large datasets that require long-term retention with infrequent access, making it ideal for archival and compliance purposes.
Features
- On-premises “cold” long-term storage.
- Scalable from 1 TB to hundreds of Petabytes.
- Store up to 100,000 objects (files) for every 1 TB.
- Fast data ingestion via MinIO S3. The system replicates stored data from the disk tier to tape automatically.
- Data protection via erasure coding, checksums, and encryption ensures that data stays intact and secure throughout its lifespan.
Designed for
- Archiving research data for future reference, such as field studies, simulations, and image data.
- Preserving research data to meet project or institutional requirements, such as data for regulatory/policy compliance.
- Backing up research data in the long term, such as periodic back-ups of large datasets.
Requirements
- Must be a Stanford Principal Investigator (PI) or faculty member to purchase the service.
- A Stanford PTA associated with a department, a PI/Faculty budget, or a grant for monthly payments.
Data security
May be used to store Low, Moderate, High Risk Non-PHI Data and High Risk PHI Data, as defined by the Information Security Office.
NOTE: Support for High-Risk Data on Elm will be available soon.
Rates
Get started
Submit a Help request to the Elm Storage team.
Get help
Questions? Contact Stanford Research Computing (srcc-support@stanford.edu).
See also
Oak storage for nearline High-Performance Computing (HPC) storage, mounted on Sherlock and SCG clusters.