AWS Machine Learning Blog9h ago|Research & PapersBusiness & Industry

Scaling Seismic Foundation Models on AWS

TGS achieved near-linear scaling for distributed training and expanded context windows for their Vision Transformer-based Seismic Foundation Model (SFM) using Amazon SageMaker HyperPod, cutting training time from 6 months to 5 days.

💡

Why it matters

This news highlights the potential for cloud-based distributed training to accelerate the development of advanced AI models for seismic analysis, with significant time and cost savings.

Key Points

  • 1Distributed training of seismic foundation models on AWS using Amazon SageMaker HyperPod
  • 2Achieved near-linear scaling, reducing training time from 6 months to 5 days
  • 3Enabled analysis of larger seismic volumes than previously possible

Details

This article describes how TGS, a geoscience data company, was able to scale the training of their Vision Transformer-based Seismic Foundation Model (SFM) using Amazon SageMaker HyperPod on AWS. By leveraging distributed training, they achieved near-linear scaling, cutting the training time from 6 months down to just 5 days. This allowed them to expand the context windows of their SFM, enabling analysis of larger seismic volumes than was previously possible. The joint solution from TGS and AWS demonstrates the power of cloud-based distributed training for advancing AI-driven seismic analysis in the oil and gas industry.

Like
Save
Read original
Cached
Comments
?

No comments yet

Be the first to comment

AI Curator - Daily AI News Curation

AI Curator

Your AI news assistant

Ask me anything about AI

I can help you understand AI news, trends, and technologies