Development of a high throughput cloud-based data pipeline for 21 cm cosmology

R. Byrne, D. Jacobs

Research output: Contribution to journalArticlepeer-review

3 Scopus citations


We present a case study of a cloud-based computational workflow for processing large astronomical data sets from the Murchison Widefield Array (MWA) cosmology experiment. Cloud computing is well-suited to large-scale, episodic computation because it offers extreme scalability in a pay-for-use model. This facilitates fast turnaround times for testing computationally expensive analysis techniques. We describe how we have used the Amazon Web Services (AWS) cloud platform to efficiently and economically test and implement our data analysis pipeline. We discuss the challenges of working with the AWS spot market, which reduces costs at the expense of longer processing turnaround times, and we explore this tradeoff with a Monte Carlo simulation.

Original languageEnglish (US)
Article number100447
JournalAstronomy and Computing
StatePublished - Jan 2021


  • Cloud computing
  • Cosmology
  • Data analysis
  • Reionization

ASJC Scopus subject areas

  • Astronomy and Astrophysics
  • Computer Science Applications
  • Space and Planetary Science


Dive into the research topics of 'Development of a high throughput cloud-based data pipeline for 21 cm cosmology'. Together they form a unique fingerprint.

Cite this