By Ruchi Munshi, Senior Software Product Manager, Data Sciences Platform

For a long time now, the Cromwell workflow engine has supported popular high-performance computing (HPC) platforms such as SGE, LSF and SLURM. But when it comes to clouds, Cromwell only had native support for the Google Cloud Platform (GCP), which is done via the Google Pipelines API.

Today, I’m thrilled to announce that over the span of this year, we’ve expanded Cromwell's native cloud support to a suite of several cloud backends to parallel our suite of supported HPC backends. The newly released Cromwell version 35 will support three cloud platforms natively: GCP with an upgraded backend to support the new Pipelines API v2; Amazon Web Services (AWS) through a brand new backend that connects to the Batch API (in beta status); and Alibaba Cloud, whose Cromwell backend was introduced in a previous version earlier this year.


The new AWS backend orchestrates workflows through the AWS Batch API. This enables us to deliver Cromwell's effortless workflow experience to the many users who are already working within the Amazon cloud. Want to learn more? We recently recorded a joint webinar with the AWS team that developed most of the backend code; the webinar covers all you need to know to get started with the new backend, including basic configuration and core capabilities, such as managing data in S3 buckets, taking advantage of cheap computing via the Amazon EC2 Spot market and more. You can find the recording here.

On the GCP front, we have a major update revolving around the adoption of the Google Pipelines API v2, which enables us to provide some eagerly awaited features. For example, many of our users have asked for the ability to request custom machine types, which can be more cost-effective than computing on standard machines. With the new version, you'll be able to request custom machines by default. We have made sure that upgrading the backend will be easy for all users -- all it takes is a small one-time change to your Cromwell configuration file to use the Pipelines API v2 backend, which is documented in the release notes.


Return to top

Mon 1 Oct 2018
Comment on this article


- Recent posts


- Upcoming events

See Events calendar for full list and dates


- Recent events

See Events calendar for full list and dates



- Follow us on Twitter

WDL Dev Team

@WDL_dev

RT @JMonlong: If you're in team #Emacs and want to work with WDL, here is how to get syntax highlighting and indentation polymode-style. Al…
16 Jul 19
RT @DockstoreOrg: Explore a WDL workflow on Dockstore by choosing between classic and EPAM visualizations. https://t.co/JQEPsIo4Ez https://…
3 Jul 19
So looking forward to having v1.0 support in @TerraBioApp ! It will unlock some really neat features with important… https://t.co/lc9bFpj1jo
28 Jun 19
This blog post shows how to run WDL workflows on cloud on @TerraBioApp https://t.co/uJ40A7SFgA
26 Jun 19
RT @DockstoreOrg: Did you know our 1.6.0 release back in April lets you launch WDL workflows via @TerraBioApp now? Browse our workflows at…
26 Jun 19

- Our favorite tweets from others

@DNAstack @DTSupercluster Ok, so now I'm expecting this many more @WDL_dev PRs :P
2 Aug 19
@geoffjentry @DTSupercluster Patrick was busy reviewing a @WDL_dev PR
2 Aug 19
This is huge. Very common ask from workflow devs to use high level langs but be able to distribute using community… https://t.co/iYBhz3i9op
25 Jul 19
This is a huge step forward. Kudos to @DNAmlin and crew https://t.co/51ZJt4yedv
19 Jul 19
miniwdl v0.3.0 is the first version with independent capability to run #OpenWDL workflows on the local host. Early… https://t.co/yzUov38KQy
19 Jul 19

See more of our favorite tweets...