By Moran Cabili, product manager, Data Sciences Platform at the Broad Institute

We heard from many of you --both new FireCloud users and experienced WDL pipeline developers-- that you need to be able to quickly test that a WDL workflow can be run successfully on FireCloud. Until now, you were required to reference an entity in the workspace data model, which took extra effort and tended to confuse newcomers. We are happy to announce that this speed bump has been eliminated; you can now bypass the data model and even upload a JSON of inputs to get your WDL up and running in record time.


You don't need a populated data model to run a workflow

You can run your workflows with just your method and your inputs specified directly in the method config. Before you launch, just uncheck the checkbox ‘Configure inputs/outputs using the Workspace Data Model’ in the Method Configurations tab. If you’d like to continue using the workspace data model, leave it checked, and it will operate as usual.

Note that referencing the data model is still needed for launching submissions that include multiple workflows, for example when you want to run the same workflow on multiple samples in a sample set in parallel.

You can upload a JSON file of inputs to populate the method configuration

JSON is the format most commonly used to specify inputs for a WDL workflow outside of FireCloud, so this is especially useful if your workflow was developed and tested outside of FireCloud and therefore already comes with a JSON file of inputs. For example, this feature makes it easy for you to run a WDL from Dockstore if it is accompanied by a JSON file, as is usually the case (see the Test file section in the Dockstore repository of interest). Note that if you are trying to use a WDL with a JSON file that was tailored for a different platform, you'll need to make sure you update the paths to any files to point to locations in Google Cloud storage that FireCloud can access.

You can download a JSON file to copy and re-use lists of input files and parameters

You can now download your JSON file too, which comes in handy when you want to use the same inputs across method configurations. For example, if you create a new method configuration, normally you’d have to enter all of the inputs manually every time, even if this new configuration will use many of the same inputs as an existing configuration. Now you can download the inputs from the prior configuration as a JSON and upload that to populate the new configuration.

Admittedly it would be even better to be able to copy inputs from one configuration to the other within FireCloud itself --and we're looking at ways to make that happen in the future-- but at least the ability to save inputs to file and re-upload is a big step up from having to fill out each field one by one manually! Right?


So in short, we hope these new capabilities will make it easier for you to get your workflows up and running quickly, whether it's your first time using FireCloud, or you're an old hand testing new WDLs. We do however still recommend making use of the power of the data model for batch processing, especially when you have a lot of data to crank through your pipelines. In fact, we also added some convenience for when you choose to use the data model: you can now set the outputs of your workflow to get hooked up to attributes named after the corresponding output names used in the WDL.

To get you fully up to speed, we’ve updated our Quick Start Guide on method configurations, as well as the tutorials on launching an analysis and configuring a method in a workspace.


Return to top

Mon 6 Aug 2018
Comment on this article


- Recent posts



- Follow us on Twitter

FireCloud

@BroadFireCloud

The analysis described in this paper is available in reproducible form in FireCloud; see https://t.co/uSChRZIoZg fo… https://t.co/j1zeh2TGRg
30 Nov 18
@xDBio_Inc @geoffjentry @WDL_dev @gatk_dev It’s pretty new, glad you like it! Think we should add the name itself a… https://t.co/65jwJEbyQu
23 Oct 18
We’re excited to deliver our #ASHG18 Invited Workshop on reproducible research tomorrow morning! Looking forward to… https://t.co/6vmB5qaA1H
17 Oct 18
RT @NCI_NCIP: Does @BroadFireCloud sound familiar? It should! @AllofUsResearch uses the same researcher workbench as this @NIH initiative.…
16 Oct 18
RT @broadinstitute: .@BroadGenomics put together a comprehensive list of @broadinstitute activities at #ASHG18. Find out about sessions, po…
16 Oct 18

- Our favorite tweets from others

The question is, how will @Microsoft and @Docker team up to solve collaborative challenges in the area of Bioinform… https://t.co/IKzElembVl
4 Dec 18
@geoffjentry Who doesn't love a Warp Pig? @WDL_dev and @gatk_dev are on the ball getting stickers out. Was happy to… https://t.co/91OODRpFOC
22 Oct 18
Does @BroadFireCloud sound familiar? It should! @AllofUsResearch uses the same researcher workbench as this @NIH in… https://t.co/8ZyoMBSG4x
12 Oct 18
Today at #GATK course, pipelining with WDL, Cromwell and Firecloud! @ClinicalBioinfo @FProgresoysalud @gatk_dev https://t.co/V4bLinpoPh
20 Sep 18
@dgmacarthur If anybody wants to sequence my genome to find the rare variant that is preventing me from going into… https://t.co/xGPGDZn9rQ
11 Jul 18

See more of our favorite tweets...