A production-ready suite of tools for single-sample variant filtration

After training on a massive amount of data, the neural network models at the heart of CNNvariant have learned enough to be released from beta.

This suite of tools for single-sample variant filtration uses machine learning to differentiate between good variants and artifacts of the sequencing process, a fairly new approach that is especially effective at correctly calling indels. To enable the models to accurately filter and score variants from VCF files, we trained on validated VCFs (from truth models including SynDip, Genomes in a bottle, and Platinum Genomes) with unvalidated VCFs aligned to different reference builds (HG19, HG38), sequenced on different machines, using different protocols. The result is a single-sample tool that complements, rather than replaces, VQSR (our "traditional" cohort based variant filtering algorithm).

What’s in the CNN toolbox? Pre-trained and train-your-own CNNScoreVariant is a pre-trained Convolutional Neural Network ready to score variants. After scoring, you can filter your VCF by applying a sensitivity threshold with the tool FilterVariantTranches. Or you can train your own models with the tools CNNVariantWriteTensors and CNNVariantTrain, as long as you have validated VCFs to use as training data.

While the current release is only recommended for single sample variant filtration, in the future we hope to explore ways a similar model could be applied to joint-called VCFs.

Under the hood For details on how our deep learning tool works and how it stacks up to other variant filtering tools (both Deep Learning and traditional), as well as a bit of history on Deep Learning in genomics, see our other blog post.



Weichi on 12 Feb 2019


It's great to hear that! Thanks! Just remind you there is an error on [CNNScoreVariants documentation ](https://software.broadinstitute.org/gatk/documentation/tooldocs/current/org_broadinstitute_hellbender_tools_walkers_vqsr_CNNScoreVariants.php "CNNScoreVariants documentation "). The value in -tensor-type should be _read_tensor_ not _read-tensor_. Because I got the below error: >! A USER ERROR has occurred: Argument tensor-type has a bad value: read-tensor. 'read-tensor' is not a valid value for TensorType. Possible values: { >! reference ( 1 Hot encoding of a reference sequence. ) >! read_tensor (Read tensor are 3D tensors spanning aligned reads, sites and channels. The maximum number of reads is a hyper-parameter typically set to 128. There are 15 channels in the read tensor. They correspond to the reference sequence data (4), read sequence data (4), insertions and deletions (2) read flags (4) and mapping quality (1).) >! } Regards Weichi




- Recent posts


- Upcoming events

See Events calendar for full list and dates


- Recent events

See Events calendar for full list and dates



- Follow us on Twitter

GATK Dev Team

@gatk_dev

Closing the loop: we import our #OpenWDL workflow into @TerraBioApp https://t.co/JpMH43wiTj
17 May 19
First run of #OpenWDL HelloWorld successful! Ramping up toward HelloGATK... https://t.co/ZIWASIRoQI
17 May 19
Last day of GATK bootcamp at @CSCfi, morning session on writing and running GATK workflows in #OpenWDL then tying i… https://t.co/QULQjb8ZCV
17 May 19
Last chance to join us next month in sunny Newcastle for a 4-day GATK bootcamp that will answer every question you… https://t.co/MLaSsC5Sr3
17 May 19
GATK in the UK! https://t.co/q91ghgjPPx
16 May 19

- Our favorite tweets from others

@sciencegurlz0 @gatk_dev documentation, tutorials and videos were very useful to me https://t.co/Qkcsbahn9p
14 May 19
Having a great introduction day! Looking forward the coming practicals and discussions! https://t.co/DBaZfRVGYD
14 May 19
The (awesome) materials are also available at: https://t.co/3bhlHpUSKH
14 May 19
@lukwam @broadinstitute @gatk_dev Nice to see Cromwell and GATK as the tools of choice
11 Apr 19
Demo: Checking output from GATK best practices. @broadinstitute @gatk_dev #gatk #genomics #cromwell #bestpractices… https://t.co/iAwmy10zDJ
11 Apr 19

See more of our favorite tweets...