This post was co-authored with Bharath Ramsundar.
We are excited to announce the release of DeepChem 2.6.0! DeepChem 2.6.0 has a number of improvements in the overall usability of DeepChem like improvement to documentation, addition and re-organization of tutorials, features and models. The improved test coverage makes DeepChem 2.6.0 more production ready than the previous releases.
The release contains contributions from many people. We would like to thank our amazing community of developers, users and well-wishers who have made it possible.
Detailed Overview of Changes
Improved Production Readiness
One of the major improvements has been a wide increase in test coverage in 2.6.0. DeepChem’s CI has been improved to include a broad range of tests for all components of DeepChem, including many slow tests that weren’t being run on the CI previously. Separate CI workflows were created for individual components like Tensorflow, Jax, Pytorch models and other components like docstrings etc. All CI workflows have passed on the Linux platform making this release stable for production usage by guaranteeing an increased robustness. The overall healthiness of code has also been improved by improved adherence to coding conventions like mypy, flake8 and yapf. DeepChem 2.6.0 supports python 3.7–3.9 in Ubuntu and macOS. For Windows, DeepChem is stable on python 3.7.
This release upgrades DeepChem’s backend to support TensorFlow 2.7 and PyTorch 1.9.0. The current version is also compatible with the latest version of NumPy 1.22 paving way for going towards a more unified array API (ref).
Support for Jax Models
DeepChem 2.6.0 also started using Jax as a machine learning backend. We have added support for Jax models via the dc.models.jax_models
API. Support for Jax is limited to Ubuntu and macOS as Windows currently does not support Jax.
Upgraded Tutorials
A lot of quality improvements have been made to the tutorials section of DeepChem as they form a core component of improving usability of DeepChem. Some of the existing tutorials were improved and new tutorials were added to cover different aspects and use cases of DeepChem. All tutorials can be used in Google Colab for learning purposes. Some of the new tutorials are:
- Introduction to Material Science
- Protein Deep Learning
- Physics Informed Neural Networks
- Introduction to Molecular Attention Transformer
- Distributed Multi-GPU training of DeepChem Models with LitMatter
- Multisequence Alignments
Also, the entire set of tutorials has been re-organized based on use cases of DeepChem like Molecular Machine Learning, Modeling Proteins, Protein Ligand Modeling, Material Science, Quantum Chemistry, Bioinformatics and Physics Informed Neural Networks. This time, we also have a couple of DeepChem YouTube tutorials complementing the existing set of tutorials in the form of interactive Jupyter Notebooks.
Documentation
DeepChem 2.6.0 has improved its documentation on API reference for its users by adding more usage examples and explanations wherever as required. Documentation has also been improved to cover infrastructure related aspects like making a release, CI, running test suites, getting started with contributing to DeepChem etc.
Datasets and DataLoader Improvements
As part of the MoleculeNet suite of datasets, we have added the Freesolv dataset (#2576) and the USPTO dataset (#2546). Improvements have been made to out deepchem.data.DataLoader
classes to handle a wide variety of data like .zip files (#2446)
Improvements to Featurizers
Featurizers are key strengths of DeepChem. In this release, we have added the following new featurizers:
- DeepChem now supports a DummyFeaturizer via the
dc.feat.DummyFeaturizer
API. This featurizer simply returns a datapoint without performing any kind of featurization operation on it. It turns handy when working on datasets which do not require any kind of featurization. - Addition of the PAGTN featurizer which can be used for for PAGTN graph network for molecules (
dc.feat.PagtnMolGraphFeaturizer
) - Addition of RobertaFeaturizer for transformer models
- Addition of BertFeaturizer for transformer models
- Addition of RxnFeaturizer for chemical reaction models
- OneHotFeaturizer (
dc.feat.OneHotFeaturizer
) can now be used to encode any arbitrary string or molecule as a one-hot array. This can be very useful for a wide variety of applications like protein modeling. Transformer has been added for handling chemical reaction SMILES into source and target string required for machine translation tasks (#2597).
Models and Layers
DeepChem has added a number of new models:
- Pagtn model: Graph property prediction (dc.models.PagtnModel)
- MolGAN model: A generative model for small molecular graphs
- PINNModel: Partial differential equation solvers
- Molecular Attention Transformer: Molecule property predicton
And new layers like:
- A linear layer in Jax (#2634)
- ScaleNorm
- MATEncoderLayer
- MultiHeadedMATAttention
- MATEmbedding
- MATGenerator
DeepChem is moving towards supporting fully differentiable layers (link). Support for Numpy 1.22 will enable us to add generic layers in future releases.
Minor improvements
DeepChem had a lot of minor improvements like fixes on minor errors in CI, saving and loading of models etc.
- Improvements like addition of utilities functions for using Graph Conv models, utilities to find shortest path between atoms in a molecule etc have also been added.
- New features for different loss functions like Huber loss, Squared Hinge loss have been added.
- Also, optimizers like AdamW (Adam with weight decay), sparse adam optimizer have been added to give users a wide variety of choice for training models.
- Hyper parameter optimization methods have been improved and made consistent across different hyper parameter optimization techniques.
- Docking: Using Vina has been quite challenging in the past. The AutoDock Vina team had released a python API which has been integrated with DeepChem for docking related usages (#2741).
Improved Logging and Error Messages
DeepChem has improved upon better error messages and improved logging messages for failing cases and invalid inputs. Now, Weights & Biases logger (#2520) logger was integrated with DeepChem to log training loss and validation metrics apart from the TensorBoard. Logging for hyper-parameter optimizers has been improved to log all the tested models.
Pull Request list:
Here is a full list of Pull Requests (PRs) describing the changes to deepchem repository.
Dataset and DataLoader Improvements
- https://github.com/deepchem/deepchem/pull/2446
- https://github.com/deepchem/deepchem/pull/2546
- https://github.com/deepchem/deepchem/pull/2565 (Improvements to FASTA Loader)
- https://github.com/deepchem/deepchem/pull/2576
Improvements to Testing
- https://github.com/deepchem/deepchem/pull/2461
- https://github.com/deepchem/deepchem/pull/2486
- https://github.com/deepchem/deepchem/pull/2538
- https://github.com/deepchem/deepchem/pull/2568
- https://github.com/deepchem/deepchem/pull/2578
- https://github.com/deepchem/deepchem/pull/2604 (tests for Jax Models)
- https://github.com/deepchem/deepchem/pull/2610
- https://github.com/deepchem/deepchem/pull/2757
Changes in CI
- https://github.com/deepchem/deepchem/pull/2525
- https://github.com/deepchem/deepchem/pull/2573
- https://github.com/deepchem/deepchem/pull/2607
- https://github.com/deepchem/deepchem/pull/2645
- https://github.com/deepchem/deepchem/pull/2715
- https://github.com/deepchem/deepchem/pull/2720
- https://github.com/deepchem/deepchem/pull/2722
- https://github.com/deepchem/deepchem/pull/2793
Models
New Models, Layers and Modules
- https://github.com/deepchem/deepchem/pull/2426 (molGAN model)
- https://github.com/deepchem/deepchem/pull/2508 (Pagtn model)
- https://github.com/deepchem/deepchem/pull/2622 (Attention module for MAT)
- https://github.com/deepchem/deepchem/pull/2624 (MAT Embedding and Generator Layer)
- https://github.com/deepchem/deepchem/pull/2691 (MATModel integration with deepchem)
- https://github.com/deepchem/deepchem/pull/2658 (PINNModel)
Jax Models
Existing Model Updates
- https://github.com/deepchem/deepchem/pull/2559 (conversion of MultitaskRegressor and MultitaskClassifier to PyTorch)
Error Messages and Logging
- https://github.com/deepchem/deepchem/pull/2442
- https://github.com/deepchem/deepchem/pull/2520 (WandB logger addition)
- https://github.com/deepchem/deepchem/pull/2586
- https://github.com/deepchem/deepchem/pull/2766
- https://github.com/deepchem/deepchem/pull/2725 (Improvements to GridHyperparameter logging)
Tutorial Updates
Updates to existing tutorials:
- https://github.com/deepchem/deepchem/pull/2435
- https://github.com/deepchem/deepchem/pull/2445
- https://github.com/deepchem/deepchem/pull/2535
- https://github.com/deepchem/deepchem/pull/2637 (tutorial re-organization)
- https://github.com/deepchem/deepchem/pull/2729
- https://github.com/deepchem/deepchem/pull/2746
New Tutorials:
- https://github.com/deepchem/deepchem/pull/2483
- https://github.com/deepchem/deepchem/pull/2626
- https://github.com/deepchem/deepchem/pull/2711
- https://github.com/deepchem/deepchem/pull/2786
- https://github.com/deepchem/deepchem/pull/2805
- https://github.com/deepchem/deepchem/pull/2682
Minor Improvements
- https://github.com/deepchem/deepchem/pull/2450
- https://github.com/deepchem/deepchem/pull/2460
- https://github.com/deepchem/deepchem/pull/2462
- https://github.com/deepchem/deepchem/pull/2463
- https://github.com/deepchem/deepchem/pull/2466
- https://github.com/deepchem/deepchem/pull/2479
- https://github.com/deepchem/deepchem/pull/2482
- https://github.com/deepchem/deepchem/pull/2484
- https://github.com/deepchem/deepchem/pull/2488
- https://github.com/deepchem/deepchem/pull/2493
- https://github.com/deepchem/deepchem/pull/2497
- https://github.com/deepchem/deepchem/pull/2506
- https://github.com/deepchem/deepchem/pull/2516
- https://github.com/deepchem/deepchem/pull/2522
- https://github.com/deepchem/deepchem/pull/2539
- https://github.com/deepchem/deepchem/pull/2563
- https://github.com/deepchem/deepchem/pull/2566
- https://github.com/deepchem/deepchem/pull/2567
- https://github.com/deepchem/deepchem/pull/2584
- https://github.com/deepchem/deepchem/pull/2595
- https://github.com/deepchem/deepchem/pull/2605
- https://github.com/deepchem/deepchem/pull/2607
- https://github.com/deepchem/deepchem/pull/2614
- https://github.com/deepchem/deepchem/pull/2620
- https://github.com/deepchem/deepchem/pull/2628
- https://github.com/deepchem/deepchem/pull/2641
- https://github.com/deepchem/deepchem/pull/2648
- https://github.com/deepchem/deepchem/pull/2652
- https://github.com/deepchem/deepchem/pull/2654
- https://github.com/deepchem/deepchem/pull/2671
- https://github.com/deepchem/deepchem/pull/2695
- https://github.com/deepchem/deepchem/pull/2694
- https://github.com/deepchem/deepchem/pull/2732
- https://github.com/deepchem/deepchem/pull/2742
- https://github.com/deepchem/deepchem/pull/2759
- https://github.com/deepchem/deepchem/pull/2767
- https://github.com/deepchem/deepchem/pull/2769
- https://github.com/deepchem/deepchem/pull/2777
- https://github.com/deepchem/deepchem/pull/2790
- https://github.com/deepchem/deepchem/pull/2724
- https://github.com/deepchem/deepchem/pull/2799
- https://github.com/deepchem/deepchem/pull/2802
- https://github.com/deepchem/deepchem/pull/2545
- https://github.com/deepchem/deepchem/pull/2807
- https://github.com/deepchem/deepchem/pull/2741
Dependency and Setup Fixes
Most of these fixes are related to version bumping, pinning and fixing. CI improvements, installation related fixes.
- https://github.com/deepchem/deepchem/pull/2524
- https://github.com/deepchem/deepchem/pull/2560 (Jax setup)
- https://github.com/deepchem/deepchem/pull/2590
- https://github.com/deepchem/deepchem/pull/2618
- https://github.com/deepchem/deepchem/pull/2733
- https://github.com/deepchem/deepchem/pull/2740
- https://github.com/deepchem/deepchem/pull/2789
Documentation improvements
- https://github.com/deepchem/deepchem/pull/2469
- https://github.com/deepchem/deepchem/pull/2471
- https://github.com/deepchem/deepchem/pull/2473
- https://github.com/deepchem/deepchem/pull/2491
- https://github.com/deepchem/deepchem/pull/2492
- https://github.com/deepchem/deepchem/pull/2503
- https://github.com/deepchem/deepchem/pull/2571
- https://github.com/deepchem/deepchem/pull/2585
- https://github.com/deepchem/deepchem/pull/2587
- https://github.com/deepchem/deepchem/pull/2591
- https://github.com/deepchem/deepchem/pull/2592
- https://github.com/deepchem/deepchem/pull/2599
- https://github.com/deepchem/deepchem/pull/2621
- https://github.com/deepchem/deepchem/pull/2630
- https://github.com/deepchem/deepchem/pull/2643
- https://github.com/deepchem/deepchem/pull/2649
- https://github.com/deepchem/deepchem/pull/2651
- https://github.com/deepchem/deepchem/pull/2664
- https://github.com/deepchem/deepchem/pull/2698
- https://github.com/deepchem/deepchem/pull/2712
- https://github.com/deepchem/deepchem/pull/2713
- https://github.com/deepchem/deepchem/pull/2751
- https://github.com/deepchem/deepchem/pull/2752
- https://github.com/deepchem/deepchem/pull/2768
- https://github.com/deepchem/deepchem/pull/2774
Featurizers and Transformers
New Featurizers
- https://github.com/deepchem/deepchem/pull/2544 (MATFeaturizer)
- https://github.com/deepchem/deepchem/pull/2570 (Dummy Featurizer)
- https://github.com/deepchem/deepchem/pull/2656 (RxnFeaturizer)
- https://github.com/deepchem/deepchem/pull/2597 (RxnSplitTransformer)
- https://github.com/deepchem/deepchem/pull/2523 (RobertaFeaturizer)
- https://github.com/deepchem/deepchem/pull/2642 (BertFeaturizer)
Improvements