Attendees: Bharath, Peter, Vignesh, Seyone
Summary: Bharath’s spent a good chunk of the last week working on torchchem. This is intended to be a new set of PyTorch models that work with moleculenet. Bharath mentioned that that other next steps were to start factoring moleculenet out into its own repo. The goal of these efforts is to make it easier to use DeepChem’s infrastructure in your own projects (so you can pick and choose which parts you’d actually like to use).
Bharath also briefly mentioned that there were two choices with documentation efforts. One is to focus documentation on the website at https://deepchem.io/. The other is to make separate readthedocs websites for each package. Peter and Vignesh said they were agnostic about either option.
Peter merged in the TensorFlow 2.X transition PR into head! This was a major push forward since it got TensorFlow 2.X support working for most models. However, the graph and atomic convolutions are still broken. Peter mentioned he was going to transition to working on implementing a general multitask convolution class for DeepChem as a change from working on infrastructure. Bharath is going to take a crack at debugging
Vignesh mentioned that he was still busy with thesis work, but should hopefully have some time in a few weeks. He said he’d take a look at the on-going torchchem work (see discussion).
We had a new attendee, Seyone, who’s a student working on molecular machine learning methods. Bharath asked Seyone what parts of the library he’d found useful so far, and Seyone mentioned that the colab notebooks were a very useful resource for newcomers since they eased the installation barriers.
In other miscellaneous updates, DeepChem head shifted support to Python 3.6 and 3.7. This fixed Travis CI issues we were having with some of our libraries not supporting Python 3.5.