Attendees: Bharath, Peter, Vignesh, Seyone
Summary: Bharath’s been continuing work on the examples PR. This code is working to clean up the collection of DeepChem examples. Bharath’s working to expand out the set of examples and is finding a number of small bugs and infelicities in DeepChem on the way. He’s creating a number of issues proposing cleanup and API improvements as a result (issue, issue, issue, issue, issue)
Peter’s been working to improve the interoperability of the
Dataset class with other machine learning frameworks. He added this PR that enables the Dataset class to be converted to a PyTorch dataset. He also added this PR which allows
Dataset objects to be converted to/from Pandas dataframes. This will be useful for future integration with TorchChem and other ML frameworks.
Vignesh put together a set of pretrained weights for Chemception models. These will be useful for enabling easy experimentation with pretrained architectures. Bharath and Vignesh will figure out how to get these up onto AWS and enable easy download. Vignesh also said he might be able to add Chemception models to torchchem in a few weeks once he has bandwidth. Pretraining looks to be more convenient in PyTorch since there’s no need to manage session objects, so it might be easier to expand pretraining support in TorchChem.
Seyone’s been continuing work on his tutorial for applying BERT-style methods to molecular property prediction. He’s got an architecture for making predictions on Tox21 up and running but is still wrestling with a few tricky issues on making the model train correctly. He hopes to have those figured out and the tutorial up for review soon.
As a quick reminder to anyone reading along, the DeepChem developer calls are open to the public! If you’re interested in attending, please send an email to X.Y@gmail.com, where X=bharath, Y=ramsundar.