Use GPT-3 to solve chemistry problems. Most of the repo is currently not intended for use as library but as documentation of our experiments. We'll factor out the experiments (that come with tricky dependencies) into its own repository over time.
from gptchem.gpt_classifier import GPTClassifier
from gptchem.tuner import Tuner
classifier = GPTClassifier(
property_name="transition wavelength", # this is the property name we will use in the prompt template
tuner=Tuner(n_epochs=8, learning_rate_multiplier=0.02, wandb_sync=False),
)
classifier.fit(["CC", "CDDFSS"], [0, 1])
predictions = classifier.predict(['CCCC', 'CCCCCCCC'])
The time these call take can vary as the methods call the OpenAI API under the hood. Therefore, in situation of high load, we also experienced hours of waiting time in the queue.
The most recent code and data can be installed directly from GitHub with:
$ pip install git https://github.com/kjappelbaum/gptchem.git
The installation should only take a few seconds to minutes. You can install additional depenencies using the extras experiments
and eval
.
Contributions, whether filing an issue, making a pull request, or forking, are appreciated. See CONTRIBUTING.md for more information on getting involved.
The code in this package is licensed under the MIT License.
If you found this package useful, please cite our preprint
@inproceedings{Jablonka_2023,
doi = {10.26434/chemrxiv-2023-fw8n4},
url = {https://doi.org/10.26434/chemrxiv-2023-fw8n4},
year = 2023,
month = {feb},
booktitle = {ChemRxiv},
author = {Kevin Maik Jablonka and Philippe Schwaller and Andres Ortega-Guerrero and Berend Smit},
title = {Is {GPT} all you need for low-data discovery in chemistry?}
}
See developer instructions
The final section of the README is for if you want to get involved by making a code contribution.
To install in development mode, use the following:
$ git clone git https://github.com/kjappelbaum/gptchem.git
$ cd gptchem
$ pip install -e .
After cloning the repository and installing tox
with pip install tox
, the unit tests in the tests/
folder can be
run reproducibly with:
$ tox
Additionally, these tests are automatically re-run with each commit in a GitHub Action.
The documentation can be built locally using the following:
$ git clone git https://github.com/kjappelbaum/gptchem.git
$ cd gptchem
$ tox -e docs
$ open docs/build/html/index.html
The documentation automatically installs the package as well as the docs
extra specified in the setup.cfg
. sphinx
plugins
like texext
can be added there. Additionally, they need to be added to the
extensions
list in docs/source/conf.py
.
After installing the package in development mode and installing
tox
with pip install tox
, the commands for making a new release are contained within the finish
environment
in tox.ini
. Run the following from the shell:
$ tox -e finish
This script does the following:
- Uses Bump2Version to switch the version number in the
setup.cfg
,src/gptchem/version.py
, anddocs/source/conf.py
to not have the-dev
suffix - Packages the code in both a tar archive and a wheel using
build
- Uploads to PyPI using
twine
. Be sure to have a.pypirc
file configured to avoid the need for manual input at this step - Push to GitHub. You'll need to make a release going with the commit where the version was bumped.
- Bump the version to the next patch. If you made big changes and want to bump the version by minor, you can
use
tox -e bumpversion minor
after.
This package was created with @audreyfeldroy's cookiecutter package using @cthoyt's cookiecutter-snekpack template.