File size: 193 Bytes
e07c7af | 1 2 3 4 | ---
license: apache-2.0
---
SimSon model, pre-trained on 100M SMILES from Zinc, Enamine and PubChem, and 1M polymers from P1M dataset (seperate checkpoint). 4 hidden layers, 12 attention heads. |
e07c7af | 1 2 3 4 | ---
license: apache-2.0
---
SimSon model, pre-trained on 100M SMILES from Zinc, Enamine and PubChem, and 1M polymers from P1M dataset (seperate checkpoint). 4 hidden layers, 12 attention heads. |