YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

ESM Cambrian 600M

from transformers import AutoTokenizer, AutoModel
from esm.models.esmc import ESMC
from esm.sdk.api import ESMProtein, LogitsConfig
import torch

client = ESMC.from_pretrained("esmc_600m")
protenrich = AutoModel.from_pretrained("SaeedLab/ProtEnrich-ESMC-600M", trust_remote_code=True)

seqs = "MKTFFVLLL"
protein = ESMProtein(sequence=seqs)

with torch.no_grad():
  protein_tensor = client.encode(protein)
  outputs = client.logits(
    protein_tensor, LogitsConfig(sequence=True, return_embeddings=True)
  )
  pooled = outputs.embeddings[0, 1:-1].mean(axis=0)
  enriched = protenrich(pooled)

print('H enrich:', enriched.h_enrich)
print('H anchor:', enriched.h_anchor)
print('H algn:', enriched.h_algn)
print('Structure:', enriched.struct)
print('Dynamics:', enriched.dyn)
Downloads last month
21
Safetensors
Model size
13.8M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Collection including SaeedLab/ProtEnrich-ESMC-600M