Pretrained PLMs
Collection
11 items
•
Updated
Encoder only version of the ANKH2 large model (paper not released yet for ANKH2). The encoder only version is ideal for protein representation tasks.
from transformers import T5EncoderModel, AutoTokenizer
model_path = 'Synthyra/ANKH2_large'
model = T5EncoderModel.from_pretrained(model_path)
tokenizer = AutoTokenizer.from_pretrained(model_path)
We are working on implementing a version of T5 based PLMs with Flex attention once learned relative position bias is supported (used in T5). Stay tuned.