ReGPT-125M-200G

This model was trained on GPT-Neo-125M with Mengzi Retrieval LM.

For more details, please refer to this document.

How to use

You have to use a forked transformers: https://github.com/Langboat/transformers

from transformers import Re_gptForCausalLM
model = Re_gptForCausalLM.from_pretrained('Langboat/ReGPT-125M-200G')
Downloads last month
69
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Spaces using Langboat/ReGPT-125M-200G 2