import spaces import gradio as gr from transformers import AutoModelForSeq2SeqLM, AutoTokenizer import torch title = """# 👋🏻Welcome To 🌟Tonic's🌐Aya-101""" description = """ try this space to build downstream applications with [CohereForAI/aya-101](https://huggingface.co/CohereForAI/aya-101) use via API ;-) """ checkpoint = "CohereForAI/aya-101" tokenizer = AutoTokenizer.from_pretrained(checkpoint) model = AutoModelForSeq2SeqLM.from_pretrained(checkpoint) device = "cuda" if torch.cuda.is_available() else "cpu" model.to(device) if device == "cuda": model = model.half() @spaces.GPU def translate(text): """ Translates the input text to English using the Aya model. Assumes the model can automatically detect the input language. """ inputs = tokenizer.encode(text, return_tensors="pt").to(device) outputs = model.generate(inputs, max_new_tokens=128) translation = tokenizer.decode(outputs[0], skip_special_tokens=True) return translation def main(): with gr.Blocks() as demo: gr.Markdown(title) gr.Markdown(description) with gr.Row(): input_text = gr.Textbox(label="Input Text") output_text = gr.Textbox(label="🌐Aya", interactive=False) input_text.change(fn=translate, inputs=input_text, outputs=output_text) demo.launch() if __name__ == "__main__": main()