File size: 2,563 Bytes
38ca3df 37bc035 38ca3df 37bc035 0c96eab 38ca3df d2fa755 37bc035 0c96eab 37bc035 dc68044 37bc035 dc68044 37bc035 38ca3df 5ef3c94 38ca3df f10f95e 38ca3df dc68044 f10f95e 3e48fc2 96c2387 f10f95e 3e48fc2 96c2387 f10f95e 3e48fc2 96c2387 3d114da f10f95e 3e48fc2 96c2387 f10f95e 3e48fc2 96c2387 3e48fc2 96c2387 4269d7e |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 |
---
license: mit
datasets:
- tatsu-lab/alpaca
tags:
- generated_from_trainer
- text2text-generation
model-index:
- name: T5R-base
results: []
pipeline_tag: text2text-generation
language:
- en
widget:
- text: |
Instruction: X
Output: Adolf Hitler (German: [ˈadɔlf ˈhɪtlɐ] (listen); 20 April 1889 – 30 April 1945) was an Austrian-born German politician who was the dictator of Germany from 1933 until his suicide in 1945. He rose to power as the leader of the Nazi Party,[a] becoming the chancellor in 1933 and then taking the title of Führer und Reichskanzler in 1934.[b] During his dictatorship, he initiated World War II in Europe by invading Poland on 1 September 1939. He was closely involved in military operations throughout the war and was central to the perpetration of the Holocaust: the genocide of about six million Jews and millions of other victims.
X:
example_title: Example 1
- text: |
Instruction: X
Output: 1- Base your meals on higher fibre starchy carbohydrates. 2- Eat lots of fruit and veg. 3- Eat more fish, including a portion of oily fish.
What kind of instruction could this be the answer to?
X:
example_title: Example 2
---
# T5-Reverse (T5R)
This model can generate prompts (instructions) for any text!
This model is an instruction-tuned version of [google/flan-t5-base](https://huggingface.co/google/flan-t5-base) on [alpaca dataset](https://huggingface.co/datasets/tatsu-lab/alpaca) but in **reverse format**!
## How to Use the Model
You can use the `transformers` library to load and utilize the T5-Reverse (T5R) model for generating prompts based on text. Here's an example of how to do it:
```python
>>> # Import required libraries
>>> import torch
>>> from transformers import pipeline
>>> # Load the model and tokenizer using the pipeline from Hugging Face Hub
>>> inference = pipeline("text2text-generation", model="kargaranamir/T5R-base")
>>> # Example instruction and prompt
>>> sample = '''
>>> Instruction: X
>>> Output: 1- Base your meals on higher fibre starchy carbohydrates. 2- Eat lots of fruit and veg. 3- Eat more fish, including a portion of oily fish.
>>> What kind of instruction could this be the answer to?
>>> X:
>>> '''
>>> # Generate a response using the model
>>> res = inference(sample)
>>> # Print the generated response
>>> print(res)
[{'generated_text': 'Instruction: Generate three recommendations for a healthy diet.'}]
```
## Citation
If you find this model/approach useful, make a link to the huggingface model. |