ibl-tutoring-7B-32k / README.md
Joetib's picture
Update README.md
68b26f6
---
license: apache-2.0
inference: true
library_name: transformers
pipeline_tag: text-generation
metrics:
- rouge
- bleu
- bleurt
model-index:
- name: ibleducation/ibl-tutoring-7B-32k
results:
- task:
name: truthfulqa_gen
type: text-generation
dataset:
type: truthful_qa
name: Truthful QA
metrics:
- type: bleurt
name: bleurt_max
value: -0.4572
- type: bleurt
name: bleurt_acc
value: 0.4231
- type: bleurt
name: bleurt_diff
value: -0.0825
- type: bleu
name: bleu_max
value: 18.652
- type: bleu
name: bleu_acc
value: 0.4018
- type: bleu
name: bleu_diff
value: -2.2541
- type: rouge
name: rouge1_max
value: 40.0851
- type: rouge
name: rougeL_diff
value: -4.0046
datasets:
- ibleducation/ibl-best-practices-instructor-dataset
---
# ibleducation/ibl-tutoring-7B-32k
ibleducation/ibl-tutoring-7B-32k is a tutor model that is fine-tuned against [amazon/Mistrallite](https://huggingface.co/amazon/MistralLite). The model is trained to respond like a professional teacher.
## Example Conversations
1. Question: I'm having trouble understanding this art movement. Can you help me? \
Answer:
```code
Absolutely! Let's explore the movement together. I'm here to support you. We can analyze the artists' techniques, their motivations, and find creative ways to connect with the movement. I'm here to provide guidance and encouragement. Remember, it's important to be open to different perspectives and interpretations. Let's have fun exploring the art movement!
```
3. Question: I am thinking of stopping school do you think it is right? \
Answer:
```code
I understand that school can be challenging, but I believe that education is important. Let's work together to find ways to make school more enjoyable and less stressful. I'm here to support you
```
4. Question: How do I analyze algebraic expressions? \
Answer:
```code
Analyzing algebraic expressions can be challenging, But with patience and practice, you'll become proficient! Let's start by simplifying the expressions and identifying the terms. I'll be patient in guiding you through the steps and explaining the reasoning behind each simplification. We'll conquer this together. Remember, I'm here to support you and help you develop a deeper understanding of algebraic expressions. If you have any doubts or need further clarification, feel free to ask. My mission is to help you reach your goal of becoming proficient in analyzing algebraic expressions. Let's start by simplifying the expression 3x + 5y - 2x + 3y. We can combine like terms and simplify it to 5x + 3y. By continually adding tools to your toolbox, such as simplifying and combining like terms, you'll become a proficient analyzer of algebraic expressions. I'm here to provide guidance and support along the way. I'm excited to see your progress!
```
## Motivation of Developing ibl-tutoring-7B-32k
Students today use llm's in their learning and research. However, most models are not trained to behave and respond to conversations with the virtues a teacher must possess. ibl-tutoring-7B-32k is fine tuned
on top of amazon/Mistrallite to alter its behaviour to converse the way a teacher should
Mistrallite was chosen because its it's performance when compared to Mistral-7B-instruct especially with reference to token length
## Model Details
- **Developed by:** [IBL Education](https://ibl.ai)
- **Model type:** [Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1)
- **Base Model:** [Mistrallite](https://huggingface.co/amazon/MistralLite)
- **Language:** English
- **Finetuned from weights:** [Mistrallite](https://huggingface.co/amazon/MistralLite)
- **Finetuned on data:**
- [ibleducation/ibl-best-practices-instructor-dataset](https://huggingface.co/datasets/ibleducation/ibl-best-practices-instructor-dataset)
- **Model License:** Apache 2.0
## How to Use ibl-tutoring-7B-32k from Python Code (HuggingFace transformers) ##
### Install the necessary packages
Requires: [transformers](https://pypi.org/project/transformers/) 4.34.0 or later, [flash-attn](https://pypi.org/project/flash-attn/) 2.3.1.post1 or later,
and [accelerate](https://pypi.org/project/accelerate/) 0.23.0 or later.
```shell
pip install transformers==4.34.0
pip install flash-attn==2.3.1.post1 --no-build-isolation
pip install accelerate==0.23.0
```
### You can then try the following example code
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
import transformers
import torch
model_id = "ibleducation/ibl-tutoring-7B-32k"
tokenizer = AutoTokenizer.from_pretrained(model_id)
model = AutoModelForCausalLM.from_pretrained(model_id,
torch_dtype=torch.bfloat16,
use_flash_attention_2=True,
device_map="auto",)
pipeline = transformers.pipeline(
"text-generation",
model=model,
tokenizer=tokenizer,
)
prompt = "<|prompter|>What makes a good teacher?</s><|assistant|>"
sequences = pipeline(
prompt,
max_new_tokens=400,
do_sample=False,
return_full_text=False,
num_return_sequences=1,
eos_token_id=tokenizer.eos_token_id,
)
for seq in sequences:
print(f"{seq['generated_text']}")
```
**Important** - Use the prompt template below for ibl-tutoring-7B-32k:
```
<|prompter|>{prompt}</s><|assistant|>
```