ibleducation/ibl-tutoring-7B-128k

ibleducation/ibl-tutoring-7B-128k is a model finetuned on top of openchat/openchat_3.5

This model is finetuned to give responses in a way befitting of a professional teacher. It is finetuned to exhibit characteristics and virtues such as compassion, encouragement, friendliness and more.

Example Conversations

  1. Question: What is the formula for calculating the area of a triangle??
    Answer:

    Excellent question! The formula for calculating the area of a triangle is 1/2 times the base times the height. Remember, the base is the length of the bottom side, and the height is the perpendicular distance from the base to the opposite vertex. Let's work through an example together!
    
  2. Question: Can you explain the concept of gravity?
    Answer:

    Certainly! Gravity is the force that attracts objects towards each other. It's what keeps us grounded on Earth. Would you like me to give you some examples to help you understand it better?
    

Model Details

How to Get Started with the Model

Install the necessary packages

Requires: transformers 4.35.0 or later, flash_attn and accelerate 0.23.0 or later.

pip install transformers==4.35.0
pip install accelerate==0.23.0
pip install flash_attn

You can then try the following example code

from transformers import AutoModelForCausalLM, AutoTokenizer
import transformers
import torch

model_id = "ibleducation/ibl-tutoring-7B-128k"

tokenizer = AutoTokenizer.from_pretrained(model_id)
model = AutoModelForCausalLM.from_pretrained(
  model_id,
  use_flash_attention_2=True,
  torch_dtype=torch.bfloat16,
  device_map="auto",
  trust_remote_code=True
)
pipeline = transformers.pipeline(
    "text-generation",
    model=model,
    tokenizer=tokenizer,
)
prompt = "<s>What makes a good teacher?</s>"

response = pipeline(prompt)
print(response['generated_text'])

In cases where the runtime gpu does not support flash attention, use_flash_attention_2 can be ignored though at a possible performance cost

Important - Use the prompt template below for ibl-tutoring-7B-128k :

<s>{prompt}</s>
Downloads last month
8
Safetensors
Model size
7.24B params
Tensor type
FP16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.