mrseeker
changing readme
1d8bad2
|
raw
history blame
1.16 kB
metadata
license: apache-2.0
language:
  - en
pipeline_tag: text-generation
inference: true
tags:
  - pytorch
  - mistral
  - finetuned

Mistral 7B - Holodeck

Model Description

Mistral 7B-Holodeck is a finetune created using Mistral's 7B model.

Training data

The training data contains around 3000 ebooks in various genres. Most parts of the dataset have been prepended using the following text: [Genre: <genre1>, <genre2>]

How to use

You can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:

>>> from transformers import pipeline
>>> generator = pipeline('text-generation', model='KoboldAI/Mistral-7B-Holodeck-1')
>>> generator("Welcome Captain Janeway, I apologize for the delay.", do_sample=True, min_length=50)
[{'generated_text': 'Welcome Captain Janeway, I apologize for the delay."\nIt's all right," Janeway said. "I'm certain that you're doing your best to keep me informed of what\'s going on."'}]

Limitations and Biases

Based on known problems with NLP technology, potential relevant factors include bias (gender, profession, race and religion).