Text Generation
Safetensors
English
Tamil
gpt2
QnQ
qnqgpt2 / README.md
karthikqnq's picture
Update README.md
5e97f15 verified
metadata
base_model:
  - openai-community/gpt2
language:
  - en
  - ta
license: mit
tags:
  - gpt2
  - text-generation
  - QnQ
datasets:
  - varshil27/1mg-train-data-LLama2-formatted
  - karthikqnq/1mgdataset
  - anjandash/java-8m-methods-v2
metrics:
  - accuracy

QnQGPT Model

This is a custom GPT model based on GPT-2 architecture.

Model Details

  • Model Type: GPT-2
  • Base Model: gpt2
  • Training Data: [Describe your training data]
  • Use Cases: [Describe intended use cases]

Usage

from transformers import AutoModelForCausalLM, AutoTokenizer

model = AutoModelForCausalLM.from_pretrained("karthikqnq/qnqgpt")
tokenizer = AutoTokenizer.from_pretrained("karthikqnq/qnqgpt")

# Generate text
text = "Hello, how are"
inputs = tokenizer(text, return_tensors="pt")
outputs = model.generate(**inputs, max_length=50)
result = tokenizer.decode(outputs[0])
print(result)

Training Details

[Add your training details here]

Limitations

[Add model limitations here]

License

This model is released under the MIT License.