Model Card

Add more information here

Example Usage

from transformers import AutoModelForCausalLM, AutoTokenizer, pipeline

tokenizer = AutoTokenizer.from_pretrained('fineinstructions/query_templatizer_full', revision=None) # Load tokenizer
tokenizer.padding_side = 'left'
model = AutoModelForCausalLM.from_pretrained('fineinstructions/query_templatizer_full', revision=None) # Load model
pipe = pipeline('text-generation', model=model, tokenizer=tokenizer, pad_token_id=tokenizer.pad_token_id, return_full_text=False)

inputs = ['ok now can you give 3 very speculative ideas on how to achieve unidirectional movement that results in more energy than input using magnets and/or ZPF extraction, as simple setups?']
print(pipe(inputs, max_length=131072, do_sample=False))

This model was trained with a synthetic dataset with DataDreamer 🤖💤. The synthetic dataset card and model card can be found here. The training arguments can be found here.

Downloads last month
103
Safetensors
Model size
1.24B params
Tensor type
F32
·
BF16
·
I8
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and HF Inference API was unable to determine this model's library.

Model tree for fineinstructions/query_templatizer_full

Adapters
1 model
Quantizations
1 model

Dataset used to train fineinstructions/query_templatizer_full