stablelm-zephyr-3B-localmentor-GGUF
Model creator: remyxai
Original model: stablelm-zephyr-3B_localmentor
GGUF quantization: llama.cpp
commit fadde6713506d9e6c124f5680ab8c7abebe31837
Description
Fine-tune with low-rank adapters on 25K conversational turns discussing tech/startup from over 800 podcast episodes.
- Developed by: Remyx.AI
- License: apache-2.0
- Finetuned from model: stablelm-zephyr-3b
- Repository: https://github.com/remyxai/LocalMentor
Prompt Template
Following the tokenizer_config.json, the prompt template is Zephyr.
<|system|>
{system_prompt}</s>
<|user|>
{prompt}</s>
<|assistant|>
- Downloads last month
- 15
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the HF Inference API does not support llama.cpp models with pipeline type text-generation
Model tree for mgonzs13/stablelm-zephyr-3B-localmentor-GGUF
Base model
remyxai/stablelm-zephyr-3B_localmentor