--- language: - en license: mit tags: - nlp - code - mlx license_link: https://huggingface.co/microsoft/phi-2/resolve/main/LICENSE pipeline_tag: text-generation --- ![Alt text](https://cdn.discordapp.com/attachments/989904887330521099/1203045175794868354/ChatGPT_Image.webp?ex=65cfaa21&is=65bd3521&hm=0861aae13f3b3960954b9f5938aa74bdee2afb3daee706064b47ee3fb75c6e8c&) # phi-2-dpo-7k This is a finetuned version of [`microsoft/phi-2`](). Finetuned on a cocktail of 7k chat interaction from argilla latest DPO datase: orca pairs, ultrafeedback ratings, and capybara-dpo. Refer to the [original model card](https://huggingface.co/microsoft/phi-2) for more details on the model. ## Use with mlx ```bash pip install mlx ``` ```python from mlx_lm import load, generate model, tokenizer = load("mlx-community/phi-2-dpo-7k") response = generate(model, tokenizer, prompt="Similarity between pizza and quantum mechanics, in few words, use rhymes\nOutput: ", max_tokens=100, verbose=True) ```