Ishika08's picture
Updated README.md
9e7e323 verified
---
base_model: unsloth/phi-4-unsloth-bnb-4bit
tags:
- text-generation-inference
- transformers
- unsloth
- trl
- phi
- text-generation
license: apache-2.0
language:
- en
---
# Uploaded model
- **Developed by:** Ishika08
- **License:** apache-2.0
- **Finetuned from model :** unsloth/phi-4-unsloth-bnb-4bit
This phi model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
## How to Use the Model for Inferencing
You can use the model for inferencing via Hugging Face's API by following the steps below:
### 1. Install Required Libraries
Ensure that you have the `requests` library installed:
```bash
pip install requests
```
## Steps to use the model for inferencing using Hugging Face API
import requests
# API URL for the model hosted on Hugging Face
API_URL = "/static-proxy?url=https%3A%2F%2Fapi-inference.huggingface.co%2Fmodels%2FIshika08%2Fphi-4_%3C%2Fspan%3Efine-tuned%3Cspan class="hljs-emphasis">_mdl"
# Set up your Hugging Face API token
HEADERS = {"Authorization": f"Bearer token_id"}
# The input you want to pass to the model
payload = {
"inputs": "What is the capital of France? Tell me some of the tourist places in bullet points."
}
# Make the request to the API
response = requests.post(API_URL, headers=HEADERS, json=payload)
# Print the response from the model
print(response.json()) # Get the response output
# OUTPUT
{
"generated_text": "Paris is the capital of France. Some of the famous tourist places include:\n- Eiffel Tower\n- Louvre Museum\n- Notre-Dame Cathedral\n- Sacré-Cœur Basilica"
}
## Steps to use model using InferenceClient library from huggingface_hub
from huggingface_hub import InferenceClient
# Initialize the client with model name and Hugging Face token
client = InferenceClient(model="Ishika08/phi-4_fine-tuned_mdl", token=""")
# Perform inference (text generation in this case)
response = client.text_generation("What is the capital of France? Tell me about Eiffel Tower history in bullet points.")
# Print the response from the model
print(response)
[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)