Update README.md
Browse files
README.md
CHANGED
@@ -31,7 +31,7 @@ Try the 3.8B model here: [Playground](https://huggingface.co/spaces/numind/NuExt
|
|
31 |
|
32 |
⚠️ We recommend using NuExtract with a temperature at or very close to 0. Some inference frameworks, such as Ollama, use a default of 0.7 which is not well suited to pure extraction tasks.
|
33 |
|
34 |
-
## This is the OpenVINO
|
35 |
The model was created with the Optimum-Intel libray cli-command
|
36 |
#### Dependencies required to create the model
|
37 |
There is an open clash in dependencies versions between optiumum-intel and openvino-genai
|
@@ -45,6 +45,8 @@ So for the model conversion the only dependency you need is
|
|
45 |
pip install -U "openvino>=2024.3.0" "openvino-genai"
|
46 |
pip install "torch>=2.1" "nncf>=2.7" "transformers>=4.40.0" "onnx<1.16.2" "optimum>=1.16.1" "accelerate" "datasets>=2.14.6" "git+https://github.com/huggingface/optimum-intel.git" --extra-index-url https://download.pytorch.org/whl/cpu
|
47 |
```
|
|
|
|
|
48 |
This command will install, among others:
|
49 |
```
|
50 |
tokenizers==0.20.3
|
|
|
31 |
|
32 |
⚠️ We recommend using NuExtract with a temperature at or very close to 0. Some inference frameworks, such as Ollama, use a default of 0.7 which is not well suited to pure extraction tasks.
|
33 |
|
34 |
+
## This is the OpenVINO IR format of the model, quantized in int8
|
35 |
The model was created with the Optimum-Intel libray cli-command
|
36 |
#### Dependencies required to create the model
|
37 |
There is an open clash in dependencies versions between optiumum-intel and openvino-genai
|
|
|
45 |
pip install -U "openvino>=2024.3.0" "openvino-genai"
|
46 |
pip install "torch>=2.1" "nncf>=2.7" "transformers>=4.40.0" "onnx<1.16.2" "optimum>=1.16.1" "accelerate" "datasets>=2.14.6" "git+https://github.com/huggingface/optimum-intel.git" --extra-index-url https://download.pytorch.org/whl/cpu
|
47 |
```
|
48 |
+
The instructions are from the amazing [OpenVINO notebooks](https://docs.openvino.ai/2024/notebooks/llm-question-answering-with-output.html#prerequisites)<br>
|
49 |
+
vanilla pip install will create clashes among dependencies/versions<br>
|
50 |
This command will install, among others:
|
51 |
```
|
52 |
tokenizers==0.20.3
|