FM-1976 commited on
Commit
6f7e7a4
·
verified ·
1 Parent(s): 3156f53

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +6 -2
README.md CHANGED
@@ -33,9 +33,11 @@ There is an open clash in dependencies versions between optiumum-intel and openv
33
 
34
 
35
  So for the model conversion the only dependency you need is
36
- ⚠️ TO BE FIXED
 
37
  ```
38
- pip install optimum-intel[openvino]
 
39
  ```
40
  This command will install, among others:
41
  ```
@@ -54,6 +56,8 @@ After the previous step you are enabled to run the following command (considerin
54
  ```bash
55
  optimum-cli export openvino --model NuExtract-1.5-tiny --task text-generation-with-past --trust-remote-code --weight-format int8 ov_NuExtract-1.5-tiny
56
  ```
 
 
57
 
58
 
59
  #### Dependencies required to run the model with `openvino-genai`
 
33
 
34
 
35
  So for the model conversion the only dependency you need is
36
+
37
+
38
  ```
39
+ pip install -U "openvino>=2024.3.0" "openvino-genai"
40
+ pip install "torch>=2.1" "nncf>=2.7" "transformers>=4.40.0" "onnx<1.16.2" "optimum>=1.16.1" "accelerate" "datasets>=2.14.6" "git+https://github.com/huggingface/optimum-intel.git" --extra-index-url https://download.pytorch.org/whl/cpu
41
  ```
42
  This command will install, among others:
43
  ```
 
56
  ```bash
57
  optimum-cli export openvino --model NuExtract-1.5-tiny --task text-generation-with-past --trust-remote-code --weight-format int8 ov_NuExtract-1.5-tiny
58
  ```
59
+ this will start the process and produce the following messages, without any fatal error
60
+
61
 
62
 
63
  #### Dependencies required to run the model with `openvino-genai`