I am trying to accelerate a NLP pipeline using HuggingFace transformers and the ONNX Runtime. I faced a following error: InvalidArgument: [ONNXRuntimeError] : 2 : INVALID_ARGUMENT : Got invalid dimensions for input: input_ids for the following indices.
I would appreciate it if you could direct me how to run the model via ONNXRuntime.
rom transformers import BertTokenizerFast
from onnxruntime import ExecutionMode, InferenceSession, SessionOptions
#convert HuggingFace model to ONNX
tokenizer = BertTokenizerFast.from_pretrained("bert-base-cased")
convert(framework="tf", model="bert-base-cased", output=Path("bert-base-cased.onnx"), tokenizer=tokenizer, opset=11)
#create the InferenceSession
options = SessionOptions()
options.intra_op_num_threads = 1
options.execution_mode = ExecutionMode.ORT_SEQUENTIAL
session = InferenceSession("bert-base-cased.onnx", options)
session.disable_fallback()
#tokenizer the input to give to the model
model_inputs = tokenizer("My Name is BERT", return_tensors="pt", return_token_type_ids=False)
inputs_onnx = {k: v.cpu().detach().numpy().astype(np.int32) for k, v in model_inputs.items() if k == 'input_ids'}```
#run
output, pooled = session.run(None, inputs_onnx)
#Error
---------------------------------------------------------------------------
InvalidArgument Traceback (most recent call last)
/var/folders/g6/dqc__5mn6gg8q67n6zdmqsxc0000gn/T/ipykernel_7924/281882707.py in <module>
----> 1 output, pooled = session.run(None, inputs_onnx)
/opt/anaconda3/envs/pico_tf_torch/lib/python3.7/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py in run(self, output_names, input_feed, run_options)
186 output_names = [output.name for output in self._outputs_meta]
187 try:
--> 188 return self._sess.run(output_names, input_feed, run_options)
189 except C.EPFail as err:
190 if self._enable_fallback:
InvalidArgument: [ONNXRuntimeError] : 2 : INVALID_ARGUMENT : Got invalid dimensions for input: input_ids for the following indices
index: 1 Got: 8 Expected: 5
Please fix either the inputs or the model.