Error while deserializing header: HeaderTooLarge
#12
by
PirateWolf
- opened
Hello, I'm trying to run Revision for SDXL in a Google Colab instance. I'm using the following notebook, T4 GPU instance with pro RAM: https://colab.research.google.com/github/comfyanonymous/ComfyUI/blob/master/notebooks/comfyui_colab.ipynb
Here's a screenshot of my workflow:
It doesn't work as expected as I'm getting the following error:
2
loading new
!!! Exception during processing !!!
Traceback (most recent call last):
File "/content/drive/MyDrive/ComfyUI/execution.py", line 151, in recursive_execute
output_data, output_ui = get_output_data(obj, input_data_all)
File "/content/drive/MyDrive/ComfyUI/execution.py", line 81, in get_output_data
return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True)
File "/content/drive/MyDrive/ComfyUI/execution.py", line 74, in map_node_over_list
results.append(getattr(obj, func)(**slice_dict(input_data_all, i)))
File "/content/drive/MyDrive/ComfyUI/nodes.py", line 727, in load_clip
clip_vision = comfy.clip_vision.load(clip_path)
File "/content/drive/MyDrive/ComfyUI/comfy/clip_vision.py", line 78, in load
sd = load_torch_file(ckpt_path)
File "/content/drive/MyDrive/ComfyUI/comfy/utils.py", line 11, in load_torch_file
sd = safetensors.torch.load_file(ckpt, device=device.type)
File "/usr/local/lib/python3.10/dist-packages/safetensors/torch.py", line 309, in load_file
with safe_open(filename, framework="pt", device=device) as f:
safetensors_rust.SafetensorError: Error while deserializing header: HeaderTooLarge
Solved:
model downloaded was somehow corrupted. I re-downloaded it and now it works.
PirateWolf
changed discussion status to
closed
i am having the same issue
while making the adapter load in peft
how to checksum my model error, it isn't be interrupted. but it has the same qestion. lbnlst, I am a freshman. Looking for helping.
Oi como posso ajudar