Error "scb10x/typhoon-2-llama31-8b-instruct-beta-v1"

#3
by xJohn - opened

Hi,
I found error when load the main model and show the error "
OSError: scb10x/typhoon-2-llama31-8b-instruct-beta-v1 is not a local folder and is not a valid model identifier listed on 'https://huggingface.co/models'".
How to solve?

I tried changing from "scb10x/typhoon-2-llama31-8b-instruct-beta-v1" to "scb10x/llama3.1-typhoon2-8b-instruct" and it worked.
https://huggingface.co/scb10x/llama3.1-typhoon2-8b-instruct

changed at self.llama_base_model in configuration_typhoon2audio.py
changed at llama_base_model in config.json

@tzsoulcap Did you tried to quantized this model success?

SCB 10X org

Thank you for pointing that out. We mistaken specified pre-release model version. It's now fixed on the latest commit.

kunato changed discussion status to closed

Sign up or log in to comment