Hi. Is it possible to run someone else’s model if “Hosted inference API: The output API has been disabled for this model”. Can I use “Inference Endpoints” or “PRO spaces” for this? Thank you.
Here is an example: tiiuae/falcon-40b · Hugging Face
Hi. Is it possible to run someone else’s model if “Hosted inference API: The output API has been disabled for this model”. Can I use “Inference Endpoints” or “PRO spaces” for this? Thank you.
Here is an example: tiiuae/falcon-40b · Hugging Face