classification probability
hi,
is it possible to output a probability for safe and unsafe - in order to understand how confident the classification as safe is?
Thanks,
Gerald
cc @jfchi
Can you try the approach listed here for obtaining log probabilities: /static-proxy?url=https%3A%2F%2Fdiscuss.huggingface.co%2Ft%2Fannouncement-generation-get-probabilities-for-generated-output%2F30075%3C%2Fa%3E%3C%2Fp%3E
hi @litesaber do you refer to compute_transition_scores - how to get final probabilities? could you please give example?
Yeah I agree, is there an example of how we could get the probability score using a HuggingFace Inference Endpoint?
@litesaber I tried creating my own handler.py to be able to calculate the scores, but I haven't had success being able to do what I expect it to do when deploying my own fork of this repo with my custom handler. Do you have an example of how we can get those scores when deploying a Dedicated inference endpoint?