Update README.md
Browse files
README.md
CHANGED
@@ -60,6 +60,8 @@ model-index:
|
|
60 |
---
|
61 |
|
62 |
# Kotoba-Whisper (v2.0)
|
|
|
|
|
63 |
_Kotoba-Whisper_ is a collection of distilled [Whisper](https://arxiv.org/abs/2212.04356) models for Japanese ASR, developed through the collaboration bewteen
|
64 |
[Asahi Ushio](https://asahiushio.com) and [Kotoba Technologies](https://twitter.com/kotoba_tech).
|
65 |
Following the original work of distil-whisper ([Robust Knowledge Distillation via Large-Scale Pseudo Labelling](https://arxiv.org/abs/2311.00430)),
|
|
|
60 |
---
|
61 |
|
62 |
# Kotoba-Whisper (v2.0)
|
63 |
+
[**faster-whisper weight**](https://huggingface.co/kotoba-tech/kotoba-whisper-v2.0-faster), [**whisper.cpp weight**](https://huggingface.co/kotoba-tech/kotoba-whisper-v2.0-ggml), [**pipeline with stable-ts/punctuation**](https://huggingface.co/kotoba-tech/kotoba-whisper-v2.1)
|
64 |
+
|
65 |
_Kotoba-Whisper_ is a collection of distilled [Whisper](https://arxiv.org/abs/2212.04356) models for Japanese ASR, developed through the collaboration bewteen
|
66 |
[Asahi Ushio](https://asahiushio.com) and [Kotoba Technologies](https://twitter.com/kotoba_tech).
|
67 |
Following the original work of distil-whisper ([Robust Knowledge Distillation via Large-Scale Pseudo Labelling](https://arxiv.org/abs/2311.00430)),
|