ElanMT
This model is a pretrained checkpoint and is suitable for fine-tuning on a large dataset. For general use cases, using ElanMT-BT-en-ja is strongly recommended.
Model Details
This is a translation model based on Marian MT 6-layer encoder-decoder transformer architecture with sentencepiece tokenizer.
- Developed by: ELAN MITSUA Project / Abstract Engine
- Model type: Translation
- Source Language: English
- Target Language: Japanese
- License: CC BY-SA 4.0
Usage
Training Data
Training Procedure
Evaluation
Disclaimer
The translated result may be very incorrect, harmful or biased. The model was developed to investigate achievable performance with only a relatively small, licensed corpus, and is not suitable for use cases requiring high translation accuracy. Under Section 5 of the CC BY-SA 4.0 License, ELAN MITSUA Project / Abstract Engine is not responsible for any direct or indirect loss caused by the use of the model.
- Downloads last month
- 119
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.