metadata
license: apache-2.0
datasets:
- asset
- wi_locness
- GEM/wiki_auto_asset_turk
- discofuse
- zaemyung/IteraTeR_plus
language:
- en
metrics:
- sari
- bleu
- accuracy
Model Card for CoEdIT-XL
This model was obtained by fine-tuning google/flan-t5-xl on the CoEdIT dataset.
Paper: CoEdIT: ext Editing by Task-Specific Instruction Tuning Authors: Vipul Raheja, Dhruv Kumar, Ryan Koo, Dongyeop Kang
Model Details
Model Description
- Language(s) (NLP): English
- Finetuned from model: google/flan-t5-xl
Model Sources [optional]
- Repository: https://github.com/vipulraheja/coedit
- Paper [optional]: [More Information Needed]
How to use
We make available the models presented in our paper.
Model | Number of parameters |
---|---|
CoEdIT-large | 770M |
CoEdIT-xl | 3B |
CoEdIT-xxl | 11B |
Uses
Direct Use
[More Information Needed]
Downstream Use [optional]
[More Information Needed]
Out-of-Scope Use
[More Information Needed]
Bias, Risks, and Limitations
[More Information Needed]
Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
How to Get Started with the Model
Use the code below to get started with the model.
Software
https://github.com/vipulraheja/coedit
Citation
BibTeX:
[More Information Needed]
APA:
[More Information Needed]