HANTAEK's picture
Update README.md
c745575 verified
---
license: unknown
datasets:
- KorQuAD/squad_kor_v1
language:
- ko
base_model:
- CurtisJeon/klue-roberta-large-korquad_v1_qa
pipeline_tag: question-answering
---
# KLUE RoBERTa Large KorQuAD v1 QA - Fine-tuned
์ด ๋ชจ๋ธ์€ [CurtisJeon/klue-roberta-large-korquad_v1_qa](https://huggingface.co/CurtisJeon/klue-roberta-large-korquad_v1_qa)๋ฅผ ๊ธฐ๋ฐ˜์œผ๋กœ ํ•˜์—ฌ ์ถ”๊ฐ€ ๋ฐ์ดํ„ฐ๋กœ fine-tuningํ•œ ํ•œ๊ตญ์–ด ์งˆ์˜์‘๋‹ต(QA) ๋ชจ๋ธ์ž…๋‹ˆ๋‹ค.
## ๋ชจ๋ธ ์ •๋ณด
- ๊ธฐ๋ณธ ๋ชจ๋ธ: KLUE RoBERTa Large
- ํƒœ์Šคํฌ: ์งˆ์˜์‘๋‹ต (Question Answering)
- ์–ธ์–ด: ํ•œ๊ตญ์–ด
- ํ›ˆ๋ จ ๋ฐ์ดํ„ฐ: KorQuAD v1 + [์ž์ฒด ๋ฐ์ดํ„ฐ]
## ๋ชจ๋ธ ๊ตฌ์กฐ
- RobertaForQuestionAnswering ์•„ํ‚คํ…์ฒ˜ ์‚ฌ์šฉ + CNN ๋ ˆ์ด์–ด(without a dropout)
- 24๊ฐœ์˜ hidden layers
- 1024 hidden size
- 16 attention heads
- ์ด ํŒŒ๋ผ๋ฏธํ„ฐ: ์•ฝ 355M
## ์‚ฌ์šฉ ๋ฐฉ๋ฒ•
์ด ๋ชจ๋ธ์€ Hugging Face Transformers ๋ผ์ด๋ธŒ๋Ÿฌ๋ฆฌ๋ฅผ ์‚ฌ์šฉํ•˜์—ฌ ์‰ฝ๊ฒŒ ๋กœ๋“œํ•˜๊ณ  ์‚ฌ์šฉํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค:
```python
from transformers import AutoModelForQuestionAnswering, AutoTokenizer
model_name = "HANTAEK/klue-roberta-large-korquad-v1-qa-finetuned"
model = AutoModelForQuestionAnswering.from_pretrained(model_name)
tokenizer = AutoTokenizer.from_pretrained(model_name)