Model Card for Model ID

ํ•ด๋‹น ๋ชจ๋ธ์€ ๊ธˆ์œต ๋„๋ฉ”์ธ ๋ฐ์ดํ„ฐ 83๋งŒ๊ฐœ์˜ ์ƒ˜ํ”Œ์— ๋Œ€ํ•˜์—ฌ ์ถ”๊ฐ€ํ•™์Šตํ•œ ๋ชจ๋ธ์ž…๋‹ˆ๋‹ค.

Llama 3.1 8B base ๋ชจ๋ธ์—์„œ ํ•™์Šต์ด ์ง„ํ–‰๋˜์—ˆ์œผ๋ฉฐ, ๊ทธ ์™ธ ์ถ”๊ฐ€์ ์ธ instruction tuning์€ ์ˆ˜ํ–‰๋˜์ง€ ์•Š์•˜์Šต๋‹ˆ๋‹ค.

๋˜ํ•œ, ์ด ๋ชจ๋ธ์€ ๊ณผํ•™๊ธฐ์ˆ ์ •๋ณดํ†ต์‹ ๋ถ€ยท๊ด‘์ฃผ๊ด‘์—ญ์‹œ๊ฐ€ ๊ณต๋™ ์ง€์›ํ•œ โ€˜์ธ๊ณต์ง€๋Šฅ ์ค‘์‹ฌ ์‚ฐ์—…์œตํ•ฉ ์ง‘์ ๋‹จ์ง€ ์กฐ์„ฑ์‚ฌ์—…โ€™์œผ๋กœ ์ง€์›์„ ๋ฐ›์•„ ๊ฐœ๋ฐœ๋˜์—ˆ์Šต๋‹ˆ๋‹ค.

Model Details

Model Description

This is the model card of a ๐Ÿค— transformers model that has been pushed on the Hub. This model card has been automatically generated.

  • Developed by: fasoo
  • Model type: Causal Language Model
  • Language(s) (NLP): English, Korean
  • License: [More Information Needed]
  • Finetuned from model: Llama 3.1 8B

Model Sources [optional]

  • Paper [optional]: [More Information Needed]

Uses

# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("fasoo/llama3.1-financial-dedup")
model = AutoModelForCausalLM.from_pretrained("fasoo/llama3.1-financial-dedup")

Environmental Impact

  • Hardware Type: NVIDIA H100 80GB HBM3
  • Compute Region: Gwangju, South Korea
Downloads last month
13
Safetensors
Model size
8.03B params
Tensor type
F32
ยท
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.