Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
Spaces:
Dovakiins
/
qwerrwe
like
0
Build error
App
Files
Files
Community
main
qwerrwe
/
tests
/
e2e
/
test_dpo.py
Commit History
add support for rpo_alpha (#1681)
c996881
unverified
winglian
commited on
Jun 4, 2024
re-enable DPO for tests in modal ci (#1374)
1f151c0
unverified
winglian
commited on
Jun 3, 2024
Add KTO support (#1640)
22ae21a
unverified
benredmond
winglian
commited on
May 20, 2024
Add ORPO example and e2e test (#1572)
98c25e1
unverified
tokestermw
commited on
Apr 27, 2024
run tests again on Modal (#1289) [skip ci]
0001862
unverified
winglian
commited on
Feb 29, 2024
DPO cleanup (#1126)
7523d1f
unverified
winglian
plaguss
HF staff
commited on
Jan 23, 2024