Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
yichaodu
/
DiffusionDPO-alignment-internvl-1.5
like
0
Text-to-Image
Diffusers
Safetensors
stable-diffusion
stable-diffusion-diffusers
DPO
DiffusionDPO
arxiv:
2407.04842
Model card
Files
Files and versions
Community
Use this model
e2f07e8
DiffusionDPO-alignment-internvl-1.5
1 contributor
History:
2 commits
yichaodu
Upload README.md with huggingface_hub
e2f07e8
verified
7 months ago
.gitattributes
Safe
1.52 kB
initial commit
7 months ago
README.md
Safe
980 Bytes
Upload README.md with huggingface_hub
7 months ago