Collections
Discover the best community collections!
Collections including paper arxiv:2311.08401
-
Unleashing the Power of Pre-trained Language Models for Offline Reinforcement Learning
Paper • 2310.20587 • Published • 17 -
SELF: Language-Driven Self-Evolution for Large Language Model
Paper • 2310.00533 • Published • 2 -
QLoRA: Efficient Finetuning of Quantized LLMs
Paper • 2305.14314 • Published • 48 -
QA-LoRA: Quantization-Aware Low-Rank Adaptation of Large Language Models
Paper • 2309.14717 • Published • 44
-
ChatAnything: Facetime Chat with LLM-Enhanced Personas
Paper • 2311.06772 • Published • 35 -
Fine-tuning Language Models for Factuality
Paper • 2311.08401 • Published • 29 -
A Survey on Language Models for Code
Paper • 2311.07989 • Published • 22 -
Instruction-Following Evaluation for Large Language Models
Paper • 2311.07911 • Published • 20
-
Fine-tuning Language Models for Factuality
Paper • 2311.08401 • Published • 29 -
Automatically Correcting Large Language Models: Surveying the landscape of diverse self-correction strategies
Paper • 2308.03188 • Published • 2 -
Trusted Source Alignment in Large Language Models
Paper • 2311.06697 • Published • 11 -
Long-form factuality in large language models
Paper • 2403.18802 • Published • 25
-
Reward-Augmented Decoding: Efficient Controlled Text Generation With a Unidirectional Reward Model
Paper • 2310.09520 • Published • 11 -
When can transformers reason with abstract symbols?
Paper • 2310.09753 • Published • 3 -
Improving Large Language Model Fine-tuning for Solving Math Problems
Paper • 2310.10047 • Published • 6 -
LLaVA-Interactive: An All-in-One Demo for Image Chat, Segmentation, Generation and Editing
Paper • 2311.00571 • Published • 41
-
PockEngine: Sparse and Efficient Fine-tuning in a Pocket
Paper • 2310.17752 • Published • 12 -
S-LoRA: Serving Thousands of Concurrent LoRA Adapters
Paper • 2311.03285 • Published • 30 -
Parameter-Efficient Orthogonal Finetuning via Butterfly Factorization
Paper • 2311.06243 • Published • 18 -
Fine-tuning Language Models for Factuality
Paper • 2311.08401 • Published • 29