Falcon 2: An 11B parameter pretrained language model and VLM, trained on over 5000B tokens tokens and 11 languages May 24, 2024 • 25
view article Article Welcome FalconMamba: The first strong attention-free 7B model Aug 12, 2024 • 108