BioQwen: A Small-Parameter, High-Performance Bilingual Model for Biomedical Multi-Tasks
This repository contains the quantized weights for the BioQwen 1.8B version, processed through the MLC-LLM project. When you download the BioQwen.apk via this link, it will automatically download the files from this repository. Therefore, it is generally unnecessary to download and use these files separately.
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
HF Inference API was unable to determine this model's library.