YAML Metadata
Warning:
empty or missing yaml metadata in repo card
(https://huggingface.co/docs/hub/model-cards#model-card-metadata)
capreolus/bert-base-msmarco
Model description
BERT-Base model (google/bert_uncased_L-12_H-768_A-12
) fine-tuned on the MS MARCO passage classification task. It is intended to be used as a ForSequenceClassification
model; see the Capreolus BERT-MaxP implementation for a usage example.
This corresponds to the BERT-Base model used to initialize BERT-MaxP and PARADE variants in PARADE: Passage Representation Aggregation for Document Reranking by Li et al. It was converted from the released TFv1 checkpoint. Please cite the PARADE paper if you use these weights.
- Downloads last month
- 64
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.