YAML Metadata
Warning:
empty or missing yaml metadata in repo card
(https://huggingface.co/docs/hub/model-cards#model-card-metadata)
distilbert-base-uncased trained on MSMARCO Document Reranking task,
usage
from transformers import AutoTokenizer, AutoModelForSequenceClassification
tokenizer = AutoTokenizer.from_pretrained('brutusxu/distilbert-base-cross-encoder-first-p')
model = AutoModelForSequenceClassification.from_pretrained('brutusxu/distilbert-base-cross-encoder-first-p')
query = 'I love New York'
document = 'I like New York'
input = '<P>' + query + tokenizer.sep_token + '<Q>' + document
tokenized_input = tokenizer(input, return_tensors='pt')
ranking_score = model(**tokenized_input)
performance
on MSMARCO Document Reranking w. top-100 documents from BM25
MRR@10: 0.373
MRR@100: 0.381
nDCG@10: 0.442
nDCG@10: 0.475
- Downloads last month
- 6
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.