(EfficientCodeBERT) CodeBERT-Based Student Model for Vulnerability Detection

This fine-tuned and distilled version of the CodeBERT model is designed for detecting vulnerabilities in source code. The custom architecture optimizes the model for efficiency, reducing size while retaining competitive accuracy. With 35 million parameters, this lightweight model offers robust performance for binary classification tasks.

Model Details:

  • Base Model: microsoft/codebert-base
  • Architecture: 384 hidden size, 8 layers, 6 attention heads
  • Max Sequence Length: 128
  • Dataset: DiverseVul
  • Task: Vulnerability detection (binary classification)
Downloads last month
59
Safetensors
Model size
37M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and HF Inference API was unable to determine this model's library.