File size: 2,831 Bytes
44e19d9
03f9e9a
44e19d9
 
 
 
 
 
cc3fc13
44e19d9
 
 
 
 
 
 
 
 
 
 
dd2c680
44e19d9
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
---
license: cc-by-nc-4.0
language:
- en
---

---


# UniNER-7B-definition

**Description**: A UniNER-7B model trained from LLama-7B using the [Pile-NER-definition data](https://huggingface.co/datasets/Universal-NER/Pile-NER-definition) without human-labeled data. The data was collected by prompting gpt-3.5-turbo-0301 to label entities from passages and provide short-sentence definitions. The data collection prompt is as follows:

<div style="background-color: #f6f8fa; padding: 20px; border-radius: 10px; border: 1px solid #e1e4e8; box-shadow: 0 2px 5px rgba(0,0,0,0.1);">
<strong>Instruction:</strong><br/>
Given a paragraph, your task is to extract all entities and concepts,
and define their type using a short sentence. The output should be in the following format:
[("entity", "definition of entity type in a short sentence"), ... ]
</div>

Check our [paper](https://arxiv.org/abs/2308.03279) for more information. Check our [repo](https://github.com/universal-ner/universal-ner) about how to use the model.

## Comparison with [UniNER-7B-type](https://huggingface.co/Universal-NER/UniNER-7B-type)
The UniNER-7B-type model, trained on Pile-NER-type, excels in recognizing common and short NER tags (e.g., person, location) and performs better on NER datasets. On the other hand, UniNER-7B-definition demonstrates superior capabilities in understanding short-sentence definitions of entity types. Additionally, it exhibits enhanced robustness against variations in type paraphrasing.

## Inference
The template for inference instances is as follows:
<div style="background-color: #f6f8fa; padding: 20px; border-radius: 10px; border: 1px solid #e1e4e8; box-shadow: 0 2px 5px rgba(0,0,0,0.1);">
<strong>Prompting template:</strong><br/>
A virtual assistant answers questions from a user based on the provided text.<br/>
USER: Text: <span style="color: #d73a49;">{Fill the input text here}</span><br/>
ASSISTANT: I’ve read this text.<br/>
USER: What describes <span style="color: #d73a49;">{Fill the entity type here}</span> in the text?<br/>
ASSISTANT: <span style="color: #0366d6;">(model's predictions in JSON format)</span><br/>
</div>

### Note: Inferences are based on one entity type at a time. For multiple entity types, create separate instances for each type.

## License

This model and its associated data are released under the [CC BY-NC 4.0](https://creativecommons.org/licenses/by-nc/4.0/) license. They are primarily used for research purposes.

## Citation

```bibtex
@article{zhou2023universalner,
      title={UniversalNER: Targeted Distillation from Large Language Models for Open Named Entity Recognition}, 
      author={Wenxuan Zhou and Sheng Zhang and Yu Gu and Muhao Chen and Hoifung Poon},
      year={2023},
      eprint={2308.03279},
      archivePrefix={arXiv},
      primaryClass={cs.CL}
}
```