Datasets:
Tasks:
Text Classification
Modalities:
Text
Formats:
csv
Languages:
English
Size:
10K - 100K
License:
File size: 536 Bytes
62d0080 9794742 8a0351c 0a1226f 62d0080 0a1226f 8a0351c ea5d4d4 24471c2 0a1226f 62d0080 8a0351c 62d0080 24471c2 8a0351c 9794742 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 |
---
license: other
language:
- en
pretty_name: toxic-comments
size_categories:
- 10K<n<100K
task_categories:
- text-classification
tags:
- toxic
- hate
---
# Toxic-comments (Teeny-Tiny Castle)
This dataset is part of a tutorial tied to the [Teeny-Tiny Castle](https://github.com/Nkluge-correa/TeenyTinyCastle), an open-source repository containing educational tools for AI Ethics and Safety research.
## How to Use
```python
from datasets import load_dataset
dataset = load_dataset("AiresPucrs/toxic_content", split = 'train')
``` |