Datasets:
Tasks:
Text Classification
Modalities:
Text
Formats:
csv
Languages:
English
Size:
10K - 100K
License:
license: other | |
language: | |
- en | |
pretty_name: toxic-comments | |
size_categories: | |
- 10K<n<100K | |
task_categories: | |
- text-classification | |
tags: | |
- toxic | |
- hate | |
# Toxic-comments (Teeny-Tiny Castle) | |
This dataset is part of a tutorial tied to the [Teeny-Tiny Castle](https://github.com/Nkluge-correa/TeenyTinyCastle), an open-source repository containing educational tools for AI Ethics and Safety research. | |
## How to Use | |
```python | |
from datasets import load_dataset | |
dataset = load_dataset("AiresPucrs/toxic_content", split = 'train') | |
``` |