ggallipoli
commited on
Create README.md
Browse files
README.md
ADDED
@@ -0,0 +1,114 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
language:
|
3 |
+
- en
|
4 |
+
pipeline_tag: text2text-generation
|
5 |
+
library_name: transformers
|
6 |
+
tags:
|
7 |
+
- style-transfer
|
8 |
+
- formality-transfer
|
9 |
+
---
|
10 |
+
# Text Style Transfer using CycleGANs
|
11 |
+
|
12 |
+
This repository contains the models from the paper "Self-supervised Text Style Transfer using Cycle-Consistent Adversarial Networks" (ACM TIST 2024).\
|
13 |
+
The work introduces a novel approach to Text Style Transfer using CycleGANs with sequence-level supervision and Transformer architectures.
|
14 |
+
|
15 |
+
## Available Models
|
16 |
+
|
17 |
+
### Formality transfer
|
18 |
+
#### GYAFC dataset (Family & Relationships)
|
19 |
+
|
20 |
+
| model | checkpoint |
|
21 |
+
|:----------:|:------------------------------------------------------:|
|
22 |
+
| BART base | [informal-to-formal](https://huggingface.co/ggallipoli/bart-base_inf2for_family), [formal-to-informal](https://huggingface.co/ggallipoli/bart-base_for2inf_family) |
|
23 |
+
| BART large | [informal-to-formal](https://huggingface.co/ggallipoli/bart-large_inf2for_family), [formal-to-informal](https://huggingface.co/ggallipoli/bart-large_for2inf_family) |
|
24 |
+
| T5 small | [informal-to-formal](https://huggingface.co/ggallipoli/t5-small_inf2for_family), [formal-to-informal](https://huggingface.co/ggallipoli/t5-small_for2inf_family) |
|
25 |
+
| T5 base | [informal-to-formal](https://huggingface.co/ggallipoli/t5-base_inf2for_family), [formal-to-informal](https://huggingface.co/ggallipoli/t5-base_for2inf_family) |
|
26 |
+
| T5 large | [informal-to-formal](https://huggingface.co/ggallipoli/t5-large_inf2for_family), [formal-to-informal](https://huggingface.co/ggallipoli/t5-large_for2inf_family) |
|
27 |
+
| BERT base | [style classifier](https://huggingface.co/ggallipoli/formality_classifier_gyafc_family) |
|
28 |
+
|
29 |
+
#### GYAFC dataset (Entertainment & Music)
|
30 |
+
|
31 |
+
| model | checkpoint |
|
32 |
+
|:----------:|:------------------------------------------------------:|
|
33 |
+
| BART base | [informal-to-formal](https://huggingface.co/ggallipoli/bart-base_inf2for_music), [formal-to-informal](https://huggingface.co/ggallipoli/bart-base_for2inf_music) |
|
34 |
+
| BART large | [informal-to-formal](https://huggingface.co/ggallipoli/bart-large_inf2for_music), [formal-to-informal](https://huggingface.co/ggallipoli/bart-large_for2inf_music) |
|
35 |
+
| T5 small | [informal-to-formal](https://huggingface.co/ggallipoli/t5-small_inf2for_music), [formal-to-informal](https://huggingface.co/ggallipoli/t5-small_for2inf_music) |
|
36 |
+
| T5 base | [informal-to-formal](https://huggingface.co/ggallipoli/t5-base_inf2for_music), [formal-to-informal](https://huggingface.co/ggallipoli/t5-base_for2inf_music) |
|
37 |
+
| T5 large | [informal-to-formal](https://huggingface.co/ggallipoli/t5-large_inf2for_music), [formal-to-informal](https://huggingface.co/ggallipoli/t5-large_for2inf_music) |
|
38 |
+
| BERT base | [style classifier](https://huggingface.co/ggallipoli/formality_classifier_gyafc_music) |
|
39 |
+
|
40 |
+
### Sentiment transfer
|
41 |
+
#### Yelp dataset
|
42 |
+
|
43 |
+
| model | checkpoint |
|
44 |
+
|:----------:|:------------------------------------------------------:|
|
45 |
+
| BART base | [negative-to-positive](https://huggingface.co/ggallipoli/bart-base_neg2pos), [positive-to-negative](https://huggingface.co/ggallipoli/bart-base_pos2neg) |
|
46 |
+
| BART large | [negative-to-positive](https://huggingface.co/ggallipoli/bart-large_neg2pos), [positive-to-negative](https://huggingface.co/ggallipoli/bart-large_pos2neg) |
|
47 |
+
| T5 small | [negative-to-positive](https://huggingface.co/ggallipoli/t5-small_neg2pos), [positive-to-negative](https://huggingface.co/ggallipoli/t5-small_pos2neg) |
|
48 |
+
| T5 base | [negative-to-positive](https://huggingface.co/ggallipoli/t5-base_neg2pos), [positive-to-negative](https://huggingface.co/ggallipoli/t5-base_pos2neg) |
|
49 |
+
| T5 large | [negative-to-positive](https://huggingface.co/ggallipoli/t5-large_neg2pos), [positive-to-negative](https://huggingface.co/ggallipoli/t5-large_pos2neg) |
|
50 |
+
| BERT base | [style classifier](https://huggingface.co/ggallipoli/sentiment_classifier_yelp) |
|
51 |
+
|
52 |
+
## Model Description
|
53 |
+
|
54 |
+
The models implement a CycleGAN architecture for Text Style Transfer that:
|
55 |
+
- Applies self-supervision directly at sequence level
|
56 |
+
- Maintains content while transferring style attributes
|
57 |
+
- Employs pre-trained style classifiers to guide generation
|
58 |
+
- Uses Transformer-based generators and discriminators
|
59 |
+
|
60 |
+
The models achieve state-of-the-art results on both formality and sentiment transfer tasks.
|
61 |
+
|
62 |
+
## Usage
|
63 |
+
|
64 |
+
Both generators and style classifiers can be used with the Hugging Face 🤗 transformers library:
|
65 |
+
|
66 |
+
Each generator model can be loaded as:
|
67 |
+
|
68 |
+
```python
|
69 |
+
from transformers import AutoModelForSeq2SeqLM, AutoTokenizer
|
70 |
+
|
71 |
+
model = AutoModelForSeq2SeqLM.from_pretrained("[GENERATOR_MODEL]")
|
72 |
+
tokenizer = AutoTokenizer.from_pretrained("[GENERATOR_MODEL]")
|
73 |
+
```
|
74 |
+
|
75 |
+
The style classifiers can be loaded as:
|
76 |
+
|
77 |
+
```python
|
78 |
+
from transformers import AutoModelForSequenceClassification, AutoTokenizer
|
79 |
+
|
80 |
+
classifier = AutoModelForSequenceClassification.from_pretrained("[CLASSIFIER_MODEL]")
|
81 |
+
tokenizer = AutoTokenizer.from_pretrained("[CLASSIFIER_MODEL]")
|
82 |
+
```
|
83 |
+
|
84 |
+
## Citation
|
85 |
+
For more details, you can refer to the [paper](https://dl.acm.org/doi/10.1145/3678179).
|
86 |
+
|
87 |
+
```bibtex
|
88 |
+
@article{10.1145/3678179,
|
89 |
+
author = {La Quatra, Moreno and Gallipoli, Giuseppe and Cagliero, Luca},
|
90 |
+
title = {Self-supervised Text Style Transfer Using Cycle-Consistent Adversarial Networks},
|
91 |
+
year = {2024},
|
92 |
+
issue_date = {October 2024},
|
93 |
+
publisher = {Association for Computing Machinery},
|
94 |
+
address = {New York, NY, USA},
|
95 |
+
volume = {15},
|
96 |
+
number = {5},
|
97 |
+
issn = {2157-6904},
|
98 |
+
url = {https://doi.org/10.1145/3678179},
|
99 |
+
doi = {10.1145/3678179},
|
100 |
+
journal = {ACM Trans. Intell. Syst. Technol.},
|
101 |
+
month = nov,
|
102 |
+
articleno = {110},
|
103 |
+
numpages = {38},
|
104 |
+
keywords = {Text Style Transfer, Sentiment transfer, Formality transfer, Cycle-consistent Generative Adversarial Networks, Transformers}
|
105 |
+
}
|
106 |
+
```
|
107 |
+
|
108 |
+
## Code
|
109 |
+
|
110 |
+
The full implementation is available at: https://github.com/gallipoligiuseppe/TST-CycleGAN.
|
111 |
+
|
112 |
+
## License
|
113 |
+
|
114 |
+
This work is licensed under the <a rel="license" href="http://creativecommons.org/licenses/by-nc-sa/4.0/">Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License</a>.
|