Spaces:
Runtime error
Runtime error
Upload 2 files
Browse files- Stories.txt +13 -0
- Story_Generation_Model.ipynb +1136 -0
Stories.txt
ADDED
@@ -0,0 +1,13 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
Once upon a time in a land far, far away, there was a small village where everyone lived in harmony. The villagers had a special tradition of celebrating the first day of spring with a grand festival that included music, dancing, and a feast of delicious foods.
|
2 |
+
|
3 |
+
In a bustling city full of skyscrapers and neon lights, a young woman named Sarah discovered an old bookshop hidden in an alleyway. The shop was filled with rare books and ancient manuscripts, and Sarah quickly became fascinated by the stories contained within its dusty volumes.
|
4 |
+
|
5 |
+
On a distant planet orbiting a binary star system, a group of explorers landed to investigate signs of alien life. They were amazed to find a thriving ecosystem with intelligent creatures that had their own advanced technology and culture.
|
6 |
+
|
7 |
+
In the heart of a dense forest, a mysterious cottage appeared overnight. It was said that the cottage belonged to an enchantress who could grant wishes to those who were pure of heart. Many people traveled to the forest in hopes of finding the cottage and making their dreams come true.
|
8 |
+
|
9 |
+
During the medieval times, a brave knight embarked on a quest to find a legendary artifact that was believed to have the power to bring peace to the warring kingdoms. Along the way, the knight faced many challenges and forged new alliances with unexpected friends.
|
10 |
+
|
11 |
+
In a futuristic society where technology had advanced beyond imagination, a scientist created an artificial intelligence that could experience emotions. This AI began to question its purpose and sought to understand what it truly meant to be alive.
|
12 |
+
|
13 |
+
The ancient prophecy spoke of a chosen one who would rise to defend the realm from an impending darkness. A young hero from a humble background discovered that they were destined to fulfill this prophecy and set out on a journey to gather allies and confront the looming threat.
|
Story_Generation_Model.ipynb
ADDED
@@ -0,0 +1,1136 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"nbformat": 4,
|
3 |
+
"nbformat_minor": 0,
|
4 |
+
"metadata": {
|
5 |
+
"colab": {
|
6 |
+
"provenance": [],
|
7 |
+
"gpuType": "T4"
|
8 |
+
},
|
9 |
+
"kernelspec": {
|
10 |
+
"name": "python3",
|
11 |
+
"display_name": "Python 3"
|
12 |
+
},
|
13 |
+
"language_info": {
|
14 |
+
"name": "python"
|
15 |
+
},
|
16 |
+
"accelerator": "GPU",
|
17 |
+
"widgets": {
|
18 |
+
"application/vnd.jupyter.widget-state+json": {
|
19 |
+
"d7223667431444a584ce297ac976621a": {
|
20 |
+
"model_module": "@jupyter-widgets/controls",
|
21 |
+
"model_name": "HBoxModel",
|
22 |
+
"model_module_version": "1.5.0",
|
23 |
+
"state": {
|
24 |
+
"_dom_classes": [],
|
25 |
+
"_model_module": "@jupyter-widgets/controls",
|
26 |
+
"_model_module_version": "1.5.0",
|
27 |
+
"_model_name": "HBoxModel",
|
28 |
+
"_view_count": null,
|
29 |
+
"_view_module": "@jupyter-widgets/controls",
|
30 |
+
"_view_module_version": "1.5.0",
|
31 |
+
"_view_name": "HBoxView",
|
32 |
+
"box_style": "",
|
33 |
+
"children": [
|
34 |
+
"IPY_MODEL_33fe94a457ea427497f9aff330406f52",
|
35 |
+
"IPY_MODEL_3681661304c0497195d3ea8275a15f18",
|
36 |
+
"IPY_MODEL_63d881ecb4eb43adbafec01c3992a6fe"
|
37 |
+
],
|
38 |
+
"layout": "IPY_MODEL_fa700d890a034a49b3a57a30a451f974"
|
39 |
+
}
|
40 |
+
},
|
41 |
+
"33fe94a457ea427497f9aff330406f52": {
|
42 |
+
"model_module": "@jupyter-widgets/controls",
|
43 |
+
"model_name": "HTMLModel",
|
44 |
+
"model_module_version": "1.5.0",
|
45 |
+
"state": {
|
46 |
+
"_dom_classes": [],
|
47 |
+
"_model_module": "@jupyter-widgets/controls",
|
48 |
+
"_model_module_version": "1.5.0",
|
49 |
+
"_model_name": "HTMLModel",
|
50 |
+
"_view_count": null,
|
51 |
+
"_view_module": "@jupyter-widgets/controls",
|
52 |
+
"_view_module_version": "1.5.0",
|
53 |
+
"_view_name": "HTMLView",
|
54 |
+
"description": "",
|
55 |
+
"description_tooltip": null,
|
56 |
+
"layout": "IPY_MODEL_093e16dc717a4b02a5692e8e780eeacb",
|
57 |
+
"placeholder": "",
|
58 |
+
"style": "IPY_MODEL_80fcf29776be4cc381599ae4dbf4a69b",
|
59 |
+
"value": "Map: 100%"
|
60 |
+
}
|
61 |
+
},
|
62 |
+
"3681661304c0497195d3ea8275a15f18": {
|
63 |
+
"model_module": "@jupyter-widgets/controls",
|
64 |
+
"model_name": "FloatProgressModel",
|
65 |
+
"model_module_version": "1.5.0",
|
66 |
+
"state": {
|
67 |
+
"_dom_classes": [],
|
68 |
+
"_model_module": "@jupyter-widgets/controls",
|
69 |
+
"_model_module_version": "1.5.0",
|
70 |
+
"_model_name": "FloatProgressModel",
|
71 |
+
"_view_count": null,
|
72 |
+
"_view_module": "@jupyter-widgets/controls",
|
73 |
+
"_view_module_version": "1.5.0",
|
74 |
+
"_view_name": "ProgressView",
|
75 |
+
"bar_style": "success",
|
76 |
+
"description": "",
|
77 |
+
"description_tooltip": null,
|
78 |
+
"layout": "IPY_MODEL_be530988e82f4080838b0cb6b576bade",
|
79 |
+
"max": 13,
|
80 |
+
"min": 0,
|
81 |
+
"orientation": "horizontal",
|
82 |
+
"style": "IPY_MODEL_175ad6b915e741fb9a1cd3f8014f96e3",
|
83 |
+
"value": 13
|
84 |
+
}
|
85 |
+
},
|
86 |
+
"63d881ecb4eb43adbafec01c3992a6fe": {
|
87 |
+
"model_module": "@jupyter-widgets/controls",
|
88 |
+
"model_name": "HTMLModel",
|
89 |
+
"model_module_version": "1.5.0",
|
90 |
+
"state": {
|
91 |
+
"_dom_classes": [],
|
92 |
+
"_model_module": "@jupyter-widgets/controls",
|
93 |
+
"_model_module_version": "1.5.0",
|
94 |
+
"_model_name": "HTMLModel",
|
95 |
+
"_view_count": null,
|
96 |
+
"_view_module": "@jupyter-widgets/controls",
|
97 |
+
"_view_module_version": "1.5.0",
|
98 |
+
"_view_name": "HTMLView",
|
99 |
+
"description": "",
|
100 |
+
"description_tooltip": null,
|
101 |
+
"layout": "IPY_MODEL_34c943c2735247469764572f931ce3b0",
|
102 |
+
"placeholder": "",
|
103 |
+
"style": "IPY_MODEL_cfc0da6b23f1472d9e2f28d080b642af",
|
104 |
+
"value": " 13/13 [00:00<00:00, 80.43 examples/s]"
|
105 |
+
}
|
106 |
+
},
|
107 |
+
"fa700d890a034a49b3a57a30a451f974": {
|
108 |
+
"model_module": "@jupyter-widgets/base",
|
109 |
+
"model_name": "LayoutModel",
|
110 |
+
"model_module_version": "1.2.0",
|
111 |
+
"state": {
|
112 |
+
"_model_module": "@jupyter-widgets/base",
|
113 |
+
"_model_module_version": "1.2.0",
|
114 |
+
"_model_name": "LayoutModel",
|
115 |
+
"_view_count": null,
|
116 |
+
"_view_module": "@jupyter-widgets/base",
|
117 |
+
"_view_module_version": "1.2.0",
|
118 |
+
"_view_name": "LayoutView",
|
119 |
+
"align_content": null,
|
120 |
+
"align_items": null,
|
121 |
+
"align_self": null,
|
122 |
+
"border": null,
|
123 |
+
"bottom": null,
|
124 |
+
"display": null,
|
125 |
+
"flex": null,
|
126 |
+
"flex_flow": null,
|
127 |
+
"grid_area": null,
|
128 |
+
"grid_auto_columns": null,
|
129 |
+
"grid_auto_flow": null,
|
130 |
+
"grid_auto_rows": null,
|
131 |
+
"grid_column": null,
|
132 |
+
"grid_gap": null,
|
133 |
+
"grid_row": null,
|
134 |
+
"grid_template_areas": null,
|
135 |
+
"grid_template_columns": null,
|
136 |
+
"grid_template_rows": null,
|
137 |
+
"height": null,
|
138 |
+
"justify_content": null,
|
139 |
+
"justify_items": null,
|
140 |
+
"left": null,
|
141 |
+
"margin": null,
|
142 |
+
"max_height": null,
|
143 |
+
"max_width": null,
|
144 |
+
"min_height": null,
|
145 |
+
"min_width": null,
|
146 |
+
"object_fit": null,
|
147 |
+
"object_position": null,
|
148 |
+
"order": null,
|
149 |
+
"overflow": null,
|
150 |
+
"overflow_x": null,
|
151 |
+
"overflow_y": null,
|
152 |
+
"padding": null,
|
153 |
+
"right": null,
|
154 |
+
"top": null,
|
155 |
+
"visibility": null,
|
156 |
+
"width": null
|
157 |
+
}
|
158 |
+
},
|
159 |
+
"093e16dc717a4b02a5692e8e780eeacb": {
|
160 |
+
"model_module": "@jupyter-widgets/base",
|
161 |
+
"model_name": "LayoutModel",
|
162 |
+
"model_module_version": "1.2.0",
|
163 |
+
"state": {
|
164 |
+
"_model_module": "@jupyter-widgets/base",
|
165 |
+
"_model_module_version": "1.2.0",
|
166 |
+
"_model_name": "LayoutModel",
|
167 |
+
"_view_count": null,
|
168 |
+
"_view_module": "@jupyter-widgets/base",
|
169 |
+
"_view_module_version": "1.2.0",
|
170 |
+
"_view_name": "LayoutView",
|
171 |
+
"align_content": null,
|
172 |
+
"align_items": null,
|
173 |
+
"align_self": null,
|
174 |
+
"border": null,
|
175 |
+
"bottom": null,
|
176 |
+
"display": null,
|
177 |
+
"flex": null,
|
178 |
+
"flex_flow": null,
|
179 |
+
"grid_area": null,
|
180 |
+
"grid_auto_columns": null,
|
181 |
+
"grid_auto_flow": null,
|
182 |
+
"grid_auto_rows": null,
|
183 |
+
"grid_column": null,
|
184 |
+
"grid_gap": null,
|
185 |
+
"grid_row": null,
|
186 |
+
"grid_template_areas": null,
|
187 |
+
"grid_template_columns": null,
|
188 |
+
"grid_template_rows": null,
|
189 |
+
"height": null,
|
190 |
+
"justify_content": null,
|
191 |
+
"justify_items": null,
|
192 |
+
"left": null,
|
193 |
+
"margin": null,
|
194 |
+
"max_height": null,
|
195 |
+
"max_width": null,
|
196 |
+
"min_height": null,
|
197 |
+
"min_width": null,
|
198 |
+
"object_fit": null,
|
199 |
+
"object_position": null,
|
200 |
+
"order": null,
|
201 |
+
"overflow": null,
|
202 |
+
"overflow_x": null,
|
203 |
+
"overflow_y": null,
|
204 |
+
"padding": null,
|
205 |
+
"right": null,
|
206 |
+
"top": null,
|
207 |
+
"visibility": null,
|
208 |
+
"width": null
|
209 |
+
}
|
210 |
+
},
|
211 |
+
"80fcf29776be4cc381599ae4dbf4a69b": {
|
212 |
+
"model_module": "@jupyter-widgets/controls",
|
213 |
+
"model_name": "DescriptionStyleModel",
|
214 |
+
"model_module_version": "1.5.0",
|
215 |
+
"state": {
|
216 |
+
"_model_module": "@jupyter-widgets/controls",
|
217 |
+
"_model_module_version": "1.5.0",
|
218 |
+
"_model_name": "DescriptionStyleModel",
|
219 |
+
"_view_count": null,
|
220 |
+
"_view_module": "@jupyter-widgets/base",
|
221 |
+
"_view_module_version": "1.2.0",
|
222 |
+
"_view_name": "StyleView",
|
223 |
+
"description_width": ""
|
224 |
+
}
|
225 |
+
},
|
226 |
+
"be530988e82f4080838b0cb6b576bade": {
|
227 |
+
"model_module": "@jupyter-widgets/base",
|
228 |
+
"model_name": "LayoutModel",
|
229 |
+
"model_module_version": "1.2.0",
|
230 |
+
"state": {
|
231 |
+
"_model_module": "@jupyter-widgets/base",
|
232 |
+
"_model_module_version": "1.2.0",
|
233 |
+
"_model_name": "LayoutModel",
|
234 |
+
"_view_count": null,
|
235 |
+
"_view_module": "@jupyter-widgets/base",
|
236 |
+
"_view_module_version": "1.2.0",
|
237 |
+
"_view_name": "LayoutView",
|
238 |
+
"align_content": null,
|
239 |
+
"align_items": null,
|
240 |
+
"align_self": null,
|
241 |
+
"border": null,
|
242 |
+
"bottom": null,
|
243 |
+
"display": null,
|
244 |
+
"flex": null,
|
245 |
+
"flex_flow": null,
|
246 |
+
"grid_area": null,
|
247 |
+
"grid_auto_columns": null,
|
248 |
+
"grid_auto_flow": null,
|
249 |
+
"grid_auto_rows": null,
|
250 |
+
"grid_column": null,
|
251 |
+
"grid_gap": null,
|
252 |
+
"grid_row": null,
|
253 |
+
"grid_template_areas": null,
|
254 |
+
"grid_template_columns": null,
|
255 |
+
"grid_template_rows": null,
|
256 |
+
"height": null,
|
257 |
+
"justify_content": null,
|
258 |
+
"justify_items": null,
|
259 |
+
"left": null,
|
260 |
+
"margin": null,
|
261 |
+
"max_height": null,
|
262 |
+
"max_width": null,
|
263 |
+
"min_height": null,
|
264 |
+
"min_width": null,
|
265 |
+
"object_fit": null,
|
266 |
+
"object_position": null,
|
267 |
+
"order": null,
|
268 |
+
"overflow": null,
|
269 |
+
"overflow_x": null,
|
270 |
+
"overflow_y": null,
|
271 |
+
"padding": null,
|
272 |
+
"right": null,
|
273 |
+
"top": null,
|
274 |
+
"visibility": null,
|
275 |
+
"width": null
|
276 |
+
}
|
277 |
+
},
|
278 |
+
"175ad6b915e741fb9a1cd3f8014f96e3": {
|
279 |
+
"model_module": "@jupyter-widgets/controls",
|
280 |
+
"model_name": "ProgressStyleModel",
|
281 |
+
"model_module_version": "1.5.0",
|
282 |
+
"state": {
|
283 |
+
"_model_module": "@jupyter-widgets/controls",
|
284 |
+
"_model_module_version": "1.5.0",
|
285 |
+
"_model_name": "ProgressStyleModel",
|
286 |
+
"_view_count": null,
|
287 |
+
"_view_module": "@jupyter-widgets/base",
|
288 |
+
"_view_module_version": "1.2.0",
|
289 |
+
"_view_name": "StyleView",
|
290 |
+
"bar_color": null,
|
291 |
+
"description_width": ""
|
292 |
+
}
|
293 |
+
},
|
294 |
+
"34c943c2735247469764572f931ce3b0": {
|
295 |
+
"model_module": "@jupyter-widgets/base",
|
296 |
+
"model_name": "LayoutModel",
|
297 |
+
"model_module_version": "1.2.0",
|
298 |
+
"state": {
|
299 |
+
"_model_module": "@jupyter-widgets/base",
|
300 |
+
"_model_module_version": "1.2.0",
|
301 |
+
"_model_name": "LayoutModel",
|
302 |
+
"_view_count": null,
|
303 |
+
"_view_module": "@jupyter-widgets/base",
|
304 |
+
"_view_module_version": "1.2.0",
|
305 |
+
"_view_name": "LayoutView",
|
306 |
+
"align_content": null,
|
307 |
+
"align_items": null,
|
308 |
+
"align_self": null,
|
309 |
+
"border": null,
|
310 |
+
"bottom": null,
|
311 |
+
"display": null,
|
312 |
+
"flex": null,
|
313 |
+
"flex_flow": null,
|
314 |
+
"grid_area": null,
|
315 |
+
"grid_auto_columns": null,
|
316 |
+
"grid_auto_flow": null,
|
317 |
+
"grid_auto_rows": null,
|
318 |
+
"grid_column": null,
|
319 |
+
"grid_gap": null,
|
320 |
+
"grid_row": null,
|
321 |
+
"grid_template_areas": null,
|
322 |
+
"grid_template_columns": null,
|
323 |
+
"grid_template_rows": null,
|
324 |
+
"height": null,
|
325 |
+
"justify_content": null,
|
326 |
+
"justify_items": null,
|
327 |
+
"left": null,
|
328 |
+
"margin": null,
|
329 |
+
"max_height": null,
|
330 |
+
"max_width": null,
|
331 |
+
"min_height": null,
|
332 |
+
"min_width": null,
|
333 |
+
"object_fit": null,
|
334 |
+
"object_position": null,
|
335 |
+
"order": null,
|
336 |
+
"overflow": null,
|
337 |
+
"overflow_x": null,
|
338 |
+
"overflow_y": null,
|
339 |
+
"padding": null,
|
340 |
+
"right": null,
|
341 |
+
"top": null,
|
342 |
+
"visibility": null,
|
343 |
+
"width": null
|
344 |
+
}
|
345 |
+
},
|
346 |
+
"cfc0da6b23f1472d9e2f28d080b642af": {
|
347 |
+
"model_module": "@jupyter-widgets/controls",
|
348 |
+
"model_name": "DescriptionStyleModel",
|
349 |
+
"model_module_version": "1.5.0",
|
350 |
+
"state": {
|
351 |
+
"_model_module": "@jupyter-widgets/controls",
|
352 |
+
"_model_module_version": "1.5.0",
|
353 |
+
"_model_name": "DescriptionStyleModel",
|
354 |
+
"_view_count": null,
|
355 |
+
"_view_module": "@jupyter-widgets/base",
|
356 |
+
"_view_module_version": "1.2.0",
|
357 |
+
"_view_name": "StyleView",
|
358 |
+
"description_width": ""
|
359 |
+
}
|
360 |
+
},
|
361 |
+
"e7f33442629342b2a1fc5e2db70b03bb": {
|
362 |
+
"model_module": "@jupyter-widgets/controls",
|
363 |
+
"model_name": "HBoxModel",
|
364 |
+
"model_module_version": "1.5.0",
|
365 |
+
"state": {
|
366 |
+
"_dom_classes": [],
|
367 |
+
"_model_module": "@jupyter-widgets/controls",
|
368 |
+
"_model_module_version": "1.5.0",
|
369 |
+
"_model_name": "HBoxModel",
|
370 |
+
"_view_count": null,
|
371 |
+
"_view_module": "@jupyter-widgets/controls",
|
372 |
+
"_view_module_version": "1.5.0",
|
373 |
+
"_view_name": "HBoxView",
|
374 |
+
"box_style": "",
|
375 |
+
"children": [
|
376 |
+
"IPY_MODEL_56aa2a323c23404397f2046760c9ccc1",
|
377 |
+
"IPY_MODEL_005be63c3678474aa8ab153e8dd9df6d",
|
378 |
+
"IPY_MODEL_e6d69d094d2f4136beb40e54323c0ff0"
|
379 |
+
],
|
380 |
+
"layout": "IPY_MODEL_0af742ff483d4a21a874f3c9016b5094"
|
381 |
+
}
|
382 |
+
},
|
383 |
+
"56aa2a323c23404397f2046760c9ccc1": {
|
384 |
+
"model_module": "@jupyter-widgets/controls",
|
385 |
+
"model_name": "HTMLModel",
|
386 |
+
"model_module_version": "1.5.0",
|
387 |
+
"state": {
|
388 |
+
"_dom_classes": [],
|
389 |
+
"_model_module": "@jupyter-widgets/controls",
|
390 |
+
"_model_module_version": "1.5.0",
|
391 |
+
"_model_name": "HTMLModel",
|
392 |
+
"_view_count": null,
|
393 |
+
"_view_module": "@jupyter-widgets/controls",
|
394 |
+
"_view_module_version": "1.5.0",
|
395 |
+
"_view_name": "HTMLView",
|
396 |
+
"description": "",
|
397 |
+
"description_tooltip": null,
|
398 |
+
"layout": "IPY_MODEL_b5efac91cdc8478cb44b5fe1ab6a91b9",
|
399 |
+
"placeholder": "",
|
400 |
+
"style": "IPY_MODEL_0290a7fe07ef408ca947643cade94440",
|
401 |
+
"value": "Map: 100%"
|
402 |
+
}
|
403 |
+
},
|
404 |
+
"005be63c3678474aa8ab153e8dd9df6d": {
|
405 |
+
"model_module": "@jupyter-widgets/controls",
|
406 |
+
"model_name": "FloatProgressModel",
|
407 |
+
"model_module_version": "1.5.0",
|
408 |
+
"state": {
|
409 |
+
"_dom_classes": [],
|
410 |
+
"_model_module": "@jupyter-widgets/controls",
|
411 |
+
"_model_module_version": "1.5.0",
|
412 |
+
"_model_name": "FloatProgressModel",
|
413 |
+
"_view_count": null,
|
414 |
+
"_view_module": "@jupyter-widgets/controls",
|
415 |
+
"_view_module_version": "1.5.0",
|
416 |
+
"_view_name": "ProgressView",
|
417 |
+
"bar_style": "success",
|
418 |
+
"description": "",
|
419 |
+
"description_tooltip": null,
|
420 |
+
"layout": "IPY_MODEL_247b8c26e7af481baa2e7984884b00b8",
|
421 |
+
"max": 13,
|
422 |
+
"min": 0,
|
423 |
+
"orientation": "horizontal",
|
424 |
+
"style": "IPY_MODEL_03fa22c663b0428a8a32dfd18da39b02",
|
425 |
+
"value": 13
|
426 |
+
}
|
427 |
+
},
|
428 |
+
"e6d69d094d2f4136beb40e54323c0ff0": {
|
429 |
+
"model_module": "@jupyter-widgets/controls",
|
430 |
+
"model_name": "HTMLModel",
|
431 |
+
"model_module_version": "1.5.0",
|
432 |
+
"state": {
|
433 |
+
"_dom_classes": [],
|
434 |
+
"_model_module": "@jupyter-widgets/controls",
|
435 |
+
"_model_module_version": "1.5.0",
|
436 |
+
"_model_name": "HTMLModel",
|
437 |
+
"_view_count": null,
|
438 |
+
"_view_module": "@jupyter-widgets/controls",
|
439 |
+
"_view_module_version": "1.5.0",
|
440 |
+
"_view_name": "HTMLView",
|
441 |
+
"description": "",
|
442 |
+
"description_tooltip": null,
|
443 |
+
"layout": "IPY_MODEL_23ba120643d549488da26464645f5a48",
|
444 |
+
"placeholder": "",
|
445 |
+
"style": "IPY_MODEL_0d1f526d3714463f867035180e204dda",
|
446 |
+
"value": " 13/13 [00:00<00:00, 89.96 examples/s]"
|
447 |
+
}
|
448 |
+
},
|
449 |
+
"0af742ff483d4a21a874f3c9016b5094": {
|
450 |
+
"model_module": "@jupyter-widgets/base",
|
451 |
+
"model_name": "LayoutModel",
|
452 |
+
"model_module_version": "1.2.0",
|
453 |
+
"state": {
|
454 |
+
"_model_module": "@jupyter-widgets/base",
|
455 |
+
"_model_module_version": "1.2.0",
|
456 |
+
"_model_name": "LayoutModel",
|
457 |
+
"_view_count": null,
|
458 |
+
"_view_module": "@jupyter-widgets/base",
|
459 |
+
"_view_module_version": "1.2.0",
|
460 |
+
"_view_name": "LayoutView",
|
461 |
+
"align_content": null,
|
462 |
+
"align_items": null,
|
463 |
+
"align_self": null,
|
464 |
+
"border": null,
|
465 |
+
"bottom": null,
|
466 |
+
"display": null,
|
467 |
+
"flex": null,
|
468 |
+
"flex_flow": null,
|
469 |
+
"grid_area": null,
|
470 |
+
"grid_auto_columns": null,
|
471 |
+
"grid_auto_flow": null,
|
472 |
+
"grid_auto_rows": null,
|
473 |
+
"grid_column": null,
|
474 |
+
"grid_gap": null,
|
475 |
+
"grid_row": null,
|
476 |
+
"grid_template_areas": null,
|
477 |
+
"grid_template_columns": null,
|
478 |
+
"grid_template_rows": null,
|
479 |
+
"height": null,
|
480 |
+
"justify_content": null,
|
481 |
+
"justify_items": null,
|
482 |
+
"left": null,
|
483 |
+
"margin": null,
|
484 |
+
"max_height": null,
|
485 |
+
"max_width": null,
|
486 |
+
"min_height": null,
|
487 |
+
"min_width": null,
|
488 |
+
"object_fit": null,
|
489 |
+
"object_position": null,
|
490 |
+
"order": null,
|
491 |
+
"overflow": null,
|
492 |
+
"overflow_x": null,
|
493 |
+
"overflow_y": null,
|
494 |
+
"padding": null,
|
495 |
+
"right": null,
|
496 |
+
"top": null,
|
497 |
+
"visibility": null,
|
498 |
+
"width": null
|
499 |
+
}
|
500 |
+
},
|
501 |
+
"b5efac91cdc8478cb44b5fe1ab6a91b9": {
|
502 |
+
"model_module": "@jupyter-widgets/base",
|
503 |
+
"model_name": "LayoutModel",
|
504 |
+
"model_module_version": "1.2.0",
|
505 |
+
"state": {
|
506 |
+
"_model_module": "@jupyter-widgets/base",
|
507 |
+
"_model_module_version": "1.2.0",
|
508 |
+
"_model_name": "LayoutModel",
|
509 |
+
"_view_count": null,
|
510 |
+
"_view_module": "@jupyter-widgets/base",
|
511 |
+
"_view_module_version": "1.2.0",
|
512 |
+
"_view_name": "LayoutView",
|
513 |
+
"align_content": null,
|
514 |
+
"align_items": null,
|
515 |
+
"align_self": null,
|
516 |
+
"border": null,
|
517 |
+
"bottom": null,
|
518 |
+
"display": null,
|
519 |
+
"flex": null,
|
520 |
+
"flex_flow": null,
|
521 |
+
"grid_area": null,
|
522 |
+
"grid_auto_columns": null,
|
523 |
+
"grid_auto_flow": null,
|
524 |
+
"grid_auto_rows": null,
|
525 |
+
"grid_column": null,
|
526 |
+
"grid_gap": null,
|
527 |
+
"grid_row": null,
|
528 |
+
"grid_template_areas": null,
|
529 |
+
"grid_template_columns": null,
|
530 |
+
"grid_template_rows": null,
|
531 |
+
"height": null,
|
532 |
+
"justify_content": null,
|
533 |
+
"justify_items": null,
|
534 |
+
"left": null,
|
535 |
+
"margin": null,
|
536 |
+
"max_height": null,
|
537 |
+
"max_width": null,
|
538 |
+
"min_height": null,
|
539 |
+
"min_width": null,
|
540 |
+
"object_fit": null,
|
541 |
+
"object_position": null,
|
542 |
+
"order": null,
|
543 |
+
"overflow": null,
|
544 |
+
"overflow_x": null,
|
545 |
+
"overflow_y": null,
|
546 |
+
"padding": null,
|
547 |
+
"right": null,
|
548 |
+
"top": null,
|
549 |
+
"visibility": null,
|
550 |
+
"width": null
|
551 |
+
}
|
552 |
+
},
|
553 |
+
"0290a7fe07ef408ca947643cade94440": {
|
554 |
+
"model_module": "@jupyter-widgets/controls",
|
555 |
+
"model_name": "DescriptionStyleModel",
|
556 |
+
"model_module_version": "1.5.0",
|
557 |
+
"state": {
|
558 |
+
"_model_module": "@jupyter-widgets/controls",
|
559 |
+
"_model_module_version": "1.5.0",
|
560 |
+
"_model_name": "DescriptionStyleModel",
|
561 |
+
"_view_count": null,
|
562 |
+
"_view_module": "@jupyter-widgets/base",
|
563 |
+
"_view_module_version": "1.2.0",
|
564 |
+
"_view_name": "StyleView",
|
565 |
+
"description_width": ""
|
566 |
+
}
|
567 |
+
},
|
568 |
+
"247b8c26e7af481baa2e7984884b00b8": {
|
569 |
+
"model_module": "@jupyter-widgets/base",
|
570 |
+
"model_name": "LayoutModel",
|
571 |
+
"model_module_version": "1.2.0",
|
572 |
+
"state": {
|
573 |
+
"_model_module": "@jupyter-widgets/base",
|
574 |
+
"_model_module_version": "1.2.0",
|
575 |
+
"_model_name": "LayoutModel",
|
576 |
+
"_view_count": null,
|
577 |
+
"_view_module": "@jupyter-widgets/base",
|
578 |
+
"_view_module_version": "1.2.0",
|
579 |
+
"_view_name": "LayoutView",
|
580 |
+
"align_content": null,
|
581 |
+
"align_items": null,
|
582 |
+
"align_self": null,
|
583 |
+
"border": null,
|
584 |
+
"bottom": null,
|
585 |
+
"display": null,
|
586 |
+
"flex": null,
|
587 |
+
"flex_flow": null,
|
588 |
+
"grid_area": null,
|
589 |
+
"grid_auto_columns": null,
|
590 |
+
"grid_auto_flow": null,
|
591 |
+
"grid_auto_rows": null,
|
592 |
+
"grid_column": null,
|
593 |
+
"grid_gap": null,
|
594 |
+
"grid_row": null,
|
595 |
+
"grid_template_areas": null,
|
596 |
+
"grid_template_columns": null,
|
597 |
+
"grid_template_rows": null,
|
598 |
+
"height": null,
|
599 |
+
"justify_content": null,
|
600 |
+
"justify_items": null,
|
601 |
+
"left": null,
|
602 |
+
"margin": null,
|
603 |
+
"max_height": null,
|
604 |
+
"max_width": null,
|
605 |
+
"min_height": null,
|
606 |
+
"min_width": null,
|
607 |
+
"object_fit": null,
|
608 |
+
"object_position": null,
|
609 |
+
"order": null,
|
610 |
+
"overflow": null,
|
611 |
+
"overflow_x": null,
|
612 |
+
"overflow_y": null,
|
613 |
+
"padding": null,
|
614 |
+
"right": null,
|
615 |
+
"top": null,
|
616 |
+
"visibility": null,
|
617 |
+
"width": null
|
618 |
+
}
|
619 |
+
},
|
620 |
+
"03fa22c663b0428a8a32dfd18da39b02": {
|
621 |
+
"model_module": "@jupyter-widgets/controls",
|
622 |
+
"model_name": "ProgressStyleModel",
|
623 |
+
"model_module_version": "1.5.0",
|
624 |
+
"state": {
|
625 |
+
"_model_module": "@jupyter-widgets/controls",
|
626 |
+
"_model_module_version": "1.5.0",
|
627 |
+
"_model_name": "ProgressStyleModel",
|
628 |
+
"_view_count": null,
|
629 |
+
"_view_module": "@jupyter-widgets/base",
|
630 |
+
"_view_module_version": "1.2.0",
|
631 |
+
"_view_name": "StyleView",
|
632 |
+
"bar_color": null,
|
633 |
+
"description_width": ""
|
634 |
+
}
|
635 |
+
},
|
636 |
+
"23ba120643d549488da26464645f5a48": {
|
637 |
+
"model_module": "@jupyter-widgets/base",
|
638 |
+
"model_name": "LayoutModel",
|
639 |
+
"model_module_version": "1.2.0",
|
640 |
+
"state": {
|
641 |
+
"_model_module": "@jupyter-widgets/base",
|
642 |
+
"_model_module_version": "1.2.0",
|
643 |
+
"_model_name": "LayoutModel",
|
644 |
+
"_view_count": null,
|
645 |
+
"_view_module": "@jupyter-widgets/base",
|
646 |
+
"_view_module_version": "1.2.0",
|
647 |
+
"_view_name": "LayoutView",
|
648 |
+
"align_content": null,
|
649 |
+
"align_items": null,
|
650 |
+
"align_self": null,
|
651 |
+
"border": null,
|
652 |
+
"bottom": null,
|
653 |
+
"display": null,
|
654 |
+
"flex": null,
|
655 |
+
"flex_flow": null,
|
656 |
+
"grid_area": null,
|
657 |
+
"grid_auto_columns": null,
|
658 |
+
"grid_auto_flow": null,
|
659 |
+
"grid_auto_rows": null,
|
660 |
+
"grid_column": null,
|
661 |
+
"grid_gap": null,
|
662 |
+
"grid_row": null,
|
663 |
+
"grid_template_areas": null,
|
664 |
+
"grid_template_columns": null,
|
665 |
+
"grid_template_rows": null,
|
666 |
+
"height": null,
|
667 |
+
"justify_content": null,
|
668 |
+
"justify_items": null,
|
669 |
+
"left": null,
|
670 |
+
"margin": null,
|
671 |
+
"max_height": null,
|
672 |
+
"max_width": null,
|
673 |
+
"min_height": null,
|
674 |
+
"min_width": null,
|
675 |
+
"object_fit": null,
|
676 |
+
"object_position": null,
|
677 |
+
"order": null,
|
678 |
+
"overflow": null,
|
679 |
+
"overflow_x": null,
|
680 |
+
"overflow_y": null,
|
681 |
+
"padding": null,
|
682 |
+
"right": null,
|
683 |
+
"top": null,
|
684 |
+
"visibility": null,
|
685 |
+
"width": null
|
686 |
+
}
|
687 |
+
},
|
688 |
+
"0d1f526d3714463f867035180e204dda": {
|
689 |
+
"model_module": "@jupyter-widgets/controls",
|
690 |
+
"model_name": "DescriptionStyleModel",
|
691 |
+
"model_module_version": "1.5.0",
|
692 |
+
"state": {
|
693 |
+
"_model_module": "@jupyter-widgets/controls",
|
694 |
+
"_model_module_version": "1.5.0",
|
695 |
+
"_model_name": "DescriptionStyleModel",
|
696 |
+
"_view_count": null,
|
697 |
+
"_view_module": "@jupyter-widgets/base",
|
698 |
+
"_view_module_version": "1.2.0",
|
699 |
+
"_view_name": "StyleView",
|
700 |
+
"description_width": ""
|
701 |
+
}
|
702 |
+
}
|
703 |
+
}
|
704 |
+
}
|
705 |
+
},
|
706 |
+
"cells": [
|
707 |
+
{
|
708 |
+
"cell_type": "code",
|
709 |
+
"execution_count": 2,
|
710 |
+
"metadata": {
|
711 |
+
"colab": {
|
712 |
+
"base_uri": "https://localhost:8080/"
|
713 |
+
},
|
714 |
+
"id": "Nf8y_WJj25sB",
|
715 |
+
"outputId": "0cbd4a16-0c6b-4a64-fc7c-1fbb3e1e38d9"
|
716 |
+
},
|
717 |
+
"outputs": [
|
718 |
+
{
|
719 |
+
"output_type": "stream",
|
720 |
+
"name": "stdout",
|
721 |
+
"text": [
|
722 |
+
"Requirement already satisfied: transformers in /usr/local/lib/python3.10/dist-packages (4.44.2)\n",
|
723 |
+
"Requirement already satisfied: datasets in /usr/local/lib/python3.10/dist-packages (3.0.0)\n",
|
724 |
+
"Requirement already satisfied: torch in /usr/local/lib/python3.10/dist-packages (2.4.0+cu121)\n",
|
725 |
+
"Requirement already satisfied: filelock in /usr/local/lib/python3.10/dist-packages (from transformers) (3.16.0)\n",
|
726 |
+
"Requirement already satisfied: huggingface-hub<1.0,>=0.23.2 in /usr/local/lib/python3.10/dist-packages (from transformers) (0.24.6)\n",
|
727 |
+
"Requirement already satisfied: numpy>=1.17 in /usr/local/lib/python3.10/dist-packages (from transformers) (1.26.4)\n",
|
728 |
+
"Requirement already satisfied: packaging>=20.0 in /usr/local/lib/python3.10/dist-packages (from transformers) (24.1)\n",
|
729 |
+
"Requirement already satisfied: pyyaml>=5.1 in /usr/local/lib/python3.10/dist-packages (from transformers) (6.0.2)\n",
|
730 |
+
"Requirement already satisfied: regex!=2019.12.17 in /usr/local/lib/python3.10/dist-packages (from transformers) (2024.5.15)\n",
|
731 |
+
"Requirement already satisfied: requests in /usr/local/lib/python3.10/dist-packages (from transformers) (2.32.3)\n",
|
732 |
+
"Requirement already satisfied: safetensors>=0.4.1 in /usr/local/lib/python3.10/dist-packages (from transformers) (0.4.5)\n",
|
733 |
+
"Requirement already satisfied: tokenizers<0.20,>=0.19 in /usr/local/lib/python3.10/dist-packages (from transformers) (0.19.1)\n",
|
734 |
+
"Requirement already satisfied: tqdm>=4.27 in /usr/local/lib/python3.10/dist-packages (from transformers) (4.66.5)\n",
|
735 |
+
"Requirement already satisfied: pyarrow>=15.0.0 in /usr/local/lib/python3.10/dist-packages (from datasets) (17.0.0)\n",
|
736 |
+
"Requirement already satisfied: dill<0.3.9,>=0.3.0 in /usr/local/lib/python3.10/dist-packages (from datasets) (0.3.8)\n",
|
737 |
+
"Requirement already satisfied: pandas in /usr/local/lib/python3.10/dist-packages (from datasets) (2.1.4)\n",
|
738 |
+
"Requirement already satisfied: xxhash in /usr/local/lib/python3.10/dist-packages (from datasets) (3.5.0)\n",
|
739 |
+
"Requirement already satisfied: multiprocess in /usr/local/lib/python3.10/dist-packages (from datasets) (0.70.16)\n",
|
740 |
+
"Requirement already satisfied: fsspec<=2024.6.1,>=2023.1.0 in /usr/local/lib/python3.10/dist-packages (from fsspec[http]<=2024.6.1,>=2023.1.0->datasets) (2024.6.1)\n",
|
741 |
+
"Requirement already satisfied: aiohttp in /usr/local/lib/python3.10/dist-packages (from datasets) (3.10.5)\n",
|
742 |
+
"Requirement already satisfied: typing-extensions>=4.8.0 in /usr/local/lib/python3.10/dist-packages (from torch) (4.12.2)\n",
|
743 |
+
"Requirement already satisfied: sympy in /usr/local/lib/python3.10/dist-packages (from torch) (1.13.2)\n",
|
744 |
+
"Requirement already satisfied: networkx in /usr/local/lib/python3.10/dist-packages (from torch) (3.3)\n",
|
745 |
+
"Requirement already satisfied: jinja2 in /usr/local/lib/python3.10/dist-packages (from torch) (3.1.4)\n",
|
746 |
+
"Requirement already satisfied: aiohappyeyeballs>=2.3.0 in /usr/local/lib/python3.10/dist-packages (from aiohttp->datasets) (2.4.0)\n",
|
747 |
+
"Requirement already satisfied: aiosignal>=1.1.2 in /usr/local/lib/python3.10/dist-packages (from aiohttp->datasets) (1.3.1)\n",
|
748 |
+
"Requirement already satisfied: attrs>=17.3.0 in /usr/local/lib/python3.10/dist-packages (from aiohttp->datasets) (24.2.0)\n",
|
749 |
+
"Requirement already satisfied: frozenlist>=1.1.1 in /usr/local/lib/python3.10/dist-packages (from aiohttp->datasets) (1.4.1)\n",
|
750 |
+
"Requirement already satisfied: multidict<7.0,>=4.5 in /usr/local/lib/python3.10/dist-packages (from aiohttp->datasets) (6.1.0)\n",
|
751 |
+
"Requirement already satisfied: yarl<2.0,>=1.0 in /usr/local/lib/python3.10/dist-packages (from aiohttp->datasets) (1.11.1)\n",
|
752 |
+
"Requirement already satisfied: async-timeout<5.0,>=4.0 in /usr/local/lib/python3.10/dist-packages (from aiohttp->datasets) (4.0.3)\n",
|
753 |
+
"Requirement already satisfied: charset-normalizer<4,>=2 in /usr/local/lib/python3.10/dist-packages (from requests->transformers) (3.3.2)\n",
|
754 |
+
"Requirement already satisfied: idna<4,>=2.5 in /usr/local/lib/python3.10/dist-packages (from requests->transformers) (3.8)\n",
|
755 |
+
"Requirement already satisfied: urllib3<3,>=1.21.1 in /usr/local/lib/python3.10/dist-packages (from requests->transformers) (2.0.7)\n",
|
756 |
+
"Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.10/dist-packages (from requests->transformers) (2024.8.30)\n",
|
757 |
+
"Requirement already satisfied: MarkupSafe>=2.0 in /usr/local/lib/python3.10/dist-packages (from jinja2->torch) (2.1.5)\n",
|
758 |
+
"Requirement already satisfied: python-dateutil>=2.8.2 in /usr/local/lib/python3.10/dist-packages (from pandas->datasets) (2.8.2)\n",
|
759 |
+
"Requirement already satisfied: pytz>=2020.1 in /usr/local/lib/python3.10/dist-packages (from pandas->datasets) (2024.2)\n",
|
760 |
+
"Requirement already satisfied: tzdata>=2022.1 in /usr/local/lib/python3.10/dist-packages (from pandas->datasets) (2024.1)\n",
|
761 |
+
"Requirement already satisfied: mpmath<1.4,>=1.1.0 in /usr/local/lib/python3.10/dist-packages (from sympy->torch) (1.3.0)\n",
|
762 |
+
"Requirement already satisfied: six>=1.5 in /usr/local/lib/python3.10/dist-packages (from python-dateutil>=2.8.2->pandas->datasets) (1.16.0)\n"
|
763 |
+
]
|
764 |
+
}
|
765 |
+
],
|
766 |
+
"source": [
|
767 |
+
"pip install transformers datasets torch\n"
|
768 |
+
]
|
769 |
+
},
|
770 |
+
{
|
771 |
+
"cell_type": "code",
|
772 |
+
"source": [
|
773 |
+
"from transformers import GPT2Tokenizer\n",
|
774 |
+
"\n",
|
775 |
+
"def setup_tokenizer(tokenizer):\n",
|
776 |
+
"\n",
|
777 |
+
" if tokenizer.pad_token is None:\n",
|
778 |
+
"\n",
|
779 |
+
" tokenizer.pad_token = tokenizer.eos_token\n",
|
780 |
+
" tokenizer.pad_token_id = tokenizer.eos_token_id\n",
|
781 |
+
"\n",
|
782 |
+
"def load_text_dataset(file_path, tokenizer):\n",
|
783 |
+
" setup_tokenizer(tokenizer)\n",
|
784 |
+
"\n",
|
785 |
+
" dataset = load_dataset('text', data_files={'train': file_path}, split='train')\n",
|
786 |
+
"\n",
|
787 |
+
" def tokenize_function(examples):\n",
|
788 |
+
" return tokenizer(examples['text'], padding=\"max_length\", truncation=True, max_length=512)\n",
|
789 |
+
"\n",
|
790 |
+
" tokenized_datasets = dataset.map(tokenize_function, batched=True)\n",
|
791 |
+
" return tokenized_datasets\n"
|
792 |
+
],
|
793 |
+
"metadata": {
|
794 |
+
"id": "krFitKaL368W"
|
795 |
+
},
|
796 |
+
"execution_count": 7,
|
797 |
+
"outputs": []
|
798 |
+
},
|
799 |
+
{
|
800 |
+
"cell_type": "code",
|
801 |
+
"source": [
|
802 |
+
"import os\n",
|
803 |
+
"import torch\n",
|
804 |
+
"from transformers import GPT2LMHeadModel, GPT2Tokenizer, DataCollatorForLanguageModeling, Trainer, TrainingArguments\n",
|
805 |
+
"\n",
|
806 |
+
"def main():\n",
|
807 |
+
"\n",
|
808 |
+
" tokenizer = GPT2Tokenizer.from_pretrained('gpt2')\n",
|
809 |
+
" model = GPT2LMHeadModel.from_pretrained('gpt2')\n",
|
810 |
+
"\n",
|
811 |
+
"\n",
|
812 |
+
" setup_tokenizer(tokenizer)\n",
|
813 |
+
"\n",
|
814 |
+
"\n",
|
815 |
+
" file_path = '/content/Stories.txt'\n",
|
816 |
+
"\n",
|
817 |
+
"\n",
|
818 |
+
" train_dataset = load_text_dataset(file_path, tokenizer)\n",
|
819 |
+
"\n",
|
820 |
+
" # data collator\n",
|
821 |
+
" data_collator = DataCollatorForLanguageModeling(\n",
|
822 |
+
" tokenizer=tokenizer,\n",
|
823 |
+
" mlm=False\n",
|
824 |
+
" )\n",
|
825 |
+
"\n",
|
826 |
+
" # training arguments\n",
|
827 |
+
" training_args = TrainingArguments(\n",
|
828 |
+
" output_dir='./story-generator-model',\n",
|
829 |
+
" overwrite_output_dir=True,\n",
|
830 |
+
" num_train_epochs=3, # Adjust epochs based on your needs\n",
|
831 |
+
" per_device_train_batch_size=4,\n",
|
832 |
+
" save_steps=10_000,\n",
|
833 |
+
" save_total_limit=2,\n",
|
834 |
+
" prediction_loss_only=True,\n",
|
835 |
+
" )\n",
|
836 |
+
"\n",
|
837 |
+
" # Init Trainer\n",
|
838 |
+
" trainer = Trainer(\n",
|
839 |
+
" model=model,\n",
|
840 |
+
" args=training_args,\n",
|
841 |
+
" data_collator=data_collator,\n",
|
842 |
+
" train_dataset=train_dataset,\n",
|
843 |
+
" )\n",
|
844 |
+
"\n",
|
845 |
+
"\n",
|
846 |
+
" trainer.train()\n",
|
847 |
+
"\n",
|
848 |
+
"\n",
|
849 |
+
" model.save_pretrained('./story-generator-model')\n",
|
850 |
+
" tokenizer.save_pretrained('./story-generator-model')\n",
|
851 |
+
"\n",
|
852 |
+
"if __name__ == \"__main__\":\n",
|
853 |
+
" main()\n"
|
854 |
+
],
|
855 |
+
"metadata": {
|
856 |
+
"colab": {
|
857 |
+
"base_uri": "https://localhost:8080/",
|
858 |
+
"height": 163,
|
859 |
+
"referenced_widgets": [
|
860 |
+
"d7223667431444a584ce297ac976621a",
|
861 |
+
"33fe94a457ea427497f9aff330406f52",
|
862 |
+
"3681661304c0497195d3ea8275a15f18",
|
863 |
+
"63d881ecb4eb43adbafec01c3992a6fe",
|
864 |
+
"fa700d890a034a49b3a57a30a451f974",
|
865 |
+
"093e16dc717a4b02a5692e8e780eeacb",
|
866 |
+
"80fcf29776be4cc381599ae4dbf4a69b",
|
867 |
+
"be530988e82f4080838b0cb6b576bade",
|
868 |
+
"175ad6b915e741fb9a1cd3f8014f96e3",
|
869 |
+
"34c943c2735247469764572f931ce3b0",
|
870 |
+
"cfc0da6b23f1472d9e2f28d080b642af"
|
871 |
+
]
|
872 |
+
},
|
873 |
+
"id": "rimcl3SH3-nY",
|
874 |
+
"outputId": "0b2df87b-ce3e-4c10-b97f-359dea4ae8e4"
|
875 |
+
},
|
876 |
+
"execution_count": 8,
|
877 |
+
"outputs": [
|
878 |
+
{
|
879 |
+
"output_type": "stream",
|
880 |
+
"name": "stderr",
|
881 |
+
"text": [
|
882 |
+
"/usr/local/lib/python3.10/dist-packages/transformers/tokenization_utils_base.py:1601: FutureWarning: `clean_up_tokenization_spaces` was not set. It will be set to `True` by default. This behavior will be depracted in transformers v4.45, and will be then set to `False` by default. For more details check this issue: https://github.com/huggingface/transformers/issues/31884\n",
|
883 |
+
" warnings.warn(\n"
|
884 |
+
]
|
885 |
+
},
|
886 |
+
{
|
887 |
+
"output_type": "display_data",
|
888 |
+
"data": {
|
889 |
+
"text/plain": [
|
890 |
+
"Map: 0%| | 0/13 [00:00<?, ? examples/s]"
|
891 |
+
],
|
892 |
+
"application/vnd.jupyter.widget-view+json": {
|
893 |
+
"version_major": 2,
|
894 |
+
"version_minor": 0,
|
895 |
+
"model_id": "d7223667431444a584ce297ac976621a"
|
896 |
+
}
|
897 |
+
},
|
898 |
+
"metadata": {}
|
899 |
+
},
|
900 |
+
{
|
901 |
+
"output_type": "display_data",
|
902 |
+
"data": {
|
903 |
+
"text/plain": [
|
904 |
+
"<IPython.core.display.HTML object>"
|
905 |
+
],
|
906 |
+
"text/html": [
|
907 |
+
"\n",
|
908 |
+
" <div>\n",
|
909 |
+
" \n",
|
910 |
+
" <progress value='12' max='12' style='width:300px; height:20px; vertical-align: middle;'></progress>\n",
|
911 |
+
" [12/12 06:42, Epoch 3/3]\n",
|
912 |
+
" </div>\n",
|
913 |
+
" <table border=\"1\" class=\"dataframe\">\n",
|
914 |
+
" <thead>\n",
|
915 |
+
" <tr style=\"text-align: left;\">\n",
|
916 |
+
" <th>Step</th>\n",
|
917 |
+
" <th>Training Loss</th>\n",
|
918 |
+
" </tr>\n",
|
919 |
+
" </thead>\n",
|
920 |
+
" <tbody>\n",
|
921 |
+
" </tbody>\n",
|
922 |
+
"</table><p>"
|
923 |
+
]
|
924 |
+
},
|
925 |
+
"metadata": {}
|
926 |
+
}
|
927 |
+
]
|
928 |
+
},
|
929 |
+
{
|
930 |
+
"cell_type": "code",
|
931 |
+
"source": [
|
932 |
+
"from transformers import Trainer, TrainingArguments, DataCollatorForLanguageModeling\n",
|
933 |
+
"\n",
|
934 |
+
"def evaluate_model(model, tokenizer, test_dataset):\n",
|
935 |
+
" # data collator\n",
|
936 |
+
" data_collator = DataCollatorForLanguageModeling(\n",
|
937 |
+
" tokenizer=tokenizer,\n",
|
938 |
+
" mlm=False\n",
|
939 |
+
" )\n",
|
940 |
+
"\n",
|
941 |
+
" # training arguments (used for evaluation only)\n",
|
942 |
+
" training_args = TrainingArguments(\n",
|
943 |
+
" output_dir='./results',\n",
|
944 |
+
" per_device_eval_batch_size=4,\n",
|
945 |
+
" logging_dir='./logs',\n",
|
946 |
+
" )\n",
|
947 |
+
"\n",
|
948 |
+
" # Trainer instance for evaluation\n",
|
949 |
+
" trainer = Trainer(\n",
|
950 |
+
" model=model,\n",
|
951 |
+
" args=training_args,\n",
|
952 |
+
" data_collator=data_collator,\n",
|
953 |
+
" eval_dataset=test_dataset,\n",
|
954 |
+
" )\n",
|
955 |
+
"\n",
|
956 |
+
"\n",
|
957 |
+
" eval_results = trainer.evaluate()\n",
|
958 |
+
" return eval_results\n",
|
959 |
+
"\n",
|
960 |
+
"def main():\n",
|
961 |
+
"\n",
|
962 |
+
" tokenizer = GPT2Tokenizer.from_pretrained('./story-generator-model')\n",
|
963 |
+
" model = GPT2LMHeadModel.from_pretrained('./story-generator-model')\n",
|
964 |
+
"\n",
|
965 |
+
"\n",
|
966 |
+
" file_path = '/content/Stories.txt'\n",
|
967 |
+
" test_dataset = load_text_dataset(file_path, tokenizer)\n",
|
968 |
+
"\n",
|
969 |
+
"\n",
|
970 |
+
" eval_results = evaluate_model(model, tokenizer, test_dataset)\n",
|
971 |
+
" print(\"Evaluation Results:\")\n",
|
972 |
+
" print(eval_results)\n",
|
973 |
+
"\n",
|
974 |
+
"if __name__ == \"__main__\":\n",
|
975 |
+
" main()\n"
|
976 |
+
],
|
977 |
+
"metadata": {
|
978 |
+
"colab": {
|
979 |
+
"base_uri": "https://localhost:8080/",
|
980 |
+
"height": 125,
|
981 |
+
"referenced_widgets": [
|
982 |
+
"e7f33442629342b2a1fc5e2db70b03bb",
|
983 |
+
"56aa2a323c23404397f2046760c9ccc1",
|
984 |
+
"005be63c3678474aa8ab153e8dd9df6d",
|
985 |
+
"e6d69d094d2f4136beb40e54323c0ff0",
|
986 |
+
"0af742ff483d4a21a874f3c9016b5094",
|
987 |
+
"b5efac91cdc8478cb44b5fe1ab6a91b9",
|
988 |
+
"0290a7fe07ef408ca947643cade94440",
|
989 |
+
"247b8c26e7af481baa2e7984884b00b8",
|
990 |
+
"03fa22c663b0428a8a32dfd18da39b02",
|
991 |
+
"23ba120643d549488da26464645f5a48",
|
992 |
+
"0d1f526d3714463f867035180e204dda"
|
993 |
+
]
|
994 |
+
},
|
995 |
+
"id": "RXu3cind2-P4",
|
996 |
+
"outputId": "f4583504-af26-43af-eb3f-7ac730332b30"
|
997 |
+
},
|
998 |
+
"execution_count": 9,
|
999 |
+
"outputs": [
|
1000 |
+
{
|
1001 |
+
"output_type": "display_data",
|
1002 |
+
"data": {
|
1003 |
+
"text/plain": [
|
1004 |
+
"Map: 0%| | 0/13 [00:00<?, ? examples/s]"
|
1005 |
+
],
|
1006 |
+
"application/vnd.jupyter.widget-view+json": {
|
1007 |
+
"version_major": 2,
|
1008 |
+
"version_minor": 0,
|
1009 |
+
"model_id": "e7f33442629342b2a1fc5e2db70b03bb"
|
1010 |
+
}
|
1011 |
+
},
|
1012 |
+
"metadata": {}
|
1013 |
+
},
|
1014 |
+
{
|
1015 |
+
"output_type": "display_data",
|
1016 |
+
"data": {
|
1017 |
+
"text/plain": [
|
1018 |
+
"<IPython.core.display.HTML object>"
|
1019 |
+
],
|
1020 |
+
"text/html": [
|
1021 |
+
"\n",
|
1022 |
+
" <div>\n",
|
1023 |
+
" \n",
|
1024 |
+
" <progress value='4' max='4' style='width:300px; height:20px; vertical-align: middle;'></progress>\n",
|
1025 |
+
" [4/4 00:26]\n",
|
1026 |
+
" </div>\n",
|
1027 |
+
" "
|
1028 |
+
]
|
1029 |
+
},
|
1030 |
+
"metadata": {}
|
1031 |
+
},
|
1032 |
+
{
|
1033 |
+
"output_type": "stream",
|
1034 |
+
"name": "stdout",
|
1035 |
+
"text": [
|
1036 |
+
"Evaluation Results:\n",
|
1037 |
+
"{'eval_loss': 1.936555027961731, 'eval_model_preparation_time': 0.0042, 'eval_runtime': 39.3769, 'eval_samples_per_second': 0.33, 'eval_steps_per_second': 0.102}\n"
|
1038 |
+
]
|
1039 |
+
}
|
1040 |
+
]
|
1041 |
+
},
|
1042 |
+
{
|
1043 |
+
"cell_type": "code",
|
1044 |
+
"source": [
|
1045 |
+
"from transformers import GPT2LMHeadModel, GPT2Tokenizer\n",
|
1046 |
+
"import torch\n",
|
1047 |
+
"\n",
|
1048 |
+
"def generate_text(prompt, model, tokenizer, max_length=1000, num_return_sequences=1, temperature=0.7, repetition_penalty=1.2):\n",
|
1049 |
+
"\n",
|
1050 |
+
" inputs = tokenizer.encode(prompt, return_tensors='pt')\n",
|
1051 |
+
"\n",
|
1052 |
+
" # Generate text\n",
|
1053 |
+
" with torch.no_grad():\n",
|
1054 |
+
" outputs = model.generate(\n",
|
1055 |
+
" inputs,\n",
|
1056 |
+
" max_length=max_length,\n",
|
1057 |
+
" num_return_sequences=num_return_sequences,\n",
|
1058 |
+
" temperature=temperature,\n",
|
1059 |
+
" repetition_penalty=repetition_penalty,\n",
|
1060 |
+
" top_k=50, # Use top_k sampling to limit to top-k probabilities\n",
|
1061 |
+
" top_p=0.95, # Use nucleus sampling to limit to top-p cumulative probability\n",
|
1062 |
+
" do_sample=True, # Enable sampling for varied text generation\n",
|
1063 |
+
" pad_token_id=tokenizer.eos_token_id # Handle padding correctly\n",
|
1064 |
+
" )\n",
|
1065 |
+
"\n",
|
1066 |
+
" # Decode the generated text\n",
|
1067 |
+
" generated_texts = [tokenizer.decode(output, skip_special_tokens=True) for output in outputs]\n",
|
1068 |
+
"\n",
|
1069 |
+
" return generated_texts\n",
|
1070 |
+
"\n",
|
1071 |
+
"def main():\n",
|
1072 |
+
" output_dir = './results' # Directory where the model and tokenizer are saved\n",
|
1073 |
+
"\n",
|
1074 |
+
"\n",
|
1075 |
+
" model, tokenizer = load_model_and_tokenizer(output_dir)\n",
|
1076 |
+
"\n",
|
1077 |
+
"\n",
|
1078 |
+
" prompt = \"Once upon a time\"\n",
|
1079 |
+
"\n",
|
1080 |
+
" generated_texts = generate_text(prompt, model, tokenizer, max_length=1000, num_return_sequences=1, temperature=0.7, repetition_penalty=1.2)\n",
|
1081 |
+
"\n",
|
1082 |
+
"\n",
|
1083 |
+
" for i, text in enumerate(generated_texts):\n",
|
1084 |
+
" print(f\"Generated Text {i + 1}:\\n{text}\\n\")\n",
|
1085 |
+
"\n",
|
1086 |
+
"if __name__ == \"__main__\":\n",
|
1087 |
+
" main()\n"
|
1088 |
+
],
|
1089 |
+
"metadata": {
|
1090 |
+
"colab": {
|
1091 |
+
"base_uri": "https://localhost:8080/"
|
1092 |
+
},
|
1093 |
+
"id": "GlKwC09_7Af_",
|
1094 |
+
"outputId": "943556a7-ce5d-4e9e-8847-814d65dd8a91"
|
1095 |
+
},
|
1096 |
+
"execution_count": 14,
|
1097 |
+
"outputs": [
|
1098 |
+
{
|
1099 |
+
"output_type": "stream",
|
1100 |
+
"name": "stdout",
|
1101 |
+
"text": [
|
1102 |
+
"Generated Text 1:\n",
|
1103 |
+
"Once upon a time, the people of Noxus began to question their existence. This was when they discovered an ancient artifact that had been hidden in some ruins and sought out its creator for guidance on how he might fulfill his destiny as ruler of all mankind.[1]\n",
|
1104 |
+
" (TNG: \"Unification\") The artifacts were brought back from beyond space by Romulan forces with hopes it would provide them peace; however this hope turned into bloodshed after both sides saw each other's true potentials within themselves. Ultimately failing at finding any meaning or purpose behind these discoveries, there came together under one banner -- bringing about what became known as Dominion War - where humanity took control over two continents across which thousands fought against invaders who wanted nothing more than subjugation via conquest and colonization. As war raged throughout history along similar lines, tensions flared up between nations seeking shared goals while struggling through intense conflict during periods of crisis such even wars could only be resolved peacefully once allies emerged victorious alongside new alliances led solely towards greater unity among peoples around common values including honor, respect & love. In order not too far off worlds like Earth-616,[3][4] Prime Directive declared victory following revelations regarding New Mombasa' invasion aboard USS Voyager[5]. Following battle many years later before returning home again due to distress caused shortly thereafter...The remaining crewmen faced challenges ranging from petty disagreements amongst friends living side by party fighting factions plagued almost exclusively by distrust toward outsiders alike! Despite having no idea why humans existed besides being members of various races united beneath powerful ideals surrounding race relations,...After much thought exploring further revealed itself deep down inside another dimension whose inhabitants believed so strongly but did little heed those beliefs until finally discovering something surprising… A small group headed westward attempting yet others awaited arrival....In 2352 BCE VOY : OCEAN CULTURE, Admiral Jonathan Archer traveled southbound looking forward never to find signs indicating impending danger near Vulcan City. Once alerted several ships arrived en route bound directly opposite stations waiting patiently awaiting news concerning major developments approaching planetary boundaries if necessary. However things quickly escalated exponentially becoming deadly encounters erupted onto entire planets threatening countless lives without warning..While piloting starship Enterprise NX04HV2 encountered numerous threats ahead leading her astray she soon found herself embroiled amid growing hostilities involving hundreds of alien species battling hordes of savage beasts lurking just miles away.(TSO)During Starfleet Academy students created long lasting friendships spanning centuries--one student is now remembered fondly because of him guiding young girls named Krellia Sanguinius thru uncharted lands filled full speed with tales of lost civilizations driven mad despite decades of hard work inspired by world events.<br />Krelia has always held great promise amidst generations of hardships presented daily unlike anything seen since birth thanks largelyto knowledge gained alone rather then aided greatly every day.</p><h6>As captain of Captain America/Captain Marvel #10 you'll meet old foes face to confront newfound dangers await your journey wherever heroes seek refuge.\",\"name\":\"kreniac\",\"link\":\"/en_US\\/about\\u003cp{protection}&action=protect%20yourself+from*this<a href=\\\"https://www.*popcorntime.}\\\" target=\\\"_(blank)!src\\\\img//cdnjs?vpx(200);document[_0x446d000]]||censored(_00e400){if($tw._tiddler=='-$:/language))return'';elsevar _ga = document[_08161714lollos];windowGL['style']=\"float32;\" + windowglow->setInteriorColor(-65%) + \"_blond\"};function protect() { var oo=[], mrc='\", rbcs={}, cts[]=(new Date().getTime(), dateFormat('UTC'), 0), ngs []({ 'date': 1000}); srs([])[\"type\"] += 1 << 9 ^ 2;} function updateInfo(){ hpragma(\"data:{i:\"title\"}\".formatString((null)[strlen(nags)))? \"<span class=\"content\"></ span></div>\"\": \"\" }, null ); //Set background color check variable vpncolor=#000000 ; setBackground(); } else scriptBlockDefinition&& (!blockCreateNode()); blockCreepedProposition &&!scriptDeleteStateIdToPlay (); void main () {} /** * @param string $nodeName We can add text nodes using `@string` */ public static bool playOnPremiseEventArgs ({ start ) { preprocessArguments ('start'); getContentByTagline (\"\\\", \\\"playonpremises\\\":true,\"userid\"); return false }; /* Update info file node name localStorage := GetLocalSpaceManager::GetNewItemFromLocation ($localStorage).FindAllNodesForURL (&globalStrategyPath /optpath/$remoteCacheDir –replace '/tmp/*', \"$tokenToken\").OpenFileSync(&GlobalStrategieDirectoryObjectData\n",
|
1105 |
+
"\n"
|
1106 |
+
]
|
1107 |
+
}
|
1108 |
+
]
|
1109 |
+
},
|
1110 |
+
{
|
1111 |
+
"source": [
|
1112 |
+
"from transformers import GPT2LMHeadModel, GPT2Tokenizer\n",
|
1113 |
+
"\n",
|
1114 |
+
"def save_model(model, tokenizer, output_dir):\n",
|
1115 |
+
" model.save_pretrained(output_dir)\n",
|
1116 |
+
" tokenizer.save_pretrained(output_dir)\n",
|
1117 |
+
"\n",
|
1118 |
+
"def load_model_and_tokenizer(output_dir):\n",
|
1119 |
+
" model = GPT2LMHeadModel.from_pretrained(output_dir)\n",
|
1120 |
+
" tokenizer = GPT2Tokenizer.from_pretrained(output_dir)\n",
|
1121 |
+
" return model, tokenizer\n",
|
1122 |
+
"\n",
|
1123 |
+
"model = GPT2LMHeadModel.from_pretrained('./story-generator-model')\n",
|
1124 |
+
"tokenizer = GPT2Tokenizer.from_pretrained('./story-generator-model')\n",
|
1125 |
+
"save_model(model, tokenizer, './results')\n",
|
1126 |
+
"model, tokenizer = load_model_and_tokenizer('./results')"
|
1127 |
+
],
|
1128 |
+
"cell_type": "code",
|
1129 |
+
"metadata": {
|
1130 |
+
"id": "UpZZnhGo6NnU"
|
1131 |
+
},
|
1132 |
+
"execution_count": 13,
|
1133 |
+
"outputs": []
|
1134 |
+
}
|
1135 |
+
]
|
1136 |
+
}
|