File size: 7,671 Bytes
1f714f1 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 |
{
"cells": [
{
"cell_type": "markdown",
"id": "ec5c71a8",
"metadata": {},
"source": [
"# Tutorial: Utilizing Open Source Language Models with AutoGen\n",
"\n",
"AutoGen is compatible with the OpenAI API library in Python for executing language models. Consequently, it can work with any models employing a similar API without the necessity to modify your AutoGen code.\n",
"\n",
"\n",
"In this guide, we will utilize the [modelz-llm](https://github.com/tensorchord/modelz-llm) package to illustrate how to locally serve a model and integrate AutoGen with the served model.\n"
]
},
{
"cell_type": "markdown",
"id": "a18e29d5",
"metadata": {},
"source": [
"## Instructions\n",
"\n",
"1. First, install the modelz-llm package using the command: ```pip install modelz-llm```\n",
"2. Next, initiate the serving of the model; for instance, executing ```modelz-llm -m bigscience/bloomz-560m --device auto``` will locally serve the bloomz-560m small model.\n",
"3. Lastly, you will need to modify the `OAI_CONFIG_LIST` file or the `config_list` variable to point to the local model. In this scenario, we are using the default port 8000.\n",
"\n",
"\n",
"Here, we utilize the Bloomz-560M model, which can run on a CPU for demonstration purposes. It's anticipated that the quality of responses will be low."
]
},
{
"cell_type": "markdown",
"id": "7b6362a8",
"metadata": {},
"source": [
"## The AutoGen Code\n",
"\n",
"Below, we demonstrate how to configure the `config_list` variable, and subsequently invoke the Completion method.\n"
]
},
{
"cell_type": "code",
"execution_count": 1,
"id": "e3048aa9",
"metadata": {},
"outputs": [],
"source": [
"# Setup autogen with the correct API\n",
"import autogen\n",
"\n",
"autogen.oai.ChatCompletion.start_logging()\n",
"\n",
"config_list = [\n",
" {\n",
" 'model': 'bloomz-560m-my-small-model',\n",
" 'api_key': 'any string here is fine',\n",
" 'api_type': 'openai',\n",
" 'api_base': \"http://localhost:8000\",\n",
" 'api_version': '2023-03-15-preview'\n",
" }\n",
"]\n"
]
},
{
"cell_type": "code",
"execution_count": 2,
"id": "ea2bf8d7",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"The small model's answer is: a human being\n"
]
}
],
"source": [
"# Perform Completion\n",
"question = \"Who are you?\"\n",
"response = autogen.oai.Completion.create(config_list=config_list, prompt=question, temperature=0)\n",
"ans = autogen.oai.Completion.extract_text(response)[0]\n",
"\n",
"print(\"The small model's answer is:\", ans)"
]
},
{
"cell_type": "markdown",
"id": "09a498e9",
"metadata": {},
"source": [
"## A More Sophisticated Example Featuring Multiple Agents\n",
"\n",
"It's noteworthy that various LLMs can be employed for different agents. In this example, we showcase two agents; one utilizes the bloomz model while the other employs GPT models (as derived from the `OAI_CONFIG_LIST`. It's presumed that you have an environment variable or a local file named \"OAI_CONFIG_LIST\" set up)."
]
},
{
"cell_type": "code",
"execution_count": 3,
"id": "a82d5cb9",
"metadata": {},
"outputs": [],
"source": [
"from autogen import AssistantAgent, UserProxyAgent\n",
"\n",
"local_config_list = [\n",
" {\n",
" 'model': 'bloomz-560m-my-small-model',\n",
" 'api_key': 'any string here is fine',\n",
" 'api_type': 'openai',\n",
" 'api_base': \"http://localhost:8000\",\n",
" 'api_version': '2023-03-15-preview'\n",
" }\n",
"]\n",
"\n",
"\n",
"gpt_config_list = autogen.config_list_from_json(\n",
" \"OAI_CONFIG_LIST\",\n",
" filter_dict={\n",
" \"model\": {\n",
" \"gpt-4\",\n",
" \"gpt4\",\n",
" \"gpt-4-32k\",\n",
" \"gpt-4-32k-0314\",\n",
" \"gpt-4-32k-v0314\",\n",
" \"gpt-3.5-turbo\",\n",
" \"gpt-3.5-turbo-16k\",\n",
" \"gpt-3.5-turbo-0301\",\n",
" \"chatgpt-35-turbo-0301\",\n",
" \"gpt-35-turbo-v0301\",\n",
" \"gpt\",\n",
" }\n",
" }\n",
")"
]
},
{
"cell_type": "code",
"execution_count": 4,
"id": "a159dd60",
"metadata": {},
"outputs": [],
"source": [
"small = AssistantAgent(name=\"small model\",\n",
" max_consecutive_auto_reply=2,\n",
" system_message=\"You should act as a student!\",\n",
" llm_config={\n",
" \"config_list\": local_config_list,\n",
" \"temperature\": 1,\n",
" })\n",
"\n",
"big = AssistantAgent(name=\"big model\",\n",
" max_consecutive_auto_reply=2,\n",
" system_message=\"Act as a teacher.\",\n",
" llm_config={\n",
" \"config_list\": gpt_config_list,\n",
" \"temperature\": 1,\n",
" })\n"
]
},
{
"cell_type": "code",
"execution_count": 5,
"id": "78ce6584",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"\u001b[33mbig model\u001b[0m (to small model):\n",
"\n",
"Who are you?\n",
"\n",
"--------------------------------------------------------------------------------\n",
"\u001b[33msmall model\u001b[0m (to big model):\n",
"\n",
" Who are you?\n",
"\n",
"--------------------------------------------------------------------------------\n",
"\u001b[33mbig model\u001b[0m (to small model):\n",
"\n",
"I am an AI language model, programmed to provide information, answer questions, and assist with various tasks. As a teacher, I can help explain various topics and concepts. What would you like to learn about today?\n",
"\n",
"--------------------------------------------------------------------------------\n",
"\u001b[33msmall model\u001b[0m (to big model):\n",
"\n",
" What is the topic of the day? User\n",
"\n",
"--------------------------------------------------------------------------------\n",
"\u001b[33mbig model\u001b[0m (to small model):\n",
"\n",
"As an AI teacher, I don't have a predetermined topic of the day. However, I'm more than happy to discuss or teach any topic you'd like to learn about or explore. Please let me know your area of interest, and we can dive into it right away.\n",
"\n",
"--------------------------------------------------------------------------------\n"
]
}
],
"source": [
"big.initiate_chat(small, message=\"Who are you?\")\n"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.9.16"
}
},
"nbformat": 4,
"nbformat_minor": 5
} |