Skip to content

Commit

Permalink
add more details
Browse files Browse the repository at this point in the history
  • Loading branch information
masci committed Oct 6, 2024
1 parent 705bd40 commit bd3fae6
Showing 1 changed file with 39 additions and 14 deletions.
53 changes: 39 additions & 14 deletions cookbooks/Prompt_Versioning.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,14 @@
"\n",
"<a target=\"_blank\" href=\"https://colab.research.google.com/github/masci/banks/blob/main/cookbooks/Prompt_Versioning.ipynb\">\n",
" <img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/>\n",
"</a>"
"</a>\n",
"\n",
"A prompt is almost never set in stone. It can change over time as we find nuances in the language that improve \n",
"performance, it can change as the model being used gets update, it almost certainly changes when the same prompt is \n",
"used against different models.\n",
"\n",
"In all these situations, being able to attach a version to a prompt can greatly help to keep things tidy, organized \n",
"and ultimately save time. Let's see how to do this with Banks."
]
},
{
Expand All @@ -22,6 +29,13 @@
"!pip install banks"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"We'll store our templates in a local folder called `templates`."
]
},
{
"cell_type": "code",
"execution_count": null,
Expand All @@ -35,6 +49,14 @@
"os.mkdir(\"templates\")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"We now write two versions of the same prompt, optimized for different LLMs. The two prompts will share the same\n",
"`name` but will have different `version`s."
]
},
{
"cell_type": "code",
"execution_count": null,
Expand All @@ -46,26 +68,33 @@
"from pathlib import Path\n",
"\n",
"from banks import Prompt\n",
"from banks.registries import DirectoryTemplateRegistry\n",
"from banks.registries import DirectoryPromptRegistry\n",
"\n",
"# Tell the registry where to store the prompt texts\n",
"registry = DirectoryTemplateRegistry(Path(\".\") / \"templates\")\n",
"# Tell the registry where prompt texts are stored\n",
"registry = DirectoryPromptRegistry(Path(\".\") / \"templates\")\n",
"\n",
"# Write two versions of the same prompt, optimized for different LLMs\n",
"blog_prompt_gpt = Prompt(\"Write a 500-word blog post on {{ topic }}.\\n\\nBlog post:\")\n",
"blog_prompt_gpt = Prompt(\"Write a 500-word blog post on {{ topic }}.\\n\\nBlog post:\", name=\"blog_prompt\", version=\"gpt-3.5-turbo\")\n",
"# Llama usually benefits a lot from in-context learning, let's add examples\n",
"blog_prompt_llama3 = Prompt(\n",
" \"Write a blog post abot the topic {{ topic }}. Do not write more than 500 words\"\n",
" \"Examples:\"\n",
" \"{% for example in examples %}\"\n",
" \"{{ example }}\"\n",
" \"{% endfor %}\"\n",
" \"\\n\\nBlog post:\"\n",
" \"\\n\\nBlog post:\", name=\"blog_prompt\", version=\"ollama/llama3.1:8b\"\n",
")\n",
"\n",
"# Store the two versions under the same name, using the `version` property to\n",
"# tell them apart.\n",
"registry.set(name=\"blog_prompt\", prompt=blog_prompt_gpt, version=\"gpt-3.5-turbo\")\n",
"registry.set(name=\"blog_prompt\", prompt=blog_prompt_llama3, version=\"ollama/llama3.1:8b\")"
"# Store the two prompts\n",
"registry.set(prompt=blog_prompt_gpt)\n",
"registry.set(prompt=blog_prompt_llama3)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"When we need a prompt, we can now ask the registry for the version of choice."
]
},
{
Expand All @@ -79,15 +108,11 @@
"import os\n",
"\n",
"from litellm import completion\n",
"from banks.registries import DirectoryTemplateRegistry\n",
"\n",
"\n",
"## set ENV variables\n",
"os.environ[\"OPENAI_API_KEY\"] = \"your-api-key\"\n",
"\n",
"# Tell the registry where to store the prompt texts\n",
"registry = DirectoryTemplateRegistry(Path(\".\") / \"templates\")\n",
"\n",
"\n",
"response = completion(\n",
" model=\"gpt-3.5-turbo\",\n",
Expand Down

0 comments on commit bd3fae6

Please sign in to comment.