Okay, though I have started coding a first Transformer model (likely to do translation), I have decided to once again take a “side-trip” and look at coding an agentic chatbot. Likely at some point looking at Retrieval Augmented Generation (RAG).

But first I am going to look at something I did not previously appreciate: prompt templates. A goodly number of the examples I saw had nothing much more than something like ("system", "You are a helpful assistant") in the template. Seemed rather meaningless to me. Then I saw one that gave me hope. Not only that, another example showed additional potential of prompt template use.

Prompt Templates

LangChain provides a couple of different template classes in langchain_core.prompts. But for now I am only interested in ChatPromptTemplate; we are afterall coding a chatbot.

When working on the earlier chatbot, I was often manually adding “Please keep your response brief.” to each input to the chatbot. Then I saw an example where someone was using a prompt template to add that request to every prompt by default. No extra typing needed!

I am starting with basic chatbot code with an in-memory checkpointer (not a database to save chat history or any of the related code) and a simple command line interface (not a Streamlit web interface). I don’t think I will bother showing the code for the basic chatbot. I am sure you can sort that out. But, I am still currently using MistralAI (still free).

Example—Not Brief

(agnt-3.12) PS R:\learn\ds_agent> python chat_bot_2.py
User: What is the best noodle for a soup?
Choosing the best noodle for a soup depends on the type of soup you're making, as different noodles have different textures, thicknesses, and cooking times. Here are some popular options:

1. **Egg Noodles**: These are great for chicken noodle soup or beef noodle soup. They are thick and hearty, and can hold up well in broth.

2. **Rice Noodles**: These are perfect for Asian-inspired soups like pho or laksa. They come in various widths and can be very thin (vermicelli) or wider (like pad Thai noodles).

3. **Udon Noodles**: These are thick, chewy Japanese noodles made from wheat flour. They are excellent in hearty soups and stews.

4. **Soba Noodles**: These are thin Japanese noodles made from buckwheat flour. They have a slightly nutty flavor and are great in lighter broths.

5. **Ramen Noodles**: These are thin, curly noodles made from wheat flour. They are perfect for ramen soup and other Japanese-style soups.

6. **Pasta (like Ditalini or Orzo)**: Small pasta shapes are great for Italian-style soups like minestrone or pasta e fagioli.

7. **Glass Noodles (Cellophane Noodles)**: These are thin, transparent noodles made from mung bean or sweet potato starch. They are great in clear broths and Asian-style soups.

When choosing a noodle, consider the flavor and consistency of your soup, as well as the cooking time of the noodle. Some noodles can be cooked directly in the soup, while others may need to be cooked separately to prevent them from becoming too soft or breaking apart.

User:

Add Chat Prompt

So I will start by importing the necessary package and instantiating a prompt.

... ...
from langchain_core.prompts import ChatPromptTemplate
... ...
prompt = ChatPromptTemplate(
  [("system", "Keep all your responses as brief as possible, to a maximum of 4 sentences."),
   ("placeholder", "{messages}")
  ]
)

Note: the “placeholder” prompt expects a list of messages. In this case, it will be passed the message history from the model state. I.E. state["messages"]

Then we need to modify our call_model function to use the prompt template.

def call_model(state: TranslateState):
  updated_messages = model.invoke(prompt.invoke(state))
  return {"messages": updated_messages}

And, the AI response now looks like the following.

(agnt-3.12) PS R:\learn\ds_agent> python chat_bot_2.py
User: What is the best noodle for a soup?
The best noodle for soup depends on the type of soup you're preparing. For chicken noodle soup, classic egg noodles are often preferred. For Asian-style soups, wheat noodles or rice noodles work well. Consider the recipe and your personal preference when choosing a noodle.

User:

Very likely in this case the longer response is more useful, but in some cases a shorter response is all that would be needed.

Chaining

In the preceding code, I passed the output prompt.invoke() to model.invoke(). Langchain provides another mechanism for accomplishing the same thing: chaining.

One point about LangChain Expression Language is that any two runnables can be “chained” together into sequences. The output of the previous runnable’s .invoke() call is passed as input to the next runnable. This can be done using the pipe operator (|), or the more explicit .pipe() method, which does the same thing.

How to chain runnables

Let’s rework the above code to get that working. Pretty straightforward.

def call_model(state: MessagesState):
  chain = prompt | model
  updated_messages = chain.invoke(state)
  return {"messages": updated_messages}

I also modified the prompt template.

prompt = ChatPromptTemplate(
  [("system", "Keep all your responses as brief as possible, to a maximum of 6 points or sentences."),
   ("placeholder", "{messages}")
  ]
)

And the output with the above code modifications now looks something like the following.

(agnt-3.12) PS R:\learn\ds_agent> python chat_bot_2.py
User: What is the best noodle for a soup?
- Egg noodles: Versatile, great for chicken noodle soup.
- Rice noodles: Ideal for Southeast Asian soups, like pho.
- Wheat noodles: Perfect for Japanese-style soups, such as udon.
- Ramen noodles: Excellent in rich, hearty broths.
- Soba noodles: Good for light, dashi-based soups.
- Vermicelli: Thin, quick-cooking, suits well in miso soup.

User:

But, anonther example use of prompt templates caught my attention.

Translation Bot

By modifying the system prompt we can easily get the chatbot to do translation. At first I will hard code the languages, then look at user selectable options. The following change to the prompt was all that was required.

prompt = ChatPromptTemplate(
  [("system", "Translate the input from English to German"),
   ("placeholder", "{messages}")
  ]
)

And, running the chatbot produced the following output for the sample inputs.

(agnt-3.12) PS R:\learn\ds_agent> python chat_bot_2.py
User: Good morning.
The translation of "Good morning" into German is:

"Guten Morgen."

Here's a breakdown:
- Guten = Good
- Morgen = Morning

User: Have a lovely evening.
The translation of "Have a lovely evening" into German is:

"Haben Sie einen schönen Abend."

Here's a breakdown:
- Haben Sie = Have (formal)
- einen = a
- schönen = lovely
- Abend = evening

If you want to be more casual, you can use:

"Hab einen schönen Abend."

Here's the breakdown for the casual version:
- Hab = Have
- einen = a
- schönen = lovely
- Abend = evening

User: What is you plan for the day?
The translation of "What is your plan for the day?" into German is:

"Was ist dein Plan für den Tag?"

Here's a breakdown:
- Was = What
- ist = is
- dein = your
- Plan = plan
- für = for
- den = the
- Tag = day

If you are addressing someone formally, you would use:

"Was ist Ihr Plan für den Tag?"

Here's the breakdown for the formal version:
- Was = What
- ist = is
- Ihr = your (formal)
- Plan = plan
- für = for
- den = the
- Tag = day

User:

That output is perhaps a bit noisy. So let’s see if we can fix that by modifying the prompt template.

prompt = ChatPromptTemplate(
  [("system", "Translate the input from English to German"),
   ("system", "Keep all your responses as brief as possible, to a maximum of 6 points or sentences."),
   ("placeholder", "{messages}")
  ]
)

And,…

(agnt-3.12) PS R:\learn\ds_agent> python chat_bot_2.py
User: Good morning.
Guten Morgen.

User: Have a lovely evening.
Schönen Abend noch.

* Literally: "Nice evening still"
* Other ways to say it:
        + "Ich wünsche dir einen schönen Abend" (I wish you a nice evening)
        + "Genieße deinen Abend" (Enjoy your evening)
* "Abend" is evening.
* Adding "noch" implies "still to come".

User: What is you plan for the day?
Was ist dein Plan für den Tag?

* "Was" means "what".
* "ist" means "is".
* "dein" means "your".
* "für" means "for".
* "den" means "the".
* "Tag" means "day".

User:

Add User Control

Let’s add some user control over the way the chatbot works. I want to allow the user to specify the input and/or output languages. If the output language is different from the input language we ask the LLM to do translation. If not it will just respond to the user input. We will also allow the user to control the “size” of the response. Likely by specifying a maximum number of points or sentences.

I was looking at using a different/new command line interface library: specifically click. But for now didn’t feel like installing another library/package, so going to stick with Python’s built in argparse.

So just after the call to load_dotenv I added the following. With some temporary test code.

# create a parser object
parser = argparse.ArgumentParser(description = "A partially configurable chatbot")
# add cli arguments
parser.add_argument("-li", "--lang_in", type=str, help="input language for chatbot", default="English")
parser.add_argument("-lo", "--lang_out", type=str, help="output language for chatbot", default="English")
parser.add_argument("-mp", "--max_points", type=int, help="maximum number of LLM response points/sentences", default=0)
# get any cli args
args = parser.parse_args()

p_list = []
if args.lang_in != args.lang_out:
  p_list.append(("system", f"Translate the input from {args.lang_in} to {args.lang_out}"))
if args.max_points > 0:
  p_list.append(("system", f"Keep all your responses as brief as possible, to a maximum of {args.max_points} points or sentences."))
p_list.append(("placeholder", "{messages}"))
prompt = ChatPromptTemplate(p_list)

if True:
  print(f"\n{args}")
  print(f"\n{p_list}\n")
  prompt.pretty_print()
  print()
  exit(0)

A series of tests produced the following output in the terminal.

(agnt-3.12) PS R:\learn\ds_agent> python chat_bot_2.py

Namespace(lang_in='English', lang_out='English', max_points=0)

[('placeholder', '{messages}')]

{messages}

(agnt-3.12) PS R:\learn\ds_agent> python chat_bot_2.py -lo German

Namespace(lang_in='English', lang_out='German', max_points=0)

[('system', 'Translate the input from English to German'), ('placeholder', '{messages}')]

================================ System Message ================================

Translate the input from English to German

============================= Messages Placeholder =============================
{messages}

(agnt-3.12) PS R:\learn\ds_agent> python chat_bot_2.py -lo German -li French

Namespace(lang_in='French', lang_out='German', max_points=0)

[('system', 'Translate the input from French to German'), ('placeholder', '{messages}')]

================================ System Message ================================

Translate the input from French to German

============================= Messages Placeholder =============================

{messages}

(agnt-3.12) PS R:\learn\ds_agent> python chat_bot_2.py -mp 5

Namespace(lang_in='English', lang_out='English', max_points=5)

[('system', 'Keep all your responses as brief as possible, to a maximum of 5 points or sentences.'), ('placeholder', '{messages}')]

================================ System Message ================================

Keep all your responses as brief as possible, to a maximum of 5 points or sentences.

============================= Messages Placeholder =============================

{messages}

(agnt-3.12) PS R:\learn\ds_agent> python chat_bot_2.py -lo German -li French -mp 6

Namespace(lang_in='French', lang_out='German', max_points=6)

[('system', 'Translate the input from French to German'), ('system', 'Keep all your responses as brief as possible, to a maximum of 6 points or sentences.'), ('placeholder', '{messages}')]

================================ System Message ================================

Translate the input from French to German

================================ System Message ================================

Keep all your responses as brief as possible, to a maximum of 6 points or sentences.

============================= Messages Placeholder =============================

{messages}

Okay, let’s stop the test code and early termination and try a few more tests. I was having problems with Mistral AI, so I opened an OpenAI account and spent a couple of bucks to give me some playing room. Modified my chatbot code accordingly.

(agnt-3.12) PS R:\learn\ds_agent> python chat_bot_2.py -lo German
User: Bon jour!
AI: Guten Tag!

User: What time is it?
AI: Wie spät ist es?

User: Do not translate, but answer the question. What time is it Vancouver, B.C., Canada?
AI: I'm sorry, but I can't provide real-time information such as the current time. You can easily check the current time in Vancouver, B.C., Canada, by searching online or using a clock.

User: Do not translate, but answer the question. Who was the 10th prime minister of Canada?
AI: The 10th Prime Minister of Canada was John Diefenbaker, who served from 1957 to 1963.

User: Which way is the post office?
AI: Welche Richtung ist die Post?

User: Fini
AI: Fini

User: Done
AI: Fertig

User: Buonasera!
AI: Guten Abend!

User: Good evening!
AI: Guten Abend!

User: x
AI: See you later!

Looks like I may always need to specify the input language. Not bothering with testing the response limiting system message, since that worked earlier.

Done, Fertig, Fini

Well, I think that is it for this one. Based on things so far, think I am going to enjoy this side-trip. Next time I will look at providing a means for the chatbot to obtain information newer that the data it was trained on.

Hope you are enjoying your time coding—whatever it may be you are coding.

Resources