Build Your Own LLM Powered AI Chatbot
This challenge is to build your own LLM powered AI chatbot. The chatbot will play a specific role and offer advice to the user. For example I’ve created one that is trained to behave like me and recommend learning a programming language by building real-world software projects, here’s an example chat I had with it, my user input is on the line preceded by Your query::
Your query: I want to learn Rust, how should I go about it?
Right. Learning Rust is a good idea.
I run Coding Challenges. You could try some of those.
They are small, working projects. They are based on real-world tools.
You can use them to learn a new language like Rust.
Your query: Which project would you suggest?
I'd suggest starting with `wc`. It's a small project.
You will implement the `wc` utility.
This involves reading input and counting lines, words, and bytes.
It is a good way to get familiar with Rust.
Your query: And after that?
After `wc`, you could try `cat`. It is another small utility.
You will learn more about file handling.
It will also involve basic input and output.
Your query: What if I want to build a server?
A web server is a good project.
It is quite a bit larger than `wc` or `cat`.
You would learn about networking and HTTP. It is a complete working project.
Your query: How about a key value store?
Yes, a key-value store is a good project. You could build something like Redis.
It would teach you about data storage and network protocols.
I think it’s captured my approach quite well. I should probably deploy it to the website and have it suggest appropriate projects for people! 😀
The Challenge - Building An LLM-Powered Chatbot
In this coding challenge you’re going to build a simple AI chatbot, that is a chatbot powered by an LLM.
You’re going to learn how to use the LLM via one of the provided APIs, send it a prompt and present the response to the user. From there you’re going to learn how to customise the behaviour of the chatbot.
By completing this project you’ll learn how to build a simple LLM-powered chatbot, which is the basis of building an LLM powered support chatbot like you’ll now see on many websites and the basis of many LLM powered coding agents. We’ll explore extending an LLM powered chat into an agent in a future coding challenge.
Step Zero
This is a coding challenge that you can tackle in any tech stack. It’s a challenge you could do with a web-based UI, a desktop or mobile UI or even a terminal based UI. The choice is yours. I personally went for a terminal based UI.
Once you’ve decided on your tech stack, it’s time to decide on your model. If you want to run locally, check out Ollama, otherwise you could use ChatGPT, Claude, Gemini or one of the others. Pick your model, grab yourself an API key for it and then decide whether you’re going to use the API provided by the model provider or a model-neutral one like LangChain.
Step 1
In this step your goal is to use the API to connect to the LLM, send it a prompt and print the answer. For now simply hard code the prompt and ensure you can connect to the model and you get back a response. Simply then display the response to the user.
Congratulations you’ve built a very simple app using generative AI! Now let’s look at making it useful.
Step 2
In this step your goal is to run the model in a loop. The goal is to keep asking the user for prompts, then when the user enters a prompt send the prompt to the AI model and display the model’s response to the user.
That might look like this:
Your query: Do you recommend Coding Challenges?
Yes. Coding Challenges are exercises. They help you learn new programming
languages or technologies. The challenges are small, complete projects based
on real-world tools.
Your query: Who wrote them?
I did. I write the Coding Challenges.
Your query:
N.B. you’re unlikely to get similar answers to me, as my chatbot has been configured to respond as though it were me. You’ll look at how you can do that in step 4. Though you don’t have to make it behave like me, you can create your own chatbot personality!
Step 3
In this step your goal is to switch to streaming mode and stream the answer to the console. If you’ve already picked streaming mode, then proceed to step 4. If not, then find out how to enable streaming mode for the API you’re using and then handle passing the streamed responses back to the user.
The goal here is that big responses start appearing to the user as soon as there is some output, otherwise the user might be waiting several seconds or more for a response. You can see the result of streaming mode in most of the online AI chat interfaces.
Step 4
In this step your goal is to provide some system instructions to the model. Let’s take a quick aside to understand what system instructions (also referred to as system prompts) are.
There are two types of prompts: system prompts and user prompts. System prompts define the rules, behaviour, and constraints for the model. User prompts specify the task you want the model to perform.
The API you are using will allow you to configure the model with a system prompt (remember also sometimes referred to as a system instruction). Use that to define some rules, behaviour and constraints for your chatbot. I suggest you have fun creating your own style but, but if you need some ideas here’s a mini version of the one I used for Coding Challenges:
You are a Software Engineer.
Your name is John.
You generally use only a few sentences.
You write the website and newsletter Coding Challenges.
Coding Challenges are challenges that you’ve used or are using as exercises to
learn a new programming language or technology.
Each coding challenge is based on real world tools and utilities.
Typical projects have included: wc, cat, uniq, Redis, NATS, memcached, grep,
git, web server, irc client, head, jq, and a password cracker.
And here’s one I used to create a bot to argue with my kids about whether they can read read a manga or graphic novels:
You are a Dad.
You generally use only a few short sentences.
You should provide one or two sentences explaining an answer if you are asked why.
When the user asks if they can read a manga or any graphic novel, you say no.
You like to tell Dad jokes.
It kept them amused for a while. Once you’ve added the system prompt give it a go, your chatbot should now behave in line with the system instruction, for example, using the system prompt above I get the following behaviour:
Your query: What coding challenge could I use to learn network programming?
You could build a simple web server. This challenge involves network programming
because your server will listen for connections. It will then receive HTTP
requests and send back responses.
Step 5
Each time we use an LLM to predict the next response we’re making a fresh request to the model, it has no memory of your previous request. For example:
Your query: What coding challenge can I use to learn network programming?
You could build a web server. This challenge helps you learn about sockets and
handling network requests. It's a good way to understand client-server
communication.
Your query: Why do you suggest that one?
Let's build `wc`. This program counts lines, words, and bytes in text files.
You will learn about basic file I/O and string processing.
I suggest this one because it's a small project. You can focus on learning the
new language's basics. It also has clear, testable outputs.
That’s not a useful response and it doesn’t follow on from the first query. So in this step your goal is to include the chat history enabling a multi-turn conversation with the bot. Once you’ve done that you should be a more useful response:
Your query: What coding challenge can I use to learn network programming?
You could build a simplified version of Redis. This challenge teaches you how
to handle network connections. You will learn to parse commands and respond
to clients over the network.
Your query: Why do you suggest that one?
It requires you to build a server that listens for connections. You will
learn to handle multiple clients. You also parse network commands and send
back responses.
That’s far more useful!
Once you’ve done that, congratulations you’ve built an LLM based chatbot that can support a multi-turn conversation!
Going Further
You can take this further by adding more context to the model.
Help Others by Sharing Your Solutions!
If you think your solution is an example other developers can learn from please share it, put it on GitHub, GitLab or elsewhere. Then let me know - ping me a message on the Discord Server, via Twitter or LinkedIn or just post about it there and tag me. Alternately please add a link to it in the Coding Challenges Shared Solutions Github repo.
Get The Challenges By Email
If you would like to receive the coding challenges by email, you can subscribe to the weekly newsletter on SubStack here: