I added memory to my chatbot — and what I expected to be a minor upgrade turned into a bit of a "wait, this is actually cool" moment. Version 1 was straightforward: ask a question, get an answer, start fresh next time. Useful, but cold. Like a vending machine that also talks. Version 2 remembers you. Not in some fancy way — it's literally reading and writing to a .txt file. But because LangChain feeds that history back into the model at every turn, the conversation has continuity. You can say "remember what I told you earlier?" and it actually can. Building it made me realize: memory isn't just a feature. It's the thing that makes an AI feel like it's actually *with* you instead of just responding *to* you. The stack is still simple — Python, LangChain, Ollama running LLaMA 3.2 locally. No external APIs, no data leaving my machine. Where I want to take it: — Smarter memory with a vector database — Distinguishing between what to remember long-term vs short-term — A proper UI so it doesn't live only in a terminal It's still early. But it's starting to feel like I'm building something, not just tinkering. Code link is in the comments. 👇 #AI #MachineLearning #Python #LangChain #Chatbot #BuildInPublic #GenAI

It's so impressive go ahead and grow up or bosst your skills.

cool project well done bro 👏🏻

Adding small features can turn out to be so helpful and really cool. Keep building!

Interesting! brother I am a similar one

See more comments

To view or add a comment, sign in

Explore content categories