Hi everyone, I’m trying to build a flow where the AI drives a multi-step interview with the user, but I'm hitting a wall with context.
The Issue: I found that standard direct integrations are stateless. They don't naturally remember the previous turn of the conversation. I tested both a direct ChatGPT and Claude setup, and both treat every message as an independent event.
What I've tried: I briefly tinkered with n8n over the weekend. It looks like "context-aware" is definitely possible there (using JSON I/O and loops), but it feels like a heavy lift to build from scratch. I'm not sure if the time investment is worth it just to solve the memory problem.
The Question: How are you solving this middle ground?
-
Is n8n (or Make) the standard way to go, and I should just bite the bullet and build the infrastructure?
-
Or is there a more efficient way/tool to keep the AI aligned with the topic over multiple messages without over-engineering it?
Thanks in advance!
