Skip to main content

    631 Ideas

    nocodedude
    nocodedudeUp-and-Comer

    Enable Second-Level Granularity in Conditional FiltersNew

    Hi ManyChat team and fellow builders,I’d like to propose adding second-level filtering in Conditional blocks (and anywhere we compare timestamps). Right now the smallest time unit we can work with is one minute. For most automations that’s fine, but when you’re trying to build responsive, conversational experiences it becomes a real bottleneck.Why seconds matter Smart Delay + Conditional check loops A common pattern is: set a short Smart Delay, check the last user message time, decide whether to bundle new input or push a follow-up. With minute-only precision we’re forced to wait at least 60 seconds before the Condition can evaluate, which feels like an eternity in live chat. Message grouping & spam prevention Many builders try to merge multiple quick user replies into a single logical block, or throttle highly active users so we don’t spam agents. We currently can’t distinguish someone who answered 3 s ago from someone who answered 58 s ago. Proposed change Allow seconds as a valid unit anywhere we compare or add time (e.g., {{last_input}} is less than 15 seconds ago). If UI clutter is a concern, hide “seconds” behind an “Advanced” toggle—builders who need it will find it. Impact Faster user experience → conversations feel natural, which improves retention and conversion. Cleaner flows → no more hacky work-arounds (multiple branches, external APIs, or unnecessary tags). Competitive edge → other bot platforms already allow sub-minute delays; adding this keeps ManyChat best-in-class. If you’ve run into the same limitation—or have ideas on implementation—please add your voice below. The more real-world examples we share, the easier it’ll be for the product team to prioritize.Thanks for considering, and happy building! 🚀

    Vadim Ciobanu
    Vadim CiobanuChannel Explorer

    Feature Suggestion for ManyChat: Tracking Last Live Chat Agent InputNew

    I appreciate the powerful automation tools ManyChat provides, especially the ability to access the last_user_input variable. This functionality enables seamless integration with external tools, such as an AI assistant, by transmitting the user's last message.However, in real-world scenarios, live chat agents frequently step into conversations between users and the chatbot. Currently, there is no built-in way to distinguish between messages sent by a bot and those sent by a live chat agent. This presents a challenge when integrating with AI assistants, as they may lose critical conversation context when automation resumes.Feature Request: I propose adding two new variables: last_agent_input – Capturing the last message sent by a live chat agent. last_agent_input_timestamp – Storing the date and time of the last message from a live chat agent. (requested also here  ) Benefits: Enhanced AI Training & Adaptation: AI assistants, such as GPT-based integrations, could learn from agent responses, which are often superior to bot-generated messages. This would help improve chatbot responses over time. Maintaining Conversation Context: When automation is paused while an agent handles a conversation, the AI assistant should still receive and process the agent’s messages. This ensures the AI retains full context once automation resumes. Seamless AI-Assisted Live Support: By forwarding agent messages to an AI assistant, businesses can refine responses, analyze customer interactions, and improve chatbot accuracy based on real human interactions. By implementing these variables, ManyChat would empower users to build smarter, more context-aware AI integrations, ultimately improving the quality of automated conversations.Thank you for considering this feature request. I believe it would be a valuable addition to ManyChat’s capabilities, helping businesses provide a more seamless and intelligent customer experience.