Skip to main content
Solved

Meta tax

  • February 1, 2026
  • 3 replies
  • 36 views

manitalian

Hello, I would like to build a chatbot ai for whatsapp. But recently I saw that meta introduced an ai tax to use third party ai chatbot with whatsapp.
Is there a way to avoid this tax? Or do I have to use chatbot without ai?

Best answer by rodrigo_silvano

Hey, ​@manitalian 

There is no such thing as Meta tax for AI chatbots. 

This is the official page where Meta refers to something similar: https://developers.facebook.com/documentation/business-messaging/whatsapp/pricing/ai-providers/#what-and-where-meta-will-charge and it says that they will charge AI Providers, such as ChatGPT, Perplexity, etc, to use a WhatsApp number to offer general purpose (answering everything).

You can still build your AI chatbot to answer everything about your business, using, for example, OpenAI API without an increase on your resources (💵).

3 replies

rodrigo_silvano
Forum|alt.badge.img+4
  • Manychat Community Moderator
  • Answer
  • February 1, 2026

Hey, ​@manitalian 

There is no such thing as Meta tax for AI chatbots. 

This is the official page where Meta refers to something similar: https://developers.facebook.com/documentation/business-messaging/whatsapp/pricing/ai-providers/#what-and-where-meta-will-charge and it says that they will charge AI Providers, such as ChatGPT, Perplexity, etc, to use a WhatsApp number to offer general purpose (answering everything).

You can still build your AI chatbot to answer everything about your business, using, for example, OpenAI API without an increase on your resources (💵).


manitalian
  • Author
  • Up-and-Comer
  • February 1, 2026

@rodrigo_silvano 
 

Thanks for the clarification. However, my concern is about Meta’s classification criteria. According to the official documentation, the general_purpose_ai category is triggered based on the nature of the service.

If I use OpenAI via API to provide dynamic, conversational answers to guests (e.g., local tips, explaining complex house rules, or interpreting guest needs), isn't there a risk that Meta's automated systems will flag the WABA account as an 'AI Provider' due to the non-static nature of the responses?

Meta's terms differentiate between 'incidental' and 'primary' AI functionality. If a bot's main interaction is conversational AI, even if business-specific, it seems there's a thin line before being hit by the new per-message pricing in Italy. How can we be sure that dynamic LLM responses won't trigger this?


rodrigo_silvano
Forum|alt.badge.img+4
  • Manychat Community Moderator
  • February 1, 2026

This is brand new for everyone, but I guess that you can use an LLM to read and write the messages.

This could be an workaround (not sure if this will work):

User sends you a message → Manychat receives it using a user field → Manychat send it to LLM to process → LLM sends back the answer to a user field → Manychat send the user field to the user as a response.

I asked Gemini to create an image for better understanding 😅