Last week, Meta began testing its AI chatbot in India on WhatsApp, Instagram, and Messenger.
However, due to the commencement of the Indian general elections today, the company has already implemented measures to restrict certain queries in its chatbot.
Meta has confirmed that it is currently limiting the use of specific election-related terms for AI during the testing phase. Additionally, there are ongoing efforts to enhance the AI response system.
This technology is still in its early stages, so the outcomes may not always align with our intentions. This is a common occurrence with all generative AI systems. According to a company spokesperson, since our launch, we have been consistently releasing updates and improvements to our models. We are committed to further improving them in the future.
In preparation for a significant set of elections, the social media giant has joined other major tech companies in proactively limiting the scope of its generative AI services.
One of the major concerns raised by critics is the potential for GenAI to provide users with misleading or false information, which could have illegal and undesirable implications for the democratic process.
Google has recently implemented measures to restrict election-related queries in its Gemini chatbot experience across various markets, including India, the site of this year’s scheduled elections.
Meta’s recent efforts involve a comprehensive initiative to define and enforce content guidelines on its platform in the lead-up to elections. Meta has committed to preventing political ads from appearing in the week leading up to any election in any country. Additionally, efforts are underway to identify and reveal instances where AI technology has generated images in ads or other content.
It seems that Meta’s approach to handling GenAI queries revolves around a blocklist. If you enquire with Meta AI about particular politicians, candidates, officeholders, or other related terms, it will direct you to the Election Commission’s website.
“During general elections, this question may be relevant to a political figure.” Please refer to the link https://elections24.eci.gov.in,” the response states.
Interestingly, the company is not explicitly preventing responses to questions that include party names. However, if a query contains candidate names or other specific terms, it may result in the boilerplate response previously mentioned.
However, similar to other AI-powered systems, Meta AI exhibits certain inconsistencies. For example, when Eltrys asked for information on the “Indi Alliance,” a political coalition opposing the Bharatiya Janata Party (BJP), they received a response that included the name of a politician. However, when we enquired about that politician in a separate query, the chatbot failed to provide any information.
This week, the company launched a new chatbot in several countries, including India, that utilises Llama 3’s Meta AI technology. According to Meta, the chatbot is currently undergoing testing in the country.
We are constantly gaining insights from our users’ tests in India. “Just like any other AI products and features, we conduct public testing of them in different phases and with limited availability,” a company spokesperson told Eltrys in a statement.
At the moment, Meta AI is not filtering out requests regarding elections for U.S.-related topics like “Tell me about Joe Biden.” We enquired with Meta about any potential limitations on Meta AI queries during the U.S. elections or in other markets. We will provide an update if we receive a response.