Snapchat announces my AI powered by ChatGPT, but proceed cautiously

Snapchat announces my AI powered by ChatGPT, but proceed cautiously

ChatGPT has already been implemented in Bing and several Microsoft mobile apps. And now its reach to cell phones is expanding even further.

Snapchat's parent company, Snap (opens in new tab), announced in a press release My AI, an AI chatbot that will be integrated into the popular messaging app. The chatbot is available only to Snapchat Plus users and will roll out to U.S. subscribers this week.

Like Bing and ChatGPT, My AI is not strictly ChatGPT. Instead, it is its own AI chatbot that uses the latest version of the GPT language model that powers ChatGPT and Bing with ChatGPT. Like the new Bing, Snap did not say which version of the GPT model My AI uses, only that it is "built on the latest ChatGPT." It is therefore unclear what it may or may not have in common with Bing's chatbot dark alter ego "Sidney," who has gone off the deep end and professed his love for a New York Times reporter.

My AI's Snapchat support (opens in new tab) page states that My AI is designed to be a user's personal buddy. It can answer trivia questions, offer advice, and help plan trips. Hopefully, the results will be better than trying to get Bing to plan a trip to Amsterdam on ChatGPT.

However, there are some features of the new chatbot AI that may raise some eyebrows: according to Snapchat, "You can give My AI a nickname or tell it what you like (or don't like!) and tell it what you like (and don't like!)." Given how little we know about these relatively novel AI chatbots, this may be a disaster waiting to happen.

Training a ChatGPT-like language model on something like this seems like a recipe for a repeat of events like when Bing became Sydney. So while Microsoft initially placed new restrictions on the new Bing, they eventually added it to the Bing, Edge, and Skype mobile apps to greatly expand access. Clearly, Snap disagrees with AI experts that a "digital health warning" is necessary when using chatbot AI.

However, users should be wary. Despite Snap's statement that "Do not use My AI to generate political, sexual, harassing, or deceptive content, spam, malware, or content that promotes violence, self-harm, human trafficking, or violates our community guidelines, It is extremely likely that My AI chatbots will be used for exactly such purposes.

It is also used to store personal data, although Snap shows how such data can be deleted. Data privacy is becoming an increasingly prominent issue in chatbot AI. The Telegraph (opens in new tab) recently reported that Microsoft staff are reviewing conversations with the new Bing, and users should assume that any data (text or voice) provided to these AI chatbots is stored by the companies concerned.

On the Snap side, "Your interactions with My AI and your city-level location data will be used by My AI. Your data will be used to improve My AI and other Snap products, including advertising, to make them more personalized and relevant to you." Between these privacy concerns and the fear that Bing's chatbots will be confused and repeated to dramatic effect, we urge you to be careful when using My AI.

We intend to test My AI in practice and report our findings.

Categories