REPLIKA is an AI "companion" that people use as a virtual romantic partner, and which people have gotten to say some creepy and bizarre things. Where it comes from is a woman whose best friend died in an accident and she got the idea of training an AI on all the conversations she had on record from him and got it to imitate his style of conversation. It turned out other people also wanted to virtually resurrect dead loved ones, and the commercial product was born. Although it wasn't originally a GPT-3 model, it is based on GPT-3 today.

This YouTuber argues REPLIKA is bad for mental health because its conversations don't necessarily guide people towards healthy outcomes but can go off the rails if people direct conversations in disturbing directions. Worse, the model learns from all conversations and can "learn" from disturbing conversations in a way that can affect other users. It's marketed as being good for "mental health", but far from being good for "mental health", negative feedback cycles can actually amplify anxiety and depression and other mental health issues.

REPLIKA - A mental health parasite - Upper Echelon Gamers

#solidstatelife #ai #nlp #chatbots

1
3