The burgeoning popularity of ChatGPT has ignited widespread interest in the exciting world of artificial intelligence ( AI). It's no secret that the technology industry's biggest players, including Google, Microsoft , and Meta, are heavily investing in AI technologies. Furthermore, Elon Musk, the celebrated tech entrepreneur, is reportedly launching his new AI firm, X.AI. Despite AI's historical presence , the technology's prominence reached new heights after OpenAI's breakthrough tools, such as ChatGPT and DALL.E, took the industry by storm .
However, AI's incredible potential comes with risks, as experts have repeatedly warned about the technology's potential misuse. OpenAI's CEO, Sam Altman, has expressed concerns about AI misuse. In light of these warnings, a recent incident has left the world reeling, in which cybercriminals used AI to clone a teenager's voice and demand ransom from her mother.
According to a report by WKYT, a CBS-affiliated US news channel, Jennifer DeStefano, a woman from Arizona, received a call from an unknown number one day that turned her world upside down. DeStefano's 15-year-old daughter was on a skiing trip when she received the call, and upon answering the phone, she heard her daughter's voice say "Mom," followed by sobbing. The man's voice then threatened her daughter's safety and demanded $1 million in ransom.
DeStefano could hear her daughter's voice in the background, calling for help, and the "kidnapper" eventually settled for $50,000. However, DeStefano's teenage daughter was safe on her skiing trip, and within minutes, the authorities confirmed her safety. Nonetheless, the voice on the phone sounded exactly like her daughter's, leading DeStefano to urge others to create a "family emergency word or question that only you know " to validate that they are not being scammed with AI.
In a recent Facebook post, DeStefano shared the news story and warned others to stay vigilant against such incidents. She emphasized that "everyone should watch this!! How free AI apps can use your loved one's voice to scam you!," and suggested having a family emergency word or question to validate their identity. DeStefano's harrowing experience highlights the dark side of AI and the need to be aware of potential misuse.
Comments
Post a Comment