Artificial Intelligence is slowly becoming a great ally for users, or at least for those who know how to take advantage of the hundreds of apps and platforms powered by this technology. Undoubtedly, the most popular of all is ChatGPT, which has the advantage of being a generalist chatbot, so we can practically use it for any purpose we can think of, especially with the arrival of the hundreds of integrated GPTS.
But not everything is beautiful and positive, since the rise of AI has also partly contributed to the growth of cybercrime and misinformation. And although it is true that these evils already existed before, the arrival of this technology has pushed them even further to levels we could not have suspected.
AI facilitates and simplifies tasks so much that cybercriminals have also taken advantage of its use, and as OpenAI itself has denounced, various groups from China, Russia, or Iran have been using the ChatGPT technology to create false and deceptive content with the intention of spreading it in countries such as the US or Canada, according to a report.
These groups used the chatbot to write posts, translate them into various languages, and create software that helped them publish automatically on social networks. Among the most notable for the impact of their campaigns are the Russian group Doppelganger that translated and modified articles and headlines in English, French, German, Italian, and Polish to spread them on social networks.
While it is true that these misinformation campaigns have existed for many years, and generally have not had much impact, OpenAI warns that AI has impacted the quality and quantity of the content these networks publish. “We have seen them generate text in a greater volume and with fewer errors than these operations have traditionally handled”.
This means that “disinformation campaigns that for years had no impact, suddenly explode if no one watches them”. Although OpenAI has suggested the possibility that chatbots also act as disinformation agents, although technically this is quite complicated.