Víctor Deutsch, professor of cybersecurity at the Software Development Engineering Degree of IMMUNE.
In the last few days, a lot of content has been circulating on social networks related to the EPIKan application that uses artificial intelligence to create images with our face in various versions of an American high school yearbook from the 90s. Although it is not a very disruptive technology, given that there were already applications for manipulating and creating this type of images, the novelty is that the company has made it available to the public for just 6.99 dollars. At such an affordable price, the downloads have shot up by the millions, flooding social networks, where celebrities and influencers such as Chanel, Laura Escanes and Lola Indigo have participated under the hashtag #YearBookChallenge.
photos of celebrities generated by the EPIK application
EPIK's popularity has once again opened up the security debate of such AI-powered applications that hijack our data and images, especially after the case in Almendralejoin which a group of teenagers was the victim of deepfakes pornography made and disseminated by their schoolmates.
In the case of EPIK, the conditions for using the application grant, with complete freedom, the transfer of data (geolocation, mobile device usage information and images) of users to third parties for 3 years or until the profile is closed. Without accepting these requirements, it is not possible to use the tool.
EPIK's profits are likely to come not so much from current subscription fees, but from the future value it aims to create.
On the one hand, there is a technological race between companies to develop applications for the creation of digital content that is increasingly similar to reality, until it is practically indistinguishable. In other words, to generate "virtual worlds" that produce user experiences that elicit the same stimuli as in a real situation. Although the idea of having a "metaverse" in the short term seems to have been exaggerated, the industry is still working in that direction. And what does the industry need to improve its applications? Lots and lots of spontaneous and diverse images that will allow them to enrich their "engines" of machine learning and gaining precision step by step. With this launch, EPIK has achieved millions of altruistic" volunteers for their tests.
On the other hand, legal experts warn that the terms of service of EPIK, a Korean company, make it easy to share the private data of subscribers with others business partners. This opens the door for this data to be traded for other purposes that we cannot imagine. Some probably legal and where records can be anonymised or aggregated. But they can also be used to sharpen algorithms for targeted advertising or the delivery of unsolicited content.
Behind every cheap or free service, we can always find the trade off between privacy and functionality. It is important to bear this in mind when using such tools.
These very accessible applications open up the possibility for simple amateurs (without the need to belong to a large criminal organisation) to have at hand the tools for fraudulent actions or unethical campaigns. Without going any further in recent days, in some social networks such as LinkedIn, profiles with manipulated images are appearing that are very difficult to detect, unless you use image filters or recognise certain characteristics or common characteristics. Or that the image is "too good" to be real.
What are we looking for? To attract attention in a context where we receive a lot of content or to generate some kind of affinity through clothing, physical characteristics or the type of activity that the photo represents. All the more reason why the accessibility of this technology will have an impact on networks such as Tik Tok, Reddit or similar.
What can be done? In general, have the same warning signs as for any unsolicited online contact: a certain scepticism that leads to checking the identity and intentions of the contact. Be friendly, yes, but be careful in the data we disclose and always be ready to report when we detect abuse. In the same way as we would in the "physical world".