FaceApp cybersecurity risk

FaceApp is a Gigantic Cybersecurity Risk — From Fun to Dangerous in an Instant

In Security by Shelly KramerLeave a Comment

FaceApp cybersecurity risk

You’ve no doubt seen or heard of the FaceApp app. It’s ‘gone viral’ on social media and ordinary average people and celebrities alike are sharing photos using the app of what they look like today and what they’ll look like in old age. Lebron James, The Jo Bros, Tom Holland, and just about everyone else is on the FaceApp bandwagon. The only problem? The FaceApp poses a gigantic cybersecurity risk. Read more at MarketWatch.

FaceApp is a Gigantic Cybersecurity Risk — From Fun to Dangerous in an Instant

Analyst Take: On the surface, FaceApp sounds kind of fun. In fact, if it feels like FaceApp seems to be everywhere, you’re right — it is. The FaceApp app is currently the top-ranked app in the iOS App Store in more than 100 countries, and some 100,000 million plus people have downloaded the FaceApp app from the Google Play store.

The thing is, FaceApp is an application-based software that uses artificial intelligence to transform your photos. It is essentially facial recognition software powered by artificial intelligence. So giving up your data by way of sharing personal images using the FaceApp app? Well, you’re feeding your images into a giant AI-powered facial recognition database.

The Dangers of Seemingly ‘Fun’ Apps [and Quizzes] — and the Russia Connection

Remember that nifty This is Your Digital Life personality quiz on Facebook that asked for all kinds of personal information and people loved it so much (and shared the heck out of it?). Well that innocuous-seeming personality quiz opened the door to Cambridge Analytica. You know, Cambridge Analytica, that Russian-linked initiative that scammed 50 million US Facebook users for their data and then used that data to drive targeted ad campaigns that are widely-considered to have influenced the 2016 US Presidential election.

With FaceApp, there’s also a Russian connection. FaceApp’s parent company is Russian-based Wireless Labs. And like with Cambridge Analytica, the goal here is data. Data and access.

And there’s no overlooking the Russia connection. Much like China, what happens in Russia rarely happens without the government’s knowledge. Or involvement. 

The Cybersecurity Risks of FaceApp

Concerns about cybersecurity are at an all-time high, and with good reason. Using an app like FaceApp, feeding your personal data into the app, and then also giving the app access to your data is what’s problematic.

And in order to get FaceApp to work, you’ve got to give the app access to your photos, all of them, and the app likely also has access to other things on your device, like Siri if you’re an iOS user, and possibly your browser search history. When you tap a photo in your library that you want FaceApp to age, the app actually uploads that photo to FaceApp’s servers in Russia and the age effects are crunched by the AI there, off your device. Here’s the scary part:

FaceApp does not tell you that your photo has been uploaded to their cloud, and the app doesn’t make it clear in its user agreement if the company retains your original photo or what the company is allowed to do with it.

Here’s a deeper look at some excerpts from those Terms & Conditions specifically related to User Content:

You grant FaceApp a perpetual, irrevocable, nonexclusive, royalty-free, worldwide, fully-paid, transferable sub-license to use, reproduce, modify, adapt, publish, translate, create derivative works from, distribute, publicly perform and display your User Content, and any name, username, or likeness provided in connection with your User Content in all media formats and channels now known or later developed, without compensation to you.

As it relates to the use of the content that FaceApp users are granting, that’s a little tricky, too:

You grant FaceApp consent to use the User Content, regardless of whether it includes an individual’s name, likeness, voice, or persona, sufficient to indicate the individual’s identity. By using the Services, you agree that the User Content may be used for commercial purposes.

Have I convinced you that using the FaceApp is not only insane, but a potential cybersecurity risk as well? Want to delete that FaceApp app (and the content you uploaded)? Good luck. From the T&C:

User content removed from the Services may continue to be stored by FaceApp, including without limitation, in order to comply with certain legal obligations.

I’ve seen several other articles on this topic, some debating whether the content used by FaceApp is stored in the cloud in Russia or whether that content is located here in the US. It doesn’t matter, once you open the door to adding your face to a facial recognition database that is owned by a Russian-based company, you can assume that the Russian government is not only aware of it, but that chances are good there’s an ulterior motive that contains no upside whatsoever for FaceApp users.

If you’re using the FaceApp app, stop. Full stop.

If your kids are using the app, get them to immediately stop and delete the app. If your kids’ friends are using the app, get them to stop. Talk with other parents about this.

If you have employees using the FaceApp app, make sure you make it a priority to inform them of the potential dangers associated with the app and how it might potentially expose them, both now and in the future. Encourage them to ask questions if they want more information.

To be fair, FaceApp was asked to comment on this by the team at TechCrunch. You can see some of founder Yaroslav Goncharov’s responses here

Tech geeks like us don’t have all the answers, but most of us who are immersed in this space are flat out concerned about the potential ramifications of FaceApp and the dangers it might possibly pose. And really? What’s the upside? Your photos might end up on a billboard in a foreign country somewhere, or they might be used to train artificial intelligence. Or both. Or they might be used in even more personal, and personally dangerous ways. It’s not really worth it for a few seconds of amusement, or amazement, or whatever it is, at what you might look like 50 years from now—is it?

The original version of this article was first published on Futurum Research.

Shelly Kramer is a Principal Analyst and Founding Partner at Futurum Research. A serial entrepreneur with a technology centric focus, she has worked alongside some of the world’s largest brands to embrace disruption and spur innovation, understand and address the realities of the connected customer, and help navigate the process of digital transformation. She brings 20 years' experience as a brand strategist to her work at Futurum, and has deep experience helping global companies with marketing challenges, GTM strategies, messaging development, and driving strategy and digital transformation for B2B brands across multiple verticals. Shelly's coverage areas include Collaboration/CX/SaaS, platforms, ESG, and Cybersecurity, as well as topics and trends related to the Future of Work, the transformation of the workplace and how people and technology are driving that transformation. A transplanted New Yorker, she has learned to love life in the Midwest, and has firsthand experience that some of the most innovative minds and most successful companies in the world also happen to live in “flyover country.”

Leave a Comment