Seattle Software Developers | FaceApp is a Viral Hit but is it Safe | Girl taking a selfie

About the author : Diana

I'm a professional writer specializing in Web Development, Design, Developing Mobile Apps, Metaverse, NFTs, Blockchain and Cryptocurrencies.

Another day, another social media challenge with doubtless murky consequences. Are you one of 150 million people that recently downloaded FaceApp for the “#FaceApp Challenge”?
You just wanted to show the world what you’re going to look like once you’re old and gray, but now we have some unfortunate news: you have inadvertently given your images to malicious actors … who can do whatever they want with you images… forever.

What Is FaceApp?

FaceApp initially blew up in 2017, and is currently experiencing revived virality due to the challenge. The app uses neural networks to simulate what you’ll look like after you age⁠—think: adding wrinkles, coloring your teeth⁠—and the challenge is encouraging you to share the image.
Seems like all fun games, right? Well, as long before you transfer your selfie to the app, you’re forking over your face and knowledge to shadowy figures who MIGHT use it for untold nefarious purposes.

Wireless Science Lab, the company behind FaceApp, has terribly expansive Terms of Service that raise a growing variety of privacy issues. Section five of the Terms of Service “grants FaceApp a perpetual, irrevocable, nonexclusive, royalty-free, worldwide, fully-paid, transferable sub-licensable license to use, reproduce, modify, adapt, publish, translate, produce by-product works from, distribute, in public perform and show your User Content and any name, username or likeness provided in reference to your User Content all told media formats and channels currently notable or later developed, with no compensation to you.”

Surprisingly, this kind of content possession is pretty common for apps, however FaceApp’s TOS is especially controversial.

FaceApp’s privacy policy provides it the power to gather data sent by your device together with the websites you visit, add-ons, and alternative data that helps the app “improve its service.” meaning FaceApp has wide access to your device, your photos, and more, albeit the app. Even though it says it has no intent to misuse your knowledge or data.

But there’s more, an even more problematic wrinkle to the access issue: FaceApp with happens to be primarily based in Russia.

Who Is Behind FaceApp?

Wireless Science Lab is out of St. Petersburg, Russia and headed by Yaroslav Goncharov, an ex-employee of Yandex. Given that Russia and Russian corporations meddled in the United States’ 2016 elections and current information war, security and privacy communities are clearly raising alarms regarding the amount of access given after you use FaceApp. While there’s no direct link to the Russian government, what if there might be? And what possible implications could it have?

Should I care about giving my image to a Russian company?
“There is a terribly real chance that applications like these are merely honeypots designed to induce you to provide up data regarding yourself,” says Brandy Boudria, VP of Technology for AI-company Hypergiant.

“You send them well-lit pictures of your face,” he continues. “Now, they have your name and enough very important details, they could produce a personality profile, and would not have any trouble triangulating and verifying additional information from alternate sources like LinkedIn which would then provide them your education, your work history…the skies the limit.”

Current conversations regarding facial recognition and deep fake software, highlight the hazards of individual corporations owning giant databases of personal data⁠—particularly data sets of human faces that may power facial recognition technology.

Recently Moscow-based scientists detailed the event of a machine learning model that may use few or simply a single picture to make deep fakes from those images. Meanwhile, a recent article within the NY Times noted “documents discharged last Sunday unconcealed that Immigration and Customs social control [ICE] officers used automatic face recognition technology to scan motorists’ photos to spot undocumented immigrants. The FBI additionally spent a decade using such systems to check driver’s licenses and visa photos against the faces of suspected criminals, in keeping with a Government responsibleness workplace report last month.”

Now that another company has access to your personal data and a powerful database of your likeness, that data can be weaponized by any actors who have an interest in causing harm through a cyber-attack or an information campaign. As we look to the future and the upcoming 2020 election⁠—but also the more and more connected nature of our day to day lives⁠—this may be a terribly real and valid concern.

“There are obvious political issues with sharing personal distinguishing data, particularly given however Russia has weaponized data back at democracies, usually in manipulated and false forms, and bent native business to its can,” says New America Senior Fellow and Like War author Peter Singer. “But there’s additionally some major privacy problems that will be there, even on the far side the Russia facet. Like most of social media, most of the users are simply brooding about the fun aspects, not however it’d be monetized and weaponized.”

Update: On July 17th, U.S. Senate legislator Chuck Schumer sent a letter to the FBI and FTC to analyze FaceApp, in keeping with a Reuters report. The app, Schumer aforementioned, might cause “national security and privacy risks for volumes of U.S. citizens.”

What can I do to safeguard myself?

For starters, don’t take FaceApp’s apathetic approach to non-public security. We all know it’s simple to breeze past privacy policies, however the earlier you begin asking queries and taking note, the earlier you’ll begin to safeguard your knowledge. “Consumers should use the latest versions of iOS to assist in managing these risks,” says Dan Guido, business executive of path of Bits. “iOS 13, kicking off this Fall, alerts users once apps collect their location knowledge or activate Bluetooth within the background.”

When app stores take a powerful stance in protecting personal information, they tend to take away the burden on people to keep up their own security. Rather than a single person against all the nefarious actors, a spotlight on minimum security levels—from Google and Apple to individual users—ensures additional resilience and a safe overall app store ecosystem, that is safer for everyone.