If you used Facebook between 2010 and November 2021, unlocked a smartphone with your face on, walked into a secure bank or office building, or walked the streets of cities dotted with surveillance cameras, chances are your photo or video of Your face has been stored, analyzed and used to create a set of unique identifiers that help various algorithms to recognize you and act on you.
Your data is then used for a wide range of applications, from unlocking your phone and tagging a photo on your favorite social network, to authentication schemes, including those related to law enforcement and security, other government agencies, and even private businesses. Along with law enforcement, security services, and other government organizations, your photos can also fall into the hands of hackers and artificial intelligence researchers.
As you can see, once scanned and scanned, your face, your unique and invariable identification, is thrown away and shared everywhere, without you having much to say.
Needless to say, this is detrimental to privacy, because your inalienable human right:
Article 12 of the United Nations Universal Declaration of Human Rights establishes: “No one shall be subjected to arbitrary interference with his private life, his family, his home or his correspondence. … Everyone has the right to the protection of the law against such interference or attacks.
I am sure you will agree that having cameras to record your every move is not entirely in the spirit of the statement. But government institutions are unwilling to abandon this powerful monitoring tool.
In China, for example, Huawei has tested a facial recognition system that can activate a “Uyghur alarm”, which detects members of the Uighurs, the oppressed minority group. (Chinese authorities have arbitrarily detained up to 1 million Uighurs and other minorities in up to 400 institutions in Xinjiang, in the largest internment of a religious ethnic minority since World War II.) This system would allow the Chinese government to control and prosecute the Uighurs if they wish.
Despite the example of China, even if we allow the use of facial recognition in law enforcement, the question remains: is the tool itself reliable enough for such applications? Unfortunately, the answer is no.
Darker-skinned people have been, and continue to be, a great challenge for these algorithms. Because dark skin reflects less light and generates less contrast under non-optimized conditions, some photos used for facial recognition do not provide enough data points for the algorithm, causing incompatibility.
In their 2018 research paper, Gender Shades, Joy Buolamwini, and Timnit Gebru tested facial recognition-based gender classification algorithms built (among others) by Microsoft, IBM, and Amazon. According to the article, the programs recorded substantial error rates (between 20% and 34%) in the detection, identification and verification of the face in black men and even more in women.
An independent evaluation by the National Institute of Standards and Technology (NIST) did an even more in-depth analysis; It included up to 189 facial recognition programs and obtained the same results: they were the most inaccurate when analyzing the faces of dark-skinned women.
By themselves, the high error rates should have been reason enough to prohibit the use of such unreliable technology. But facial recognition programs are still used primarily due to greed and disregard for human rights. Some researchers ignore the real danger behind the problem and, instead of ruling out the use of this technology from the legal system, they suggest optimizing the photo settings to take into account darker skin tones and implementing “consent” as a prerequisite. for creating facial recognition data sets. .
By now we have all seen that in the face of the limitations imposed by law, consent can be applied. If you need your photo to go to work, buy food or access medical care, you have no choice but to provide it. Furthermore, the improvement of this technology only serves governments and large companies that make substantial profits from the sale of licenses for their software.
Someone else is taking advantage of your data too: hackers and hacker groups.
In 2019, at an annual Black Hat hacker convention, hackers breached Apple’s iPhone FaceID authentication system in just two minutes.
In February 2020, Clearview AI, a company that scrapes the internet and extracts billions of photos online for the use of facial recognition technology, had its entire client list stolen. This hack likely played a crucial role in further hacking attempts against the company and its clients, most of whom are law enforcement agencies and banks.
In 2020, a McAfee cybersecurity team demonstrated a flaw in facial recognition systems. They used a specially manipulated photo to fool a system similar to that used at airports for facial recognition passport identification by accepting that the person on the passport was the same as the one registered by the system’s camera. This would allow someone on a banned persons list, for example, to board the plane.
In March 2021, a criminal group used photos purchased on the black market online to fool a Chinese government-run site, stealing $ 76.2 million.
The list goes on.
Remember: in all these tricks, the final victim is always you. Even if your private data doesn’t end up being used for phishing and identity theft, it is being sold to the highest bidder on the dark web for other nefarious purposes.
What you can do
If you are like me, you are already wondering: What can we do? What is being done? In fact, a lot.
The EFF, the Electronic Frontier Foundation, is pushing for the government to ban the use of facial recognition. Fight for the Future, a nonprofit advocacy group, has launched an online campaign, Banning Facial Recognition. You can find out who in Congress supports or opposes this technology by tweeting or sharing information and helping to advance the cause locally. One of the most notable victories is the BIPA, or the Illinois Biometric Information Privacy Act.
BIPA is a step in the right direction, as it requires consent before companies can save a photo of a person’s face for facial recognition. These photos, once taken, need to be deleted after a specified period of time. If companies do not follow these rules, the injured party has the right to private action, which means that private persons (that is, you, not your company) have the right to sue.
Germany’s new government seems to understand the seriousness of the situation and is considering banning facial recognition and restricting the use of mass surveillance tools.
Even the Meta Platforms, also known as Facebook, cracked under the pressure and heavily restricted the use of its facial recognition feature in November. It can only be used in special cases, like verifying identities and unlocking hacked accounts. The full list is longer and unspecified, and shows that the company is unwilling to give up its facial recognition toy DeepFace, Facebook’s algorithm trained in a billion face scans. More importantly, Meta hasn’t ruled out going back to facial recognition in the future.
As you can see, the battle for privacy and human rights continues and is likely to intensify. Hope you now understand the concept and know the issues.
What comes next is up to you.
Rank Math Seo Pro Weadown, Wordfence Premium Nulled, Yoast Nulled, PHP Script, Fs Poster Plugin Nulled, Astra Pro Nulled,Woodmart Theme Nulled, Wpml Nulled, Avada 7.4 Nulled, Woodmart Theme Nulled, PW WooCommerce Gift Cards Pro Nulled, Elementor Pro Weadown, Newspaper – News & WooCommerce WordPress Theme, Nulledfire, Slider Revolution Nulled, Elementor Pro Weadown, Jnews 8.1.0 Nulled, WeaPlay, Business Consulting Nulled, WP Reset Pro, Newspaper 11.2, Flatsome Nulled, Woocommerce Custom Product Ad, Premium Addons for Elementor, Jannah Nulled, Consulting 6.1.4 Nulled, Plugins, WordPress Theme, Dokan Pro Nulled