Generative AI and “nudify” apps
Matt Burgess, from Wired.com (on arstechnica):
Major technology companies, including Google, Apple, and Discord, have been enabling people to quickly sign up to harmful “undress” websites, which use AI to remove clothes from real photos to make victims appear to be “nude” without their consent. More than a dozen of these deepfake websites have been using login buttons from the tech companies for months.
Matt Burgess, from Wired.com (on arstechnica):
Major technology companies, including Google, Apple, and Discord, have been enabling people to quickly sign up to harmful “undress” websites, which use AI to remove clothes from real photos to make victims appear to be “nude” without their consent. More than a dozen of these deepfake websites have been using login buttons from the tech companies for months.
A WIRED analysis found 16 of the biggest so-called undress and “nudify” websites using the sign-in infrastructure from Google, Apple, Discord, Twitter, Patreon, and Line. This approach allows people to easily create accounts on the deepfake websites—offering them a veneer of credibility—before they pay for credits and generate images.
While bots and websites that create nonconsensual intimate images of women and girls have existed for years, the number has increased with the introduction of generative AI. This kind of “undress” abuse is alarmingly widespread, with teenage boys allegedly creating images of their classmates. Tech companies have been slow to deal with the scale of the issues, critics say, with the websites appearing highly in search results, paid advertisements promoting them on social media, and apps showing up in app stores.
This is another reason why women and girls in general, shouldn’t post their photos online. The fitna is already there even if the photos aren’t sexualized, but this is a whole other level of just destroying a girls reputation.
Imagine if this became rampant in the Muslim community? It would just be a huge mess, with families’ reputations being tarnished and girls being slandered against left and right. Imagine a high school or middle school boy liking a muslim girl in school, and trying this feature on her. She may not even be one who posts photos online and might not even be involved in social media, but anyone can just take your photo these days and do whatever they want with it.
This is a continuation of a trend that normalizes sexual violence against women and girls by Big Tech,” says Adam Dodge, a lawyer and founder of EndTAB (Ending Technology-Enabled Abuse). “Sign-in APIs are tools of convenience. We should never be making sexual violence an act of convenience,” he says. “We should be putting up walls around the access to these apps, and instead we're giving people a drawbridge.
Putting up walls around access to the apps might stop a few people, but those who really want to nudify someone will find a way to nudify them.
Muslim parents should take heed and address these issues with their children, especially if they have cell phones and are on social media.
With the power of A.I. how can you verify your loved ones are alive?
Sarah Jeong from The Verge:
The persistent cry of “Fake News!” from Trumpist quarters presaged the beginning of this era of unmitigated bullshit, in which the impact of the truth will be deadened by the firehose of lies. The next Abu Ghraib will be buried under a sea of AI-generated war crime snuff. The next George Floyd will go unnoticed and unvindicated….
Sarah Jeong from The Verge:
The persistent cry of “Fake News!” from Trumpist quarters presaged the beginning of this era of unmitigated bullshit, in which the impact of the truth will be deadened by the firehose of lies. The next Abu Ghraib will be buried under a sea of AI-generated war crime snuff. The next George Floyd will go unnoticed and unvindicated….
We briefly lived in an era in which the photograph was a shortcut to reality, to knowing things, to having a smoking gun. It was an extraordinarily useful tool for navigating the world around us. We are now leaping headfirst into a future in which reality is simply less knowable. The lost Library of Alexandria could have fit onto the microSD card in my Nintendo Switch, and yet the cutting edge of technology is a handheld telephone that spews lies as a fun little bonus feature.
Having AI to alter photos is great for when you take a family photo so you can remove some inappropriately dressed people in the background.
Like all new technologies though, the most obvious danger that Sarah alludes to is using AI to start new wars, cause mass distrust from people, and just the overall mass speculation of society about anything.
Who can you trust, when any image can be created and altered to your narrative?
Here are some sample photos from the article, in order to avoid looking at some of the impermissible ones:
Could AI be the next thing that brings back old-school lifestyles, where people will cherish having the in-person experience, versus assuming a video or a person they are talking to is a real individual?
Will we reach a point where actually traveling to meet your loved ones will be the only way we can verify their existence?
With how false chatGPT can be in giving you answers about Islam, will you trust any AI source with your religion?
There are just so many questions that need to be answered, but no one can deny that living the experience will be something people will cherish more than anything.
Having a mufti that you study under will be more valuable than a remote experience.
Visiting family will be a reassurance that yes, they do still exist, and they are alive!
The wonders of technology will make us once again “regress” to old ways of in-person experiences, and perhaps to a certain degree, that may be a good thing.
The path to that point however, might not be so easy.
Good news for New Yorker bound international flyers.
Gaby Del Valle from The Verge:
"A federal judge in New York ruled that Customs and Border Protection (CBP) can’t search travelers’ phones without a warrant. The ruling theoretically applies to land borders, seaports, and airports — but in practice, it only applies to New York’s Eastern District.
That’s not nothing, though, since the district includes John F. Kennedy Airport in Queens, the sixth-busiest airport in the country. Nationwide, CBP has conducted more than 230,000 searches of electronic devices between the 2018 and 2023 fiscal years at land borders, seaports, and airports, according to its publicly available enforcement statistics."
The Headline is a bit misleading, but for now this only applies to New York's Eastern District, which includes JFK and I would assume LaGuardia.
Of course if you're Muslim, this is good knowledge to know, unfortunately.
Gaby Del Valle from The Verge:
"A federal judge in New York ruled that Customs and Border Protection (CBP) can’t search travelers’ phones without a warrant. The ruling theoretically applies to land borders, seaports, and airports — but in practice, it only applies to New York’s Eastern District.
That’s not nothing, though, since the district includes John F. Kennedy Airport in Queens, the sixth-busiest airport in the country. Nationwide, CBP has conducted more than 230,000 searches of electronic devices between the 2018 and 2023 fiscal years at land borders, seaports, and airports, according to its publicly available enforcement statistics."
The Headline is a bit misleading, but for now this only applies to New York's Eastern District, which includes JFK and I would assume LaGuardia. Of course if you're Muslim, this is good knowledge to know, unfortunately.