https://www.nytimes.com/2024/01/25/nyregion/vince-mcmahon-wwe-lawsuit.html The NYT articles suggest that this was part of a pattern of behavior (nobody posting regularly here should be surprised). The legal action raises new questions about the investigation conducted in 2022 by a special committee of W.W.E.’s board of directors into Mr. McMahon’s conduct. The investigators found that Mr. McMahon had spent $14.6 million between 2006 and 2022 on payments to women who had accused him of sexual misconduct and that the payments should have been recorded as business expenses. Further investigation by the company found that Mr. McMahon had made additional payments totaling $5 million to two other women.
Well this Vince McMahon segment has aged like milk. pic.twitter.com/Q3tJaFNebj— Bryan Jarvis (@Brainman02) January 27, 2024
Oh yeah ... it's obvious there is a lot more in the past. Those payments were already known. Just astounding that it kept going until very recently.
I think this probably goes well here. Though it could go in the AI thread as well. https://www.bbc.com/news/world-us-canada-68123671 Social media platform X has blocked searches for Taylor Swift after explicit AI-generated images of the singer began circulating on the site. In a statement to the BBC, X's head of business operations Joe Benarroch said it was a "temporary action" to prioritise safety. When searching for Swift on the site, a message appears that says: "Something went wrong. Try reloading." Fake graphic images of the singer appeared on the site earlier this week. Some went viral and were viewed millions of times, prompting alarm from US officials and fans of the singer. Posts and accounts sharing the fake images were flagged by her fans, who populated the platform with real images and videos of her, using the words "protect Taylor Swift". And while Swift is getting all the notice, this is not new. This is the WaPo, and probably paywalled, so I'll provide others. https://www.washingtonpost.com/technology/2023/11/05/ai-deepfake-porn-teens-women-impact/ Artificial intelligence is fueling an unprecedented boom this year in fake pornographic images and videos. It’s enabled by a rise in cheap and easy-to-use AI tools that can “undress” people in photographs — analyzing what their naked bodies would look like and imposing it into an image — or seamlessly swap a face into a pornographic video. On the top 10 websites that host AI-generated porn photos, fake nudes have ballooned by more than 290 percent since 2018, according to Genevieve Oh, an industry analyst. These sites feature celebrities and political figures such as New York Rep. Alexandria Ocasio-Cortez alongside ordinary teenage girls, whose likenesses have been seized by bad actors to incite shame, extort money or live out private fantasies. Victims have little recourse. There’s no federal law governing deepfake porn, and only a handful of states have enacted regulations. President Biden’s AI executive order issued Monday recommends, but does not require, companies to label AI-generated photos, videos and audio to indicate computer-generated work. As hinted at below, some of the deepfakes are minors. https://www.stltoday.com/news/natio...cle_de967702-b5cc-578a-9ede-4f7a9b842ce1.html The circulation of explicit and pornographic pictures of megastar Taylor Swift this week shined a light on artificial intelligence’s ability to create convincingly real, damaging — and fake — images. But the concept is far from new: People have weaponized this type of technology against women and girls for years. And with the rise and increased access to AI tools, experts say it’s about to get a whole lot worse, for everyone from school-age children to adults. Already, some high schools students across the world, from New Jersey to Spain, have reported their faces were manipulated by AI and shared online by classmates. Meanwhile, a young well-known female Twitch streamer discovered her likeness was being used in a fake, explicit pornographic video that spread quickly throughout the gaming community. https://www.cnn.com/2023/11/04/us/new-jersey-high-school-deepfake-porn/index.html A student at a New Jersey high school is calling for federal legislation to address AI generated pornographic images after she says photos of her and other female classmates were manipulated and possibly shared online over the summer. Westfield High School student Francesca Mani, 14, and her mother, Dorota, have expressed frustration over what they say is a lack of legal recourse in place to protect victims of AI-generated pornography. “In this situation, there was some boys or a boy — that’s to be determined — who created, without the consent of the girls, inappropriate images,” Dorota said, speaking with CNN’s Michael Smerconish Saturday. Francesca, who said she was among more than 30 female students at Westfield High School whose photos were manipulated and possibly shared publicly, is demanding accountability from the school and local, state, and government officials. School administrators initially became aware of the incident on October 20 when students informed them the images were created and possibly shared over the summer. “There was a great deal of concern about who had images created of them and if they were shared,” Westfield Principal Mary Asfendis wrote in a letter to students and parents sent on October 20. “At this time, we believe that any created images have been deleted and are not being circulated. This is a very serious incident.”
This is sickening….. https://www.washingtonpost.com/dc-m...ns-police-child-sexual-abuse-rodney-vicknair/
Anybody see this about Sean Combs/Puff Diddy/Diddy? https://www.latimes.com/entertainme...ddy-lawsuit-lil-rod-sexual-assault-love-album Now, sending heavily armed, military-like police to do the searches of his homes in Miami and LA (on Monday, or maybe yesterday) is waaaay over the top. But the continuing allegations are really, really serious. And I'm reading bits here and there and this is going to be bad, not just for Diddy, but for a lot of people.
Yeah, but in the case of people like Swift, there are millions of images out there which were not taken by her. And, in the nature of her profession, she needs to have those photos out there. I mean, you think this issue with Swift is bad, take a look at India. This thing in India has been going on for years, not just a few months. And when you get into telling people to stop sharing photos, that gets into victim blaming. "She's so hot. I should make a porn video of her with AI."
The internet rumors is that he was like the Epstein but for celebs? In any case, I think there has been a notion that Diddy was among the next to fall for like 3-4 years at least now. So is this another thing where it has been a public secret among the rumor mill of the people in the know/insiders and it was just a question when law enforcement was going to get involved?
Are there laws yet in place to deal with this new technology? There will eventually be GamerGate/Libertarian kind of people who will argue that it's their intellectual property, because it is an AI creation. I bet the legal system hasn't caught up yet. Ghislaine Maxwell
You hope so???? I'm hoping he had fewer clients, not more. Probably fewer victims that way. It turned up shit all right. It turned up Trump. They just deliberately didn't connect the dots
I have not come here to defend Trump I have come here to broaden your view in regards to Epstein’s alleged accomplices The fact of your selective outrage is well established