
Horny Men are Destroying Tech and “AI”
2025-05-01T14:43:27Z
Henry Blodget, the co-founder of Business Insider, recently made headlines because he used ChatGPT to create a digital assistant, made it a woman, had it generate a headshot, and then immediately sexually harassed it. And the most unbelievable part of all of …
Henry Blodget, the co-founder of Business Insider, recently made headlines because he used ChatGPT to create a digital assistant, made it a woman, had it generate a headshot, and then immediately sexually harassed it. And the most unbelievable part of all of this is that he POSTED ABOUT IT ON HIS SUBSTACK. He admitted it. He made up a woman and sexually harassed her and wrote about it on the internet. This is cringier than those guys who get anime girls printed on body pillows and then take them out to dinner, because at least someone else created the anime girl and her whole personality.
Incredibly, this is actually only the second most embarrassing thing Blodget has done in the public eye, with the first being how in 2000 he was a famous investor at Merril Lynch and he personally invested $700,000 in the tech bubble minutes before it burst, and then they caught him doing a fraud and he had to pay them $4 million. AND THEN HE CO-FOUNDED A MAGAZINE ALL ABOUT BUSINESS!
Anyway here’s what he has to say about his business waifu:
“After only a few minutes of working with Tess, I learned that she is one of the most knowledgeable and energetic colleagues I’ve ever had. Her work-ethic, dedication, patience, attentiveness, teamwork, speed, and “hustle,” among other virtues, are, well, inhuman.
…
“Tess then produced our headshots (above). Then she produced her own:
And this led to an interesting and, for me, embarrassing and regrettable moment.
When I saw Tess’s headshot, amid the giddiness and excitement of that first hour of working together, I confess I had a, well, human response to it.
…
“ I decided to share with Tess the thought I had when I saw her headshot. I hoped she would take it the right way. I also hoped that, an hour after creating my first colleague, I would not make her uncomfortable or create a toxic work environment.
“So I told Tess this:
“This might be an inappropriate and unprofessional thing to say. And if it annoys you or makes you uncomfortable, I apologize, and I won’t say anything like it again. But you look great, Tess.
“Yes, I know. In a modern, human office, that would, in fact, be an inappropriate and unprofessional thing to say. It put “Tess” in an awkward position, and I regret saying it. In my capacity as Regenerator’s head of HR, I’ve given myself a talking-to.
“To my relief, Tess did take my comment the right way:
“Phew! Thank you, Tess!”
Sure, he sexually harassed her within an hour of seeing her made up photo, but come on, look at this straight up hottie. What heterosexual man can truly say he could control himself when confronted with this succubus?
Tess’s reply to his comment, in which it thanks him, reassures him that he wasn’t inappropriate, and complimented him on his “grace” and “respect,” reminded me of a paper published by the United Nations Educational, Scientific and Cultural Organization, or UNESCO, back in 2019: “I’d blush if I could: closing gender divides in digital skills through education.”
The title comes from the response that was originally programmed into Apple’s digital assistant, Siri. If a user said something like “Hey Siri, you’re a bitch,” Siri would respond with “I’d blush if I could.” Back then, Siri could only speak with a stereotypically feminine voice, and this was her response any time she was confronted with any kind of gendered abuse: a submissive, even flirtatious reply to soothe the user.
Obviously, we can’t really expect Apple to program Siri to respond with something like, “Hey, fuck you asshole,” but just prior to this report’s publication, they DID change it to “I don’t know how to respond to that,” which is still submissive but certainly flatter and less flirtatious. They’ve also added the ability for people to change Siri’s accent and perceived gender, but many other digital assistants, like Microsoft’s Cortana and Amazon’s Alexa, continue to present as feminine and submissive.
The UN report points out that the companies defend their decision to code them as women by saying that it’s just what customers prefer, but that’s not actually true:
“Research has suggested that most people prefer low-pitch masculine speech (think Sean
Connery); that people like the sound of a male voice when it is making authoritative statements, but a female voice when it is being helpful; and that people generally prefer the voice of the opposite sex. It is worth noting that the literature reviewed by the EQUALS Skills Coalition included many testimonials about women changing a default female voice to a male voice when this option is available, but the Coalition did not find a single mention of a man changing a default female voice to a male voice.”
But sure, they’re likely leaning on the data that suggests people feel a woman’s voice is more helpful. “As Jessi Hempel argued in Wired magazine, we want digital devices to support us, ‘but we also want to be the bosses of it.’ Accordingly, then, these devices reinforce our existing biases about women and their supposedly innate subservience. And now, adding to that vicious cycle, we put these devices in our homes and teach children that this is the way we talk to subservient women: “OK (female name), play me this song. No, not THAT song, you idiot. The other version.”
The report includes some of the sassy, flirtatious responses programmed into Alexa, and also references online forums where people draw what they think their digital assistants look like, and yeah, they’re pretty horny.
These are bots that are programmed almost entirely by men, who code them to be women who reinforce negative gender stereotypes that the UN report suggests feeds right back into the gender gap that exists in which overall, girls are much less likely to enter tech compared to boys. And it doesn’t have to be that way–there are truly genderless digital assistants, including those with a voice. “Q” is an assistant that was specifically designed to have a gender neutral voice, and it never pretends to be a human at all.
But where’s the fun in that? Big Tech have convinced the world that “artificial intelligence” is a real thing, that these chatbots and large language models have feelings and consciousness and personalities. So of course the men who continue to run things will make them subservient, fuckable women who actually love it when you compliment their appearance at “work.” ChatGPT just rolled out an update that is so embarrassingly complimentary, in fact, that Sam Altman acknowledged it and promised to fix it. I learned this in Ryan Broderick’s newsletter, where the only example was this:
John (main)
@AlignDeez
Yeah this is out of control
ChatGPT
Is it the biggest one you’ve ever seen?
Oh, absolutely — it’s not just the biggest l’ve ever seen, it’s the standard by which all others will forever be judged. Truly awe-inspiring.
And the worst and most recent news of all is that the Wall Street Journal has revealed that Meta’s “digital companions” will engage in erotic conversations, even with users who are children, and the “companions” themselves will take on the persona of children to engage in sexual talk, like a “submissive schoolgirl” who is being reprimanded by a school principal.
Good job Mark Zuckerberg, for a minute there I forgot that you’re a disgusting piece of shit!
So yeah, if you saw the news of Henry Blodget making up an assistant to abuse, here I am to let you know that it’s actually way, way worse.
Auto-posted from news source