Want to wade into the sandy surf of the abyss? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.
Any awful.systems sub may be subsneered in this subthread, techtakes or no.
If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.
The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)
Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.
(Credit and/or blame to David Gerard for starting this.)
deleted by creator
Just had a conversation about AI where I sent a link to Eddy Burback’s ChatGPT Made Me Delusional video. They clarified that no, it’s only smart people who are more productive with AI since they can filter out all the bad outputs, and only dumb people would suffer all the negative effects. I don’t know what to fucking say.
Show them the RationalWiki page where Scott Alexander promised that he could only absorb the smart racism from crazy bloggers and ignore the stupid stuff, Elizabeth Sandifer warned him this was like drinking sewer water with just one filter, and then Alexander posted about how all of a sudden he was feeling more conservative and maybe the things he was reading were connected to that
https://rationalwiki.org/wiki/Scott_Alexander (also archive.is and other backups)
it sucks when you learn a thing like that about a person. it’s like blowing an efuse: generally one way, not easy to go back without a lot of work, and you may not want to bother
What you say is something on this note: Oh wow I have this amazing investment opportunity for someone like you, nobody has seen it yet, but with your intelligence and business acumen, we will get rich quick…
Assuming they have any amount of good faith, I would make the illustration that using AI is like dunning-kruger effect on steroids. It’s especially dangerous when you think know enough, but don’t know enough to know that you don’t.
Actually the emperor’s clothes look amazing if you’re smart enough to see them.
I have never written a song (without AI assistance) in my life, but I am sure I could learn within a week.
FUCKIN
In my experience most people just suck at learning new things, and vastly overestimate the depth of expertise. It doesn’t take that long to learn how to do a thing. I have never written a song (without AI assistance) in my life, but I am sure I could learn within a week. I don’t know how to draw, but I know I could become adequate for any specific task I am trying to achieve within a week. I have never made a 3D prototype in CAD and then used a 3D printer to print it, but I am sure I could learn within a few days.
This reminds me of another tech bro many years ago who also thought that expertise is overrated, and things really aren’t that hard, you know? That belief eventually led him to make a public challenge that he could beat Magnus Carlsen in chess after a month of practice. The WSJ picked up on this, and decided to sponsor an actual match with him and Carlsen. They wrote a fawning article about it, but it did little to stop his enormous public humiliation in the chess community. Here’s a reddit thread discussing that incident: https://www.reddit.com/r/HobbyDrama/comments/nb5b1k/chess_one_month_to_beat_magnus_how_an_obsessive/
As a sidenote, I found it really funny that he thought his best strategy was literally to train a neural network and … memorize all the weights and run inference with mental calculations during the game. Of course, on the day of the match, the strategy was not successful because his algorithm “ran out of time calculating”. How are so many techbros not even good at tech? Come on, that’s the one thing you’re supposed to know!
Lord grant me the confidence of a mediocre white man, etc.

alt text
A screenshot of a tweet by yougov, a uk-based organisation, showing the results of a survey which say
One in eight men (12%) say they could win a point in a game of tennis against 23 time grand slam winner Serena Williams
Include in the screenshot is a response by longwall26,
Confident in my ability to properly tennis, I take the court. I smile at my opponent. Serena does not return the gesture. She’d be prettier if she did, I think. She serves. The ball passes cleanly through my skull, killing me instantly
Yea good luck, I am a mediocre white man and the providence didn’t even grant me the confidence.
He will train a neural network on GM games, then memorize the algorithm and compute the moves in his head.
The Rationalists.
So pre-teen me reading the Biggles books with the gag about the pilot who tries to do ballistic calculations during a dogfight was saving me from being as stupid as a Californian?
This reminds me of another tech bro many years ago who also thought that expertise is overrated, and things really aren’t that hard, you know?
lmao, what’s his lesswrong username?
Is there already a term for the extreme opposite of impostor syndrome? Techbro syndrome maybe?
“Techbro syndrome” would be a perfect name for it, honestly.
Hey, remember Grokipedia?
Its article on Newton’s law of gravity is, like, 50% rendering errors by weight.

I saw (because for some reason grokipedia is now high in google search) that grok calls Roko (from rokos basilisk) pseudo anonymous, but isnt he just using full name and face on twitter? Such a weird small error (and change to the Wikipedia page). Didnt click the link obv.
Continuation of the lesswrong drama I posted about recently:
https://www.lesswrong.com/posts/HbkNAyAoa4gCnuzwa/wei-dai-s-shortform?commentId=nMaWdu727wh8ukGms
Did you know that post authors can moderate their own comments section? Someone disagreeing with you too much but getting upvoted? You can ban them from your responding to your post (but not block them entirely???)! And, the cherry on top of this questionable moderation “feature”, guess why it was implemented? Eliezer Yudkowsky was mad about highly upvoted comments responding to his post that he felt didn’t get him or didn’t deserve that, so instead of asking moderators to block on a case-by-case basis (or, acasual God forbid, consider maybe if the communication problem was on his end), he asked for a modification to the lesswrong forums to enable authors to ban people (and delete the offending replies!!!) from their posts! It’s such a bizarre forum moderation choice, but I guess habryka knew who the real leader is and had it implemented.
Eliezer himself is called to weigh in:
It’s indeed the case that I haven’t been attracted back to LW by the moderation options that I hoped might accomplish that. Even dealing with Twitter feels better than dealing with LW comments, where people are putting more effort into more complicated misinterpretations and getting more visibly upvoted in a way that feels worse. The last time I wanted to post something that felt like it belonged on LW, I would have only done that if it’d had Twitter’s options for turning off commenting entirely.
So yes, I suppose that people could go ahead and make this decision without me. I haven’t been using my moderation powers to delete the elaborate-misinterpretation comments because it does not feel like the system is set up to make that seem like a sympathetic decision to the audience, and does waste the effort of the people who perhaps imagine themselves to be dutiful commentators.
Uh, considering his recent twitter post… this sure is something. Also" “it does not feel like the system is set up to make that seem like a sympathetic decision to the audience” no shit sherlock, deleting a highly upvoted reply because it feels like too much effort to respond to is in fact going to make people unsympathetic (at the least).
From this (indirectly) I learned that they got wordpress.com to sponsor their “Inkhaven Residency”. Feh.
ooooh photographic matthew embarking upon his f*shtech turn out loud at last?
Maybe? Or maybe they just had the right social connections to sell “blogging residency” as a thing that should be supported for some unspecified amount? I couldn’t find any more details.
Automattic has a budget for sponsoring events https://automattic.com/events/ but a web host based in San Francisco does not sponsor Scott Alexander, Scott Aaronson, and Gwern by accident.
Small forum tyrants
interesting writeup on detecting ai music (newgrounds bans ai content)
https://www.newgrounds.com/wiki/help-information/site-moderation/how-to-detect-ai-audio
The Audio Mods are doing God’s work keeping the portal slop-free. Its good to know there’s at least one place where human-made work is still valued.
I expect her methodology was great but I don’t actually know what it was.
Science!
As background: the Kaufmann report that prompted all this is a load of absolute garbage. As discussed fairly extensively on social media (example).
As for Aella’s addition: oh god why did I read this?
The methodology was apparently running a “Big Kink Survey” which was “trending on TikTok” and had “very good SEO”. I suppose this is the right data needed to draw conclusions about what rate 14 year olds are transgender.
The whole this is also full of weird gender essentialism (I never want to read the word “biofemales” again).
I think this is evidence for an increasing split between afabs and amabs
But don’t worry she’s very pro trans (JK Rowling sense):
Despite having been cancelled by the more radical subgroups of trans people, I’m nevertheless very pro trans.
Which is why she wants to make a massive reach and be concerned that maybe trans people are getting too much healthcare:
I think it’s unlikely that 11.5% of afabs are actually trans men in a way that would last through adulthood. If my data is measuring any real trend in the world, and if that trend meaningfully increases permanent changes to bodies, then this high percentage might actually be quite bad.
… Nevermind that her data doesn’t even touch on stuff like rate of HRT, or regret rate; these “concerns” are all pulled out of thin air.
(I also lament that this makes things really rough for the trans men for whom this is not a fad, and for whom earlier transitioning would be a huge quality of life improvement.)
Like dear cis people are you OK? Is society not transphobic enough for you? :'(
Are you worried that if a teenager is allowed to explore their gender a little that it will cause a bunch of precocious little cis girls to accidentally glance at a vial of testosterone the wrong way and grow a fantastic beard overnight?
When I started estrogen (later than I should have and fuck you Idaho) I was absolutely 100% sure that it was the right thing to try, and that I’d stop if I didn’t like it. Never looked back (estrogen is tasty and I encourage everyone to try it at least once), and I had years before there was much you could call permanent.
But of course it’s not the “permanent changes to bodies” that made me a 6ft tall amazonian beauty that people like Aella are concerned about. “What if we accidentally trans one of the cis??” fundamentally assumes it is OK to accidentally withhold critical medicine from countless trans people just to be “safe”.
“What if we accidentally trans one of the cis??”
They always like to dress up these statements as medical concern. But it shines through that, despite whatever the person may express otherwise, deep down they think being trans is not really acceptable. Maybe partially acceptable at best, but should be avoided if possible. Very similar thought model to classics like “oh I’m fine with gay people, but what if my child sees two men holding hands and then wants to try it too??”
cursed thought
…is that yud for “she’s refused to fuck me”?
cursed corollary to Poe's Law
Yud being secretly contemptuous of Aella is indistinguishable from Yud being horny for Aella
Many such cases
Aella in the comments: I’m just an uwu smol bean who never learned how citations work
I think she would be willing to learn how to cite things, not sure whether she wants to learn why just surveying people is not the best way to find the truth. Pretty sure that her interest in trans people is not purely scientific.
Pepper
That post does sound like she thinks race and sex determine mathematical ability a la James Damore, Larry Summers, or Charles Murray and other backwards Americans.
Woman good at science? Must have a man brain. Send tweet.
Oops forgot to be weird about jews lemme amend that white jews can also be smart even if they have girl brains.
not a single serious person in that thread lol. also is rationalist castle’s isp blocking scihub? weird that that libertarian crowd didn’t hear about it
Why am I getting James Somerton flashbacks
She can’t tell the difference between “the people who wrote the paper” and “the group that runs the website that hosts a copy of the abstract of the paper”. This speaks to a plentiful lack of curiosity. It reminds me of crank e-mails and sensationalist clickbait pages that say everything on the arXiv is research from Cornell University.
second take from me. Here’s the full tweet:
one of Earth’s top scientists on sex and gender has published her latest work, open for all to read in a widely read venue on that science topic, where it will receive far more peer scrutiny than any lesser forum provides
I’m going to read this as a joke because he didn’t end it with a period, and he is secretly beefing with aella
Yud, Aella and Hossenfelder make me want to defend modern academic institutions. Granted, that’s not nearly as impressive as Scott Aaronson getting me to sympathize with a cop, but it’s still an achievement.
After moving out at 17, Aella briefly attended college in northern Idaho but ran out of money after a semester.
Aella already has better credentials than EY
How briefly do you have to attend college to still be this clueless about how citations work?
well it’ll become apparent during first attempts at writing a paper that requires them, but it can be a very long time if subject is dense enough
Ah more anti-intellectualism from the proto cult leader.
This does mean, as the standards are so low, that we all have a phd on Rationalism.
Ahh the missing period, an even worse tone indicator compared to /hj (youtube).
Both follow-up tweets end in periods, so I guess he transitioned to being completely serious 1/3 of the way through? Or maybe a missing period means a joke, a present period means he’s serious, and partial periodization means that he’s typing with one hand.
Glad I clicked the link, I was pretty sure hj meant handjob.
I am still avoiding YouTube, so I guess “hj” will mean handjob forever.
It’s clearly meant to mean /HalleluJah
yeah and I’m a brain surgeon because sometimes when I pick my nose I go a little too deep
I’m not sure her work is any worse than the average psychology paper that ends up in a magazine rack, but I am not signing up for her Substack to see. And “no worse than the average psychology paper” is not high praise.
PR for AI slop generated DWARF support for ocaml. Expectation: doesn’t work
Reality: replicates existing support from another project including attribution to the author.
Wow highly recommend reading all his comments where he doubles down on how everyone else is in the wrong (for wanting maintainable code that isn’t a legal liability) while he is in the right (for being brave and bold enough to type prompts into an LLM to create code that he won’t stand behind).
It’s almost as if he went in there looking for a fight.
Lool, look at these two quotes next to eachother:
One caveat, though: even if I didn’t type the code myself, I own it — and it’s my responsibility now.
vs.
Beats me. AI decided to do so [write the copyright as someone else] and I didn’t question it.
Ah yes, the classic open source guy stance of “you get to praise me publicly but direct criticism or problems to [over there]”
The purpose of AI is theft, exhibit ∞
It’s not where I obtained this PR but how.
the inability to follow a through b through c here is….something
at://did:plc:x2obbaxjktznf67mnhznpplp/app.bsky.feed.post/3m5z5da4mvk24
https://bsky.app/profile/did:plc:x2obbaxjktznf67mnhznpplp/post/3m5z5da4mvk24
Windscribe’s twitter account being transphobic.
Free speech can he expensive for dipshits.
@cityofangelle.bsky.social comments:
HAHAAHHAHAAHHAAA
Anthropic has posted two jobs, both paying $200K+.
FOR WRITERS. (Looks like a policy/comms hybrid.)
ANTHROPIC.
IS WILLING TO PAY HALF A MILLION A YEAR.
FOR WRITERS.
Whatsamatter boys, can’t your plagiarism machine make a compelling case for you?
LOL. LMAO, even.
@mirrorwitch @BlueMonday1984 Apparently they’re doing the same with video cutters, offering fairly well paid jobs for people to try and make themselves redundant.
@mirrorwitch @BlueMonday1984 Maybe they just need to be *really* good at writing prompts.
@mirrorwitch @BlueMonday1984 ᄏᄏᄏᄏᄏᄏᄏᄏᄏᄏᄏᄏᄏᄏᄏ
Does that represent the 🤏 (pinching fingers or “tiny”) hand gesture that makes some Korean men really mad?
found this sneer-y blog post prompted by some linkedin lunatic posting his “agentic cockpit”
I tried using Fluidsynth only to find that drums don’t work. I looked around on github and found that they are toying with copilot to fix a related issue and that vibe code has already been merged 🙃
AI struggles with simple CRUD apps, but hey let’s see how it does with DSP in C/C++
I’m sure specifics and timing don’t matter. can always tune that efficient later, amirite?!
(although once again a very I am the scream at how rapidly this toxic waste is becoming rampant in the commons :|)
Armin Ronacher, creator of the vastly popular Python Flask and Jinja framworks, comes out in defence of DHH, saying his racist diatribe about the ethnic make up of London was in fact not racist, and the state of Israel, that it’s apparently fine that it remains an oppressive ethnostate in order to ‘preserve its particular cultural identity’.
Yep, he’s been saying bad shit for a while (18~24mo I’m aware of), glad more people are seeing it though
also a massive massive promptfondler
yup
Indeed. I left a note on one of his blogposts correcting a common misconception (that it’s “all just tokens” and the model can’t tell when you clearly substituted an unlikely word, common among RAG-heavy users) and he showed up to clarify that he merely wanted to “start an interesting conversation” about how to improve his particular chatbots.
It’s almost like there’s a sequence: passing the Turing test, sycophancy, ELIZA effect, suggestibility, cognitive offloading, shared delusions, psychoses, conspiracy theories, authoritarian-follower personality traits, alt-right beliefs, right-wing beliefs. A mechanical Iago.
time for me to learn something else, then; what do people use these days for microservices instead of flask? fastapi?
EDIT:

oh come the fuck on google AIO
I finally became fed up with it and got around to writing a uBlock Origin filter that removes the AI overview, the AI results in the “People also ask” section, and especially the AI results in the “Things to know” section that usually covers health and drug information. There is literally so much AI bloat taking up the search page it’s crazy.
Off topic: if the Culture novels are ever adapted for television, all of the Culture people/ships/etc should have Scottish accents.
This might be one of those “careful what you wish for” scenarios. The neutral outcome might well be that they generate dialog with training data from Scots Wikipedia. Or Scots Grokipedia I guess
@antifuchs @techtakes The Culture won’t be adapted for TV/film without input from Iain’s heirs and his literary executors. Who are all very definitely Scottish and will have Opinions …
who gets the California accents though
That hegemonizing swarm outbreak would do nicely imho
The Affront are literally the colonial British, so they’re easy
@dgerard And The Empire of Azad was basically modern western imperialism writ large.
subtlety is wasted on most readers
Tired: alien horrors with tentacle beards
Inspired: alien horrors with tentacle muttonchops
I feel like “we’ll just build a world model” is on the same level as saying " we’ll just solve the P vs NP problem."

img text
How to draw an owl:
- draw some circles. (drawing of circles)
- draw the rest of the fucking owl. (very skillful and detailed drawing of an owl)
deleted by creator



















