Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.
Any awful.systems sub may be subsneered in this subthread, techtakes or no.
If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.
The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)
Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.
(Credit and/or blame to David Gerard for starting this…)
Great piece by Jacob Silverman about the growing shittyness of the day-to-day internet experience
Can we find a way back to an internet that puts people in lucid conversation with one another, where books are published after they are written, where anger and insanity aren’t the dominant modes of thought and the defining editorial values are more meaningful than a chumbox of clickbait nonsense? I’m not sure.
New piece from Brian Merchant: The fury at ‘America’s Most Powerful’
The piece primarily focuses around a parody of the “Iraqi Most Wanted” playing cards that were made for the invasion of Iraq, which feature the faces and home addresses of various tech billionaires (well, the “art” decks do - the “merch” decks feature their publicly listed office addresses instead), and uses that to talk about the boiling rage against the elites that has become a defining feature of the current American political climate.
TIL Richard fucking Hanania is a Rationalist, at least according to this excrescence from LW
The left-wing monoculture catastrophically damaged institutional integrity when public-health officials lied during the pandemic and when bureaucrats used threats and intimidation to censor speech on Facebook and Twitter and elsewhere—in the long-term this could move the country toward the draconian censorship regimes, restrictions on political opposition, and unresponsiveness to public opinion that we see today in England, France, and Germany.[1]
Yeah I’m sure trying to dictate to Harvard who they can hire and what courses they can teach is not leading to a “draconian censorship regime”
[1] to be clear this is attributed to Richard Ngo, not Hanania
To summarize that blog post and the three “Rationalist cases for Trump” that it points to: “We made up a Trump to like and a bunch of Democrats to get mad at”.
When exactly did that “left-wing monoculture” flex its muscles? During Trump 1.0? During the first two years of the Biden administration, when leftists and progressives were criticizing that administration every day for not doing enough on, well, anything? During the second half of the Biden administration, when Republicans controlled the House and leftist criticism of the administration, um, did not grow quiet?
I also want to know what this “draconian censorship regime” is in Europe, because it isn’t like they’re falling over themselves to take care of trans people or immigrants. Unless he’s supporting the freedom to blatantly lie in order to incite violence against minorities.
You still can’t be an outright Nazi without people getting mad at you. Obviously unacceptable.
“Lied during the pandemic” wait what? Did LW do a 180 on covid, or is this a “they gave advice without knowing everything that was later shown wrong?” Because that is quite a dumb thing to say, also calling that leftwing is nuts. And last, did they memoryhole that Scott lied about the pandemic? He said people should stop smoking because it helped with covid. Not because he had proof, just because he thought it would be good if less people smoked.
E: oops wrong person.
E2: 100 upvotes, the conspiracy weird far right people have taken over.
Yeah, the whole “they lied!” nonsense is deeply frustrating. People simultaneously want experts to be responsive and provide information immediately but have no tolerance for “as best we now know” or “given the current circumstances” advice. You can’t simultaneously get the most recent cutting-edge information and only get what’s been long-settled and validated.
Thing is, you can legit criticize Fauci for his bad communication on masking in the early days of COVID! But it’s not gonna land b/c the anti-mask/vax/etc crowd also hate Fauci for the most deranged reasons
Yeah but there is a gulf of difference between, he communicated badly and he lied
JZW link but meat is a 404 article, don’t wanna bypass their paywall
Hello fellow kids! Doing crimes is TIGHT!
American police departments near the United States-Mexico border are paying hundreds of thousands of dollars for an unproven and secretive technology that uses AI-generated online personas designed to interact with and collect intelligence on “college protesters,” “radicalized” political activists, and suspected drug and human traffickers […]
This shit os going to drive so many amateur pedo/etc hunters (who already are not the people who are good at being normal) into absolute crazy conspiracy theory land. Esp when they cant get the fake pimps (yeah pimps really, these cops seem to be like the military dude in the intro of ideocracy) arrested or banned.
E: on that note, saw somebody explain that their previous slightly hysterical, ‘facebook is claiming searching for the concentration camps is a search for csam’ was made in error as it was the word ‘mega’ (and another word) which seemed to trigger the warning. And below that people were still making up some crazy palantir is going to get everyone jailed for clicking a csam link conspiracy bullshit. I worry for the sanity of a lot of online people. People are so afraid they are blueanoning themselves
Posting this as a stub because of reasons
- I don’t want to do the research on this
- This was pulled from the front page of reddit
- Connecting these two things is largely tabloid speculation, but funny enough for a stub.
X Is Ditching DMs Hours After News of Musk Messaging Women About Impregnating Them Breaks
you will never guess which company hired oswald mosely’s grandson
https://www.politicshome.com/news/article/palantir-boss-interview-keir-starmer-gets-ai
YIL there was a right-wing antisemitic US general called Moseley
https://en.wikipedia.org/wiki/George_Van_Horn_Moseley
Nominative determinism?
If Palantir represents a veritable unknown for the public, so too could its UK chief, the grandson of Oswald Mosley and nephew of former F1 president Max Mosley.
Ah, yes, noted grandfather and nothing else, Oswald Mosley. Definitely not the founder of British Union of Fascists, please don’t look into that. His son was in the racecar business, isn’t that lovely?
And Max Mosley definitely didn’t share his father’s views
Ok that is it, im changing my mind. All the rationalist genetics stuff is real and true as there is a fascism gene.
jesus christ:
transcript
Kyle Langford, a 20-something Nick Fuentes acolyte, is running for governor of California on a platform of deporting all male undocumented immigrants and then giving all the females one year to marry a “Californian incel” to avoid deportation.
Eminently punchable face
“In hindsight, CFAR co-founder Anna Salamon told NBC, “we were creating conditions for a cult.””, if only there had been people warning about this, sadly such a club didn’t exist. Anyway, im sure this will lead to them reflecting and changing things.
Source “Two Vegan Lovers, an AI ‘Cult,’ and a Trail of Dead Bodies”, Archive
Yudkowsky had a gift for making hyperniche concepts accessible.
No, he didn’t. His “explanations” are turgid and useless even when they’re not just mathematically wrong. They create the feeling of understanding for some readers — those who want their self-image of smartness validated, who imagine that LessWrong is the cool kids’ table, and who aren’t actually tested on how much they’ve learned.
Over the course of thousands of pages, rationalist Harry uses logic and decision theory to save the world and defeat Voldemort.
No, he uses his fucking Time Turner.
Snyder seemed to be trying to break through to Yudkowsky with an appeal to his self-importance
OK, zero notes there.
With the collapse of the US empire and hegemony in progress wonder if they actually did something with this or if it is all another thought experiment.
E: wow that users posts history is something. Drops that article 3 years ago. Silence till some weird comment (directly addressing yud claiming they had some big breakthrough, about the mind state of the zizians of course. And it is all speculation and way too verbose).
hey “way too verbose” is a free space
Yes you are right and it wasnt even that bad, the footnote was only half the length of a ‘who build this old roman wall?’ footnote.
I got it from a comment here, apparently some pie in the sky charity needs more money
Ah a food related charity that is important wonder what they did with the millions, ah release papers. That is …
Planning for securing food in a nuclear winter? What a great wheeze. If your advice isn’t any good, nobody can tell until there’s a nuclear winter, and if there is a nuclear winter they won’t exactly be able to ask for their money back because they’ll be too busy dying of radiation sickness.
Remember the apocalypse slop buckets the rightwing grifto sphere kept trying to sell?
I do indeed. A bucket always seemed like pretty poor.protection from the end of the world, even if it was full of purified water and high protein MREs and whatever else. I suppose you could put it on your head and make like Ned Kelly
ICYI I posted a new vid/pod this week and an accompanying thread about: Everything Is Work Now — Tech products have a work/life balance problem
404 media: I Tested The AI That Calls Your Elderly Parents If You Can’t Be Bothered
It’s a service that makes an AI voice chatbot call your parents daily, so you don’t have to, and then it even sends you a notification to your phone with an AI summary of what your parent told the AI.
I really didn’t think that people can come up with new AI-based ideas anymore that would astonish me, but there, I was wrong, they did it. This is so cold and fundamentally alienating to me, it reminds me of that recently much-quoted Miyazaki phrase, “an insult to life itself”.
It really sucks so much how many coders embrace it. At my work, there is the looming introduction of code LLMs very soon, and I’m anxious to learn how many of my colleagues will happily use it, and the consequences it will have for me to deal with the results (and generally, how it will make me feel to work in an environment where these tools are embraced). I was hoping that the corporate bureaucracy would be slow enough that the AI bubble collapses before it’s allowed to use the tools, but unfortunately management put a lot of pressure behind it and it all went faster than expected :(
not every programmer posts to social media…
Id say a lot of the better ones dont at least not regularly (in my exp), did hear from one of those that they had a problem with new hires, some lf them have very random output quality wise, until they get fired for using llms for everything.
Our company is currently looking for a new programmer and we’ve interviewed a few so far. I don’t want to generalize but it really seems that a non-negligible part of the younger ones at least tries to use LLMs to make up for a lack or experience, and that really shows.
I normally don’t like doing programming challenges during an interview because they have little to no real-world connections, but I’ve been throwing small questions around lately just to see what people do, and how they approach them, and there’s a subset of people who will say, “I would ask ChatGPT now” in those scenarios.
I haven’t met a vibe-coder in real life yet, but I’m afraid it’s only a matter of time.
I haven’t worked in industry for a while now but from your accounts it seems like… nothing’s changed?
Sturgeon’s law very much applies to software engineers. I’m sorry but the vast majority of people in my junior cohort I wouldn’t hire to replace my lightbulb. Of course they’re all in on LLMs. They’ll be doing what they were doing best, generating tons of awful code they copied from somewhere else that the adults in the room will have to clean up later, just the generation and copying is now paid at a $100 monthly subscription.
Like seriously, it doesn’t matter even a tiny bit the code got generated by a bullshit machine when the code is Node.JS anyway. If you’re building a giant penis out of cow dung it doesn’t matter who your construction crew is and how good they are. And the industry is like 90% building giant penises than never come to fruition anyway.
Fair points, but I still take cleaning up someone’s own bad Node.JS code over cleaning up LLM Node.JS slop because the optimist in me hopes that the human who wrote bad code can at least learn something and become better over time. After all we all have started with writing garbage, I know that I have.
On the other hand, I guess I should find a job where I don’t have to touch web development with a ten-foot pole because it’s probably not getting better.
New sneer from Tante, this time aimed at the entire field of CS:
This is really weird because every single person in academia I talked to about non-CS stuff is either a perfectly median centrist social-democrat or literally a member of the local communist party, with zero variation in between those two.
Like it’s either you’re a young idealist that still believes the world can be better, or you’re 40 with three kids and a mortgage that just wants the government to be relatively stable and not fuck shit up for you.
I know zero Americans though, so maybe there’s a skew there.
Not sure why anyone thought CS as a community can “save us”. It’s just as likely to be red/black-pilled (gold/black-pilled?) as any other heavily male tech adjacent community. The idea that nerds should be politically liberal because they were bullied in 80 high-school comedies is ludicrous.
No, I’m sure this time we can identify the person or people who are divinely anointed to exercise absolute power over everyone.
His name is Scott.
article about the tactics that felon employs against the women bearing his children
features some notable sentences from his fixer, too. the sort of shit that just barely doesn’t qualify as him threatening to top off your kneecaps
Do note how this legion of kid slaves shit shows how he both doesnt believe will go to mars nor that ai/robots can automate things.
You dont make new children to save civilization in 18+ years if you think in 2 years you will go to mars/build robots that automate everything/build the robotgod.
oh yeah he’s way too much of a coward to go to mars. besides, no-one there he could dominate or hurt
Eurgh!
Musk seems like he’s 1cm away from Jeffrey Epstein.
idk when i’ll have better opportunity to post this
Isn’t it more grammatically correct to say “Jeffreys Epstein”?
I thought we all knew the proper collective noun was a creepful of epsteins
previous stubsack guest star Cursor rejoins the show, using a shitty liarsynth to automatically tell users broken behaviour is expected (cw: orange site), followed by people mass-killing their subscriptions
Earlier today Cursor, the magical AI-powered IDE started kicking users off when they logged in from multiple machines. Like,you’d be working on your desktop, switch to your laptop, and all of a sudden you’re forcibly logged out. No warning, no notification, just gone.
Naturally, people thought this was a new policy.
So they asked support.
And here’s where it gets batshit: Cursor has a support email, so users emailed them to find out. The support peson told everyone this was “expected behavior” under their new login policy
One problem. There was no support team, it was an AI designed to ‘mimic human responses’
haven’t gotten into the replies to look for sneers yet but I bet there will be some
Cursor’s cofounder jumped in to try and quell the outrage. He is not pulling it off.
That feels like a fitting ironic fate, a company selling AI slopcode generation looses a bunch of users from believing their own bullshit and using an LLM as customer support. Hopefully that story repeated a few dozen times across other businesses and the business majors stop pushing LLM usage.
Edit… looking at the orange site comments… some unironically cited Anthropic
researchmarketing hype, which (correctly) shows “Chain-of-Thought” is often bullshit unrelated to the final answer (but it’s Anthropic, so the label it as deception and unfaithfulness instead of the entire approach being bullshit in general).