It’s wild that they want to replace the lowest levels of actual work with AI but the management positions that actually might be a better fit for AI replacement are perfectly safe.
An AI, even as shit as they are, probably would be better at being a CEO than a customer support rep.
How do they make profits, if we can’t afford anything?
I suppose most bullshit jobs can easily be replaced with the likes of ChatGPT, but for most jobs that actually need to get things done it will only create mass unemployment if the corporations are willing to tank the quality of their products. Doesn’t sound like a sustainable business decision, though.
The main threat is to junior professionals. AI won’t replace “actual” jobs, but it certainly has the potential to reduce them by giving one person the ability to do the work of multiple people. This is almost certain to primarily affect junior positions whose grunt-work is offloaded to AI, thus reducing the number of qualified senior professionals in the future.
Any job that is actually productive rather than doing stuff that no one wants or needs is an “actual” job, and I just don’t see how you can replace so many people with LLMs without severely affecting the quality of the work, even if it’s grunt work done by juniors.
There are tasks that are necessary but tedious. These are tasks that either consume the time of experienced professionals, or are offloaded to inexperienced professionals when payroll allows.
Tedious tasks are perfect candidates for automation, especially when the result is much easier to verify than to find. This frees the experienced professional to do interesting work.
Tedious tasks are also great for assigning to juniors to help them learn.
I’ll often use tedious tasks as a testing ground for new technologies as well. Tedium leads to innovation.
Exactly. Fewer juniors means fewer seniors in the future.
‘automation’ and ‘AI’ are currently very different things. There are certainly applications where machine learning is very effective, sometimes even better than senior professionals, but LLMs are mostly bullshit machines.
They generate a lot less bullshit when deliberately trained on a specific dataset, and they’re only getting better with time.
Because it’s not about quality or efficiency. They only see it as a cost-cutting measure.
The next quarterly report is literally the only thing in the world that matters
And finally, when they employ zero people, and no one has any money to buy their product they will have finally hit infinite profits. Seriously though it doesn’t look like we’re doing UBI any time soon and I doubt the ultra rich would kill off everyone on purpose because half the point seems to be to have people they can feel superior too. Are we just going to wipe ourselves out in a weird paperclip maximizer scenario where we pile up a ton of shit no one can actually buy but has to keep being made to make a line go up?
Unless we rise up and stop the people in charge, yes. Their end goal is having everything belong to them. Nothing else matters to these people.
- Yes
- Some of these people are dumb enough to get off being superior to a chatbot. We may yet be fucked.
- ???
- Profot
Surely AI must be hell of a profit then right now ? Right?
Cool. Cool cool cool. So what’s the plan for all the unemployed people? Just let them starve and die so you can buy their houses? And rent them to whom?
This is the case with most technologies. The technology is world changing, but unregulated capitalism makes it worse than what is being replaced. For example: crypto, electric cars.