#0004 The schizo-fication of the online world, Our darkest hour approaches, AI is already smarter than you, you're just too stupid to see it
Welcome to Constant Flux, a weekly lens taking a systemic view on the polycrisis.
In this issue we learn how online reality fractures into endless symbolic meanings all the while AI rapidly reshapes the systems we rely on. On top of this human intelligence struggles (no shit) to keep pace.
Dive in and see if you can find some clarity from this chaos.
The schizo-fication of the online world
The article tells about a cultural shift in how people experience the internet. Instead of simply struggling to focus, people are overwhelmed by too many signals, constantly seeing patterns and meaning in everything they encounter online... leading to a state where reality feels fractured, definitely uncertain but also very symbolic.
Schizoposting thrives on this—mundane things become loaded with significance. Why do certain colours keep appearing in pop culture? Is this meme actually part of a secret message? Why does this TikTok feel like a prophecy?
It's a lot of noise out there and it's making you schizo!
Our darkest hour approaches
Shapiro (but not that Shapiro) paints a picture of a system on the edge. One where AI is reshaping everything fast. Real fast. Human roles are being hollowed out, not physically like in the industrial age, but mentally.
It’s not just that we’re being outpaced. It’s that the whole structure we depend on is cracking.
Everyone’s reacting, there's no doubt about that. Governments, companies, communities, just not in sync. It's like they’re on different timelines.
AI is already smarter than you, you're just too stupid to see it
A second hit from Shapiro in the same newsletter. Call it a double-whammy.
This one’s about intelligence. Not the kind we think we have but the kind we can’t even recognize. Shapiro posits that AI is already smarter than us in many ways. But most people miss it because it doesn’t look the way we expect.
And here's the thing:
If you can’t see the change, you can’t prepare for it. You can’t coordinate around it. You can’t even talk about it.
The first piece was about speed and overwhelm. This one’s about perception and denial. Together, they describe a system that uncapable of seeing itself slipping.
So what would help?
We should probably stop trying to be right and rather stop being blind when everything changes.
DOGE Has Deployed Its GSAi Custom Chatbot for 1,500 Federal Workers
DOGE is rolling out a chatbot, “GSAi,” to 1,500 federal employees. The goal: streamline work, cut costs, and boost output. Classic automation pitch...
But don't you also put to risk replacing human capacity to coordinate and adapt? Take initiatives like this too far and you cut out the trust and context organizations need to respond to real change. Fast isn’t always smart...
I think the real test isn’t if AI can do human tasks, RPAs have already proven that. It’s whether institutions using AI can still think.
But if the goal is to cut people and eventually plug in Grok, then sure. It’s pure Muskian logic.
Ripples
America’s religious decline may have paused: new Pew research
Maybe this pause is about a search for meaning? After falling headless into uncertainty, I wonder if some of us will look to religion again. But maybe it's not belief, it's just that nothing else holds.
The US Army Is Using ‘CamoGPT’ to Purge DEI From Training Materials
Not only a policy shift, it’s the political logic getting wired into the “operational machinery.” A speed and scale not possible only a few years ago.
Arc Institute’s new AI can read and write the code of life
Life? Just another dataset to optimize.
Software engineering job openings hit five-year low?
Are you paying attention yet? This AI shift targets well-paid, creative work, unlike past automation which began with repetitive jobs. Or maybe Jevons paradox strikes and we'll need even more programmers