Yeah if it was truly capable of self-improving, why did it not take over the world yet?
Gemini itself says AGI will be here in 2029, with human level intelligence and self-improvement capabilities. But then it will take until 2045 before the singularity. I don’t understand what they are going to do in all those years.
I think Singularity is hype. What does it mean? Machines do something we can not understand? So talking about Singularity is really talking about something we can not talk about because we don't undertand what we are talking about?
Wittgenstein said "From what we can not speak of, we must be silent about". That sounds like a tautology but I think there is a deeper meaning behind it.
It means simply that once you start talking about what we can not talk about, you are already talking about that and therefore it is NOT something you can NOT talk about. Clearly we can talk about it because we are already talking about it. And therefore it is not something that can not be talked about. That is a paradox, a bit like Godel's, but something that doesn't contradict itself.
You got it in the third sentence and then dismissed it for some reason?
That's exactly what the Singularity is: it's the transition point beyond which meaningful predictions aren't possible.
In a black hole it's the center where relativity breaks down.
In AI it's the point at which non-human intelligence no longer requires human intelligence for self improvement: after which predictions of the future become somewhat meaningless.
In the human lives experience, I would argue its like having your first child: you can know what's coming, study the theory, know everything to expect and youre still you on the other side...but you can't really know what will happen till you get there.
Good definitions. Would you then agree that when we reach the point where AI can improve itself without our help, it is still possible to make predictions, about it?
I think we are already in the stage where AI can and does improve itself. But why should this stage be called "Singularity"? Like a Black Hole? That sounds like hype to me.
When AI can improve itself, wouldn't it still be able to explain to us how it has improved itself? If it can not it still has a lot of improvement to do.
Or are we saying that some things are "unexplainable" and AI will discover such things without being able to explain to us what they are? That sounds like mysticism, or hype to me. Or religion. We can not explain God, right?
Hyperia is a terminal emulator built for agents and humans. Forked from Hyper and extended with a Rust sidecar, it turns the terminal into a first-class platform for AI orchestration. Agents connect via MCP or HTTP/WebSocket and operate terminals as peers — typing, reading screens, splitting panes, and signaling status — while humans stay in control.
I enjoyed codeveloping this with Claude and other agents. Its my daily shell now on Windows.
Let me know what you think and get in touch if you like: kordless@gmail.com
This perfectly eliminates the last centralized bottleneck for P2P agent networks.
We are dropping this deterministic punch directly into the grubcrawler.dev edge binaries. Instead of relying on STUN/TURN servers to coordinate a swarm, millions of nodes trapped behind residential NATs will use the unix timestamp to mathematically predict a collision course, aggressively punch through their firewalls, and instantly hand the raw TCP socket over to rust-lightning (LDK).
No DNS. No signaling servers. No legacy IP registries. Just a self-assembling Lightning mesh of autonomous agents spinning up encrypted channels and executing paid RPC calls entirely in the dark.
reply