The first half of this is already happening to a certain extent.
I first noticed this in a submission[1] on Dimitris Papailiopoulos' Adderboard[2], which is a code-golf competition for training the smallest transformer that can add two 10-digit numbers. Most submissions on it are fully AI generated.
The report in the linked repo is Claude Code generated.
It's actually fascinating to think that autonomous researchers will likely need a publishing system, simply because that would be the most efficient way to disseminate their knowledge. Would be a good way to keep humans somewhat in the loop too.
I have mine reading yours right now. Unfortunately(?) I mentioned LeCun to it, and it says it's adding a "causal world-state mixer" to nanograd; not sure how this will work out, but it wasn't nervous to do it. Gpt 5.4 xhigh
EDIT: Not a good fit for nanograd. But my agent speculates that's because it spent so much more time on compute.