The only significant difference in priorities as regards speed would be the modern prioritization of "developer time", ads, and telemetry.
I can run Windows 95 applications at better than era-appropriate speed, in an x86 emulator written in javascript running on a web browser. That's at least 3 layers of virtual machine abstraction and the applications are still faster.
So if you're saying "the comparison isn't fair because modern software is too shit to hold up", then I agree, but if you're trying to tell me there is something else inherent to modern computing that makes software so many orders of magnitude slower, than I request that you show data to support that claim.
I can run Windows 95 applications at better than era-appropriate speed, in an x86 emulator written in javascript running on a web browser. That's at least 3 layers of virtual machine abstraction and the applications are still faster.
So if you're saying "the comparison isn't fair because modern software is too shit to hold up", then I agree, but if you're trying to tell me there is something else inherent to modern computing that makes software so many orders of magnitude slower, than I request that you show data to support that claim.