This is why I've said for years: If you want to drive best practices and policy with companies you can only do it with liability. Particularly non-insurable and non-tax deductible liability. If a company can't offload civil or criminal penalties to their insurance company and take the tax write down, they suddenly start caring about it.
That said, this should be used sparingly; as it embeds a behavior deep. If that behavior later no longer makes sense it can be extremely costly to change it later.
On an emotional level I feel the same way: I would love the company who leaked my PII die and their CEO/CTO be out of job forever.
Practically I think that leaking data is inevitable. A junior developer absolutely WILL vibecode a piece of code with glaring security vulnerabilities. An experienced sysadmin WILL temporarily allow public access to the S3 bucket and then forget.
So if you make sure liabilities are covered by corporate assets and are uninsurable, you will find out a world with no services soon.
I don't know what middle ground is possible to find here.
> Particularly non-insurable and non-tax deductible liability
Too often liabilities exceed assets, or the liabilities are externalised.
Liability doesn't work as an incentive for many risks. For uncommon but extreme risks, it can be better to roll the dice on company failure than regularly pay low amounts for mitigation.
It is especially effective to ignore liabilities when a company has poor profitability anyways.
And then you see major companies sidestep the costs of their liabilities (plenty of examples after security failures, but also companies like Johnson&Johnson).
When MS removed Solitaire and made it an app, that should have been the sign to move.
When they introduced a mobile first UI onto a desktop OS...
When they forced mandatory Microsoft accounts...
When they started saving files that had no place being in one drive to the cloud by default and charging people for it...
When they announced the worst AI privacy disaster in computing OS history...
When their updates refused to install cleanly and bricked people's computer to the point of hardware damage...
Seriously thinking I might have Stockholm syndrome at this point. To me the best windows would be Windows 11's kernel and libraries with Windows 7's UI and apps. Because it's been all down hill (generally) since there.
It's not stockholm syndrome for a lot of people. Microsoft is so firmly entrenched in so much of the corporate world that you can't get away from them. My mom was in the market for a new laptop recently, and I so badly wanted to get her setup with a MacBook Air, but it's not an option because the Sage accounting software she uses for my dad's business is Windows only. And furthermore, the business itself (a small pawn shop) is forced to use some specific software to manage inventory (I believe it allows police to access the database to track serial numbers in finding stolen goods or something), which is a webapp using some antiquated decades-old technology that only runs in Microsoft Edge's IE-compatibility mode (which has become a more and more difficult incantation to enable over the years) and I believe that can only be used on the Windows version of Edge.
For me it's currently the minimal-hassle way to make my Steam library runnable. But it feels like we're moving in a good direction thanks to Valve's efforts where one day I may be able to never boot into Windows on my PC.
> When they introduced a mobile first UI onto a desktop OS…
That's when I jumped to Macs and haven't looked back since. Windows is just a glorified game console to me now, but I have enough fun with PS5/Switch exclusives.
Though macOS is also becoming annoying, not quite to that breaking point yet, but worrying
Meanwhile Linuxland seems like a chaos of 10000 people who all think they're right, under an anal overlord
Maybe it's time to dig the Commodore 64 back up? :')
But who cares though, soon AI will make operating systems meaningless, right?
> To me the best windows would be Windows 11's kernel and libraries with Windows 7's UI and apps.
Does anyone now how to achieve that? What happens when you replace the kernel in a Windows 7 installation with the one from Windows 11? How is the manual update procedure for kernels on MS Windows?
Based on the info if you click into them, likely no. I would have expected them to be incidental materials from tunneling, but reading the description that's not the case.
Part of the reason, I think, is that Qualcomm and Apple cut their teeth on mobile devices, and yeah wider SIMD is not at all a concern there. It's also possible they haven't even licensed SVE from Arm Holdings and don't really want to spend the money on it.
In Apple's case, they have both the GPU and the NPU to fall back on, and a more closed/controlled ecosystem that breaks backwards compatibility every few years anyway. But Qualcomm is not so lucky; Windows is far more open and far more backwards compatible. I think the bet is that there are enough users who don't need/care about that, but I would question why they would even want Windows in the first place, when macOS, ChromeOS, or even GNU/Linux are available.
A ton of vector math applications these days are high dimensional vector spaces. A good example of that for arm would I guess be something like fingerprint or face id.
Also, it doesn't just speed up vector math. Compilers these days with knowledge of these extensions can auto-vectorize your code, so it has the potential to speed up every for-loop you write.
> A good example of that for arm would I guess be something like fingerprint or face id.
So operations that are not performance critical and are needed once or twice every hour? Are you sure you don't want to include a dedicated cluster of RTX 6090 Ti GPUs to speed them up?
I'd argue that those are actually very performance critical because if it takes 5 seconds to unlock your phone, you're going to get a new phone.
The point is taken, though, that seemingly the performance is fine as it is for these applications. My point was only that you don't need to be running state of the art LLMs to be using vector math with more than 4 dimensions.
I remember failing an interview with the optimization team of a large fruit trademarked computer maker because I couldn't explain why the x87 stack was a bad design. TBF they were looking for someone with a masters, not someone just graduating with a BS. But, now I know... honestly, I'm still not 100% sure what they were looking for in an answer. I assume something about register renaming. memory, and cycle efficiency.
Having given a zillion interviews, I expect that they weren't looking for the One True Answer, but were interested in seeing if you discussed plausible reasons in an informed way, as well as seeing what areas you focused on (e.g., do you discuss compiler issues or architecture issues). Saying "I dunno" is bad, especially after hints like "what about ..." and spouting complete nonsense is also bad.
(I'm just commenting on interviews in general, and this is in no way a criticism of your response.)
I think I said something about the stack efficiency. I was a kid that barely understood out of order execution. Register renaming and the rest was well beyond me. It was also a long time ago, so recollections are fuzzy. But, I do recall is they didn't prompt anything. I suspect the only reason I got the interview is I had done some SSE programming (AVX didn't exist yet, and to give timing context AltiVec was discussed), and they figured if I was curious enough to do that I might not be garbage.
Edit: Jogging my memory I believe they were explicit at the end of the interview they were looking for a Masters candidate. They did say I was on a good path IIRC. It wasn't a bad interview, but I was very clearly not what they were looking for.
I believe the Academy Awards and a few other things too also influence this. The rules to be eligible still very much favor legacy studios IIRC. But, with this that may change? Hard to say. I know that quite a few Netflix movies have had theatrical runs at random mom and pop theaters in Cali so they could meet eligibility requirements for the various awards.
Honestly? I expected this to be talking about the MiSTer project FPGA core[1]. That has been tuned so it's capable of running the AREA5150 demo[2] which is an insane challenge (AFAIK the timings of the v20 break that demo). Not saying this isn't cool, but it's definitely not what I was expecting.
I've said for years that any smart thermostat should have a bimetallic backup that controls maximum ranges and acts in the dumbest way possible. Just max temp and min temp for AC and heat. Nothing that should ever be hit... but there nonetheless.
That said, this should be used sparingly; as it embeds a behavior deep. If that behavior later no longer makes sense it can be extremely costly to change it later.
reply