This reminds me of Harvey Cragon's intro to computer architecture textbook...
When it introduces Harvard vs. Von Neumann architectures, it doesn't invent some dumb RISC computer to illustrate the difference... No... it makes you learn the actual von Neumann machine! Also Conrad Zuse Z machine.
Cragon's argument is that students will not learn the concept of engineering trade-offs, if presented with a clean "textbook" architecture.
I hated MIX for various reasons, it's sort of in-between simple and kludgy.
[0] Cragon was professor at University of Texas Austin ca 1980. Also the architect of TI's ASC in the 1960s.
AI is important but we don't know what skills will be relevant in 10+ years to harness AI (I can't imagine prompt engineering is much the same). Anyway, would a typical teacher be ahead of the curve on what pedagogical tack to take here even if it was appropriate?
The best thing to do is to set the kids up to learn the most important thing - which is how to teach oneself. If a kid can read about something, and then understand what was important from the reading, and then write about it, and then know where to go next they will be well served in the AI world.
100gbps is going to be for mesh networks supporting clusters (4 Mac Studios let's just say) - not for LAN type networks (unless it's in an actual datacenter).
Yeah nothing about Apple is server side and imho that's what training is. To be serious about it as a company you have all sorts of other tools (crawlers, etc...) helping with training so it basically has to be in the datacenter at any reasonable scale anyway. And that's just not where Apple lives. We saw with Swift that they couldn't focus on server side enough to make it a serious language there and they've consistently declined to enter that area over the years because it's outside their wheelhouse.
It has a HDMI port and its USB-C ports also support display out. But I believe most who buy it intend to use it headless. The machine runs Ubuntu 24.04 and has a slightly customised Gnome (green accents and an nvidia logo in GDM) as its desktop.
They were trying to compete with Sun and IBM in the server space (SPARC and Power) and thought that they needed a totally pro architecture (which Itanium was). The baggage of 32-bit x86 would have just slowed it all down. However having an x86-64 would have confused customers in the middle.
Think back then it was all about massive databases - that was where the big money was and x86 wasn't really setup for the top end load patterns of databases (or OLAP data lakes).
In the end, Intel did cannibalize themselves. It wasn’t too long after the Itanium launch that Intel was publicly presenting a roadmap that had Xeons as the appealing mass-market server product.
Yeah they actually survived quite well. Who knows how much they put into Itanium but in the end they did pull the plug and Xeons dominated the market for years.
They even had a chance with mobile chips using ATOM but ARM was too compelling and I think Apple was sick of the Intel dependency so when there was an opportunity in the mobile space to not be so deeply tied to Intel they took it.
I think the difference was that replacing Itaniums with Xeons on the roadmap didn't seriously hurt margins (probably helped!)
The problem with mobile was that it fundamentally required low-margin products, and Intel never (or way too late) realized that was a kind of business they should want to be in.
> and thought that they needed a totally pro architecture (which Itanium was).
Was it though ? They made a new CPU from scratch, promissing to replace Alpha, PA-RISC and MIPS, but the first release was a flop.
The only "win" of Itanium that I see, is that it eliminated some competitors in low and medium end server market: MIPS and PA-RISC, with SPARC being on life support.
The deep and close relationship of Compaq with Intel meant that it also killed off Alpha, which unlike MIPS and PA-RISC wasn't going out by itself (Itanium was explicitly to be PA-RISC replacement, in fact it started as one, while SGI had issues with MIPS. SPARC was reeling from the radioactive cache scandal at the time but wasn't in as bad condition as MIPS, AFAIK)
I never used them but my understanding is that the performance was solid - but in a market with incumbents you don't just need to be as good as them you need to be significantly better or significantly cheaper. My sense was that it met expectations but that it wasn't enough for people to switch over.
Merced (first generation Itanium) had hilariously bad performance, and its built in "x86 support" was even slower.
HP-designed later cores were much faster and omitted x86 hardware support replacing it with software emulation if needed, but ultimately IA-64 rarely ever ran with good performance as far as I know.
Pretty sure it was Itanium that finally turned "Sufficiently Smart Compiler" into curse phrase as it is understood today, and definitely popularized it.
I remember not believing my friend when he said that the OS and the games were inside the computer and didn't need to be loaded up via a floppy disk. That was my first time seeing a hard drive.
My very first hard drive was for an Apple //gs. It was the size of a shoebox and was a whopping 5 megabytes. I was an Egghead employee at the time and after my employee discount, it was something like $600-$700.
There was tremendous resistance to setting up that school both because of where the money came from but more so because of the possibility that the school would not actually be academic but more 'professional' instead. I can't comment on the former as it seems mostly just xenophobic but maybe there are other angles there. The latter is definitely a concern though.
Not xenophobic. Oxford happily takes money from people from anywhere all the time. It might be things such as his involvement in the Al Yamamah arms deal.
It was probably a cascade of people but the question is whether we all realize Apple was right or if they just implemented it wrong or if it will just take a year or two to get things dialed in (but still prepared for an AR/VR world) and then we forget it ever happened.
People had the same reaction to iOS 7. They cleaned up some of the excesses over the next few years, and now the same basic concept is what people want Apple to RETURN to. They'll be fine.
I’d still want Apple to return to an iOS 6-like design. Not the super-skeuomorphic stuff, but the regular UI with discernible controls clearly separated from content.
It's a leadership failure. They obviously have a UI/UX dept. Those people want to be considered productive. Hence, they need to force a major redesign every now and then. Without a Steve Jobs like leader, those things will happen due to fundamental laws of corporate bureaucracy.
You remember the funny turn of phrase instead of how bad the reception was in your iPhone 4, and how it ruined the experience of owning it. Because it wasn't that big of a deal in the end.
There are many things which are worse which cannot be configured. I can't get my battery life back, I can't get a version of Apple Maps which doesn't crash on launch back, I can't get my framerate back. I can't even get a refund for this $1200 phone.
reply