I looked at the site and it's cool that that he's not using HF to metal etch. Instead he's using an ammonia and hydrogen peroxide solution. That's a lot safer stuff to handle. That seemed to always be the roadblock to home silicon fabrication: that you needed to use some very toxic stuff to metal etch.
Freaking cool! 1um might seem a lot for today industry standards if we think modern processors and peripherals, still more than adequate for analog stuff (opamps, regulators, matched transistor pairs, VCOs and VCAs, simple glue logic etc.).
Mask size or fabrication geometry size? I'm guessing you mean the latter. It depends on the generation of these chips (moving to a smaller process) but I know the 6502 used a 10 um process at introduction, and I think the 8080 used a 5 or 6 um process. Don't know about the Z80.
For comparison, the 80386 was fabricated with a 1um process.
I don't know, the Z80 data sheet I briefly skimmed doesn't even seem to mention that detail.
However if this technology can offer what was available in the 70s, and on their site they say it can be used for analog ICs too, it woud seem logic to use it to remake older obsolete chips (how many 6522 and SIDs were fried?), along with some newer versions of older analog chips.
Not a backyard foundry but this guy is attempting to make a "tangible" RISC-V CPU using "As few programmable chips as possible." he wants to use MSI and LSI chips " so things like buffers, flip flops, and so on. "
What you mean "printing"?
From the article: There are 66 individual fabrication steps to make this chip and it takes approximately 12 hours for a full run.
Automate all 66 fabrication steps into a fully integrated device is not exactly trivial.
The most complex parts of phones and computers (e.g., the CPU) are integrated circuits (ICs)[0], which are (nowadays) billions of nanometer-sized transistors on top of a piece of silicon. The way these are made is arguably the most complex, high precision manufacturing process in the world, and is usually done in multi-billion-dollar "fabs" (fabrication plant) by huge companies (e.g., Intel).
Even the most basic IC fabrication, like in the article, absolutely requires maybe ~5-10 complex tools (furnace, sputterer, etc; $1000-$10k each at current eBay prices and very low quality, if you know how to rebuild/fix all of them) and a host of supporting equipment (fume hood for seriously dangerous[1] chemical work, etc). To get reasonable results, you also need to understand the device physics and then test multiple times to get the process right. I have some serious respect for Sam Zeloof of the article for getting this to work: it's at least an order of magnitude more difficult than other home manufacturing (3D printing, woodworking, welding, sewing...), even if you're already an industry expert. And his device used 6 transistors; you'd need to get the transistor manufacturing reliability up significantly to make a useful microprocessor (instead of small analog circuits), which probably starts at several thousand transistors [2].
If (GP post) you want an automated device that makes an IC for you given a digital design file, well, hm. The closest things we have today are companies that manage and run the equipment for you (fabless semiconductor companies (Qualcom, AMD...[3]) give their chip designs to, e.g., TSMC to manufacture). Academic researchers often send parts in together to reduce costs[4], in which case you could get tens of identical (reasonably simple) chips for several thousand dollars. Someone linked to [5], which looks like an attempt at a more open, hobbyist-friendly version of the same thing. I did run across [6] once, which _does_ seem to be attempting to make an easier to use, very small, automated system. I've no idea what their status is.
A desktop device as simple to use as a 3D printer is barely even on the conceptual possibility level at the moment, and then only when people start talking sci-fi self-assembly and molecular nanomanufacturing and a century of R&D.
I've been doing good work in computing for before most of the audience on Hacker News was born, and I'm still at it, e.g., with an ambitious Web site.
From all my experience, far and away, the biggest bottleneck and ball and chain on progress in computing is bad technical writing, bad documentation, poor ability to describe technical work; one of the biggest problems here is undefined jargon; and there one of the biggest problems is undefined acronyms.
In particular, even with my decades of experience, there is no good way for me to guess that the acronym "IC" abbreviated "integrated circuits". For an audience as broad as all of Hacker News, there is NO way to know.
In particular, writing
"Homemade Integrated Circuits"
is plenty short so that there is no good reason to write
"Homemade IC"
That acronym is undefined and obscure.
Technical writing in computing is awash in undefined and obscure acronyms, and that is part of the bottleneck and ball and chain on progress.
The situation in technical writing broadly, e.g., in math, science, engineering, is clear: When jargon is used for the first time, define it or at least give a link to a definition. With the Internet, giving links to definitions is especially easy and convenient.
For the sake of progress in computing, I urge computing and the Hacker News audience (i) to minimize use of obscure acronyms and (ii) on the first use of jargon always define or link to a definition. This advice is rock solid and just technical writing 101.
In the meanwhile, indeed, I'd be more interested in homemade ice cream than homemade integrated circuits -- even for the Hacker News audience, the acronym IC more likely abbreviates "ice cream" than "integrated circuits".
> In particular, even with my decades of experience, there is no good way for me to guess that the acronym "IC" abbreviated "integrated circuits".
IC to mean integrated circuits is a very common acronym, and right in the center of the HN focus.
I’ve literally never once seen “IC” used in any context to refer to any of the strained alternatives you propose.
It really feels like you had a canned rant that you've been waiting to find a place to use, and you decided to strain to make it seem like it fit the first thing that you felt like you might get away with stretching it to cover.
> even for the Hacker News audience, the acronym IC more likely abbreviates "ice cream" than "integrated circuits".
I disagree that this is true, either in general or in the specific context.
"Individual contributor" is another common "IC", although following "homemade" it would be unlikely enough that I didn't notice it as a candidate interpretation until stumbling across this thread and considering the acronym independently.
> IC to mean integrated circuits is a very common acronym, and right in the center of the HN focus.
Wrong. IC is for hardware. The center of Hacker News is software, NOT hardware. Commonly the audience here works with software for hours a day but goes for weeks without ever seeing an IC or hardly even thinking about one.
You are just trying to pick a fight with me.
Again, once again, over again, yet again,
Homemade IC
is more likely to mean ice cream than integrated circuit.
So, to pick a fight, you pick IC out of the context to say that generally IC means integrated circuit more than ice cream -- true but trivial, beside the point. Again, the title was
Homemade IC
and THERE no telling what the heck IC meant, and even ice cream, even at Hacker News is more likely.
My point about the worst bottleneck in computing is rock solid and very important for computing and the Hacker News audience and fully appropriate. Your "rant" is insulting and provocative.
Your point that
Homemade IC
clearly meant integrated circuit is absurd, just deliberately insulting. You are just trying to pick a fight.
Resist all you want: It remains, computing has a severe bottleneck -- bad technical writing with undefined jargon and acronyms. And Hacker News titles make WAY too heavy use of acronyms. Disagree, fight, resist, object, all you want -- you are still wrong.
Here you are doing the usual for an angry person with weak arguments -- you are attacking the person instead of the ideas. That
Homemade IC
is obscure jargon is true beyond any question. So, you accuse me of a "rant": My original response was short. Then I got attacked.
Sorry, but you seem much more like the one trying to pick a fight at all cost here with your ridiculous insistence on ice cream and doubling down with these giant replies.
You could have just asked for the title to be clarified to avoid a potentially unknown acronym.
Sure I COULD have done lots of other things, but it is insulting and patronizing for you to suggest I SHOULD have done what you suggest.
I did NOTHING wrong.
My point about undefined jargon is rock solid. My point was clear enough in my original post. My responses are only to explain with grossly excessive clarity to defend myself against people who want to attack for whatever reason.
There are some very thin skinned, hostile people on Hacker News.
And my early statement is literally true: I'd be more interested in homemade ice cream than homemade integrated circuits and justifiably so.
All my points are just dirt simple, grade school stuff, explained over and over with outrageously overwhelming clarity, yet some people want to fight for whatever reasons. After my short, clear, obvious post, I just defended myself.
The people attacking me, usually personally instead of my ideas, are finding NOTHING wrong with what I stated but are embarrassing themselves.
It should be enough to be correct, and I am fully correct.
I bet 99% of the people who come to HN absolutely know that IC means Integrated Circuit. It's very common, it's up there with stuff like CPU and RAM and D&D and RMS for being things that don't need to be explained here. I wouldn't expect someone to have to explain what ASIC stands for on HN, much less IC. If you see something you don't understand, congratulations, you're about to learn something new today! Look it up on Google et voila! Not something to complain about.
You are straining to miss the point. Again, yet again, over again, once again, the question was what does
Homemade IC
mean? It might mean
Homemade Ice Cream
You what to say that IC usually abbreviates "integrated circuit" and nearly never "ice cream", and that is true but nearly irrelevant since the issue is what does "IC" mean in
Homemade IC
There
Homemade Integrated Circuit
is tough to swallow because making integrated circuits usually takes $billions and lots of highly dangerous chemicals, all super tough to do at "home". So,
Homemade Ice Cream
is actually MORE likely, even at Hacker News.
You can see this. It's grade school stuff. You are just having fun arguing against the obvious and are embarrassing yourself.
Why? Because a huge fraction, no doubt a huge majority, of the Hacker News audience concentrates on software.
In more detail, a lot of the audience uses laptop computers, WiFi hubs, smartphones, etc. and never sees an integrated circuit, not even in its plastic box on a circuit board.
I plugged together my most recent computer so saw the motherboard with its many integrated circuit packages, handled the integrated circuit of the processor, an AMD FX-8350, installed several adapter cards with their visible integrated circuit packages, etc. but still could not be sure about the meaning of "IC" in the title. Besides, making an integrated circuit at home is a rare and strange, also possibly interesting, thing.
Again, computing needs to work really hard to avoid use of undefined jargon.
Again, literally, even for the Hacker News audience,
Homemade Ice Cream
is more likely than
Homemade Integrated Circuits
Again, once again, over again, yet again, one more time, the biggest bottleneck, a real ball and chain, on progress in computing is bad technical writing, and undefined jargon is one of the worst parts. Again, ..., in essentially all the more important technical writing it is 101 level rock solidly standard always, no question, to expand acronyms.
Right, there are some exceptions -- HTTP, HTML, URL, but the full list is short, and IC is not on it. Neither are JS, ASIC, CSS, ACL, OO, JSON, RSA, LDAP, CMIS/P, SMTP, SNMP, ASP.NET, ADO.NET, and some hundreds more.
There used to be CICS, IMS, MVS, VTAM, ISAM, VNET, SNA, IPL, RACF, CP67/CMS, VM, DB2, etc. which was priesthood jargon for some years but gone now.
This is just a rock solid technical writing lesson 101. Accept it or not as you wish.
Jargon is an insider thing, and everyone else gets irritated.
Of course I came across IC some tens of thousands of times, from silicon, germanium, 1 micron down to 3 nanometers, Moore's law, Dennard scaling, static random access memory (SRAM), dynamic random access memory (DRAM), scanning-tunneling electron microscopy, vacuum deposition, extreme ultra violet (EUV) light sources and optics, etc. Still
Homemade IC
was obscure. Since I've also heard of TSMC (Taiwan Semiconductor Manufacturing Company or some such) and the $billions for making ICs, that an IC could be "homemade" was strange context.
Believe me or not, but I say again, once again, over again, yet again, computing desperately needs to avoid undefined jargon and acronyms, and in this case
A guy in HK is making a 1um fab with own process and open tooling.