Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

There's a very lively startup scene in Berlin, but no-one in that scene seems to know that the world's first digital computer was built in Berlin.

There's an obscure and easy to miss sign (https://en.wikipedia.org/wiki/Konrad_Zuse#/media/File:2007-0...) on the house he lived in while working on it, and another sign at the technical university, but that's about it. No big festivities in his name, no commemorations, nothing.

Just imagine if the first digital computer was built somewhere in the US or in the UK. I'm sure we'd hear about until the end of times. Strange how that works.



The Deutsches Technikmuseum commemorates his contributions too. It was one of the things I marveled at in that geek paradise of a museum. Because otherwise the only time I ever heard of Zuse is in a few paragraphs in the cursory and customary "History of Programming Languages" that opened my PL class at uni.

(Claims below are completely opinions, very unsubstantiated, yours might vary, doesn't matter, just my musings. Take it as a light-hearted commentary, not some hill I'd be willing to die on.)

So what even amuses me further is how I find the question of the earliest computer as largely political for lack of a better term. Most of the world would say it's either Alan Turing and his work at Enigma, some would claim Charles Babbage. Growing up in SE Asia, some of my elementary books would even claim the abacus as the earliest "computing device" (I disagree on principle). And of course, Berlin, Germany would say it is Konrad Zuse.

It seems to me this is because there is a very loose definition of "achieving computing". I won't get into Turing vs Zuse, but consider the arguments I remember for the abacus:

- it aids computation, was used as, basically, a calculator

- yes you can still make mistakes, but hey garbage-in garbage out!

- interpreting the configurations of the device to mean something is not so different from interpreting a computer's state through the configuration of lights on your monitor

Of course, if you qualify the debate as achieving electronic computing, then we have a bit more to chew on. But I imagine you could hammer arguments to go either way even with the "electronic" qualifier.


I suspect counting on your fingers predates the abacus. Granted, the abacus is much more powerful, and it is crafted instead of naturally occurring, but the principle of operation seems very similar.


Boasting about national achievements is considered bad form in Germany (considered a trait of people who've got nothing else to show).


*taught to be bad form. You know why[1]. Which only made Germany the hidden champion of hidden champion companies.

[1]https://www.youtube.com/watch?v=IoLIU2NI66w


I do not understand your comment and I am doubly confused after watching the linked video.


Just because Germans are taught not to be proud of their history using the above method (perfectly sound argument), doesn't mean they aren't. It's just more quiet and subtle.

Apologies for the confusing video, but you're watching a master of dark humor.


Many people i know in Berlin know that one of the first mechanical calculators is in the technical museum Berlin, but those people don’t know about concepts like turing-equivalent, so they call it a Computer.

Many great inventions have a nationalist folklore surrounding it, often ascribing a development to some national hero, while ignoring the process of development as a whole or parallel developments in other countries (e.g. the car, the airplane, the telephone).

On the other hand, Zuse is indeed obscure in Germany overall. Maybe it had to do with the fact that he wasn’t really commercially successful, he did not build some computing Empire out of his inventions. He wasn’t a Silicon Valley style entrepreneur. Something to consider for the engineers among us: it’s not enough to build/invent a thing.



I knew that there was a proof, but never read the paper before. Now that I did, I think that proof is garbage tbh.

For Turing completeness, a fixed, constant-sized program needs to be able to handle arbitrary inputs and use arbitrary amounts of memory. It's important to distinguish the model from the implementation here. Theoretically, a language like Python is Turing complete. (And BTW you don't need infinite precision integers or anything fancy: you can simply simulate the tape with an object graph.) Practically, every Python program running on a real computer will run out of memory eventually.

Anyway, the translated Z3 programs in this paper contain one case statement for every possible memory location. This means that for a given program size, you're limited to run in O(1) space. That's only as powerful as a finite state machine, not a Turing machine.

I see nothing in the paper that convincingly addresses this limitation. There are some arguments to the effect that practical CPUs have limited address space as well. Which is true, but that argument only shows that those aren't Turing complete either, not that the Z3 magically is.

Maybe there's a subtle point that I overlooked, but at the moment, I'm simply not convinced.


Ultimately there is no real computer that is Turing complete, because all real computers are finite. Ultimately, you can represent any real computer with a (very large) DFA.

The two things missing from what would normally be considered a universal computer are conditionals and indirect addressing. The Z3 is also limited in that it executes a specific finite set of instructions in linear order, with no branching.

In the proof, they get around the finite length of a program by literally creating a loop -- they glue one end of the paper tape of the program to the other so that the computer can keep executing the same instruction stream forever. Then they get around the lack of indirect addressing by accessing every memory location in every loop. I agree that it's very much a stretch to call the Z3 Turing complete.


Wouldn't a DFA for a computer have to have 2^n states for n bits of memory, so it's size would be exponential to memory size of the computer? So even for a pedestrian 640K of memory, that's like 10^10^7 states. Sounds a lot like trying to simulate a non-deterministic Turing machine on a deterministic machine; the state space explodes exponentially making it intractable.


Z4/ETH-Edition has conditional jumps, which makes it much less contentious.


I clicked through the first link because I wanted to see what was "funny" about it. Turns out, the way you simulate a Turing machine on the Z3 is very similar to the execution model of AWS step functions.

https://aws.amazon.com/step-functions/


Historians say: the "argument that Konrad Zuse's 1943 Z3 computer was universal was an impressive party trick, but diverged entirely from the way the machine was designed, how it was actually used, or indeed from anything that would have made sense in the 1940s... To us, the real lesson of his analysis is that the Z3 could have been Turing complete with only minor design changes, but that it wasn't because the concept and its benefits were not yet widely understood."

I agree with that analysis and think Turing-completeness is overrated.


Its an interesting what if to consider what would have happened if Zuze had had the support from the allies post war in the same way that Volkswagon did.

Surprised in away that Zuze wasn't snapped up by operation paperclip.


I think that's because he didn't have a lasting impact for Berlin (or Germany) beside being the first.

Zuse built the first computers in Berlin, but they were only used in the war-time government ministries afaik. After the war, he sold his first computer, which I would consider commercial available, to ETH Zurich in Switzerland. He founded 2 companies, but both were not successful compared to e.g. Intel. He would be really famous and well-known if he had managed to create a computing giant. He never had a university employment, so his papers were not well known compared to Turing and there weren't a lot of them. He even didn't publish his PhD thesis where he invented the first programming language and created an elaborate demo in which he programmed the first chess engine (also afaik). I think in science being the first is overrated and not that important. You have to publish and popularise your idea to really have an impact (schmidhuber would disagree). So his intellectual legacy was merely that he was the first, but not really influential. And his commercial success was also limited.

I think it's tragic, it was an enormous wasted potential for germany. It might be that he never was the great entrepreneur with the skills to build a tech-giant, but he was undoubtedly a genius who invented the future. He could have an enormous impact if he were equipped with a big lab and funding, if he would have taken the academic route. But Germany wasn't (and is still not) able to support geniuses that don't complete the usual strict path to professorship (study -> phd -> habilitation/post-doc).


It probably has to do with world war 2. It wasn't regular Germany where it was built in. It was in a Germany that was controlled by an insane lunatic.


Zuse is one of the more underappreciated figures in the foundational periods of computer science due to this fact.

I find it quite interesting that the fact that he was on the losing side of that war meant that he had to make due with limited resources which positively influenced his design choices and as such shaped his perspectives of what computers can and should be.

Neil Gershenfeld at the MIT Center for Bits and Atoms has often said that 'computer science is one of the worst things to happen to either computing or science. The canon of computer science that's currently taught prematurely froze a model of computation based on 1950s technology'

I think you can see a sort of stunted alternate path of computation and the connection it has to the physical world in the kinds of projects that Zuse participated in post WWII:

https://www.youtube.com/watch?v=odwgpKRnWM8

https://www.youtube.com/watch?v=Pxv4vDMOFCo

For various reasons Turing captures the contemporary pop culture imagination but I have a feeling that historians in 50-100 years will look back at Zuse as being equally if not more so influential.

I bet that Von Neumann would get a chuckle at how influential his throw away architecture ended up being to our society and would agree with Gershenfeld about how unfortunate it is that its success prematurely froze the model of computation.


I suspect that the work that John von Neumann and Turings work as the foundation will be.


I think that's also because Zuse was always relatively obscure during the war, he never was a "nazi celebrity" like Wernher von Braun. Zuse's computer experiments were not considered "vital to the war effort", funding got denied so it always remained a "basement project". And after the war computers were hardly the most important thing to work on in Germany either. Also AFAIK Zuse himself didn't consider his inventions all that important and earth-shattering, at least before and during the war, he "just" needed a tool to help with computations, so he built one.


I'm not sure that the then German government was fully supportive of Zuse's work, from the wikipedia article:

A request by his co-worker Helmut Schreyer—who had helped Zuse build the Z3 prototype in 1938[21]—for government funding for an electronic successor to the Z3 was denied as "strategically unimportant".

https://en.wikipedia.org/wiki/Konrad_Zuse


You can find an interview with him about IBM Deep Blue here:

https://research.tilburguniversity.edu/en/publications/an-in...


Thanks for that! He lived long enough to see Windows 95 and the early internet - wonder what he made of all of that.


There are some cool Zuze computers and reconstructions at the Deutches Technik museum. Z1 is a marvel. Well worth a visit: https://technikmuseum.berlin/ausstellungen/dauerausstellunge...


Well, there's the Zuse Institute Berlin (ZIB) in Berlin, named after him, and quite prominent on the campus of FU Berlin.

Further, there was a conference/show to celebrate Zuse's 100th birthday, organized by the Tagesspiegel newspaper. This event was repeated for the last two years, I think.


There is also a sign mentioning Konrad built the first computer on the U-bahn (I think it is U5?)


-Also, there's a bust of him in the Spreebogen (Berlin-Moabit).


Isn't SUSE Linux named after Konrad Zuse?


As far as I know it's just an abbreviation that, by accident, is close to "Zuse", but has nothing to do with it (someone might want to correct me here if I'm wrong)

It was developed by Germans, so SuSE means: Software und System-Entwicklung (or english: Software and systems development)


I always thought that this was not really a coincidence, but a clever pun on Zuse's name.


I'm German, and it never occurred to me as a pun. SuSE started out with a hippy-ish corporate style in the 90s so I guess they didn't intend to associate with old school computers but I may be wrong. I remember Siemens/Nixdorf financing/advertising the Konrad-Zuse-Museum, though.


Very unlikely. “Suse” is the short form of “Susanne” in German, similar to “Suzie” for “Suzanne” in English. For that reason Germans wouldn’t perceive it as a pun on “Zuse”, and also because in German pronunciation the soft “s” in “Suse” is phonetically very different from the hard “ts” in “Zuse”.


Knowing Germany and the German love for any kind of wordplay and puns, you're probably correct.


Germans don't care much about those things, because there is so much history everywhere. A small invention like a computer has not much relevance in that environment.

Additionally, Zuse was colaborating and supported from the Nazi-Regime, even though he himself was not a Nazi, nor did he directly supported their crimes. But this stain dimished education about his work for a long time.


As was the Volkswagen and much much more directly




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: