Hacker Newsnew | past | comments | ask | show | jobs | submit | mrob's commentslogin

Technically correct only if you accept "vague set of traditions" as a valid definition for "constitution". This both contradicts common usage and enables tyranny, so I recommend rejecting it.

The UK constitution isn't a "vague set of traditions", it is spread across a number of acts of Parliament.

Where can I find the official list of which acts are part of the constitution? And what additional obstacles exist to changing those acts beyond the obstacles to changing non-constitutional acts of parliament? In common usage, a constitution is something that restricts changes to ordinary law. If a "constitution" is made entirely from ordinary law it cannot function as a constitution.

Before you demand more explanation on the spot, you know there's a Wikipedia page for that. It explains the components how they are legitimated and the mechanisms of the UK government that rely on it.

It's only a spike if it comes down. Every RAM chip is a lottery ticket with a plausible chance of giving one lucky winner fabulous prizes like absolute dictatorship of the entire world and physical immortality. What else are the billionaires going to spend their money on? Arms races can absorb unlimited resources.

The growing prevalence of so-called "supply-chain attacks" (a bad name because it implies a commercial relationship that doesn't usually exist) shows that "common sense antivirus" isn't working so well even among the technically inclined.

Because whoever wins the AI race (assuming they don't overshoot and trigger the hard takeoff scenario) becomes a living god. Everybody else becomes their slave, to be killed or exploited as they please. It's a risky gamble, but in the eyes of the participants the upside justifies it. If they don't go all in they're still exposed to all the downside risk but have no chance of winning.

I don't expect hardware prices to go down unless the third option (economic collapse) happens before somebody triggers the dystopia/extinction option.


Just to add some slight nuance but is an important distinction,

They aren't all necessarily racing to be "god", some are racing to make sure someone else is not "god".

If it weren't for Altman releasing ChatGPT, it's very likely that we would have markedly less powerful LLMs at our disposal right now. Deepmind and Anthropic were taking incredibly safe and conservative approaches towards transformers, but OAI broke the silent truce and forced a race.


Unlike the tree, nobody can take your idea away from you. You retain possession of your idea even if somebody copies it. It sounds insane to me to think you should get permanent control over other people's communication just because you had the idea before them.

Christopher Strachey wrote a version of draughts/checkers for the Manchester Mark 1 that was fully functional in 1952. This is IMO the first video game. Earlier candidates use single-purpose display hardware, which disqualifies them from being "video".

https://en.wikipedia.org/wiki/Christopher_Strachey


If the wikipedia image is accurate, its technically not "displaying" the board, its just in ram. The RAM just happens to be visible. But you get into a lot of technicalities when talking about the "first video game", so its up to interpretation. There was the "Cathode-ray tube amusement device" in 1947 that, by some interpretations could also be the "first video game" https://en.wikipedia.org/wiki/Cathode-ray_tube_amusement_dev...

I think it is at least safe to say that PONG isn't the first


That's a pretty weird distinction to make.

I remember back in the 80s writing a CGA text-mode game (they were quite in vogue at the time), and (as I assume most programmers did) I used the video memory directly as the source of truth about the current state of the level.

OP's distinction about video being a raster-based signal that you feed into a regular TV-like device, rather than being vector based or hard wired lights seems sensible. As to how that video signal is generated is kind of irrelevant.


The manchester mark 1 had a teleprinter as its output, and used a Williams tube as ram (https://en.wikipedia.org/wiki/Williams_tube). If the image on wikipedia is accurate, the checkers game only "displayed" itself incidentally, on the Williams tube, rather than actually outputting to the teleprinter. In your game, it would be like writing the current level to internal ram, rather than to the actual video memory. The Williams tube isn't really a TV-like device. It stores data on a CRT, but that CRT isn't visible to the user in general operation, as the read plate covers the "screen". Again, "first video game" is up to a lot of interpretation.

Also, saying that vector based video makes it not a video game is a little strange, given how common vector graphics were in arcades (eg Asteroids, Tempest, Missile Command) and the Vectrex


Not necessary, you can just take an additional CRT and wire it in parallel to one of the Williams tube CRTs to see what's on the screen.

That's how the Manchester Baby did it (visible in the center of the image here): https://upload.wikimedia.org/wikipedia/commons/6/6f/Manchest...


I'm not necessarily making the point that vector graphics based games aren't video games, just arguing against the parent comment against the claim that it wasn't a video game because it was stored in RAM.

I agree with the assertion that this was a video game because it was using a raster-based CRT for the display, even though the primary purpose of that display was for data storage not display.


I don't know why being vector based would disqualify this from being a video game?

https://www.arcade-museum.com/Videogame/star-wars

It's from 1983, which disqualifies it from being early, of course.


I don't think it's necessary for video RAM to be separate from code RAM. The BBC Micro game "Revs" runs code from the video RAM and sets the palette to make it look like blue sky.

https://en.wikipedia.org/wiki/Revs_(video_game)

CRT Amusement Device is IMO disqualified for not using any form of computer.


The CRT Amusement Device uses a video display and has game-like elements, you could argue that makes it a "video game" (as opposed to a "computer game")

Computers games can have no video at all why the 'display' it's being sent over a serial output to either a display or a printed paper.

Nethack/Slashem, text adventures, Sokoban, Trek... can be printed one sheet at a time and be totally playable. With Slashem it might be a big waste of paper, but with text adventures you can just reuse the output (obviously) and reduce tons of further typing because you already have the whole scrollback printed back in your hands.


Consumer choice only works when there's a free market. Computer systems are encumbered by copyright and patent monopolies, so there's no free market. I can't buy a third-party Macbook. Because these monopolies are granted by the state it's reasonable for the state to correct any market failures they cause with regulation.

In this context, "general purpose" means "Turing complete" in the informal sense of handwaving away the requirement for infinite storage space.


What you say changes nothing.

The earlier relay computers were Turing complete.

For ENIAC it also does not make sense to claim that it was Turing complete. Such a claim can be made for a computer controlled by a program memory, where you have a defined instruction set, and the instruction set may be complete or not. If you may rewire arbitrarily the execution units, any computer is Turing complete.

The earlier ABC electronic computer was built for a special purpose, the solution of systems of linear algebraic equations, like ENIAC was built only for a special purpose, the computation of artillery tables.

By rewiring the ABC electronic computer you could have also computed anything, so you can say that it was Turing complete, if rewiring is allowed.

The only difference is that rewiring was simpler in ENIAC, because it had been planned to be easy, so there were special panels where you could alter the connections.

Neither ABC nor ENIAC had enough memory to be truly general-purpose, and by the end of the war it was recognized that this was the main limitation for extending the domain of applications, so the ENIAC team proposed ultrasonic delay lines as the solution for a big memory (inspired by the use of delay lines as an analog memory in radars), while von Neumann proposed the use of a cathode ray tube of the kind used in video cameras (iconoscope; this was implemented first in the Manchester computers).

Because ENIAC was not really designed as general-purpose, its designers originally did not think about high-capacity memories. On the other hand, John Vincent Atanasoff, the main designer of the ABC computer, has written a very insightful document about the requirements for memories in digital computers, years before ENIAC, where he analyzed all the known possibilities and where he invented the concept of DRAM, but implemented with discrete capacitors. Later, the proposal of von Neumann was also to use a DRAM, but to use a cheaper and more compact iconoscope CRT, instead of discrete capacitors.

While the ABC computer was not general-purpose as built, the document written by Atanasoff in 1940, “Computing Machine for the Solution of Large Systems of Linear Algebraic Equations”, demonstrated a much better understanding of the concept of a general-purpose electronic digital computer than the designers of ENIAC would demonstrate before the end of 1944 / beginning of 1945, when they realized that a bigger memory is needed to make a computer suitable for any other applications, i.e. for really making it "general purpose".


The Z3 was only general purpose by accident, and this was only discovered in 1997 (published 1998). [0] It's only of theoretical interest because the technique required is too inefficient for real-world applications.

ENIAC is notable because it was the first intentionally general purpose computer to be built.

[0] https://www.inf.fu-berlin.de/inst/ag-ki/rojas_home/documents...


I do not think that it is right at all to say "intentionally general purpose computer".

ENIAC was built for a special purpose, the computation of artillery tables.

It was a bespoke computer built for a single customer: the United States Army's Ballistic Research Laboratory.

This is why it has been designed as the digital electronic equivalent of the analog mechanical computers that were previously used by the Army and why it does not resemble at all what is now meant by "general-purpose computer".

The computers of Aiken and Zuse were really intentionally general-purpose, their designers did not have in mind any specific computation, which is why they were controlled by a program memory, not by a wiring diagram.

What you claim about Z3 being general purpose by accident does not refer to the intention of its designer, but only to the fact that its instruction set was actually powerful enough by accident, because at that early time it was not understood which kinds of instructions are necessary for completeness.

All the claims made now about ENIAC being general-purpose are retroactive. Only after the war ended and the concept of a digital computer became well understood, the ENIAC was repurposed to also do other tasks than originally planned.

The first truly general-purpose electronic digital computers that were intentionally designed to be so were those designed based on the von Neumann report.

Before the completion of the first of those, there were general-purpose hybrid electronic-electromechanical digital computers, IBM SSEC being the most important of them, which solved a lot of scientific and technical problems, before electronic computers became available.


A counter argument is that Mauchly was actually interesting in using computers for weather modeling and I’m sure that influenced the design of ENIAC. He could only get ENIAC funded if it was valuable to the war effort. I’ve read quite a lot about that machine and I’m not aware of any architectural features that were specific to ballistics calculations. This is unlike the British Colossus, another early computer, which was specifically designed for code breaking and wasn’t general purpose.

As for the objection that it wasn’t stored program, I was interested to learn that it was converted to stored program operation after only two years or so of operation, using the constant table switches as the program store. But the Manchester Baby, which used the same memory for code and data was more significant in the history of stored program machines.

On the general question of “first computer”, I think the answer is whatever machine you want it to be if you heap enough conditional adjectives on it.


> Mauchly was actually interesting in using computers for weather modeling and I’m sure that influenced the design of ENIAC

True. Mauchly was a physics professor interested in meterology, and he knew that predicting the weather and calculating an artillery shell's flight are mathematically the same type of problem, which was important to get funding. In the fifties, Eniac was even used to calculate weather forecasts (see https://ams.confex.com/ams/2020Annual/webprogram/Manuscript/...). So these were just two related special problems, and it would be a stretch to interpret this as an intention to build a general-purpose computer. The latter had to wait until the sixties.


> The Z3 was only general purpose by accident ... ENIAC [..] was the first intentionally general purpose computer

That's a pretty academic take. Neither Eckert, nor Mauchly, nor Zuse knew about Alan Turing’s 1936 paper when they designed their machines. The classification of ENIAC (and the Z3) as a "universal Turing machine" is entirely a retroactive reinterpretation by later computer scientists. John von Neumann knew the paper and was aware of its significance, but he only turned up in the ENIAC project when the design was complete. At this time, Eckert and Mauchly were already well aware of ENIAC's biggest flaw (the massive effort to reprogram the machine, and in fact they came up with the stored-program concept which von Neumann later formalized). ENIAC’s funding and primary justification were for the very specific purpose of calculating artillery firing tables for the military. The machine was built for this purpose, which included the feature which retroactively led to the mentioned classification.


Still feels like history written by the victors (of WW2 and computing, eventually) in this case. If you want to be mathematically precise, it's been proven to be Turing-complete. If you want to use common sense (IMO better), it was one of the most significant leaps in automated computation and simply didn't need to do more for its intended applications. For conditional branches to make sense, you also need a fast temporary storage array (since it would be awfully slow running directly off tape like a Turing machine), and to realize that all that effort makes sense, you first need to play with a computer for a while and discover the new possibilities.


>It starts by calling all people who listen to anything other than classical music “illiterate”!

It does not. The only reference to literacy is the following:

"The use of the word “song” for instrumental music — that is, music that is not sung and hence is not a song — is borderline illiterate."

That is entirely reasonable and correct.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: