Hacker Newsnew | past | comments | ask | show | jobs | submit | marcosdumay's commentslogin

I don't think you can make any statistical argument from accident data from a single year.

Yes, there is universal data out there. But those events are so rare that you almost never can differentiate a normal year from an abnormal one.


You absolutely can see a difference. [0] The term of art is "Runway Incursions", and the stats definitely show our airports are working at the limits of safety.

[0]: https://www.buckycountry.com/2025/09/22/runway-close-calls-u...


That's a 7 years graph, where category A incursions change by 0.7σ, and total incursions are basically horizontal.

What statistical conclusion are you taking from it?


Category A and B incursions increased by 2.8σ. Further, it was 7 years of increases in a row. Either factor on its own would indicate a process out of statistical control.

> how many web servers abort request processing if the connection drops?

I don't think I have ever seen a published web service which error log wasn't full of broken pipe messages. So, AFAIK, all.


You only get a broken pipe when you write, which is often after you've already done most of the work.

> I think programming languages have a tendency to pick up cute features that give you a little dopamine kick when you use them, but that aren't actually good for the health of a substantial codebase.

That's not the case with Haskell.

Haskell has a tendency to pick up features that have deep theoretical reasoning and "mathematical beauty". Of course, that doesn't always correlate with codebase health very well either, and there's a segment of the community that is very vocal about dropping features because of that.

Anyway, the case here is that a superficial kind of mathematical beauty seems to conflict with a deeper case of it.


I always felt Monads were an utterly disgusting hack that was otherwise quite practical though. It didn't feel like mathematical beauty at all to me but like a hack to fool to the optimizer to not sequence out of events.

By "is worth it" you mean it's worth the work?

Because it's very little extra work.

If you want to know if it's a good syntax, AFAIK it's the only way to do a semicolon-less language that doesn't break all the time.


Looks at 11 languages, ignores Haskell or anything really different...

Once I learned Haskell, everything else looks pretty much identical. Java, C, C++, Smalltalk... At least Lisp looks a little bit different.

or Raku

Those are functional languages that generally don't use statements, so it makes sense to leave them out of a discussion about statement separators. If you think more people should use functional languages and so avoid the semicolon problem altogether, you could argue that.

Functional hardly matters Haskell has plenty of indentation which is by the way interchangeable with `{ ... }`, one can use both at one's own pleasure and it's needed for many things.

Also, famously `do { x ; y ; z }` is just syntactic sugar for `x >> y >> z` in Haskell where `>>` is a normal pure operator.


Yet, the author ends with a half-backed clone of the Haskell syntax.

And that gives US people the right to go there and murder a few thousand extra people?

What it gave the US was an added incentive to take down what is unarguably one of the world's most evil and dangerous regimes.

Would you attack the US because they "murdered" thousands of Germans to take down Hitler in WW2?


I you want to point at evil and dangerous regimes I have a list and Iran wouldn't even be in the top 3...

Obviously your list is different from mine.

Throughput in congestion is determined mostly by how quickly drivers react to the opportunity to move and how many points of attrition are in a path. Both of what are impacted by the number of cars and how well they break or accelerate, not by their size.

There's space to claim large car cause attrition, but that's completely dependent of the local properties of the streets.


The footprint of the car matters. When cars get 5% longer, the same number of people in cars takes 5% more roadway, which adds up quickly, because the difference between smoothly-flowing traffic and jammed traffic is a fragile equilibrium dominated by breakpoints. Furthermore, heavier cars accelerate and decelerate slower than lighter cars, which has a compounding effect on decreasing overall throughput.

That isn't true. Most of the space a car takes is empty as you need long distances between cars.

No, the length you need between cars is variable and depends on the speed of traffic and the time it takes for a car to come to a stop. The longer a car is, the heavier it is (frames do not have negative weight), and the heavier it is, the longer the stopping distance is. Please don't bother commenting further on something you're so belligerently clueless about.

That larger cars cause diminished throughput is pretty solidly demonstrated through a variety of modeling and real-world traffic analysis.

https://www.researchgate.net/publication/365069344_How_the_r...


It would be great if people stopped dismissing the problem that WASM not being a first-class runtime for the web causes.

So, what policy do you change after you learn that economic activity always concentrate on a small part of the city? You go and outlaw a natural law?

1. Stop restricting what can be built, especially in the super valuable city centers

2. Shift taxes off of building (which punishes development) and onto the passive holding of land (which discourages idle land speculation)

In other words, change man's laws, not nature's.


You know that both of those will make the disparity larger, right?

You aim to build more centers and less outlying regions.

Do them publish the banned coordinates in a list too? Maybe they could put the reason at each line.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: