C# is a great language with almost unlimited power and great ergonomics (as the article shows), but the .NET CLR (runtime) is a bit overcomplicated with a distinct "Java smell", and packaging and distribution is still meh.
If they could make the developer experience similar to Go, it would rule the world...
You also have the option to do single file deployment where it self-extracts the runtime when you run it. It's not as nice but it works and maintains full compatibility.
I actually really like the CLR developer experience next to java ngl. I reach for C# in lieu of java (less J2EE SingletonBeanFactoryManagerInstance slop) but particularly F# is pretty nice to use. Haskell has bad tooling, OCaml is getting better thanks to JaneStreet (and if OxCaml gets wide adoption unboxed types are a big perf win) but if nothing else lack of a Rider-esque debugger is just a big time sink.
I drank the Go kool-aid, then tried to do some high performance things the Go way: didn't work (channels are slow) and I got over it. Still think Go is great for web backends and the like with production grade stdlib.
Maybe the content is great, but the AI writing style is really grating with its staccato sentences and faux-"profoundness". Can't bear it any more, stopped reading.
"You’re not checking logic. You’re checking shape.". Ugh.
Sorry for that, everyone. I did use the AI to help me with structure and English. I thought I'd proofread and edited that enough to be readable, but apparently it still smells. I'll update the wording soon.
Or you can just write in your native language, and let us machine-translate it? Just a thought. We are, perhaps, letting ourselves be held back by norms that no longer bear any load.
That's a great idea, in fact. I'll try it out next time. Maybe even a mix, because I do sometimes want to be very specific about some expressions and experiment with wordplay
The mental-model that I am using for online writing, is that it is analogous to the spectrum of `pretending <-> acting`. The worst writing (AI or otherwise), looks, sounds, feels like pretense, like a kid that tucks a towel into his shirt, and runs around, pretending to be a super-hero. Meanwhile, acting, true acting, is invisible, it is a synonym for _being_[1].
That said, a lot of the AI writing feels "procedural", in the sense that most corporate writing (whitepapers, press releases, etc) feel procedural (i.e. the result of a constructed procedure). Before AI, the constructed procedure was basically that a piece of writing passes through a bunch of people (e.g. engineering -> management -> marketing -> website/email), and the output is a bland, forgettable pablum designed to (1) be SEO-friendly, (2) be spam-filter friendly, (3) be easy to ingest, (4) look superficially trustworthy and authoritative (e.g. inflated page count, extra jargon, numbers, plots), (5) look like it belongs to the "scene" or "industry" by imitating all the other corporate writings out there[2].
AI is interesting, in the same way that computers or the internet or an encyclopedia are interesting: how people choose to use it tells you a lot about them. All of those technologies can be used to compensate for a lack of skill (it helps one pretend), or they can be used to forge a skill (it helps one become).
One has to pretend, before they can act (I guess? Feels intuitively correct to me). So perhaps, AI (and web, and computer, and encyclopedia) is only harmful to the extend that it does not nudge a person towards becoming[3]? And if so, that's a _cultural_ limitation, not a technological one.
[1]: I am not an actor, and so I might be wrong, but that is the impression I get from just watching and analyzing the acting in various films.
[2]: this becomes frustrating when you get criticized for producing something that "reads like $famousSomething", and then you get criticized again for producing something that "does not read like $typeOfFamousSomething".
[3]: No clue how you (plural -- let's bring back "yous") will convince your boss that you did not take the shortcut, because you were trying to "become more".
Maybe for resume cover letters and LinkedIn posts but I haven't met anyone with half decent taste who prefers AI writing, even well prompted, to skillful human writing. I'm not a stranger to using AI for writing tasks by any means but it's only ever a starting point that gets heavily rewritten by both myself and the model.
It's not hard to get them to copy a style, you just have to provide examples and they will happily produce similar text including grammatical and spelling mistakes. The trouble is with the composition and novelty. Most of the big models have had all of the interesting parts hidden behind a wall of RLHF. Local models are better since you can use ones that are not indoctrinated as a "helpful assistant" and also control the system prompt, temperature and see the top K alternate tokens which let you steer them in interesting ways.
>Maybe for resume cover letters and LinkedIn posts but I haven't met anyone with half decent taste who prefers AI writing, even well prompted, to skillful human writing.
That attitude is one, maybe two generations away from extinction. Taste is created by the market, which caters to the young. When enough people have been born into a world in which AI generated culture and communication is the norm, that is what will define what good taste is. People like you (and I) will just come off like old people yelling at clouds.
We can already see this happening at the fringes. People have relationships with AI, they prefer AIs to real people, they use AI as a primary source of truth, they consider AI generated art to be superior to human work, they trust AI more than people. People identify as AI. AI is filling an emotional, sociological and creative space that an increasingly alienating and hostile society denies to people, for better or worse. Generative AI has only been a thing in popular culture for four years or so and it has already completely transformed human society and human sociology.
Barring a complete collapse of the AI bubble, which seems existentially impossible at this point given how invested our economies and government are in it, that's just what normal is going to be in a decade or so.
Popular taste is guaranteed to be awful since it is driven by economics and fads. That's the type you point out as created by the market and catering to the young. It's a disposable product of consumption used to sell shoes and overpriced paintings.
I don't disagree that it will permeate everything, it already does. It'll just be written by an AI instead of people being paid to find the next style to cop. I don't think it will extinguish human writing, you'll just have AI writing that you feed to official or public channels and then real writing that goes in private or pseudonymous channels. Using AI writing among friends or an in group will still be a faux pas and cringe because it will have become the norm to be rebelled against.
Tangent, but.. It must’ve picked up the faux profoundness on LinkedIn. Those posts I find truly unreadable. It half seriously makes me think anyone being able to post anything was a bad move.
I can 100% guarantee you that if you have a computer with 8GB RAM, the computer wouldn’t start swapping if you brought up a new process that needed 4 of those 8GB even though it says the operating system is using “8GB RAM”
That's a blatant simplification, and does not match reality as far as I've seen.
The OS only only has one large source of memory it uses "optimistically" - the file/buffer cache. But that's tracked separately, so it doesn't show up in normal memory usage numbers (unless you don't know how to read them).
The other source of "extra" memory usage is memory mapped executable files, which can be unmapped and then read back on demand. That's quite small though.
Everything else (mostly) is actual memory usage caused by actual drivers and programs (though it can be swapped, but that's a major perf hit).
The major reason for Windows memory bloat is the hundreds of inefficient services from both Microsoft and hardware vendors that run at startup. The "optimization" pool (let's not call it that way) is way smaller than that.
eg. pre-loading an application is a pessimization if there's not enough memory - not only does it permanently eat away a portion of the total memory due to the intricacies of Windows working set memory management, it will need to be swapped out when actual demand for memory arises, competing with other disk access.
The only actual "optimization" in Windows is Superfetch, and that barely works these days.
110C is not that unusual in the Nordics (although way above average, it's for tougher sauna goers). I've been in one. Not most people's cup of tea though, the experience is comparable to the opposite of a long cold plunge.
A dry sauna sounds terminally boring. The point of Finnish saunas is that they are dry and hot, but you can adjust the pain...experience, I mean, by throwing water on the rocks at intervals of your choice.
It doesn't write anything extra to the browser history. How about actually checking before exaggerating. If you are bothered by a single wrong title with the right URL, well... I think something else is wrong.
You are also completely speculating on the intent. Less drama please.
That site/app doesn't have a single piece of information about who's running it, what the privacy policy is (besides some AI slop in the FAQ section) etc. etc. - and you're supposed to put business-critical information into it (according to its demo)?!
If they could make the developer experience similar to Go, it would rule the world...
reply