Hacker Newsnew | past | comments | ask | show | jobs | submit | moron4hire's commentslogin

About a decade ago, I got to spend a week at a residency centered on immersion and design, provided by the Fallingwater Institute. My group of about 12 people stayed at High Meadow, which is an educational complex the Institute operates and is also an award winning piece of architecture. My wife was insanely jealous. Visiting FLW works has been a minor travel hobby for us over the years.

Being only a short walk from Fallingwater, we spent some time there every day, including one day when we had the whole house to ourselves and had dinner on the terrace. We each tried to gain a sense of what it was like to live there, rather than just be a museum tourist. A couple of folks played card games sitting at the kitchen table all night. One person curled up with a book in one of the tall, narrow bay windows. I laid out on the floor of the living room and stared at the ceiling, something I do at home sometimes. I thought laying on the floor would give me a feeling of ownership, of doing whatever I wanted with a place, because you couldn't do that as a regular tourist.

It... kinda worked. Not really. It was too surreal. I don't know if I'd ever be able to feel like Fallingwater was home.

My wife and kids and I visited Taliesin West last summer as part of a Grand Canyon trip. I had much the same feeling there, while listening to the tour talk about FLW and his apprentices living there, that I couldn't imagine it as a real living space. Also, I started getting real cultish vibes from the stories of some of the stuff the apprentices went through. Of course, Scottsdale, AZ wasn't any cooler back then than it is today and they built the place themselves, by hand, without any air conditioning. More than one apprentice's marriage ended in divorce over the place because their wives couldn't stand living in tents in the desert without power and running water during the construction years. I was also struck by how I would not expect anyone to even be allowed to bring a spouse on any similar apprenticeship in the modern day, but that's a different issue.

Between all of those experiences and also hearing the stories of how much the wives of FLW's clients would fight with him over kitchens, my own career as a consultant, not being able to imagine telling a client they couldn't have the kitchen they want, and other issues, in recent times I've lost some respect for FLW.

I don't think his Usonian concepts have had much impact on society. For one thing, most people don't even know Usonian is a word, as evidenced when I see them try to come up with a word for a North American who isn't Canadian or Mexican (USian has to win a prize for finding an even more awkward term than Usonian).

That leaves all of his contract work, which was frequently deeply flawed in construction. Some of that defectiveness was due to him experimenting with new construction techniques that eventually got perfected and are no longer so flawed, but there are still many core issues. I would come home from visiting his works and I would wrack my brain over how to employ his ideas of incorporating nature into living spaces before I finally remembered I live in Virginia: nature here is primarily composed of mosquitoes a this breathable water we call "air".

His designs are all-or-nothing, it must be employed as a unified whole. It doesn't look right if it's a single piece of furniture or a window treatment in an otherwise normal house. Putting a 50" flat screen in Fallingwater would ruin the place. Got walls at 90 degree angles to each other? Sorry about your luck! It ends up looking like wearing cargo shorts and a Fedora. If you have a regular ass house like every other "impoverished" slob with a quarter acre lot in suburbia, FLW-style design does not work. I say "impoverished" because FLW-style designs are exclusively the purvue of the ultra rich. To have a house that coordinated, that put together, takes "I make people work overtime for me and I don't even know their names" kind of money.

In 2024, I spent $750,000 on a 1200 sqft rancher built in 1962. Less than a decade before that, Kentuck Knob had been completed for about $96,000. My house may not be as pretty, but at least the roof doesn't leak and the stove can fit a cake.


> I don't think his Usonian concepts have had much impact on society.

The word Usonain has vanished, but the style's influence has not.

> In 2024, I spent $750,000 on a 1200 sqft rancher built in 1962.

The Jacobs First House [1] in Madison, WI was the first Usonian house; it is credited with many features that became common in the mid-century ranches of the 50's and 60's. Stewart Hicks has a good deep dive [2] into Wright's influence on 20th century architecture.

[1] https://en.wikipedia.org/wiki/Herbert_and_Katherine_Jacobs_F...

[2] https://www.youtube.com/watch?v=ZXyiK-zVKsE


I don't know, man. I mean, I know it's the standard Architecture School answer that Wright was influential. But I feel like that can only be said if you focus on superficial, outward appearance and completely ignore his design philosophy, why he designed the way he did.

The materials he chose were meant to make home ownership accessible to the common man. Your own link to the Jacob's House talks about Mr. Jacobs being "a young newspaper man". The $5000 cost in 1935 is like $120k today. Yeah, if 1500sqft houses cost $120k today, I could believe a journalist just starting out on their career could afford the mortgage on it.

Are houses L-shaped now? Yeah, sure. Are they accessible to the common person? Not at all. People are talking the Little Golden Book version of Wright's philosophy.

Also, my house is not L-shaped. Jacobs got 25% more house for 1/6th the inflation adjusted price. He got a study and a shop; I'm performing my own manual labor in my back yard to build a gazebo that I hope will work as a shop for me. As a newspaper hack with a likely-unemployed wife he got a house near a lake. My electrical engineer with a master's degree wife working at a major military research institution and I got a drainage ditch that kinda looks like a stream when it rains really hard. And we're in our 40s. Everyone massively missed Wright's point.


Most scripting languages are designed to present a REPL (read-eval-print loop) in such a scenario.

"Disney’s then-CEO Bob Iger... was sold on Sora, too. He lauded Altman’s ability to “look around corners”..."

WTF is that supposed to mean? I'm sorry, maybe I'm being dense. I can't figure out what "look around corners" is supposed to mean. "Think outside the box," I guess? Why "look around corners?"

I mean, maybe I do get it. Altman has a weird face that looks like you can't predict where his eyes are based on where his head is. "Shifty," one might say. But I doubt that's what Iger meant.

It's dumb. It's dumb corporate speak. I'm so sick of this kind of stuff getting a pass. We used to bully people over using the word "synergy." Let's make america anti-corporate-weasel again.


I read it as being able to see the future, which is still bullshit par excellence. The future is just around the corner as it were, but us normal people cannot see it, on account of both it being the future, and around a corner.

To be very clear, I think it's completely stupid.


Really enjoyed the "karts and equipment will reach underground areas via giant green pipes" caption on the LHC tunnel diagram.

This one was good. It was pretty low-stakes and not anything that would impact anyone. For a while there, companies like Google were announcing products that sounded like a good idea, but turned out were just them trolling everyone over things people had been requesting for a long time.

Their heyday of good jokes was also when they hadn't produced any ads and seemed like an underdog. "Don't Be Evil" days.

Vibe-wise they all feel closer to Raytheon and I sure as fuck wouldn't want to see an attempt at a whimsical joke from Raytheon.


Real PenIsland.com vibes.

There is a lot more "yngmi" and "have fun being poor"-style attitude around here regarding LLM boosterism.

That attitude is particularly galling. Along with the "lock in now or become part of the permanent underclass".

The thing I can't stand is the absolute certainty of the boosters. It's almost religious.

AI is the future. AI will do this. AI will cause that. It is inevitable. Everything is obviously changing.

They leave no room left for debate. No openness to pushback. And no evidence or proof. It just is because it is, and if you don't believe it, you're simply wrong. We saw the same sort of attitudes with blockchain and NFTs.


Technojesus is going to save all the Claude Code users from never having learned how to invert a binary tree, amen64.

What would you take as evidence?

Any quality, useful (commercial) piece of software made with only AI code would suffice. Any successful AI-written indie game. Any successful AI-written library.

The proof should be in the pudding. Claude Code has been out for a year, and the boosters are saying they are orders of magnitude more productive, so we should have seen some kind of successful output within an order of magnitude of the time it would have taken to develop using traditional, non-LLM tools.

Where is the agent-coded Photoshop clone eating Adobe's lunch? Where is the agent-coded Quicken clone which puts the original out of business? Where is the agent-coded hit video game? All we are seeing so far are "Show HN" level projects.

The only thing LLMs seem to be speeding up is people's mouths.


How many years did it take for you to write a Photoshop clone eating Adobe's lunch?

If it's for work, why do you need GitHub at all?

To me, GitHub only makes sense as a social media site for code. If you are publishing to GitHub with no intent to be open in your code, development process, and contributor roster, then I don't see the point of being on GitHub at all.

Because it's not like their issue tracker is particularly good. It's not like their documentation support is particularly good. It's not like their search is particularly good. It's CI/CD system is bonkers. There are so many problems with using GitHub for its own sake that the only reason I can see to be there is for the network effects.

So, with that in mind, why not just setup a cheap VPS somewhere with a bare git repo? It'll be cheaper than GitHub and you don't have to worry about the LLM mind virus taking over management of your VPS and injecting this kind of junk on you.


What do you use for code review and CI/CD then?

You can do it with forgejo, just have to self-host the runners

I am excited about its potential integration with jujutsu: https://codeberg.org/forgejo/discussions/issues/325

Very true. We have a private git repository running on a server that serves as our master. Works fine for us. We backup to GitHub. But it isn't used in any way in the dev workflow

To me, this is a sign of just how much regular people do not want AI. This is worse than crypto and metaverse before it. Crypto, people could ignore and the dumb ape pictures helped you figure out who to avoid. Metaverse, some folks even still enjoyed VR and AR without the digital real estate bullshit. And neither got shoved down your throat in everyday, mundane things like writing a paper in Word or trying to deal with your auto mechanic.

But AI is causing such visceral reactions that it's bleeding into other areas. People are so averse to AI they don't mind a few false positives.


It's how people resisted CGI back in the day. What people dislike is low quality. There is a loud subset who are really against it on principle like we also have people who insist on analog music but regular people are much more practical but they don't post about this all day on the internet.

perhaps one important detail is that cassette tape guys and Lucasfilm aren’t/weren’t demanding a complete and total restructuring of the economy and society

An excellent observation. When films became digital the real backlash came when they stopped distributing film for the old film projectors and every movie theaters had to invest in a very expensive DCP projectors. Some couldn’t and were forced to shut down.

If I had lost my local movie theater because of digital film, I would have a really good reason to hate the technology, even though the blame is on the studios forcing that technology on everyone.


It is not. People resisted bad CGI. During the advent of CGI people celebrated the masterpiece of the Matrix and even Titanic. They hated however the Scorpion King.

No, I don't think most people are really against AI Gen works "on principle". Or at least not in any interpretation of "on principle" that would allow for you to be dismissive of complaints in this way.

I think principles are important. Especially when it comes to art, principle might be all we have. Going back to the crypto example, NFTs were art that real people had made. In some cases, very good art. People railed against NFTs despite the quality of the art. That is being against something on-principle. Comparatively, if my local grocery chains were owned by neonazis, I'd have a much harder time of standing on principle, giving that doing so may have a negative impact on my ability to survive and prosper.

AI Gen works, on the other hand, most often do not come with readily available marking that it is AI Gen. What people are complaining about is the lack of quality in the work. If they accuse a poorly human-written article of being AI Gen, that's just a mistake. But the general case is a legitimate evaluation of the quality of the material and the conditions under which it was made and presented.

In my own case, while I certainly have plenty of "principled" reasons to dislike AI Gen works, I also dislike it because it's just garbage. Oh yeah, sure, it's impressive that a computer can spit out reasonable content at all. It would equally be impressive for a chimpanzee to start talking in full sentences. That doesn't mean I'm going to start going to the chimpanzee for dissertations on the human condition.


Not really. The scale is entirely different. I think less of someone as a person if they send me AI slop.

  > I think less of someone as a person if they send me AI slop.
n=1 but working on side projects for others, i could easily generate ai images (instead of using stock photos) for a client, but i resist because i also feel this but as the sender...

there is the fact that such images 'look ai' but even if it were perfect, idk somehow i feel cheap doing that.


Agreed. Even in low value stuff I’d so much rather use basic stock images, ms paint drawings or almost anything over AI images. Seeing them is almost like being near someone who stinks or is sick/coughing. It’s a very visceral reaction.

I think literally everyone could agree CGI has been detrimental to the quality of films.

"Literally everyone" can't even agree on whether Polio is bad.

I myself would disagree that CGI itself is a bad thing.


Not just in the obvious ways either, even good CGI has been detrimental to the film (and TV) making process.

I was watching some behind the scenes footage from something recently, and the thing that struck me most was just how they wouldn't bother with the location shoot now and just green-screen it all for the convenience.

Even good CGI is changing not just how films are made, but what kinds of films get shot and what kind of stories get told.

Regardless of the quality of the output, there's a creativeness in film-making that is lost as CGI gets better and cheaper to do.


it may be an unpopular opinion but i feel like that watching any of the marvel movies... its like its just a showcase for green screens and ridiculous rubber-band acrobatics cgi everywhere...

that kind if stuff might work in anime or cartoons, but live action just looks ridiculous to me for the most part.


I could maybe agree in the sense of "has had detrimental effects", but certainly not in the sense of "net detrimental".

Project Hail Mary is a great example of not relying on CGI.

Anecdata-- from me. I think cgi can be a net positive.

90% of the time, you wouldn't know CGI if you saw it. That's the 'good' CGI.

Same thing is true of AI output.


Not the same. The more effort you put into CGI the more invisible it becomes. But you can’t prompt your way out of hallucinations and other AI artifacts. AI is a completely different technology from CGI. There is no equivalence between them.

But you can’t prompt your way out of hallucinations and other AI artifacts

That's not the case, and hasn't been for some time, but it sounds like your mind's made up.


Hallucinations have been solved?! That’s great news! Must have missed that.

Hallucinations have been solved?!

Apparently not, because no one but you implied that they had been.

There are prompting strategies that improve the odds greatly, but like the GGP, you've made up your mind, so it's a waste of time to argue otherwise.


i think they are referring to statements that they have "solved" hallucinations and it wont be a problem anymore (which it obviously isn't yet anyways)

[1] https://news.ycombinator.com/item?id=44779198


My guess is that post-training has gotten a lot better in the last couple of years and what people are attributing to better models are actually just traditional (non-LLM) models they place on top of the LLM which makes it appears that the model has increased in quality (including by seemingly fewer hallucination).

If this is the case it would be observed with different prompting strategies, when you find a prompt which puts more weight on the post-training models.


I guarantee you have encountered AI content and not realized it was AI. I assume you've heard of the survivorship bias?

I have and I hated it.

The story is that I was getting into a new genre of music, namely Japanese City pop from the 1980s. I was totally unfamiliar with the genre and started listening to it on YouTube. I found one playlist, which I listened to a lot, thinking: “wow, this is very formulaic, and the lyrics are very generic” but I kind of thought that was just how the genre went. Finally had planned to use it for during a small local event, but when I went to find out who the artists were I embarrassingly found out it was all AI generated.

Thing is, in this instance I knew nothing of the source material, when I went to get actual songs, written by actual people, the difference was start. I would be able to recognize AI generated City pop in an instant now 8 months later. This experience kind of felt like I had been scammed. That my ignorance of the genre had been taken advantage of. It was not pleasant.


I had a very similar experience, looking for music to play during D&D sessions. Not paying close attention to the music, it seemed like it fit the bill. Once I started listening more closely, there were lots of issues that became readily apparent.

My dad has also started sharing with me links on Facebook to pop songs that have been re-arranged in different genres. This was a big area of fun for a number of folks in my family several years ago as we discovered YouTube artists like Chase Holfelder who put significant effort into making very high quality rearrangements. But I kept noticing these weird issues in the new songs.

I've gotten to where I can identify an AI generated song almost immediately: there's a weird, high frequency hiss in the mix that sounds like heavy noise getting to overcome compression artifacts but the source from which it's coming should be clean. There's a general lack of enthusiasm to the lyrics and a boring, nonsensical progression to the lyrics on original arrangements. Sometimes, the person generating the song tries to hide that last issue by generating instrumentals only or they use one of those try-to-hard-to-sound-badass Country Rock genres that are popular on Tik Tok to stick on top of clips from the TV show Yellowstone (WTF is with that?!), but then when I check the details, there's an obviously AI cover art for artists I've never heard of. The accounts will be anthologies full of these artists that have never existed.

So, I know people keep parroting "a good artist can use any tool". But I've yet to see it. All this "democratizing art" (didn't know anyone was gate keeping it to begin with, certainly have not seen any lack of talent online in several years) doesn't seem to be producing results. It becomes pretty obvious very quickly it's all just a pump and dump scheme to Get Them Clicks.


You don't understand. I mean content that even now, you don't know it is AI.

Obviously you think the AI content that you can identify is bad. But there is content you've encountered that you think is good and not AI content, that actually is AI generated.

That's the survivorship bias.


This sounds dangerously close to a No True Scotsman argument. Any example one could provide, you've teed it up nicely to claim that no, you didn't mean that one, obviously, because you could tell. No, it's some other thing that you haven't found yet. That's the passing-AI.

I think it is worse then a No True Scotsman. I think your parent actually performed a category mistake here. Survivorship bias does not apply here. Whether or not I notice or even unknowingly enjoy AI generated content is not in the same category as how much I notice or enjoy CGI.

The difference is in the authorship. Actual work and skill goes into CGI, and people generally notice bad CGI, and it generally affects how you judge the art. Sometimes CGI is actually part of the art and you are supposed to notice it, and it is still good (think how Cher use Autotune in Do You Believe). There is no such equivalence with AI.

To further elaborate. Bad CGI is often (but not always) used as a cost-cutting means. Directors (or producers encourage directors to) use it when they want to save money on practical effects or even cover up mistakes that happened during shooting and want to avoid an expensive re-shoot. This can work OK if used sparingly and carefully, however if this is done a lot and without the needed care, you will notice it, and you will judge the work from it. AI content is kind of like that, except that is kind of all what AI is. The other couldn’t be bothered to do the work and just prompted an AI to do it for them.

To summarize: AI is not like CGI in general, it is much closer to a strict subset of CGI which only includes bad CGI.


No there is a very loud minority of users who are very anti AI that hate on anything that is even remotely connected to AI and let everyone know with false claims. See the game Expedition 33 for example.

Especially true in gaming communities.

IMO it's a combination of long-running paranoia about cost-cutting and quality, and a sort of performative allegiance to artists working in the industry.


And E33 is also a good example that these users are a minority and effectively immaterial. They don't affect sales or the popular opinion.

People don't care about AI. They only care whether the product is good.


And yet, no game has problems selling due to these reactions. As a matter of fact, the vast majority of people can't even tell if AI has been used here or there unless told.

I reckon it's just drama paraded by gaming "journalists" and not much else. You will find people expressing concern on Reddit or Bluesky, but ultimately it doesn't matter.


Yeah, anything that has an MDI metaphor going on should be ran fullscreen. Otherwise, what's the point? If the idea is to use the OS desktop space as the application window organizational space, then don't let people make apps that have different document panes.

This goes towards something that I've felt for a little while: at some point in time around the early 2000s, operating system vendors abdicated their responsibility to innovate on interaction metaphors.

What I mean is, things like tabbed interfaces got popularized by Web browsers, not operating systems. Google Chrome and Firefox had to go out of their way to render tabs; there was no support built into the OS.

The OS interfaces we have now are not appreciably different from what we had in the early 2000s. It seems absurd that there has been almost no progress in the last 25 years. What change there has been feels like it could have been accomplished in user-space, plus it doesn't get applied consistently across applications, thus making it feel like not a core part of the OS.

MacOS in particular was supposed to an emphasis on the desktop environment being the space of window and document level manipulation, as exemplified by the fact that applications did not have their own menubars. All application menu bars were integrated together at the top of the screen. Why should it be any different with any other UI organizational feature? Should not apps merely be a single window pane, accomplishing a single thing, and you combine multiple apps together to get something akin to an IDE out of them?

Well, I don't know if they should be. But they can't. Because OS vendors never provided a good means to do it. Even after signalling they wanted it.


I seem to remember Windows XP using tabs in a lot of its settings pages - and possibly earlier versions as well.

It did, but those were static tabs. It was pretty easy to create tabs as a form of sub-organization. But the treatment of tabs as documents was new-ish to Chrome/Firefox. Other applications treated multiple, concurrent document views as whole, resizable, sub windows inside of an "MDI" panel.

Look at how older versions of Word, Excel, and Visual Studio worked. The tool trays stay consistant as you move between document windows. The entire application is minimizable and quittable together as one.

Photoshop still uses this metaphor. In the ealry and mid-2000s, Photoshop on Windows had a window for the application separate from the documents, but on Apple OS9 and OSX, the only representation of the application itself was in the menu bar. Document windows and tooltray windows both floated in the same desktop space as every other window.

I haven't checked on the GNU Image Manipulation Program, but I seem to remember it retained the same "no application window, tooltrays and doc windows exist in the DE" metaphor for much longer than Photoshop.

There is also a difference in the way that Chrome renders tabs in the window title area. That's a part of the UI chrome that one would expect to be in the perview of the UI toolkit, but Google took it on themselves.


Virtual desktops in Unix predate Visual Studio. I'm pretty sure there was a concept of tabbed interfaces somewhere in the Amiga or BeOS or any other OS.

https://en.wikipedia.org/wiki/Tab_(interface)

Don Hopkins himself can enlighten us about it (NeWS) better than me literally anyone in this thread, jut wait.


What does that have to do with my criticism of the two most popular operating system that they failed to innovate or adapt in areas that showed obvious need?

I'm not sure if I understood correctly but i3 has tabbed windows and no window titles

Opera had tabs. Tabbed under Unix had tabs. Dillo had tabs. TCL/TK had damn tabs in 1997.

Thank you for the additional examples of how the major OS vendors failed to respond to clear need within the market.

KDE actually had it for many years, until Gnome pushed for CSDs, and with (at the time) CSD-only wayland that feature disappeared.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: