Hacker Newsnew | past | comments | ask | show | jobs | submit | franciscop's commentslogin

I did this for ~10 years, and absolutely no regrets, it was a lot of fun and the side projects gave me energy.

Nowadays it's hard though, learning a new language, with a gf and a full-time demanding job, I don't have a lot of time to be tinkering. I do feel a bit sad about this but just assumed it's just life, and cannot imagine with kids how impossible this'd be.

I did look at doing some basic housekeeping with LLMs (updating deps, standardize testing across projects, etc) and realized I have literally 200+ side projects, most of them websites/JS libraries/React libraries. I was a bit baffled, of course 80% of it is trash, but I was kind of amazed at how many things I've actually done.


There’s this special feeling when you can sit down later in the evening to tinker for a couple of hours, or read a challenging/inspiring book in peace.

But when I don’t have time and frankly energy, then I still try to do _some_ minutes of this kind of thing daily.

I feel like there‘s a big difference between 0min and 15min for anything (also includes exercise, meditation etc.), and while it’s great to have more time, there are diminishing returns beyond 30/45min.


When I push myself to do these things, it loses all meaning. I do fun programming because it's fun, when I tried pushing myself like this virtually always I ended up more tired (for a miryad of reasons). And if I need to push myself, I'd rather just learn more Japanese, or do some exercise, or something else. But when I have like 1-2 weeks holiday, I will for sure sneak a few full coding days in there, and that is liberating.

I agree with this. But there‘s a big difference between building a habit of doing some intellectual or creative activity regularly and pushing yourself to do a specific thing.

Before having built a more regular habit I was often in a sort of excitement-burnout loop. That doesn’t work well for me.


Do you think e.g. the AI/LLM boom is all rent seeking? Do you think there's no positive value for the world on the recently announced e.g. MacBook Neo and that it's purely a monopolistic activity? Those are 2 clear recent examples of big players making massive benefits for the world, and I'm okay if they get X% of that value as company valuation.


Not at all. OpenAI / Anthropic are producing tons of surplus value right now! Not to mention how great the Chinese open source LLMs are. And Apple's hardware division has always been fine.

Apple's 30% tax for payments in apps is the ultimate rent seeking example though. Want to install your own apps, lol you can't. And if big AI companies follow in the steps of Google/Facebook it's bad for everyone. Let's recognize it and prevent it from happening this time.


Strongly agree! I was answering my parent comment that had a very negative view, it is indeed an amazing situation where' in!


How should we prevent it, considering the huge financial incentives they have to go exactly that?


Wait what's a new laptop doing to push the needle exactly? Genuinely curious


It's an 8GB RAM 256GB SSD laptop with a lower spec'd 6-core chip for $599 USD. Seems overhyped to me, PCs have done that for a while, just not as elegantly. Admittedly it probably has far better battery life than a PC, so that's a genuine advantage.


In general, it's kind of the difference between having a sharp axe vs a dull axe.

Though in the particular case of the MacBook Neo, I'm not sure whether we're talking about sharper or duller. Depends on the metric you're using, I guess.


It doesn't cut a cake nearly as well as the old Macbook Air so obviously duller.


The Neo is supposed to be the budget version. I think MacOS is a decent computing platform for some engineering and creative endeavors -- if one more college kid gets access to it for cheaper I say it's a positive


Macbook Neo is just another laptop. There is nothing "massive benefits for the world" in the context you are trying to put it. And doesn't Apple take close to a third in 'rent' for anything on their platform?


The bar isn't massive benefits for the world, the MacBook Neo is great! If there was a new company that builds MacBook Neos, that's a great company. They build something real and sell it for more than it costs to make, no strings attached.

The problem with Apple comes down to the App Store, the forced 30%, and all the apps that just don't get built cause of Apple. This is rent seeking, and this is evil.

Here's a good system for evaluating technologies: https://www.ranprieur.com/tech.html

If you don't want a MacBook Neo, don't buy one and it doesn't affect your life. But the App Store affects your life whether you own an iPhone or not. It affects the direction of the world. And that's where the rent seeking problem is.


This was one of the very few advantages of moving from Linux => MacOS, that at least most of the software was beautiful and consistent by default. I'm saddened to see that this is not true anymore. Been holding the Tahoe upgrade, and might just keep my macbook air m1 much longer than originally intended because of this.


I've started using Linux recently after not touching a desktop distro for 20-odd years, and I was surprised how good both Gnome and KDE look these days.

It certainly doesn't feel like there's a trillion-dollar-company difference between those two and Tahoe.


Same! I hadn't touched Linux since 2005-06. I've been trying out Niri and Noctalia. I've been pleasantly surprised how close I'm able to get to match most of what I do on a Mac.

I haven't had this much fun with computers in years. It has certainly helped with my Tahoe grief.


Beautiful, it’s nice, but the polished user experience was the ultimate argument.

- Raising the lid of the laptop and the base wouldn’t stick and fall off on the desk,

- A single-button click,

- A Cmd+C to copy and Ctrl+C for the interruption 7 in the terminal,

But now you have to configure that, yes, activate the right-click; yes, activate the three-finger click (wtf, 3 fingers); yes, activate the swipe-across-desktops on the magic mouse, all those items were selling points, so they should have studied the best behavior and implemented it by default on all deployments. But that requires studies, aesthetics, and a taste that only Steeve Jobs had, otherwise everything becomes an option. That’s right, I’m going to paraphrase Jobs’ argument against the 1990ies Microsoft:

The problem with Apple is they have no taste.


What I find confusing and unhelpful is how The Apple OS deals with windows. Say if you have 4 safari windows, 3 excel windows, 5 window word documents and a bunch of terminals spread across a bunch of desktops. To me, I have clearly conceptionalized different work streams into desktops.

Apple doesn’t understand and respect that.

Firstly, alt-tab doesn’t consider windows, it considers apps. So if you have multiple browser windows or word windows open, you can’t alt-tab between them. It’s totally confusing. So I install an app just to get the normal alt-tab behavior of other OSs, to alt-tab between windows (mine is called alt-tab, and it’s a bit buggy and slow, I think they all are)

Next, Apple does not respect the multiple desktop boundary. If I click on the safari icon in the dock, it will switch to some seemingly random safari window in some other desktop. If I close any window, it will also run off to some other window of the same app in some other desktop (who came up with that behavior?) when I dismiss an outlook notification, it will run of to another desktop to look at outlook (actually I think this one is Microsoft’s fault, but Apple could probably do something about this one).

The result is that while working, I have trouble staying on the desktop I’m working on, I constantly am getting sent off to some other random desktop, and have to find where I am and where I was.

There must be a better, more productive way to manage windows and desktops.

(Also what’s up with the autocorrect, I had to retype every instance of “I think” in this message, because it insists it should be “o think”)


> Firstly, alt-tab

I assume you mean cmd-tab.

>doesn’t consider windows, it considers apps. So if you have multiple browser windows or word windows open, you can’t alt-tab between them. It’s totally confusing.

You use cmd-tilde to switch between windows.

>So I install an app just to get the normal alt-tab behavior of other OSs, to alt-tab between windows (mine is called alt-tab, and it’s a bit buggy and slow, I think they all are)

You don’t need an app.

>Next, Apple does not respect the multiple desktop boundary. If I click on the safari icon in the dock, it will switch to some seemingly random safari window in some other desktop. If I close any window, it will also run off to some other window of the same app in some other desktop (who came up with that behavior?) when I dismiss an outlook notification, it will run of to another desktop to look at outlook (actually I think this one is Microsoft’s fault, but Apple could probably do something about this one). The result is that while working, I have trouble staying on the desktop I’m working on, I constantly am getting sent off to some other random desktop, and have to find where I am and where I was. There must be a better, more productive way to manage windows and desktops.

This is a configurable setting.

>(Also what’s up with the autocorrect, I had to retype every instance of “I think” in this message, because it insists it should be “o think”)

This is a configurable setting.


>>Next, Apple does not respect the multiple desktop boundary... > This is a configurable setting.

If you mean the "When switching to an application, switch to a Space with open windows for the application" settings, this works only partially. When clicking the dock icon its behaviour depends on if there are windows in your current Space (virtual desktop) or not. And don't get me started on where macOS decides new windows should go.


> You use cmd-tilde to switch between windows.

the approach is still pretty different in macOS compared to most other WM behaviors, namely: cmd-tilde cycles windows within the currently-open application, and cmd-tab cylces through applications. in most other environments, alt-tab will cycle windows across all open applications (and win-tab does something like cmd-tab on macOS but somehow horribly).


But the worst part: cmd-tilde cycles, whereas alt-tab on all other OS is a stack of most recently used windows.


I am mildly shocked after almost two decades of Mac use I never came across cmd+tilde thanks a lot!


I see this comment often and I usually pipe up to say that if you don’t have a US ANSI keyboard it can feel unintuitive. You can remap the hotkey to Option + Tab in those cases, easier to get used to.


Next try CMD+H to hide instead of minimising, like in Windows Land.


It doesn’t go to recently used, it CYCLES through the windows.to get to most recent used, you need to cycle through all the windows of the app. Who came up with that!?


> This is a configurable setting.

Give me pointers please. Getting same headaches every day. Clicking on icon in dock, closing some window produces random results every time, across many, many apps


> - A Cmd+C to copy and Ctrl+C for the interruption 7 in the terminal,

I really miss that in Linux. That said, some terminals implement smart Ctrl+C which will interrupt if there's no text selected and copy otherwise. But terminal I use (Gnome Console) does not, so I have to press Ctrl+Shift+C to copy text and then I press that in browser and everything exploded because it opens developer tools. So annoying.


I also used Ctrl-Shift-C for a long time, but then I learned that the CUA keys for copy and pasting are Ctrl/Shift-Ins and that is far more convenient and works consistently even in the old Windows terminal in a VM.


I am in the same boat. I would like to buy a new m5, but being forced to keep Tahoe is preventing me to get it until they fix this clusterfuck


Modern Gnome on Fedora feels like MacOS in a good way. Consistent design


I hear KDE Plasma is nice this time of year. Computers should adapt to fit the user, not the other way around.


The book can say anything it wants, whenever it's true and/or applicable in court later on is a very different matter. Spain's SGAE is a very powerful lobby but still needs to follow the law.

Edit: haven't followed the law in a while, but you could definitely copy, digitalize and scan documents for yourself and your friends (copia privada).


In Spain EULAs cannot infringe upon the law either.


"If it took all the fossil fuel on Earth" What do you mean? To TRAIN an LLM model it takes roughly the same amount of energy as to raise a person, so it's not even really expensive in energy costs.


* Max storage is 512GB instead of 4TB.

I currently have 1TB and I'm pretty happy with it, but I've had 256GB and 512GB in the past and I was not happy with those. This might be the only reason I would not consider this laptop.


This very clearly seems like a bug either in their DMS script, or in the DMS job that they don't directly control, since CSV clearly allows for escaping commas (by just quoting them). Would love to see a bug report being submitted upstream as well as part of the "fix".


CSV quoting is dialect dependent. Honestly you should just never use CSV for anything if you can avoid it, it's inferior to TSV (or better yet JSON/JSONL) and has a tendency to appear like it's working but actually be hiding bugs like this one.


Most CSV dialects have no problem having double quoted commas.

The "dialect dependent" part is usually about escaping double quotes, new lines and line continuations.

Not a portable format, but it is not too bad (for this use) either considering the country list is mostly static


I'd go so far as to say any implementation that doesn't conform to RFC 4180[1] is broken and should be fixed. The vast majority of implementations get this right, it's just that some that don't are so high profile it causes people to throw up their hands and give up.

[1]: https://datatracker.ietf.org/doc/html/rfc4180


The blog post was 404ing for me, it seems to be this:

https://web.archive.org/web/20260211225255/https://crabby-ra...


I've seen too many times in real life people who do arts and want to try to sell it not understand that once you switch from a hobby to a business, you need to spend at least 50% of your time on the business/marketing/logistics/etc side of things, hence failing miserably. The best possible outcome that I've seen is that they miraculously hit a nerve on the first hit, become famous, and at some point realize they need to pay taxes and do so in a decent timeframe.

So I found this article great to explain those things, and also how it's not just "you", but it's "the part of you that people need to buy" to make it into an actual business the thing that it's important. I'll be sharing it a bunch, I'm so happy fnnch wrote this!


Or delegate that stuff and become a "sellout". Just don't get taken advantage of. Oh, and have actual talent. Or don't, doesn't really matter, if the salesperson has some of their own.


I've seen some discussions and I'd say there's lots of people who are really against the hyped expectations from the AI marketing materials, not necessarily against the AI itself. Things that people are against that would seem to be against AI, but are not directly against AI itself:

- Being forced to use AI at work

- Being told you need to be 2x, 5x or 10x more efficient now

- Seeing your coworkers fired

- Seeing hiring freeze because business think no more devs are needed

- Seeing business people make a mock UI with AI and boasting how programming is easy

- Seeing those people ask you to deliver in impossible timelines

- Frontend people hearing from backend how their job is useless now

- Backend people hearing from ML Engineers how their job is useless now

- etc

When I dig a bit about this "anti-AI" trend I find it's one of those and not actually against the AI itself.


The most credible argument against AI is really the expense involved in querying frontier models. If you want to strengthen the case for AI-assisted coding, try to come up with ways of doing that effectively with a cheap "mini"-class model, or even something that runs locally. "You can spend $20k in tokens and have AI write a full C compiler in a week!" is not a very sensible argument for anything.


How much would it cost to pay developer to do this??


It’s hard to say. The compiler is in a state that isn’t useful for anything at all and it’s 100k lines of code for something that could probably be 10k-20k.

But even assuming it was somehow a useful piece of software that you’d want to pay for, the creator setup a test harness to use gcc as an oracle. So it has an oracle for every possible input and output. Plus there are thousands of C compilers in its training set.

If you are in a position where you are trying to reverse engineer an exact copy of something that already exists (maybe in another language) and you can’t just fork that thing then maybe a better version of this process could be useful. But that’s a very narrow use case.


The cost argument is a fallacy, because right now, either you have a trained human in the loop, or the model inevitably creates a mess.

But regardless, services are extremely cheap right now, to the point where every single company involved in generative AI are losing billions. Let’s see what happens when prices go up 10x.


zero

because they tell you to stop being so stupid and run apt install gcc


Because hardware costs never goes down and energy efficiency never go up overtime?

Whatever the value/$ is now, do you really think it is going to be constant?


If hardware industry news is any indication, hardware costs aren't going to be going down for GPUs, RAM, or much of anything over the next 3-5 years.


Maybe, but I seriously doubt that new DRAM and chip FABs aren't being planned and built right now to push supply and demand to more of an equilibrium. NVIDIA and Samsung and whoever else would love to expand their market than to wait for a competitor to expand it for them.


How long do you think it takes for those factories to go from nothing to making state-of-the-art chips at a scale that's large enough to influence the supply even by 1%?

There are plenty of them being built, yes. Some of them will even start outputting products soon enough. None of them are gonna start outputting products at a scale large enough to matter any time soon. Certainly not before 2030, and a lot of things can change until then which might make the companies abandon their efforts all together or downscale their investments to the point where that due date gets pushed back much further.

That's not even discussing how easier it is for an already-established player to scale up their supply versus a brand-new competitor to go from zero to one.


If you keep digging, you will also find that there's a small but vocal sock puppet army who will doggedly insist that any claims to productivity gains are in fact just hallucinations by people who must not be talented enough developers to know the difference.

It's exhausting.

There are legitimate and nuanced conversations that we should be having! For example, one entirely legitimate critique is that LLMs do not tell LLM users that they are using libraries who are seeking sponsorship. This is something we could be proactive about fixing in a tangible way. Frankly, I'd be thrilled if agents could present a list of projects that we could consider clicking a button to toss a few bucks to. That would be awesome.

But instead, it's just the same tired arguments about how LLMs are only capable of regurgitating what's been scraped and that we're stupid and lazy for trusting them to do anything real.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: