Hacker Newsnew | past | comments | ask | show | jobs | submit | nobleach's commentslogin

80 miles for me! I was a Space Shuttle era kid though. Saw the Challenger disaster during my lunchtime. And then on perpetual replay for the rest of the week on WESH/WCPX/WFTV most likely. Even still, just knowing we were launching all those people into space was awe-inspiring.

TBH, I was probably closer to 80 miles than 60 before we moved. to Daytona... Flagler Beach. You?

I was a poor kid building computers in the mid to late 90's. I tried everything I could NOT to use a true Pentium. My first build (coming from an upgraded Compaq 386DX) was an AMD 486 "DX4". I had a Diamond Stealth PCI VGA card and 16mb of DRAM. After that I tried a 233Mhz Cyrix 6x86. That chip was garbage. I had to run some software pentium emulation to get Cubase to run. I went 300Mhz Celeron after that. That was my first time trying the new SDRAM! After that I FINALLY got a legit Pentium III 400Mhz! I could go on and on as this is a lovely walk down memory lane and there's been some fun dips back into AMD Athlon/Ryzen/etc.

> I tried a 233Mhz Cyrix 6x86. That chip was garbage.

Those chips were excellent value for mostly integer work, but had incredibly poor floating-point performance which was a problem for gamers as the 3D era was really getting going around that time. I had one, it did me good service for a few years.


Yeah, I was all about recording music/running the first iteration of software synths. I was a Graphic Design major at that time so Photoshop/Illustrator/QuarkXpress were my jam. Those suprisingly didn't run that bad - in real Graphic Design, no one used Eyecandy (the reason everything on the web in 1998 had drop shadows, outter glows, lens flares) so rendering "3D" rarely came into play.

Not only performance, I strongly suspect there were issues with implementation too as the apps would just freeze/die frequently.

They were practically overclocked coming out of the factory, so if you had any issue at all with cooling, or attempted to clock them up even higher, they would be unstable.

"poor"

As a VERY long-time Linux user, I agree. Multi-monitor setups, where you can unplug the monitor and have your windows gather back onto your laptop screen requires WAY too much configuration. Having your audio switch back to internal laptop speakers requires homebrewing a script. On my 2020 Dell XPS, I still haven't figured out how to enable the subwoofers - so I'm stuck with ThinkPad quality audio. I have 3 ThinkPads (one with straight ArchLinux, 2 with CachyOS) and there's always some little piece I'm annoyed with. The X1C has good battery life, the T480 and P14s are meh. I JUST bought my first HiDPI Lenovo laptop this weekend. Getting that to be a decent tradeoff between readable text and mongo-duplo-massive UI has been "fun". (Yoga 15.3" Aura edition - I really like it) But running apps in Wine is darn near impossible - the text is for ants!

All of these issues go away with Mac and Windows. I'm not giving up on Linux, I'm just a realist.


>On my 2020 Dell XPS, I still haven't figured out how to enable the subwoofers

If I remember correctly dell never provided drivers/firwmare/docs upstream to drive the subwoofer. ref: https://bugzilla.kernel.org/show_bug.cgi?id=215233


Have you ever watched someone USE those COBOL TUIs? Everyone from airline ticket agents, to local governments, to folks at Home Depot while looking up inventory. They could fly through menus and accomplish things. I remember when Best Buy switched to a Windows-based experience. It was terrible. Simply adding a mouse+windowing experience slowed everything way down. I saw it first hand at Target too. They went from an OS/2-based TUI to Windows NT. I know there'll always be those folks that think we're all just trying to play "leet Haxorz", but there's just something about those systems that people deeply connected with.


While Astro does indeed have its own type of components, it also supports React, Solid and a host of others. So transplanting your current tree of components in, adding the React plugin and saying "GO" is likely a fairly straight-forward project. I moved a previous static site into an older verison of Astro with very little trouble.


What would you call a person who, when presented with new information, refuses to change their mind? Dogmatic? Religious? An Idiot? I'm sure there's some self-serving reason the guy wants to go to the moon. What we don't know is if he's had that in mind the entire time.


I convinced one boss to let me spike out a project with it. I was in love with OCaml at the time. OCaml's docs are... I'm just going to say it, they're terrible. F# on the other hand, has fantastic docs. In the end, the only real gripe I had was the significant whitespace. I'm just not a fan.


Way back when I was in IT Admin, I used to have this problem all the time. Some non-tech person emails a spreadsheet, another non-tech person edits it, and saves it. The original person complains that they can't see the changes. Yeah, because it's saved in some MS Windows Profile location that no sane human would ever visit. My solution was to ONLY email links to shared files on a shared resource. The LAST thing I'd ever think of is writing software to solve this problem!

These days if I were interviewing someone and they said, "I'd use the simple solution that is fairly ubiquitous", I'd say, "yes! you've now saved yourself tons of engineering hours - and you've saved the company eng money".


I attach a copy of the file and then provide a network location for where it is located. Makes it easy for people to just open up a simple copy to look at it and they know where to go to access the original.


8GB of "unified" memory. That means it's also shared by the GPU. I realize these things aren't meant to be gaming rigs, or CAD workstations, but I do agree that this isn't very forward thinking.


I use a MacBook Air with 8 GB of memory and it's fine. If I've got a browser and VSCode and Blender and PrusaSlicer and Claude and XCode all open it gets a little slow, but Mac is very good at memory management these days.

Someone using just a browser and Word would have absolutely no problem.


I’ve used a computer with 4 gigs for the last 15 years and it still does emails, recipes, and YouTube. I.e. 100% of my computer needs.


What's so special about this Mac memory management? It uses the SSD better and makes swapping faster? It predicts what I'm gonna use or stop using and it swaps in/out accordingly?


I'm not sure. I think it does swap more aggressively. I think the disk is also just really fast and has a higher speed connection to memory.

Qualitatively I'm running way more things in the background than I could on Linux and Windows machines with double the RAM, with far fewer hiccups.

I haven't tried a modern Surface or other high-end Windows laptop so maybe their swapping is comparable, but given the shocked reactions of non-Mac users at 8 GB of memory, I don't think so.


All of that is a yes, plus compressed memory is a big component of macOS.


It's mostly for people who need to edit some documents, a few photos here and there and other things like that. 8GB of RAM is going to be enough for the average user.


it is quite forward thinking, but for Apple, they are thinking you will need to upgrade.


Even though Pushing Daisies ended too abruptly, I thought he was GREAT in that series as well.


Yeah, someone saw a love struck pie maker and thought: Ronan the Accuser.

He has range!


Check out Wonderfalls, his first series.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: