Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don't think this is "recognizing customers", I think this is just sectioning markets. I'd dreading Nvidia crippling deep learning for all those not paying X more dollars (as the situation already is for server farms).

Long term, the aim has to be selling cards by usage rather than by cost of production (obvious with the aim of prices higher than the cost of production by different amounts). How many people like Adobe's creative suite? Student software versus professional software, etc.



> I'd dreading Nvidia crippling deep learning for all those not paying X more dollars (as the situation already is for server farms).

This already changed years ago when NVIDIA removed the last non-crippled double and half precision GPUs from their product lineup. The cheapest GPU you can buy for ML now is the titan v, which was $3000 at launch.


I'm only planning at this point - so I don't know but am very interested. I see the RTX 3080 reviewed as the most cost effective chip you can get for deep learning. I have the impression a lot of research is moving to lower precision also.

https://timdettmers.com/2020/09/07/which-gpu-for-deep-learni...


They have been going the other way recently: Titan V ($3000) -> Titan RTX ($2500) -> RTX 3090 ($1500). The 3090 beats the Titan RTX in double precision and is close to 2x in single precision.


Scientific calculation need double but ML don't need double, isn't it?


Def doing it for themselves, I see it in this case as customer base preservation. Most of their other approaches have tried to move users off their lower end products. This time they are trying to preserve lower end product availability (smart).




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: