Hacker Newsnew | past | comments | ask | show | jobs | submit | sisjohn's commentslogin

I love this one! I would like to see more color schemes about this one.


Just buy a smart car that is based from your standards, in that way, you can weigh in its pros and cons yourself.


It would be interesting to create complex mathematical computations and finding a solution without the proper help of JS.


It would be a good way to build an appreciation how javascript makes things simple.


I agree with this statement; it's one of their business models, after all.


I do really love the art. I wonder if someday, we can walk and explore inside of these artworks.


Definitely, I can feel you on this one. There's a lot of negative connotations in regards to being associated as one.


One thing that I love about VSCode is the flexibility that it offers. Also, my favorite theme, Synthwave, really makes your editor looks amazing.


I agree with this one; there's a lot of business that runs through Facebook. It greatly depends on the platform as their means of communication.


I'm quite interested in seeing this extension limited to HN and Twitter and with Reddit and Facebook.


Any suggestions on what GPU to use to train large models?


Really depends on what you mean by large. If you mean truly large, you will need a cluster to train it in any reasonable amount of time. You’d probably want to look at servers built on the HGX platform (8 A100s per server). We use servers leased in bulk from traditional server providers (think Dell, HP, etc). If you mean more like “as large as personally affordable”, then you’d probably want to look at something like the RTX 3090, if you can get lucky and find it at MSRP, it has 24 gigs of memory. Nvidia also has their workstation cards with up to 48 gigs if I remember correctly, but if I were buying cards for myself, I would wait until I could get two 3090s somewhere close to MSRP, instead of paying the markup on the workstation cards (unless you want to have more than 2 in a workstation, in which case you’d need to go for those)


Totally depends on your budget. The DGX A100 [1] is quite good if you have a fat wallet

[1] https://www.nvidia.com/en-us/data-center/dgx-a100/


2 x 3090FE is the best bang for your buck.


Do you need watercooling to keep them from running too hot?


You can tweak the power limit settings for your application. In many cases you can drop the power consumption (and heat generated) while still maintaining > 90% performance but this will depend on your actual use case [0].

In my experience for many models you can reduce the power limit even further than what has been tested in this guide while barely impacting performance.

[0] https://timdettmers.com/2020/09/07/which-gpu-for-deep-learni...


I use 2x3090 to train large language models, and mine don't thermal-throttle with air cooling even though they're right next to each other. Eth mining does generate too much heat though.


For ML? Nope. I think overheating issues are mostly for mining. I run models and 3D render quite a bit and never ran into problems.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: