New lab PC



  • Hi!

    My scientific advisor, in his infinite benevolence, suggested that I buy a new workstation for the lab and use it while I'm there instead of my current workstation, an X220 ThinkPad I bought used a few years ago. The budget is approximately equivalent to $850, though could be increased up to 60% if really needed. Expected workload is mostly text editing, coding, tensor decompositions (lots of 102MB-sized matrix products and matrix inversions, preferably in parallel), possibly some FFT convolution thrown in; we might end up hosting a small web app on it, but that's unlikely. Target OS is Linux, though I'll probably have to install Windows when I leave the lab.

    Here's the build I came up with (thanks @Tsaukpaetra for linking to PCPartPicker in your thread):

    Type Item Price
    CPU AMD Ryzen 5 3400G 3.7 GHz Quad-Core OEM/Tray Processor $156.14
    CPU Cooler ARCTIC Freezer 12 CPU Cooler $30.99 @ Newegg
    Motherboard Asus PRIME B450M-A Micro ATX AM4 Motherboard $141.60 @ Amazon
    Memory Kingston HyperX Predator 16 GB (2 x 8 GB) DDR4-3200 CL16 Memory $97.48 @ Amazon
    Storage Transcend 480 GB 2.5" Solid State Drive $79.30 @ Amazon
    Storage Toshiba 2 TB 3.5" 7200RPM Internal Hard Drive $124.17 @ Newegg
    Power Supply EVGA BQ 500 W 80+ Bronze Certified Semi-modular ATX Power Supply $59.34 @ Walmart
    Monitor AOC 24B2XH 23.8" 1920x1080 75 Hz Monitor $109.99 @ Amazon
    Case Thermaltake Versa H24 ATX Mid Tower Case $42.81
    Total $841.82

    Aside the obvious problem of having to update the motherboard firmware (I may be able to use my existing Ryzen 2400G system to do that, but are there easier solutions? New enough but cheap enough MB model I couldn't find?), I have three questions:

    1. How much more expensive would it be to use a discrete GPU and is it worth it? The code I have available is currently CPU-only, and the workloads seem to be small enough to make it hard to bring a GPU advantage. GPU-based code also seems to be much less portable, which I wouldn't want to happen.

    2. When it comes to cases, I've been living under a rock. How do you choose a case in 2020? I would like one without RGB lights or transparent windows, but easily removable dust filters seem like a convenient thing to have. (Some cases have dust filters and also obvious unfiltered holes - why? Are those supposed to be air outtakes?) Is there anything else to consider?

    3. How subjective is the choice of a monitor? I've read some advice to get a mildly cheap IPS one if I'm not interested in low latency or exact colour reproduction (and I'm not), so that's what I chose.


  • Discourse touched me in a no-no place

    @aitap said in New lab PC:

    How subjective is the choice of a monitor?

    If you don't need high resolutions, high refresh rates or super colour reproduction, you can make do with a very cheap one. At work, we usually just reuse those as much as we can even if we change the computers themselves. For basic use, the tech's very close to plateaued and prices are keen for resolutions that correspond to what TVs use.


  • Notification Spam Recipient

    @aitap said in New lab PC:

    How much more expensive would it be to use a discrete GPU and is it worth it? The code I have available is currently CPU-only, and the workloads seem to be small enough to make it hard to bring a GPU advantage. GPU-based code also seems to be much less portable, which I wouldn't want to happen.

    It would probably be pretty expensive if you're doing anything moderately serious, but if the CPU is (currently) handing everything just fine there's not enough benefit in rewriting your code for the potential of speed gains...


  • Considered Harmful

    1. See if you can find BOX variant of the processor. The basic AMD cooler (Prism they call it now) is far better than they used to be in the olden days, without all that hassle of mounting another part.

    2. Why no M.2 SSD? The only concern I can imagine is that it's less portable in case you need to ever move it. It's not a world's difference in everyday speed, but when you're building new, take advantage.
      My personal fav: ADATA SX8200 Pro

    3. Ascetic kind of case is difficult to find these days indeed, but I'm afraid Thermaltake used to have badly matched side panels and flimsy plastic trays for drives. Mechanical drives may cause unwanted vibrations.

    4. I literally hate Toshiba mechanical drives. They're like bad bits of Seagate crossed with an angsty teenager sitting on its ass all day and having suicidal tendencies. My bad experience comes from 5 years ago, so perhaps things have changed. Disregard this point freely, but I just had to get it off my chest.


  • Banned

    @aitap Let's clear up a couple things first.

    1. Aftermarket cooling is a waste of money. Seriously. Stock cooler will be fine. We're talking about low power CPUs that won't be overclocked. The benefit from better cooling will be exactly zero.

    2. You're buying a CPU based on outdated Zen+ architecture. For a few dollars more you can get a much faster CPU with more cores.

    https://www.newegg.com/amd-ryzen-5-3600/p/N82E16819113569

    Sure, you'll need a discrete video card for that, but you can a used one on Ebay for very cheap. Or if all new parts are a hard requirement, you can get GT 710 for 50 bucks, which you won't do any gaming on, but should draw your Excel charts just fine. Or spend about $80 on something almost half-decent, like this GT 1030:

    1. How much more expensive would it be to use a discrete GPU and is it worth it? The code I have available is currently CPU-only, and the workloads seem to be small enough to make it hard to bring a GPU advantage. GPU-based code also seems to be much less portable, which I wouldn't want to happen.

    I'd say that until runtime is measures in hours rather than tens of minutes, you can live by with just CPU.

    1. When it comes to cases, I've been living under a rock. How do you choose a case in 2020?

    Cheapest one available. All you should care about is that it doesn't fall apart under the pressure of power button. Everything else is irrelevant. Of course the answer would be different if you were building something more beefy that needs good airflow.

    That said, one thing you may look at is whether there's USB3 in front, in case you care.

    1. How subjective is the choice of a monitor?

    It's pretty objective. But unless you do gaming or photo/video editing, it doesn't matter what you get. Just make sure it's at least FullHD (1920x1080). Preferably larger. More pixels = more stuff visible at once, and there's never too much stuff visible at once.



  • @Gąska said in New lab PC:

    More pixels = more stuff visible at once, and there's never too much stuff visible at once.

    👍


  • Banned

    @HardwareGeek I've recently upgraded from FullHD to QHD (2560x1440) and I absolutely love it. I can have two Java files side by side, and project tree in sidebar, and I don't have to use horizontal scrolling at all!



  • @Applied-Mediocrity Thanks for the heads up about M.2 SSD and the cooler, also for the warnings! I didn't even think about buying non-SATA storage. I'll ponder the HDD and the case some more, will see if I find a better option.


  • Discourse touched me in a no-no place

    @aitap said in New lab PC:

    M.2 SSD

    If you go M.2 make sure it's an NVMe M.2 and not SATA M.2.


  • Fake News

    @Gąska said in New lab PC:

    Cheapest one available. All you should care about is that it doesn't fall apart under the pressure of power button. Everything else is irrelevant. Of course the answer would be different if you were building something more beefy that needs good airflow.

    In the past I would see cheaper cases which featured their own power supply. Better keep the reliable brand of PSU and remove that freebie should you encounter one, or pick one without PSU (back then I guess the PSU-included ones were so cheap because they were ordered in bulk).



  • @Gąska My colleague reported problems with his stock Intel cooler, but then he's a gamer, so I guess I shouldn't be basing my assumptions on his experience. My FX6100 used to overheat with a stock cooler, but that's considered ancient by now.

    Thanks for the advice on discrete GPUs; I will choose a better CPU and a cheap video card.

    More pixels = more stuff visible at once, and there's never too much stuff visible at once.

    But that should probably be accompanied by increasing the physical dimensions of the monitor. I have a 7" laptop with a 1920x1200 display, and at its 323DPI the bugs in pixel-based application layouts almost outweigh the benefits of sharp fonts.


  • Discourse touched me in a no-no place

    @aitap said in New lab PC:

    My colleague reported problems with his stock Intel cooler, but then he's a gamer, so I guess I shouldn't be basing my assumptions on his experience. My FX6100 used to overheat with a stock cooler, but that's considered ancient by now.

    Stock coolers are better these days. On standard clock speeds, the stock one supplied should be fine.
    Aftermarket cooling isn't a waste of money but it's not necessary here.



  • @loopback0 said in New lab PC:

    Stock coolers are better these days. On standard clock speeds, the stock one supplied should be fine.

    Thanks!

    Fun fact: (the price I'm going to get for Ryzen 5 3600 BOX) - (Ryzen 5 3600 OEM price) is a bit more than the cost of that Arctic Freezer 12. Hopefully, this indicates that Arctic Freezer is no better anyway.


  • Banned

    @aitap said in New lab PC:

    My FX6100 used to overheat with a stock cooler, but that's considered ancient by now.

    Even at the time, FX series had monstrous power usage. Around 200W IIRC. For comparison, Ryzen 5 (and most Core i5's past and present) is rated at 65W.

    More pixels = more stuff visible at once, and there's never too much stuff visible at once.

    But that should probably be accompanied by increasing the physical dimensions of the monitor.

    Well, maybe. Most people are comfortable with FullHD on 15.6" laptop screen with 125% scaling. That's about 115PPI. A typical standalone screen is 24". At 2560x1440 with 100% scaling, it would give you about 123PPI - a 7% difference. So I don't think it would matter much. Now, if you were to go 4K, that would be a different conversation.



  • @aitap said in New lab PC:

    How much more expensive would it be to use a discrete GPU and is it worth it? The code I have available is currently CPU-only, and the workloads seem to be small enough to make it hard to bring a GPU advantage.

    First question would be whether you can deal with single-precision (or even less) or you want/need to run in double-precision (the latter would limit you to expensive workstation GPUs). Even if you can run in single-precision, you probably still want to go for at least a upper-middle-end GPU or a high-end one (so, you're probably looking at least at €350, but more likely €500+).

    I guess with the budget that you mention, going for a more beefy CPU will (like others have said already) give you more immediate benefits.

    Personally, I would go for more RAM (32GB) if possible. And, unless you actually need that amount of storage (2TB+400GB), I'd trade the HD+SSD combo for e.g., a single 1TB M.2 NVMe SSD.

    GPU-based code also seems to be much less portable, which I wouldn't want to happen.

    Depends, I guess. If you use libraries that support off-loading to the GPU, it might be relatively straight forward. If you've written the core number crunching yourself, you'd have to port that.

    FWIW- the main problem here is that you get to choose between CUDA (=NVIDIA) and ${other stuff}. CUDA provides by far the most mature dev environment for GPU code, whereas ${other stuff} may be portable(-ish), but tends to be much more painful to develop in.



  • Why are you even building this thing?

    You can get a 16 core Xeon with 96GB of memory for like $600.

    Yeah it's like 3 years old, but I'd rather a three year old machine that can keep my whole dataset in RAM than a new machine that can't.


  • Banned

    @Captain said in New lab PC:

    Why are you even building this thing?

    You can get a 16 core Xeon with 96GB of memory for like $600.

    Yeah it's like 3 years old

    Link? I'd like to get one myself.




  • Banned

    @Captain I knew there's a catch. These aren't 3 year old Xeons. These are 8 year old Xeons. Sandy Bridge-based. 32 nm process, compared to Ryzen 3600's 7nm. And there's two of them, for a total TDP of 230W.



  • @Captain said in New lab PC:

    @Gąska

    I got one last year for $750, so maybe it's a little older than I thought. Still worth a look though.

    No hard drive? 😕



  • @Gąska Oh well, I'd still take something like this over building a workstation. Still a beastie, in service until 2015, with 20M L2 cache per processor (and there's two of them).


  • Banned

    @Mason_Wheeler said in New lab PC:

    @Captain said in New lab PC:

    @Gąska

    I got one last year for $750, so maybe it's a little older than I thought. Still worth a look though.

    No hard drive? 😕

    Would you want an 8 year old hard drive, though?



  • @cvi said in New lab PC:

    Personally, I would go for more RAM (32GB) if possible. And, unless you actually need that amount of storage (2TB+400GB), I'd trade the HD+SSD combo for e.g., a single 1TB M.2 NVMe SSD.

    +1. There's no reason to buy HDDs at all these days and RAM is cheap and you can never have enough.


  • And then the murders began.

    @dfdub said in New lab PC:

    There's no reason to buy HDDs at all these days

    For a workstation, sure. But I can't afford to populate my NAS with SSDs.


  • Banned

    @aitap one more thing. The prices for some of the components you chose are horrendous. You can easily get a mobo under $100, and the RAM you picked is twice the regular price.

    And like @cvi said, more RAM is better. Here's an example offer of 32GB for $107 (in 16GB sticks so there's room for more):

    https://www.newegg.com/g-skill-32gb-288-pin-ddr4-sdram/p/N82E16820232091

    And here's a mobo for $80:

    https://www.newegg.com/asus-prime-b450m-a-csm/p/N82E16813119137


  • Fake News

    @Gąska That mobo seems to be the same model, so maybe it's just the Amazon vs NewEgg price?


    As for RAM: 2 modules of 16 GB is indeed a good suggestion (though personally I would opt for Crucial's offering as I believe they still fab their own memory chips and QC them - though you can see that in the price).

    If possible I would consult ASUS' QVL to see what hardware they tested their mobo with. It's of course a limited list, but it will improve the chances that everything works out of the box. My brother picked parts for an early Ryzen build and he managed to pick a RAM kit which simply wouldn't boot. The shop where he bought it offered to try a second kit, to no avail, and simply substituted the kit for another brand - problem solved.



  • @cvi said in New lab PC:

    Personally, I would go for more RAM (32GB) if possible.

    Noted, thank you!

    FWIW- the main problem here is that you get to choose between CUDA (=NVIDIA) and ${other stuff}.

    ...and if I want the code to be really portable, I get to develop the same code twice, for CUDA and OpenCL! Then again, I've got a colleague who maintains OpenMP and MPI backends for his (much more serious) number crunching problems and can use them both at the same time, so it's all a question of developer skill and time.



  • @Gąska Thanks again for the warnings! That B450M-A is actually going to cost the university ~$78, but your RAM sticks cost ~$158 in places where our bureaucracy can order them. I think I'm going to opt for 2x16G 3200MHz Kingston HyperX Fury (HX432C16FB3K2/32), which is "only" ~$146. Both vendors/frequencies are mentioned in QVL (thanks @JBert!), but only in up to 2x8G configurations (I cannot buy the ones that are mentioned as 2x16G).



  • @aitap said in New lab PC:

    CUDA and OpenCL

    Technically, you can run OpenCL stuff on NVIDIA as well. At least up to some OpenCL version. I have no idea if you can run newer OpenCL versions anywhere, though - the OpenCL 2.2 spec was released ~2017 and according to Wikipedia there is still nothing out there that supports it.

    You'd be better off looking up something like OpenMP's GPU support (supposed to exist, haven't used it myself) and similar options. Those would be portable in a sense, albeit might also currently only have NVIDIA backends (besides potentially CPU ones). (FWIW- I don't see AMD GPUs being used a lot for number crunching stuff. I don't think we have a single AMD GPU at work, and the clusters/"supercomputers" that I know all use NVIDIA GPUs if they have GPUs at all.)



  • @Captain I've had a beast like that for years (Z820 with 15k rpm scsi drives). Switching to a laptop (i5 with 1TB SSD) was actually an improvement on all levels. OK, one disadvantage is that I lost the office heater...


  • Considered Harmful

    @robo2 said in New lab PC:

    @Captain I've had a beast like that for years (Z820 with 15k rpm scsi drives). Switching to a laptop (i5 with 1TB SSD) was actually an improvement on all levels. OK, one disadvantage is that I lost the office heater...

    And NIST had to add a leap second that year, because apparently Earth had rotated a bit slower due to loss of angular momentum caused by you spinning up those beasts 🍹



  • @robo2 said in New lab PC:

    @Captain I've had a beast like that for years (Z820 with 15k rpm scsi drives). Switching to a laptop (i5 with 1TB SSD) was actually an improvement on all levels. OK, one disadvantage is that I lost the office heater...

    I have a dual core i5 laptop. I like my 16 core Xeon better. To be fair, they both have sata SSDs in, not the fanciest nvme or whatever it's called.

    I know my choice isn't for everyone, but I did put my money where my mouth is, specifically for a data analysis/ML workload. :-)



  • @Captain certainly. And when I still ran parallel fluid analysis workloads the xeon came in handy. Nowadays that all runs "in the cloud", so my local workloads are much reduced. For lighter workloads the newer processors are just better. Or at least, I hope so. Maybe I just want to believe technology has profressed in this area, please let me keep my delusions...


  • Banned

    This post is deleted!