WTF Bites


  • Java Dev

    @Gąska Well... Would you pay $5000 for a heavily multi-threaded CPU for running primarily single-threaded tasks when you can get something better for single-threaded performance for under $500?

    Even discounting that, if you need the balance between multi-threaded and single-threaded the HEDT platforms are the best. They are based on server CPUs but aimed towards the desktop market, for the powerusers who need more threads/memory/PCIe lanes than the normal mainstream platforms can provide.

    Even so, AMD has a 32-core CPU that drops into their HEDT socket to a cost of under $2000. So that's 4 more cores. more PCIe lanes, same TDP, but it doesn't require two PSUs just to power the damn thing, or an MB that costs $1500. The only advantage Intel has is that it can handle an extra 64GB of RAM thanks to being 6-channel memory. Because comparing Intel's offering to AMD really shows how pathetic Intel's whole new platform is, especially when considering Intel only has one(!) CPU that goes into it as of now.

    Intel Xeon W-3175X
    28c56t, 3.1GHz base, 4.3GHz boost, 192GB RAM (6-channel), 44 PCIe lanes, 38.5MB L3 cache, 255W

    AMD TR 2990WX
    32c64t, 3.0GHz base, 4.2GHz boost, 128GB RAM (4-channel), 60 PCIe lanes, 64MB L3 cache, 250W

    The only disadvantage to the AMD WX CPUs right now is a Windows bug causing them to perform worse than the 16-core 2950X. Which apparently is a fix for some old Xeon models fucking up for the new AMD ones. Also, I would understand Intel's CPU more if it would go into their current HEDT motherboards, but I guess Intel can't quite shrink their Xeon 8180 to plop into an LGA2066 socket just like that...


  • Considered Harmful

    @Atazhaia said in WTF Bites:

    for the powerusers

    Does that mean running several RGB real time controllers made using Node.js? Those would definitely be power-users.



  • @Gąska said in WTF Bites:

    Why not?

    Oddly appropriate for this topic:

    Notice zero results from Tomb Raider from our new CPUs? This benchmark does not seem to like any arrangement above 12 cores per socket, and refuses to run.

    *snerk*


  • Banned

    @cvi the funniest part is that no, I didn't.


  • I survived the hour long Uno hand

    Imagine this scenario. It's 2019. You're setting up a new IaaS VM in Azure. It comes with Remote Desktop already configured and enabled, so as you can actually manage it over that shiny VPN connection you have set up to your Azure Virtual Network.

    Which of the following other firewall rules will be enabled by default?

    • Allow ICMP pings
    • Allow Multicast DNS
    • Allow UPnP
    • Allow Cast To Device streaming server
    All of them except pings :kermit_flail:

  • Banned

    @Atazhaia said in WTF Bites:

    @Gąska Well... Would you pay $5000 for a heavily multi-threaded CPU for running primarily single-threaded tasks when you can get something better for single-threaded performance for under $500?

    Gamers aren't rational people. And computer manufacturers know it. Besides, I think that a variant of Xeon specifically meant for desktops would be specially tuned for desktop workflows so it performs there better than the ones designated as server CPUs. The point wasn't that Xeon is better - the point was that Xeon is no worse.

    My last CPU cost me $250. That's the most I'm ever willing to spend on CPU.

    Even discounting that, if you need the balance between multi-threaded and single-threaded the HEDT platforms are the best. They are based on server CPUs but aimed towards the desktop market, for the powerusers who need more threads/memory/PCIe lanes than the normal mainstream platforms can provide.

    And what exactly makes it better? Like, what's the actual difference between the architectures? Cache latencies? Throughput in northbridge? Cycles per floating-point instruction? Heat dissipation issues? I just don't understand what makes server-based desktop CPUs inherently worse than non-server-based desktop CPUs for desktop use. Note that I'm talking about desktop-based server CPUs, not server CPUs. Server CPUs are underclocked to increase reliability and power efficiency. But server-based desktop CPUs, being desktop CPUs, wouldn't have these as such a high priority, and this allows the manufacturer to tune it better for desktop workflow.

    And I'm not saying Intel's offer is better than AMD's. I'm just saying there's nothing inherently wrong with desktop CPUs being based on server CPUs.


  • Discourse touched me in a no-no place

    @Gąska said in WTF Bites:

    I just don't understand what makes server-based desktop CPUs inherently worse than non-server-based desktop CPUs for desktop use.

    Server CPUs are typically tuned for doing more things in parallel, but with each being a bit slower. Exactly how best to do that depends on the details of the workload (especially how much data is being moved and in what pattern). There really isn't a general rule, and of those two CPUs architectures, neither looked uniformly better than the other; e.g., if your problem doesn't fit in 128GB, only being able to address that much physical memory will be a fatal mark against.

    HPC loads are similar, but the limiting factors there tend to be a combination of heat dissipation and communication synchronization delays. And HPC platforms tend to use fewer OS layers on the majority of their worker nodes.


  • Banned

    @dkf said in WTF Bites:

    Server CPUs are typically tuned for doing more things in parallel, but with each being a bit slower.

    And I imagine a server-based desktop CPU wouldn't be - at least as much as the architecture lets you. And Xeons are already very good at single-threaded work. I don't know how much they can be yet improved, but I'd bet on at least 5-10% without too much fuss (when you're the manufacturer designing the new desktop CPU, not as end-user).

    This new desktop Xeon might very well be worse than Threadripper, but it won't be because it's based on server architecture. It would be because this particular processor is worse.


  • Discourse touched me in a no-no place

    @Gąska said in WTF Bites:

    I don't know how much they can be yet improved

    Tricky. It depends on the process technology in use, and the proportion of different types of gate used (fast high-leakage-current vs slower low-leakage-current; the leakage current is a significant fraction of what drives power consumption). As I understand it, there's a bunch of other technology trade-offs too at the hardware level, but that's a bit too far outside my expertise; I can understand the issues when I'm told that they're an issue but I can't spot them from afar…


  • I survived the hour long Uno hand

    @izzion
    BONUS WTF!!!

    So, when setting up a vNIC for an Azure VM, there's options in the portal to DHCP-Static reserve an IP address for it, and to configure what DNS servers are being used (Azure default or specify a list at the vNet level, and inherit from vNet or specify a list at the VM level). Of course, the vNet DNS settings I have configured are set up for after the Azure VM I'm working on is promoted to a domain controller, and the promotion isn't going to work with that VM trying to use itself as primary DNS, since it doesn't know shit about the AD structure yet. So I figure, temporary change, the VM is letting me use the Windows commands to override the DNS Server settings on the network adapter, ez pz...

    NOOOOOOOOOOOPE! Congratulations, you just broke the hell out of your vNIC, which now won't talk to anything except the local vNet subnet (thankfully I already had a second VM for the file server that's going in Azure deployed...)

    :facepalm:


  • Banned

    @dkf on the other hand, just how different can be two CPUs from the same architecture family from the same manufacturer? It's very likely much of the design is shared between Cores and Xeons. It would definitely make economical sense to keep them as similar as possible if it didn't impact performance too much.



  • @Atazhaia said in WTF Bites:

    Intel Xeon W-3175X
    28c56t, 3.1GHz base, 4.3GHz boost, 192GB RAM (6-channel), 44 PCIe lanes, 38.5MB L3 cache, 255W

    Wikichip says it supports up to 512GB RAM. Anyway, the W series seems to be targeted at workstations rather than servers.

    For HPC nodes, you'd probably see a different series - a 512GB RAM limit is somewhat of a limitation there (even if there are only a few "large memory" nodes with more than 512GB, the centers seem to prefer having the same CPU in the whole cluster).


  • Discourse touched me in a no-no place

    @Gąska said in WTF Bites:

    just how different can be two CPUs from the same architecture family from the same manufacturer?

    I really don't know. Or rather I know for our stuff (where the answer is “very, to the point where they're not really the same architecture at all”) but we're not a commercial CPU maker so 🤷♂



  • @Applied-Mediocrity said in WTF Bites:

    Does that mean running several RGB real time controllers made using Node.js?

    I don't think that's realistic. That CPU only has 32 cores. 🐠


  • Java Dev

    @Gąska said in WTF Bites:

    just how different can be two CPUs from the same architecture family from the same manufacturer?

    Based off what I read, AMD uses the exact same cores for Epyc, Threadripper and Ryzen. They make 4-core chiplets and typically glue them together into 8-core modules, using one module for Ryzen and two or four for Threadripper and Epyc.

    Intel prefers their old monolithic design, where they make a few designs and just disable broken cores during binning. So atm 8-core for the mainstream, low core count (10) and high core count (18) for X-series/entry-level Xeon, and 28-core high-end Xeon.

    The benefit that AMD has is higher yields from a less complicated design, which Intel derided them for at the start but it seems Intel are looking at using the same tech now when they are having production issues and low yields.


  • Notification Spam Recipient

    @izzion isn't Azure wonderful?


  • area_can


  • Considered Harmful


  • Discourse touched me in a no-no place

    @Atazhaia said in WTF Bites:

    They make 4-core chiplets and typically glue them together into 8-core modules, using one module for Ryzen and two or four for Threadripper and Epyc.

    We're doing something similar for our next chip. Except we're using simpler “chiplets” (I like that term!) and a lot more of them, around 30 or so. I forget how many exactly; it depends on just how much space is needed for the on-chip memory controller and comms controller.


  • Banned

    @dkf said in WTF Bites:

    “chiplets”

    c8a927a0-cc49-47ed-b50e-4f6afd6267e3-obraz.png


  • Notification Spam Recipient


  • BINNED



  • Updating to new insider build in my VM. Clicked 'Restart'. Um, this is different...

    f9ca7f5e-b777-4b2f-bf3e-2f198e9c09d9-image.png

    edit: Using the Start menu worked.


  • Java Dev

    @dcon Meanwhile, Windows on my computer needed to keep reminding me to "keep my computer on" for updates to automatically install. I dismissed that message thrice in two hours. Yes, I got it the first time, and no, you will install them when I shut off my computer, I wont leave it on.



  • @Atazhaia said in WTF Bites:

    you will install them when I shut off my computer, I wont leave it on.

    I wouldn't risk angering our new Windows Update overlords if I were you. Remember, they control when your computer turns on and off.



  • @dcon said in WTF Bites:

    Updating to new insider build in my VM. Clicked 'Restart'. Um, this is different...

    Ugh. Insider VM updates take for-ev-er on my setup since the change to do updates in the background before restarting. I wish I could go back to the old way. :P


  • Banned

    Why so many developers don't get the difference between a tick in a checkbox and a small square in a checkbox!?


  • Banned

    As an example, just look at the pricing on the current Samsung 4K TV UHD TVs, as listed through the company's website. On April 29, if you wanted to buy a 55-inch MU8000, it would cost you $1,299.99 (as part of a $200-off sale). If you wanted to get the 65-inch version of that same TV, it would cost you $2,199.99. A $900 increase is a considerable amount of money, to be sure, but it's not outrageous. However, if you wanted to step up from the 65-inch model to the 75-inch one, it would cost $3,499.99--an increase of $1,300.

    A 70% increase isn't outrageous, but a 60% is. This is why we shouldn't let the journalists do the math.



  • @Gąska said in WTF Bites:

    As an example, just look at the pricing on the current Samsung 4K TV UHD TVs, as listed through the company's website. On April 29, if you wanted to buy a 55-inch MU8000, it would cost you $1,299.99 (as part of a $200-off sale). If you wanted to get the 65-inch version of that same TV, it would cost you $2,199.99. A $900 increase is a considerable amount of money, to be sure, but it's not outrageous. However, if you wanted to step up from the 65-inch model to the 75-inch one, it would cost $3,499.99--an increase of $1,300.

    A 70% increase isn't outrageous, but a 60% is. This is why we shouldn't let the journalists do the math.

    It gets even more funny when you account for the screen area increasing in the square of the diagonal. As in, $/m2 .

    But this is tech reporting. You know the saying?: "Those who can't do, teach. Those who can't teach, review. Those who can't review, report."

    This got hammered home to me by one news story of a journalist causing a fire alarm when they'd tried to destroy a hard-drive with a hammer. I wondered aloud how that was possible, at the workplace. A coworker pointed out that the person in question was a former technology reporting lead for a newspaper, so they'd probably mistaken the laptop battery for the hard-drive. This was later confirmed.


  • BINNED

    @acrow said in WTF Bites:

    It gets even more funny when you account for the screen area increasing in the square of the diagonal. As in, $/m2 .

    Does it, though?
    Assuming a 16:9 aspect ratio:

    >>> def price_per_area(price, diagonal):
    ...     return price / (diagonal**2 * 16*9 / (16*16+9*9))
    ... 
    >>> price_per_area(1300, 55)
    1.0057392102846647
    >>> price_per_area(2200, 65)
    1.2186061801446417
    >>> price_per_area(3500, 75)
    1.4561728395061728
    

    The larger ones are more expensive per area.



  • @topspin Hmm... true. And, as I just now checked, even the price-increase per area increase, increases going to the larger screens.

    I am misled by my intuition. I am shocked. As an engineer, I feel ashamed of myself.


  • Notification Spam Recipient



  • @acrow said in WTF Bites:

    @topspin Hmm... true. And, as I just now checked, even the price-increase per area increase, increases going to the larger screens.

    I am misled by my intuition. I am shocked. As an engineer, I feel ashamed of myself.

    It's not terribly surprising, though. I don't know anything about screens, but for chips, for example, doubling the size of the chip roughly doubles the manufacturing cost, because you get only half as many from a wafer (which, all else being equal, has a more or less fixed processing cost). However, it also roughly doubles the probability of a defect, so you don't get half as many from a wafer; you get less than half, because of the higher defect rate.

    I would think screens have a similar situation. Bigger ones cost more per area because they're more difficult to make. Also, perhaps, because they sell fewer of them, so manufacturing doesn't benefit from the economy of scale that the smaller sizes do.

    Another area where you see a similar phenomenon is the cost of diamonds. A 2 carat diamond costs far more than two 1 carat diamonds of the same quality, because it's much rarer to find diamonds that big.



  • @HardwareGeek I was aware of this. I just expected other per-screen costs to eclipse the increased defect rate to the extent that larger screens were more economical per unit of display area.
    This may or may not stem from my experience of purchasing computer monitor. As far as I can remember, I have never paid money for a TV. And it has been years since my last purchase of a monitor. When I did last time consider monitors, 22in and 24in models from the same manufacturer had a price difference of 10e. However, this is Finland, so prices are a 🐠 .


  • Banned

    @HardwareGeek @acrow also, as I found out, factories that are even capable of producing >65in LCD panels are very few and far between. We should expect a massive drop in prices of 65-75in TVs by 2021, like we've had recently with 55in. This is a very bad time to buy a large TV.



  • @Gąska said in WTF Bites:

    This is a very bad time to buy a large TV.

    You want a bigger TV for cheap?

    Move your sofa closer to it 🧘♂


  • Banned

    @TimeBandit but then I have smaller living room.


  • Considered Harmful

    @Gąska
    A Heringsvolk representative has been dispatched and will be with you shortly, to sort out any and all issues concerning the Lebensraum.


  • Banned

    @Applied-Mediocrity I guess it doesn't help that I live in Breslau.



  • @Gąska said in WTF Bites:

    I live in Breslau

    You're stuck in the cold too? :trollface:


  • Banned

    @TimeBandit why does every European city have an identically named town somewhere in America?

    And fuck you for linking to mobile Wikipedia.



  • @Gąska probably because the people that named those city came from Europe



  • @Gąska said in WTF Bites:

    @TimeBandit why does every European city have an identically named town somewhere in America?

    One thing I've learned while building my own world (fantasy, for games and stories) is that naming things isn't only a hard problem for computer science applications.

    That's why most names, if you actually look at them are either
    a) duplicates
    or
    b) describe physical features, frequently redundantly (the famous Torpenhow Hall "Hall on top of the hill hill") or River Avon (river river)

    And there's only so many ways of describing things.

    Edit: Oh, and we Americans often copy our own names in different places. For example, there's a Kansas City, Missouri and a Kansas City, Kansas. The one in Missouri is bigger. (It's really the same city, separated by a state line/river).


  • BINNED

    @Benjamin-Hall When a name doesn't have to be indicative of anything nor meaningful in any way, it's very easy to stick some random syllables together and call it a day.

    Why, I've even written a simplistic context-free grammar that can do it. How's "raunard" sound?


  • Banned

    @kazitor said in WTF Bites:

    How's "raunard" sound?

    Like a fantasy medieval scholar. Good bot.



  • @kazitor said in WTF Bites:

    @Benjamin-Hall When a name doesn't have to be indicative of anything nor meaningful in any way, it's very easy to stick some random syllables together and call it a day.

    Why, I've even written a simplistic context-free grammar that can do it. How's "raunard" sound?

    That runs into severe problems. Don't get me wrong--I did the "faceroll the keyboard" method for the first few, but then realized that it was
    a) hard to pronounce reliably
    b) prone to making things that sound like bad words (I mostly play with teenagers...)
    c) impossible for me to remember
    d) didn't fit the cultures/world I was building.

    One of my biggest pet peeves about fantasy (and science fiction) writers is when they just make up their own words for things that are, at their core, regular "earth" things. No, that's not a Q'ark'ls'vdsa, it's a cow.


  • BINNED

    @Benjamin-Hall Would you be surprised if I said TV Tropes has a page for that™?
    https://tvtropes.org/pmwiki/pmwiki.php/Main/CallARabbitASmeerp



  • @Benjamin-Hall said in WTF Bites:

    No, that's not a Q'ark'ls'vdsa, it's a cow

    But is the Q'ark'ls'vdsa spherical and in vacuum?



  • @kazitor said in WTF Bites:

    @Benjamin-Hall Would you be surprised if I said TV Tropes has a page for that™?
    https://tvtropes.org/pmwiki/pmwiki.php/Main/CallARabbitASmeerp

    TV Tropes need a time-waster warning!

    @cvi only if physicists need to examine it.


  • Java Dev

    @Benjamin-Hall said in WTF Bites:

    prone to making things that sound like bad words


Log in to reply