AMD Hype!


  • BINNED

    @Gąska said in AMD Hype!:

    @DogsB there was a time when, as part of driving test, you had to change a wheel.

    temporarily borrowing a belt onion from my mom, who lived in Soviet Poland

    That kind of sucks if you’re a woman or not particularly strong.


  • Notification Spam Recipient

    @Gąska said in AMD Hype!:

    @DogsB there was a time when, as part of driving test, you had to change a wheel.

    temporarily borrowing a belt onion from my mom, who lived in Soviet Poland

    there was also a time when people understood abstractions but apparently that's a bygone era too.


  • Java Dev

    @dkf Nvidia feels like they are starting to make a lot of airheaded decisions lately. Not just buying ARM, but also in GPU design. They seem to be going in the same direction as AMD did with GCN and make one unified GPU architecture for both workstation and gaming use. Sure, it's cheaper, but it also means losing the ability to optimize the design. At the same time they are segmenting their GPUs more by killing off their only hybrid solution with the Titan and rebranding it as the RTX 3090 and making a last-minute change to push it as a gaming GPU.

    Meanwhile, AMD realized the unified GPU architecture had made them lose out on high-end gaming and are now developing a gaming-specific architecture instead. And if their teaser numbers holds up they will now match the 3080 in performance, which would be an amazing jump forward and truly put them back in the game again. The 3090 is such a halo product that it's not even funny. Twice the cost for 10% performance gain over the 3080, and you don't even get all of the workstation features that the Titan offered.


  • Discourse touched me in a no-no place

    @Atazhaia said in AMD Hype!:

    The 3090 is such a halo product that it's not even funny. Twice the cost for 10% performance gain over the 3080, and you don't even get all of the workstation features that the Titan offered.

    If they can reduce the price, it'll become relevant beyond a month or two. If not… well, there's lots of hardware that's been a flash in the pan (and lots of software too).



  • @topspin said in AMD Hype!:

    @Gąska said in AMD Hype!:

    @DogsB there was a time when, as part of driving test, you had to change a wheel.

    temporarily borrowing a belt onion from my mom, who lived in Soviet Poland

    That kind of sucks if you’re a woman or not particularly strong.

    The tire fasteners on a car shouldn't be that heavily torqued that a small woman can't loosen them with a decent lever and her body weight.
    But that said, I've had to use a 2m pipe for a lever on lug nuts, that when they finally loosened sounded like gunshots.


  • And then the murders began.

    @DogsB said in AMD Hype!:

    @Gąska said in AMD Hype!:

    @HardwareGeek it's kinda sad that you feel the need to explain what FPGA is on this forum.

    We're software developers not hardware engineers.

    That's the beauty of FPGAs: you can be both at once!


  • Banned

    @dkf said in AMD Hype!:

    @Atazhaia said in AMD Hype!:

    The 3090 is such a halo product that it's not even funny. Twice the cost for 10% performance gain over the 3080, and you don't even get all of the workstation features that the Titan offered.

    If they can reduce the price, it'll become relevant beyond a month or two. If not… well, there's lots of hardware that's been a flash in the pan (and lots of software too).

    They had to bribe the resellers to keep 3080's MSRP at what it is. While 3090 has certainly much bigger margins, I doubt either will go fown in price anytime soon.



  • @dkf said in AMD Hype!:

    the Nvidia model (they sell the chips and cards based on the chips)

    Btw, can someone explain to me what exactly they are selling, and what various card sellers you actually buy the card from do on top of what Nvidia provides them?

    Like, if I search for e.g. "GTX 1650 super" I can see that an MSI isn't identical to a Gigabyte or a Zotac, or an Asus, but apart from putting their sticker and a different shape of fan on top of the card, what do these companies do, exactly, and how does that matter?

    Is it just that Nvidia does 90% of the final product and lets various people, basically, put a different sticker on it and do some money, or do these people really add value? If the former, why would Nvidia not do it themselves and keep that additional margin for themselves, if the latter why is there so little attention given to the actual brand in most reviews? (I mean, most reviews do mention the brand, but if you search for e.g. "GTX 1650 super asus msi" the only actual comparison is some sort of automated website that is probably also able to generate a comparison page for "PHP for dummies" vs. "Raku user manual", so it's not like it's a very important part of the reviews, otherwise there would be tons of actual comparison pages -- and yet that page shows some different numbers for both brands, so...?)


  • Discourse touched me in a no-no place

    @remi said in AMD Hype!:

    do these people really add value?

    LEDs!

    Also, maybe fancier cooling setup. But that would be potentially actual value.


  • Discourse touched me in a no-no place

    @remi said in AMD Hype!:

    Btw, can someone explain to me what exactly they are selling, and what various card sellers you actually buy the card from do on top of what Nvidia provides them?

    Nvidia supply the specifications and the reference design but the manufacturers can manufacture them slightly differently.

    See the recent crashing issue on the 3080 which was related to which types of capacitor had been used.

    Also cooling can be different, speeds can be different (although ultimately still within spec), aesthetics etc.


  • Banned

    @remi Nvidia manufactures the GPU chips. Partner vendors take that chip, put it on a PCB, connect it with other components and sell a complete graphics card. Nvidia also sells some 1st party cards, but it's always in very limited numbers. They used to be called reference cards and their main characteristic was that their cooling sucked ass. Nowadays they call it Founders Edition, and put some actually decent cooling on it - directly competing with their partners. I don't know if they plan to switch to 1st party business model completely, or if they want it to be a collector's item to add hype, but that's how it currently is.

    Edit: by manufacture, I mean they outsource the actual manufacturing to Samsung. Nvidia makes designs, orders them to be made, then takes some for itself and ships the rest to its partners.

    Edit 2: they outsource it to Samsung because TSMC was fully booked by AMD, in a big part thanks to Nvidia earlier trying to strongarm them into cheaper prices and TSMC telling them to GTFO and cancelling the contract they've already had. It's important because TSMC has a superior transistor node design, which would have allowed RTX 3000 series to be even more powerful and more energy efficient.



  • @Gąska said in AMD Hype!:

    Edit: by manufacture, I mean they outsource the actual manufacturing to Samsung. Nvidia makes designs, orders them to be made, then takes some for itself and ships the rest to its partners.

    FWIW, almost all chip companies do this. It costs something like a billion (yes, with a b) dollars to build a new chip factory and a substantial fraction of that every couple of years when a new, smaller process comes out. Not many companies have that kind of capital, so most of them contract with companies like Samsung or TSMC that make chips for hire.



  • NVIDIA is also somewhat invested into the HPC market. If I counted correctly, six out of the top ten supercomputers have a pile of NVIDIA GPUs (and a seventh seems to at least have a few).

    Number 1 is based around ARM chips.

    Pretty much all of the smaller clusters have some NVIDIA GPUs for computing. It'll take a bunch of time before smaller centers will go for something other than x86_64 due to software lock-in, but with the top 1 being ARM, I'm sure people will be looking into that alternative. Especially if there is a relatively sweet deal from NVIDIA.

    Edit: NVIDIA also acquired Mellanox recently. They are one of the traditional manufacturers for the high-end networking infrastructure that you find in supercomputers.


  • :belt_onion:

    @remi Really, as others have said, the big difference is build quality.

    While there is a small difference in performance between different manufacturers, most of the difference is in components/cooling design/aesthetics. Cheaper cards are more likely to fail, better cards run cooler and have slightly (or significantly better) parts.

    Most of the "comparison" websites are there to compare the raw card power which doesn't really change much between the manufacturers.


  • Considered Harmful

    @sloosecannon said in AMD Hype!:

    Cheaper cards are more likely to fail

    Nope.


  • :belt_onion:

    @Applied-Mediocrity said in AMD Hype!:

    @sloosecannon said in AMD Hype!:

    Cheaper cards are more likely to fail

    Nope.

    Uh.. yes.
    The recent RTX3080 story pretty much proves exactly this. Manufacturers who cheaped out on capacitors had issues, ones that didn't worked fine.


  • Considered Harmful

    @sloosecannon It happens (Radeon 7850/70 had similar issues, with MSI being especially stubborn about it), but it does not prove anything. My four and a half years of managing warranty (2012-2016) says this. There is no obvious correlation between manufacturers and brands. Fancy windforces and ROG custom boards stuffed with chokes broke just as much as plastic reference blowers. Fancy designs are there to bleed your wallet and extend your peen. They do have better thermals, but that's because reference design is shit, not the other way round.


  • :belt_onion:

    @Applied-Mediocrity said in AMD Hype!:

    @sloosecannon It happens (Radeon 7850/70 had similar issues, with MSI being especially stubborn about it), but it does not prove anything. My four and a half years of managing warranty (2012-2016) says this. There is no obvious correlation between manufacturers and brands. Fancy windforces and ROG custom boards stuffed with chokes broke just as much as plastic reference blowers. Fancy designs are there to bleed your wallet and extend your peen.

    And I'm saying as someone who's worked returns in a computer hardware store, that the extremely cheap brands are especially likely to have issues. usually, they don't. But if the wrong corners are cut, well...

    EDIT: Also, I recommend staying away from GIGABYTE for motherboards, but that's for an entirely different reason


  • Banned

    @HardwareGeek said in AMD Hype!:

    @Gąska said in AMD Hype!:

    Edit: by manufacture, I mean they outsource the actual manufacturing to Samsung. Nvidia makes designs, orders them to be made, then takes some for itself and ships the rest to its partners.

    FWIW, almost all chip companies do this. It costs something like a billion (yes, with a b) dollars to build a new chip factory and a substantial fraction of that every couple of years when a new, smaller process comes out. Not many companies have that kind of capital, so most of them contract with companies like Samsung or TSMC that make chips for hire.

    Yeah, I know. I just wanted to clarify. Also, seeing how Intel run itself into a corner, it seems to be a superior business strategy than having your own fabs.



  • @sloosecannon said in AMD Hype!:

    EDIT: Also, I recommend staying away from GIGABYTE for motherboards, but that's for an entirely different reason

    I stopped using Gigabyte motherboards after the last one I had killed itself during a BIOS update. I don't miss those days.


  • Considered Harmful

    @sloosecannon said in AMD Hype!:

    And I'm saying as someone who's worked returns in a computer hardware store, that the extremely cheap brands are especially likely to have issues

    Alright, I'm not as unreasonable to discount experience of folks who have actually done shit. I did not see the corner-cutting problem with videocards, though. Sure, Palit, Club3D, Zotac, the least expensive. Not outliers by any measure.

    Gigabyte also used to do shady shit such as changing the entire board design for worse between revisions (their annoying rev. designation that was rarely to be found in supplier descriptions). But on the whole it was more of a concern to... uh, enthusiasts. These parts worked for small business and home users, and that's where most of the market was.

    It certainly happened with stuff that is much easier to take some and stamp a logo on. Power supplies were blatantly violating advertised specs so much that EU got tired of it and made rules. The tablet craze swamped the market with no-name Mediateks. And there certainly was bad stuff from big names, too.



  • This post is deleted!


  • @robo2 said in AMD Hype!:

    internally it is all still ucs2, afaik.

    UTF-16 as of Windows 2000 actually.

    Which is truly great because, unlike UCS-2, UTF-16 has variable length (like UTF-8), but like UCS-2, each "character" takes up two bytes at minimum which is a waste of memory most of the time (Asian languages being a notable exception).



  • @HardwareGeek said in AMD Hype!:

    I can't address AMD's business case for acquiring Xilinx,

    Probably for the same reason that Intel acquired Altera?



  • @Deadfast But emoji take up four bytes instead of nine.


  • :belt_onion:

    @Applied-Mediocrity said in AMD Hype!:

    @sloosecannon said in AMD Hype!:

    And I'm saying as someone who's worked returns in a computer hardware store, that the extremely cheap brands are especially likely to have issues

    Alright, I'm not as unreasonable to discount experience of folks who have actually done shit. I did not see the corner-cutting problem with videocards, though. Sure, Palit, Club3D, Zotac, the least expensive. Not outliers by any measure.

    Gigabyte also used to do shady shit such as changing the entire board design for worse between revisions (their annoying rev. designation that was rarely to be found in supplier descriptions). But on the whole it was more of a concern to... uh, enthusiasts. These parts worked for small business and home users, and that's where most of the market was.

    It certainly happened with stuff that is much easier to take some and stamp a logo on. Power supplies were blatantly violating advertised specs so much that EU got tired of it and made rules. The tablet craze swamped the market with no-name Mediateks. And there certainly was bad stuff from big names, too.

    Yeah. Overall you're not particularly likely to have issues, no matter who you go with, and perhaps the wording in my post wasn't super clear on that. But yeah, corner cutting sometimes leads to, well.. see above... Haha


  • Banned


  • Banned


  • Banned

    Click to embiggen.

    4ad4f4de-07db-4651-bd45-0941ec56f17c-image.png



  • No mention on RT performance...


  • Banned

    "This is a level of realism that wasn't possible before."

    >shows World of Warcraft footage


  • Banned

    0dbf758c-bb91-4d10-ac1e-1077df1c9db0-image.png


  • Discourse touched me in a no-no place

    @Gąska So… in a similar performance category to the 3090, with some games doing better (at 4K) in one and some in the other. Kudos to them for putting numbers up that admit they're not better across the board.


  • Banned

    I'm quite disappointed with pricing, though.

    6900 XT @ $999 - much more reasonable than 3090's price, but if the performance really is the same, that's still not good value.
    6800 XT @ $649 - just on the edge of not being worth it compared to 3080 at MSRP, and only if they really deliver matching performance. Although they may be hoping that 3080 won't be sold anywhere close to MSRP anymore.
    6800 @ $579 - compared to 3070, you're basically paying 80 bucks for more VRAM. And if you're not gaming at 4K, you don't need that VRAM anyway.

    As it stands, I think I'm gonna go with 3070 after all.


  • Discourse touched me in a no-no place

    @Gąska said in AMD Hype!:

    I'm quite disappointed with pricing, though.

    That'll depend on how they ramp up manufacturing volumes. Early releases of a particular chip are often expensive, but as they get the yields up, prices fall. (That boosts the fraction of demand that gets satisfied, etc.)


  • Discourse touched me in a no-no place

    @dkf said in AMD Hype!:

    That'll depend on how they ramp up manufacturing volumes

    They just need to actually make some to beat Nvidia 🍹


  • Java Dev

    As I am gaming at 4K methinks I'll go with AMD. As 10GB VRAM is on the edge of acceptable for 4K right now, the 3080 is pretty dead at that resolution as there is zero future proofing. 6800XT is a very compelling card for that, though. (Or 6900XT if I wanna be insane.) Think I'll do a combined CPU, GPU and RAM upgrade. Just gonna see how the release rush is, don't need to get on release and can just wait until supplies stabilize. Surprised it took AMD this long to actually add a benefit to running both AMD CPU and GPU, though.


  • Discourse touched me in a no-no place

    @Atazhaia said in AMD Hype!:

    Surprised it took AMD this long to actually add a benefit to running both AMD CPU and GPU, though.

    It's only recently they added a benefit to running AMD at all.


  • Banned

    @Atazhaia said in AMD Hype!:

    Surprised it took AMD this long to actually add a benefit to running both AMD CPU and GPU, though.

    Huh? I must've missed that part. Unless you just mean they finally made both a good CPU and a good GPU in the same year?


  • Discourse touched me in a no-no place

    @Gąska said in AMD Hype!:

    @Atazhaia said in AMD Hype!:

    Surprised it took AMD this long to actually add a benefit to running both AMD CPU and GPU, though.

    Huh? I must've missed that part. Unless you just mean they finally made both a good CPU and a good GPU in the same year?

    The Smart Access Memory thing which only works between the Ryzen 5000 and Radeon 6000.



  • @Atazhaia said in AMD Hype!:

    Surprised it took AMD this long to actually add a benefit to running both AMD CPU and GPU, though.

    The presentation also mentioned a specific chipset, so you also need to get the right kind of motherboard.


  • Java Dev

    @cvi I already have the chipset, it's just a matter of CPU and GPU upgrade.


  • Discourse touched me in a no-no place

    Straight outta Youtube comments...

    Calling a performance gain of 1-2% "rage mode" is like calling your chihuahua Cerberus


  • Banned

    @loopback0 hey, free performance!



  • @loopback0 said in AMD Hype!:

    Calling a performance gain of 1-2% "rage mode" is like calling your chihuahua Cerberus

    "Rage" may describe the state of gamers when they realize it doesn't make any noticeable difference in performance.


  • Banned

    @Zerosquare said in AMD Hype!:

    gamers when they realize

    3cf3325a-71a6-4b95-a63c-63de4edb60dc-image.png


  • Banned

    Update: 3070 has now been released, and it seems to be selling for closer to $600 than $500. So, assuming the AMD MSRPs will be real prices, there's no reason to buy 3070. And out of the two 6800s, the XT is just a better deal - 25% more performance for 15% more money.

    Goddamn looks like I'll be spending $700 on a GPU after all, just not the one I thought...


  • Notification Spam Recipient

    @Gąska said in AMD Hype!:

    Update: 3070 has now been released, and it seems to be selling for closer to $600 than $500. So, assuming the AMD MSRPs will be real prices, there's no reason to buy 3070. And out of the two 6800s, the XT is just a better deal - 25% more performance for 15% more money.

    Goddamn looks like I'll be spending $700 on a GPU after all, just not the one I thought...

    I honestly didn't see that coming. I knew the msrp would be bullshit for the 3080 and 3090 but I was pretty sure they were going to steal the every man crown from amd while they were at it and make that price honest. gnvidia appears to have shot themselves in the foot with this one. How will amd scuffer this one is yet to be seen.


  • Discourse touched me in a no-no place

    Nvidia.com lists the 3070 FE is $499.99 which is pretty close to $500.


  • Banned

    @loopback0 it also listed 3080 FE for $699, and we all know how that ended up.

    Edit: also, unlike it's bigger brothers, 3070 FE has fucked up cooling that makes it 15-20 degrees hotter than partner cards.


Log in to reply