Have video cards become ridiculous?



  •  http://www.newegg.com/Product/Product.aspx?Item=N82E16814150500

    I'm assuming this is targeted at people who play video games, based on the packaging. 

    -- 4 GB RAM

    -- Comes in a case shaped like a gun.  (I'm not anti-gun, but it seems a bit weird.)

    -- Free 750 watt power supply included with purchase (most likely due to the minimum 650 Watt power supply required)

    -- Only $909

    Even if you are seriously into games (I'm not) do you really need something like this for the best performance? Or is this just a "mine is bigger than yours" thing?  I do have to admit, the 6 display ports is sort of cool.  Best of all, I love this review:

     Pros: Multi-player shooter fans beware. This card will own
    you. I'm not kidding. I installed this card in my computer and I decided
    to "test drive" it on Call Of Duty. As the game began to load The gun
    rose from the shipping box and hovered over my monitor. Like the melting
    metal guy in Terminator 2 my hand melted into the mouse. 9,000 kills,
    4,600 exploding vehicles, 23 different games, and 63 hours later, I
    snapped out of my Gun card induced trance. All I could hear in my
    headset was sobbing opponents. The room smelled like urine , sweat, and
    cordite. As I lay on my bed the ceiling began to sound like a Huey.
    Thrup, thrup thrup thrup...as I slipped into unconsciousness...I heard
    Jim Morrison singing.


    Cons: 1) I wet myself.  2) What I really wanted was
    the Graphics card with the Chainsaw. I only purchased the gun card
    because they were out of the Chainsaw version. Ticked me off because
    "Agriculture Tycoon: Midwestern Soybean farmer" Game only runs on the
    Chainsaw card. I was really looking forward to seeing the detailed
    graphics of my farm. I heard that the way the chaff flies off the
    combine blades looks amazing. This game barely runs on the gun
    card...well it did but I ended up killing my farm family and 475 of my
    cows...kind of lost it for a minute. That darn gun card.
    3) Every time I sign in to play multi-player shooters the servers clear out.


    Other Thoughts: You should probably purchase three or
    four extra gaming mice and a cast iron mouse pad. I ended up buying a
    new desk because I burned a hole in mine.



  • @El_Heffe said:

    Even if you are seriously into games (I'm not) do you really need something like this for the best performance? Or is this just a "mine is bigger than yours" thing?

    The latter.



  • Mmm. A 9500GT ought to be enough for anybody.



  • @El_Heffe said:

    Even if you are seriously into games (I'm not) do you really need something like this for the best performance?
     

    No. Your benefits for every additional gigabyte of video memory storage and every additional GPU operation per minute has diminishing returns. Plus, (not like the fanboys will understand or even care), most of the games, besides the newest, still only support 32-bit, making most of that 4 Gb utterly useless.

    There is something to be said, though, for being able to run a game like CoD4 in its highest settings without much lag or chop. Would I spend $909 for that experience? Hell no. To me, and again: not like the fanboys will understand or even care, there's a certain bit of irony in how one spends almost a thousand dollars on a device used to pwn me while I used my thousand dollars to hook up with his mom in Vegas.

    EDIT: Actually, I guess I should take back that 64-bit thing. Most games made in the last 5 years support it. I guess I'm just showing my age.



  •  @RHuckster said:

    most of the games, besides the newest, still only support 32-bit, making most of that 4 Gb utterly useless.

     

    True, but isn't Graphics Memory managed by the driver? From what I remember, DirectX and OpenGL basically allows games/applications to say "I want this in video card memory", and the graphics driver deals with the rest; and the only time you would use pointers into VRAM would probably be with CUDA or Shader code.

     

    Of course that doesn't mean the 4GB isn't absolutely useless, since it's mostly just a numbers thing.  Especially in the case of vanity shaped cards, it just says "hey, look how much disposable income I have!"



  • @BC_Programmer said:

    True, but isn't Graphics Memory managed by the driver? From what I remember, DirectX and OpenGL basically allows games/applications to say "I want this in video card memory", and the graphics driver deals with the rest; and the only time you would use pointers into VRAM would probably be with CUDA or Shader code.
     

    I'll have to admit, I'm totally ignorant on this matter, but I was under the impression that a 32-bit application would still need to treat the address space of graphics memory as a 32-bit integer and the runtime is also generally agnostic to exactly where this memory is stored, which means if you have, say, 3Gb of standard memory used up, you actually can only make use of 1Gb of the video card memory, whether it's shader code or what have you. Now, you could possibly say that DirectX and OpenGL might have some kind of helper application which is 64-bit compatible which deals with the shading and stuff on the side, but AFAIK, I don't think that's something that's commonly done, most likely due to the varying nature of graphics and physics engines which are often home grown by the game developer.

    Again, maybe I shouldn't really voice anything on this matter since I truly don't know anything about game programming and graphics/physics engines.

    @BC_Programmer said:

    Of course that doesn't mean the 4GB isn't absolutely useless, since it's mostly just a numbers thing.  Especially in the case of vanity shaped cards, it just says "hey, look how much disposable income I have!"

    Yes, but it's [i]inside[/i] your computer... and when you're playing a deathmatch online, nobody is going to know about it. It's only good for showing off in those stupid "Show us your rigs!" threads on gaming forums.

     



  • The memory on the graphics card does count to the 4 GB limit of 32 operating systems, I know that much. The vanity shaped cards I do not get, but the $600 dollar normal shaped cards are for playing crysis warhead at max settings at 2560x1600 at 60 fps.



  • @delta534 said:

    The memory on the graphics card does count to the 4 GB limit of 32 operating systems, I know that much.
    If you spent $900 on a video card then the 32 bit 4 GB limit probably isn't an issue, since you're probably using one of these and some of these.

    The last video card I bought cost $59.



  • @delta534 said:

    The memory on the graphics card does count to the 4 GB limit of 32 operating systems, I know that much.

     

    Only the I/O addresses used by the video card count against the Consumer  Windows 4GB limit. Video Memory isn't "mapped" into physical memory addresses, but physical memory addresses are needed to communicate with the card. 



  • @El_Heffe said:

    @delta534 said:

    The memory on the graphics card does count to the 4 GB limit of 32 operating systems, I know that much.
    If you spent $900 on a video card then the 32 bit 4 GB limit probably isn't an issue, since you're probably using one of these and some of these.

    The last video card I bought cost $59.

     

    Wait, you bought a video card? Don't they just, like, glue them onto motherboards or something?


  • Discourse touched me in a no-no place

    @BC_Programmer said:

    Only the I/O addresses used by the video card count against the Consumer  Windows 4GB limit. Video Memory isn't "mapped" into physical memory addresses, but physical memory addresses are needed to communicate with the card. 
    This. However, "enthusiast" motherboard manufacturers INSIST in theirshitty chinglish manuals that you set that BIOS setting to match video memory size.



  • @Lorne Kates said:

    Wait, you bought a video card? Don't they just, like, glue them onto motherboards or something?
     

    They're not as powerful.



  • @Lorne Kates said:

    This debate won't stop until someone mentions having gone down on Countess of Lovelace for CPU time on Babbage's mainframe

    Babbage? Come on, he was a noob. Leibniz's binary calculator is where it's at.



  • @El_Heffe said:

    @delta534 said:

    The memory on the graphics card does count to the 4 GB limit of 32 operating systems, I know that much.
    If you spent $900 on a video card then the 32 bit 4 GB limit probably isn't an issue, since you're probably using one of these and some of these.

    The last video card I bought cost $59.

    There is an Intel 990X, you need to keep up, however the new generation i7 2600k is not only better but cheaper



  • @El_Heffe said:

    @delta534 said:

    The memory on the graphics card does count to the 4 GB limit of 32 operating systems, I know that much.
    If you spent $900 on a video card then the 32 bit 4 GB limit probably isn't an issue, since you're probably using one of these and some of these.

    The last video card I bought cost $59.

    I have the first link, and the corsair equivalent of the second.  my mobo is [url=http://www.newegg.com/Product/Product.aspx?Item=N82E16813131642&cm_re=rampage_iii_extreme-_-13-131-642-_-Product]this[/url]. my primary hard drive [url=http://www.amazon.com/gp/product/B00486UR2I]is this[/url] and my [url=http://www.amazon.com/gp/product/B002WRI97A]graphics card[/url].  Tinkering with my hardware is a hobby.  Yes it's more than necessary but for me I find it fun.  It's like collecting baseball cards.  Plus when I do go to play a game I know my experience will be awesome. 

    After my wedding, next winter, I will be upgrading the video card... 

     ps: I bought one of [url=http://www.amazon.com/OCZ-OCZMSNIA-NIA-Impulse-Actuator/dp/B00168VU4U/ref=wl_it_dp_o?ie=UTF8&coliid=I1DC7W50TENSN9&colid=1CY7KTDH3BZG5]these[/url] a few years ago



  • @dhromed said:

    @Lorne Kates said:

    Wait, you bought a video card? Don't they just, like, glue them onto motherboards or something?
     

    They're not as powerful.

    My glued-on-motherboard one never has been a limit to me. Works fine.

     



  • Glued on amd/ati gpu's are not half bad, its the intel gpu's that suck.



  • @toshir0 said:

    My glued-on-motherboard one never has been a limit to me. Works fine.
     

    Addendum: not as powerful as contemporary external video cards. Also upgradeable.



  • @dhromed said:

    @toshir0 said:

    My glued-on-motherboard one never has been a limit to me. Works fine.
     

    Addendum: not as powerful as contemporary external video cards. Also upgradeable.

    True


Log in to reply
 

Looks like your connection to What the Daily WTF? was lost, please wait while we try to reconnect.