Today's Monitors



  • When I was young, a monitor was one of the few computer peripherals that achieved the "plug-and-play" ideal. If you had a monitor with an EGA/VGA style plug, it would light right up and work with most anything designed for that plug. I remember reading a few things about how graphics cards supported a variety of resolutions and refresh rates, and how not all monitors supported all modes, but seldom was there any problem. Occasionally someone would try 1280x1024 on a cheap monitor that didn't support it, or someone might try a 72-hertz refresh rate on a crappy old monitor that only supported 60. But this was rare, and the most common modes (e.g. 800x600 and 1024x768 @ 60 hertz, plus all of the standard VGA modes like 640x480 and 320x200) just worked, over 99% of the time.

    Today, on the other hand, when that magical first meeting between a monitor and a computer occurs, I have zero confidence that I am actually going to get a picture. I haven't had time to figure out all the issues at play, but in summary I think I'm dealing with some combination of the following:

    1) DVI seems to have some sort of handshake process, whereas the old "VGA" system just accepted video data as a dumb, incoming stream for display. So, any hot-swap feature is lost.

    2) There are so many more resolutions output by video cards now (e.g. at different aspect ratios) that there is more room for disagreement. It's not just a matter of supporting 320x200, 640x480, 800x600, 1024x768, and 1280x1024. I have a monitor on my desk which supports at least 20 resolutions (three different modes with width 1920, for example) and even this thing balks at a lot of the resolutions (e.g. the 4:3 aspect ratio ones) that I actually try to send it.

    3) Monitor manufacturers are removing low-resolution modes like 640x480, which aren't necessary from a marketing standpoint. The problem with this is not that I actually want to use my computer at low-resolution. The problem is that I used these low-res modes for troubleshooting. Before, I could just boot in "VGA mode" and any video hardware would work. This saved me time in cases where I didn't know whether the hardware at hand was broken, or whether there was some more esoteric problem. Now, I don't have this option.

    4) The monitors are just less durable. They're lighter and smaller, for one thing. Beyond obvious physical damage, the LCDs seem much more likely to spontaneously go dark. Of course, when this happens, I typically end up spending a couple of hours looking for other problems (Do I need to reboot for DVI? Am I at an unsupported resolution?)

    I think that the condition underlying these changes is basically a tendency toward hyperactive marketing, which has driven a steady increase in aspect ratio. The higher the aspect ratio becomes, the bigger the diagonal is for a given area. In other words, it's cheaper to make (say) a widescreen 20" monitor than a square 20" monitor. The square monitor (if we set 20" as fixed) will be bigger in area. Amazingly, consumers don't realize this, or in fact believe the opposite to be true! (That is, they think a 20" widescreen is actually bigger than a 20" square - it's not.) Manufacturers thus have a powerful incentive to make ever-wider (and shorter) monitors, and this results in the constant introduction of new resolutions.

    I also think that DVI (and its ilk, i.e. any non-VGA monitor plug) is a big part of the problem here. In addition to its infuriating little handshake process, I think the cables and plugs are significantly less durable. The driving force behind DVI, I think, is that it will eventually enable monitors that refuse to display pirated content (or, at least, content it perceives as pirated).

    All in all, I think this is an example of businesses using the bias-toward-progress in consumer electronics to hoodwink the general populace.



  • Or it could be that widescreen resolutions are the default for HD video.  They're also better for working, in my opinion.  I'd rather have widescreen where I can fit two windows comfortably side-by-side than square which is too narrow for that and which has more vertical space which is less useful.



  • @bridget99 said:

    1) DVI seems to have some sort of handshake process, whereas the old "VGA" system just accepted video data as a dumb, incoming stream for display. So, any hot-swap feature is lost.

    Use HDMI, it's always a digital signal. DVI can be either digital or analogue.
    @bridget99 said:
    2) There are so many more resolutions output by video cards now (e.g. at different aspect ratios) that there is more room for disagreement. It's not just a matter of supporting 320x200, 640x480, 800x600, 1024x768, and 1280x1024. I have a monitor on my desk which supports at least 20 resolutions (three different modes with width 1920, for example) and even this thing balks at a lot of the resolutions (e.g. the 4:3 aspect ratio ones) that I actually try to send it.

    Protip: LCDs look like shit if you send them anything other than their native resolution (usually the maximum) anyways.
    @bridget99 said:
    3) Monitor manufacturers are removing low-resolution modes like 640x480, which aren't necessary from a marketing standpoint. The problem with this is not that I actually want to use my computer at low-resolution. The problem is that I used these low-res modes for troubleshooting. Before, I could just boot in "VGA mode" and any video hardware would work.

    That's bad, what manufacturer? I've a newish Dell and a newer Acer monitor, niether have this issue.
    @bridget99 said:
    I also think that DVI (and its ilk, i.e. any non-VGA monitor plug) is a big part of the problem here. In addition to its infuriating little handshake process, I think the cables and plugs are significantly less durable. The driving force behind DVI, I think, is that it will eventually enable monitors that refuse to display pirated content (or, at least, content it perceives as pirated).

    As far as I know the content protection stuff is limited to HDMI.



  • @morbiuswilters said:

    Or it could be that widescreen resolutions are the default for HD video.  They're also better for working, in my opinion.  I'd rather have widescreen where I can fit two windows comfortably side-by-side than square which is too narrow for that and which has more vertical space which is less useful.

    I've got no problem with changing from 4:3 to some other aspect ratio. But that's not what's happened. Instead, we've seen aspect ratio become wide open, with a resultant, economically-driven race to higher and higher aspect ratios. My biggest monitor supports 1280x1024 plus 1280x600, 1280x720, 1280x768, and 1280x960. Surely all of these are not necessary just to play HD content. Some of them (e.g. 1280x600) represent ultra-high aspect ratios that don't facilitate anything but South Korean chest-pounding.

    And although I think more resolution really helps in programming, it doesn't matter to me where it comes into play. In fact, I can see where a portrait-style display would really help programming, maybe even more than a widescreen. Certainly, it seems like a portrait display would help in situations where I have to scroll up/down through a long source file. Widescreens help when someone has gone past column 80 (which I avoid anyway), and they give Microsoft a place to cram IDE-related crap. But speaking personally I'd sooner take a tall, narrow monitor for development work.



  • @Lingerance said:

    Use HDMI, it's always a digital signal. DVI can be either digital or analogue. 

    I'm really past the point of caring about that. I mean, when someone piles up a bunch of hardware and tells you to set up a distributed system (something like a Beowulf cluster), whether or not the monitors are showing a pure digital signal is really irrelevant. I just desperately want to get everything lit up as soon as possible, and the transition to LCDs and DVI has really hurt me here.

    @Lingerance said:

    Protip: LCDs look like shit if you send them anything other than their native resolution (usually the maximum) anyways. 

    Believe me, I know, and I probably should have mentioned that in my first post. I really think hardware should offer a meaninful resolution-versus color depth-versus performance tradeoff curve. It's not reasonable to just say "operate at the highest resolution all the time." Rephrased, that's the same as saying "just use the lowest framerate and the highest amount of RAM at all times."  

    And again, picture quality means little to me here. I really, really just want everything to light up with minimum fuss when I'm setting up one of my clusters.

    @Lingerance said:

    That's bad, what manufacturer? I've a newish Dell and a newer Acer monitor, niether have this issue.

    I've got a new Dell and it actually does seem to support 640x480. However, it's worth noting that many newer LCDs support 640x480, but not at the old VGA refresh rates (http://www.astahost.com/info.php/vga-mode-supported-message-monitor_t13886.html). And nobody seems to support good old 320x200, so I guess playing Wolfenstein3D is out.

    @Lingerance said:

     As far as I know the content protection stuff is limited to HDMI.

    Yes, and I made sure to throw all newfangled monitor plugs under the metaphorical bus in my OP.



  • @bridget99 said:

    I mean, when someone piles up a bunch of hardware and tells you to set up a distributed system (something like a Beowulf cluster)
    ... you run them in a near headless state, they all plug into a KVM, which goes into an IP-KVM.
    @bridget99 said:
    whether or not the monitors are showing a pure digital signal is really irrelevant. I just desperately want to get everything lit up as soon as possible, and the transition to LCDs and DVI has really hurt me here.
    I was actually implying that HDMI has given none of the issues you mentioned, which I believe were probably attributed to DVI being able to do both signal types.
    @bridget99 said:
    so I guess playing Wolfenstein3D is out.
    Virtual machines.



  • @bridget99 said:

    1) DVI seems to have some sort of handshake process, whereas the old "VGA" system just accepted video data as a dumb, incoming stream for display. So, any hot-swap feature is lost.

     

    Good 'ol I2C. It was originally designed to transmit data petween permanently soldered chips in philips tvs. Now its everywhere. Whatever you might think, It's NOT PnP, or hotswap. Your Graphics card runs through a list of addresses on startup ($00 to $FF all monitor plugs on same bus) to find your display. New display not on the list? Wait 5 mins for you card to guess right. Torture for us impatient techs. Most cards stop at the 1st display detected. This is why you have to manually scan for component TVs in Nvidia, the card could care less about anything without EDID data. Makes old monitors(2<) + linux = nightmare.



  • @bridget99 said:

    @morbiuswilters said:

    Or it could be that widescreen resolutions are the default for HD video.  They're also better for working, in my opinion.  I'd rather have widescreen where I can fit two windows comfortably side-by-side than square which is too narrow for that and which has more vertical space which is less useful.

    I've got no problem with changing from 4:3 to some other aspect ratio. But that's not what's happened. Instead, we've seen aspect ratio become wide open, with a resultant, economically-driven race to higher and higher aspect ratios. My biggest monitor supports 1280x1024 plus 1280x600, 1280x720, 1280x768, and 1280x960. Surely all of these are not necessary just to play HD content. Some of them (e.g. 1280x600) represent ultra-high aspect ratios that don't facilitate anything but South Korean chest-pounding.

    And although I think more resolution really helps in programming, it doesn't matter to me where it comes into play. In fact, I can see where a portrait-style display would really help programming, maybe even more than a widescreen. Certainly, it seems like a portrait display would help in situations where I have to scroll up/down through a long source file. Widescreens help when someone has gone past column 80 (which I avoid anyway), and they give Microsoft a place to cram IDE-related crap. But speaking personally I'd sooner take a tall, narrow monitor for development work.

     

    Widescreen formats allow vendors to make displays with a smaller area:diagonal ratio. So while they become cheaper to make (because less area means less pixels, at a given pixel size) they seem to become larger. For example, a 1366x768 display has 20% less pixels than a 1280x1024 display. Eventually we will see the 60" überwidescreen display that offers a 8000x1 pixel resolution.



  • @whatthefrak said:

    @bridget99 said:

    1) DVI seems to have some sort of handshake process, whereas the old "VGA" system just accepted video data as a dumb, incoming stream for display. So, any hot-swap feature is lost.

     

    Good 'ol I2C. It was originally designed to transmit data petween permanently soldered chips in philips tvs. Now its everywhere. Whatever you might think, It's NOT PnP, or hotswap. Your Graphics card runs through a list of addresses on startup ($00 to $FF all monitor plugs on same bus) to find your display. New display not on the list? Wait 5 mins for you card to guess right. Torture for us impatient techs. Most cards stop at the 1st display detected. This is why you have to manually scan for component TVs in Nvidia, the card could care less about anything without EDID data. Makes old monitors(2<) + linux = nightmare.

    I didn't realize I2C was at play here, but I'm not surprised. It is, in my experience, an unbearably insipid little standard. I use it to augment Windows-based (purported) solutions with big, custom-designed arrays of idiot lights / panic buttons. These are designed to allow the user to revert safely to an infantile, pre-computer-revolution state. (I'm headed there myself.)

     

    @ShamelessFurry said:

    Widescreen formats allow vendors to make displays with a smaller area:diagonal ratio. So while they become cheaper to make (because less area means less pixels, at a given pixel size) they seem to become larger. For example, a 1366x768 display has 20% less pixels than a 1280x1024 display. Eventually we will see the 60" überwidescreen display that offers a 8000x1 pixel resolution.

    One of my main points. What's sad is that people don't realize this and are thus trading their money for Pacific Rim-sourced garbage. I'm glad you realize this trend, but I must say that I've talked to many people who ought to know better but don't.



  • @bridget99 said:

    I didn't realize I2C was at play here, but I'm not surprised. It is, in my experience, an unbearably insipid little standard. I use it to augment Windows-based (purported) solutions with big, custom-designed arrays of idiot lights / panic buttons. These are designed to allow the user to revert safely to an infantile, pre-computer-revolution state. (I'm headed there myself.)

    How'd you think EDID worked? It's insipid for a different reason: THEY DONT TELL YOU ITS THERE! Its not in /dev, pnp list, or otherwise. I found out about it through pinout diagrams and a logic analyzer. It's amazing how much data is shuttled through this pipe. The monitor can force the card to not force a resolution. I fixed my HDMI TV problems by cutting these wires and tying them to + through pullup resistors. Bam! luck+custom modeline = native resolution (not recommended, YMMV). This whole "PnP" debacle is definately a ruse to keep the average user's head in the sand. This may even be worse than those fabled100mil:1contrast ratios. This whole problem could be fixed throuh a liberal application of de-hypeify, RS-485, and SMARTER firmware. (none of this plz wait 5 minutes bullshit, Useless resolutions without essential ones 320x200, 640x480).I have a LCD TV at 1366x768. IT IS A PAIN. It won't allow you at the native res without modification. BTW, they already sell those 8000000x1 LCDs, those times square building-edge displays hook up through LVDS (the oems aren't stupid enough to use erratic dvi permanently) and controllers run Wind0ws.

    Sign



  • @whatthefrak said:

    @bridget99 said:

    I didn't realize I2C was at play here, but I'm not surprised. It is, in my experience, an unbearably insipid little standard. I use it to augment Windows-based (purported) solutions with big, custom-designed arrays of idiot lights / panic buttons. These are designed to allow the user to revert safely to an infantile, pre-computer-revolution state. (I'm headed there myself.)

    How'd you think EDID worked? It's insipid for a different reason: THEY DONT TELL YOU ITS THERE! Its not in /dev, pnp list, or otherwise. I found out about it through pinout diagrams and a logic analyzer. It's amazing how much data is shuttled through this pipe. The monitor can force the card to not force a resolution. I fixed my HDMI TV problems by cutting these wires and tying them to + through pullup resistors. Bam! luck+custom modeline = native resolution (not recommended, YMMV). This whole "PnP" debacle is definately a ruse to keep the average user's head in the sand. This may even be worse than those fabled100mil:1contrast ratios. This whole problem could be fixed throuh a liberal application of de-hypeify, RS-485, and SMARTER firmware. (none of this plz wait 5 minutes bullshit, Useless resolutions without essential ones 320x200, 640x480).I have a LCD TV at 1366x768. IT IS A PAIN. It won't allow you at the native res without modification. BTW, they already sell those 8000000x1 LCDs, those times square building-edge displays hook up through LVDS (the oems aren't stupid enough to use erratic dvi permanently) and controllers run Wind0ws.

    There is probably a specification that one can buy that discusses all these things. I've worked with several standards like that, where even getting the spec requires a bit of an investment.

    I've been toiling away at this stupidity all day and what I've found is that its often helpful to obtain and install "monitor drivers" on one's computer. This, at least, solves most of the problems related to standard-ratio resolutions. It doesn't remove the burdensome need to reboot, and certainly does nothing to address the overall futility of the whole pathetic exercise, but it definitely helps.

    I wonder if there's a possible product idea here, some kind of inline DVI plug with a microcontroller designed to just make all of this crap work without rebooting or installing drivers. I could probably design the circuitry and firmware if you could articulate the algorithm. Is there a digital signal that I can simply force high, and everything will go "native?" What happens if the video card doesn't support the monitors native resolution? (Hilarity ensues?)



  • @whatthefrak said:

    The monitor can force the card to not force a resolution.

    I wonder if that's what's wrong with my monitor (HP LP2065 + Ati card + DVI). After coming back on from standby after a long while, or after being turned on, the OSD shows a resolution of 1200*1602 (!), or something like 1800*something, and I get either infinite distortion, or an image that it slightly scaled, or an image that randomly flips back and forth several times per second on and off the screen's actual pixel space. Also lots of chaotic strings of noise. Sometimesthe noise is limited to specific areas within a single menu, or a div inside Firefox (wtf), and sometimes triggered by alttabbing to another application (wtf). The OSD is unaffected and sits unperturbed like a malicious deity hovering over a broken landscape.

    Pressing the input swap button once fixes it.

    It's truly odd.


  • :belt_onion:

    @bridget99 said:

    And although I think more resolution really helps in programming, it doesn't matter to me where it comes into play. In fact, I can see where a portrait-style display would really help programming, maybe even more than a widescreen. Certainly, it seems like a portrait display would help in situations where I have to scroll up/down through a long source file. Widescreens help when someone has gone past column 80 (which I avoid anyway), and they give Microsoft a place to cram IDE-related crap. But speaking personally I'd sooner take a tall, narrow monitor for development work.
    Spoken like a true troll not hindered at all by actual fact. Visual Studio 2010 allows to move the IDE-related "crap" to a second monitor. I have been using it for months and it just works.



  • @dhromed said:

    @whatthefrak said:

    The monitor can force the card to not force a resolution.

    I wonder if that's what's wrong with my monitor (HP LP2065 + Ati card + DVI). Pressing the input swap button once fixes it.

    Yes. Input swap basically forces the monitor to get the EDID from the (believed shared) I2C bus again.

     It's interesting though, that when I have 2 PCs hooked up to my tv through HDMI that they seem to sync up on resolution. This has led me to believe that my TV has Physical non-isolated (EE WTF!) connections to both PCs on the I2C bus. Wierd.  When I change resolution on PC 1, PC 2 changes its output signal to match.  The connections are straight through (multimeter confirmed). Stupid bastards at Sylvania! What if the PCs were on different circuits? Fire, thats what. Furthermore, this opens up some interesting questions. Is this the MPAA's way of Multi-monitor HDCP broadcasting? If I modprobe I2C on this bus, do I get secret (if slow) inter-pc com? HUGE IMPLICATIONS. I could connect a DVI cable between PCs and transmit secret spy data away from the all seeing eye of TCP/IP. (Networked pr0n anyone :) )




  • @whatthefrak said:

    Yes.
     

    Ok, thanks for the intel, but the rest of it is sounding a little like SpectateSwamp.



  • @Lingerance said:

    @bridget99 said:
    1) DVI seems to have some sort of handshake process, whereas the old "VGA" system just accepted video data as a dumb, incoming stream for display. So, any hot-swap feature is lost.
    Use HDMI, it's always a digital signal. DVI can be either digital or analogue. @bridget99 said:
    I also think that DVI (and its ilk, i.e. any non-VGA monitor plug) is a big part of the problem here. In addition to its infuriating little handshake process, I think the cables and plugs are significantly less durable. The driving force behind DVI, I think, is that it will eventually enable monitors that refuse to display pirated content (or, at least, content it perceives as pirated).
    As far as I know the content protection stuff is limited to HDMI.
     

     Just so you know, DVI-D (the digital flavour of DVI) is exactly the same as HDMI (minus the audio).  So DVI does indeed support the content protection stuff (HDCP).  For example, monitors which support DVI + HDCP will work with Blu-Ray players, PS3s, etc. which require HDCP at high resolutions, with the use of an HDMI-to-DVI cable.

     Also,DVI-A (the analog flavour of DVI) is really just VGA. DVI-I connectors support both DVI-D and DVI-A.  Some computer DVI connectors have dropped support for DVI-A, though.

    @Lingerance said:

    I was actually implying that HDMI has given none of the issues you mentioned, which I believe were probably attributed to DVI being able to do both signal types. 

    I don't think his "handshaking" issues would be solved by switching to an HDMI cable, since the video part of HDMI is just DVI-D. Although, to be fair, in the past some people have had problems with certain combinations of monitors (Viewsonic) and video cards (nVidia) displaying DVI-A instead of DVI-D, due to EDID issues.



  •  Also, I thought it might be interesting to note that EDID was (and is) supported for VGA as well:

    http://www.extron.com/company/article.aspx?id=uedid&version=print

    Originally developed for use between analog computer-video devices with VGA ports, EDID is also now implemented for DVI, HDMI, and DisplayPort.

    ...

    Prior to the development of EDID, pins 4, 11, 12, and 15 on the VGA connector were sometimes used to define monitor capabilities. These ID bit pins carried either high or low values to define different screen resolutions. VESA extended this scheme by redefining VGA connector pins 9, 12, and 15 as a serial bus in the form of the DDC - Display Data Channel. This allowed for much more information to be exchanged, so that EDID and other forms of communication were possible between the source and the display.

     So maybe the handshaking issues OP is experiencing are due to using a modern display (some of which still support VGA, and many of which support DVI-A), and not simply because he's using DVI instead of VGA.



  • @dhromed said:

    @whatthefrak said:

    Yes.
     

    Ok, thanks for the intel, but the rest of it is sounding a little like SpectateSwamp.

     

    Indeed.

    @CodeSimian said:

    I don't think his "handshaking" issues would be solved by switching to an HDMI cable, since the video part of HDMI is just DVI-D. Although, to be fair, in the past some people have had problems with certain combinations of monitors (Viewsonic) and video cards (nVidia) displaying DVI-A instead of DVI-D, due to EDID issues.

    Indeed. Some cards (nVidia mostly, some of the ancient ATIs with composite) request for analog on the EDID bus first. The norm is for the monitor to hold out for DVI-D, but some monitors will use the first available connection, convenience be damned. The (hackish) solution is to use a DVI-HDMI dongle connected to a HDMI-DVI cable. This will force your respective display device to behave.

    Wow, I used the word indeed twice, I must sound a bit like Teal'c. :)



  •  Personally, I love how today's graphics cards are supporting higher resolutions and have HDMI output. It saves me the money I would have used to buy a separate monitor, since I just connect it to my TV's HDMI input. It also saves me having to buy a separate DVD player and TiVo for the TV. I've seen a lot of 19" LCD TV's coming out lately and it really just looks like they took a regular LCD monitor and added ATSC/NTSC decoding.



  • @whatthefrak said:

    Wow, I used the word indeed twice, I must sound a bit like Teal'c. :)
     

    Indeed!



  • @dhromed said:

    @whatthefrak said:

    Wow, I used the word indeed twice, I must sound a bit like Teal'c. :)
     

    Indeed!

     

    Indeed!



  • @bob171123 said:

    @dhromed said:

    @whatthefrak said:

    Wow, I used the word indeed twice, I must sound a bit like Teal'c. :)
     

    Indeed!

     

    Indeed!

     

    Indeed!



  • @dhromed said:

    @bob171123 said:

    @dhromed said:

    @whatthefrak said:

    Wow, I used the word indeed twice, I must sound a bit like Teal'c. :)
     

    Indeed!

     

    Indeed!

     

    Indeed!

     

    Indeed!



  • @whatthefrak said:

    @dhromed said:

    @bob171123 said:

    @dhromed said:

    @whatthefrak said:

    Wow, I used the word indeed twice, I must sound a bit like Teal'c. :)
     

    Indeed!

     

    Indeed!

     

    Indeed!

     

    Indeed!

    Combo breaker.



  • @Spectre said:

    @whatthefrak said:

    @dhromed said:

    @bob171123 said:

    @dhromed said:

    @whatthefrak said:

    Wow, I used the word indeed twice, I must sound a bit like Teal'c. :)
     

    Indeed!

     

    Indeed!

     

    Indeed!

     

    Indeed!

    Combo breaker.

     

    Indeed!



  • @dhromed said:

    @Spectre said:

    @whatthefrak said:

    @dhromed said:

    @bob171123 said:

    @dhromed said:

    @whatthefrak said:

    Wow, I used the word indeed twice, I must sound a bit like Teal'c. :)
     

    Indeed!

     

    Indeed!

     

    Indeed!

     

    Indeed!

    Combo breaker.

     

    Indeed!

     

    Indeed!



  • @bob171123 said:

    @dhromed said:

    @Spectre said:

    @whatthefrak said:

    @dhromed said:

    @bob171123 said:

    @dhromed said:

    @whatthefrak said:

    Wow, I used the word indeed twice, I must sound a bit like Teal'c. :)
     

    Indeed!

     

    Indeed!

     

    Indeed!

     

    Indeed!

    Combo breaker.

     

    Indeed!

     

    Indeed!

     

    Indeed!


  • :belt_onion:

    @whatthefrak said:

    @bob171123 said:

    @dhromed said:

    @Spectre said:

    @whatthefrak said:

    @dhromed said:

    @bob171123 said:

    @dhromed said:

    @whatthefrak said:

    Wow, I used the word indeed twice, I must sound a bit like Teal'c. :)
     

    Indeed!

     

    Indeed!

     

    Indeed!

     

    Indeed!

    Combo breaker.

     

    Indeed!

     

    Indeed!

     

    Indeed!

    Flurp!



  • AS for the topic of monitor vs. card scaling, its a fairly big battle. I have a beautiful MultiSync 208UX that uses /the most /obscene/ rates (had to tweak them using the card's now non-existent settings panel) but when I dont run it at its native resolution (1600&mult;1200) my graphics card is set up to scale it using a nice nearest-neighbor scaling if its somewhere near the resolution (so anything >1400&mult;1050) but when its anything else, it just centers it and places a custom image in the bg. Its fun to play Doom with this monitor as it nicely places a big cheatsheet in the background. its cheating I know but I'm not going to do something that I dont have to :D



  • @Indrora said:

    &mult;
     

    The entity you're looking for is &times; ;)


     

     

     



  • @whatthefrak said:

    @bob171123 said:

    @dhromed said:

    @Spectre said:

    @whatthefrak said:

    @dhromed said:

    @bob171123 said:

    @dhromed said:

    @whatthefrak said:

    Wow, I used the word indeed twice, I must sound a bit like Teal'c. :)
     

    Indeed!

     

    Indeed!

     

    Indeed!

     

    Indeed!

    Combo breaker.

     

    Indeed!

     

    Indeed!

     

    Indeed!

     

    Indeed!


Log in to reply