My new 4k monitor-- God I wish I had taken notes while setting up this thing



  • Just a short list of stuff I encountered:

    • Windows 8.1 is perfectly capable of running two monitors at different DPIs, and in fact does that automatically to some extent... except:
      • The taskbar is always run at the resolution of the primary monitor (meaning I get to choose between a gargantuan taskbar on the secondary screen, or a miniscule one on the secondary screen
      • Windows doesn't correct for DPI when you're moving the mouse between screens. This means that if I move the mouse from my 4k screen anywhere below the 50% mark, it will not appear on the 1080p screen. If I move it from the 50% mark, it'll appear on the 1080p screen at the 100% mark. This behavior is amazingly broken and so obviously wrong I was shocked to discover it. (Sure that matches the control panel's diagram of the monitors, but the actual real life monitors are ALMOST IDENTICAL IN PHYSICAL SIZE! It's only the DPI that's different!)
    • This monitor, a Dell, can't run 4k at 60 hertz unless it's set to DisplayPort 1.2 (the default is apparently 1.1?) But when I set it to DisplayPort 1.2, the GPU stopped supplying an image to both monitors. I found:
      • My GPU wouldn't run the 4K monitor using DP1.2 unless it was the only monitor. Initially. Once I unplugged the 1080p monitor and it ran the 4k monitor once, it worked from then on even with both monitors plugged-in. (Guess how many reboots it took to figure this out?)
      • When in DP1.2 mode, NVidia Control Panel doesn't work. Like... at all. It doesn't show the taskbar icon, if you manually attempt to launch it it doesn't show up on screen at all. If you turn off DP1.2, it's suddenly fine again. I'm actually downgraded to DP1.1, even though this mode is limited to 30 FPS, just because DP1.2 was so buggy and unreliable.
    • To solve the "mouse won't move to other screen correctly and also taskbar looks like ass" problem, I experimented with NVidia driver features. It has a feature call "surround" which, in theory, will lie to the OS and tell it your two monitors are only actually one monitor.
      • THIS FEATURE IS HORRIBLY BROKEN BEYOND BELIEF IN LIKE 58 DIFFERENT WAYS DO NOT USE IT
      • If you ever do actually use it, remember the NVidia control panel will lie to you and tell you you've turned the feature off when you actually haven't. The checkbox that reads "Span displays with Surround" DOES NOTHING AT ALL.
      • The only way to disable the feature, should you have accidentally enabled it, is to "turn on" the Surround feature with the checkbox, go into its Configure page, and uncheck one of the checkboxes next to your list of monitors.
      • BTW, Nvidia Control Panel, you know the program that's super-helpful when you have high resolution monitors? Doesn't support high resolution monitors. AT ALL. Not even in stupid "high DPI compatibility mode". It's just plain broken.
    • Skype's UI runs in "high DPI compatibility mode" (i.e. it just dumbly scales shit and looks like ass) even though it was rewritten from scratch only a couple of months ago. WTF!? I expect this shit from shitty GUI apps like Steam, I didn't expect it from Skype at all.

    So how did I finally solve my DPI problem? Well I didn't really.

    There's a feature of my video card where it can lie to the OS about the size of a monitor (bigger or smaller) and scale the image itself before delivering it to the monitor. So I told the card to tell Windows that I have two 4k monitors plugged-in. It works, kinda-- I get the nice clear image on the actual 4k and a some acceptable image on the 1080p. But it's clear I'm stressing the poor GPU to the limit-- it's lagging about 2-3 frames behind at all times now.

    What's the real solution without the lag? Uh.

    1. Throw out my perfectly good 1080p monitor and just run a single monitor from now on.

    2. Set the 4k monitor to run at 1080p and hope it has a good up-scaling algorithm.

    That's it. Due to the incompetence of Microsoft and NVidia, it turns out that practically speaking, this 4k monitor is almost worthless to me. I hate computers.

    Based on my experiences with my laptop (the 13.3" 1080p screen), I thought Windows had mostly figure out all the DPI issues and solved them at this point. Uh. Nope. Maybe Windows 10...



  • Microsoft only added the high-DPI "feature" to support the Surface Pro.

    So they never bothered testing anything higher than 2160 x 1440, because higher resolutions clearly do not exist.

    nVidia's "lie to the OS" feature was intended for XP and older, because those barely even pretended to support multi-monitor. It's probably not been touched since Vista!

    (Windows is also terrible at 1800 x 24. Don't ask why I know that)



  • @blakeyrat said:

    What's the real solution without the lag?

    Get another graphics card and buy another 4k monitor to pair with it.
    And demote the perfectly good 1080p monitor to some secondary computer (likely one that has a monitor of the same resolution).


  • Winner of the 2016 Presidential Election

    @lightsoff said:

    Microsoft only added the high-DPI "feature" to support the Surface Pro.

    Try to connect a non-HiDPI external monitor to your Surface Pro when you're using DPI scaling on the surface itself. Hint: The output will be blurry as hell. Because Windows first scales the content on the external monitor as well and then scales it back to the original size again. So it doesn't even work on the Surface, the "feature" is just horribly fucked up.


  • 🚽 Regular

    @blakeyrat said:

    I expect this shit from shitty GUI apps

    @blakeyrat said:
    I didn't expect it from Skype
    Conflict detected.

    @blakeyrat said:

    I hate computers.
    Don't we all.


  • 🚽 Regular

    @lightsoff said:

    Windows is also terrible at 1800 x 24. Don't ask why I know that

    That wasn't a typo, was it?  >_<


  • BINNED

    @blakeyrat said:

    - To solve the "mouse won't move to other screen correctly and also taskbar looks like ass" problem, I experimented with NVidia driver features. It has a feature call "surround" which, in theory, will lie to the OS and tell it your two monitors are only actually one monitor.

    • THIS FEATURE IS HORRIBLY BROKEN BEYOND BELIEF IN LIKE 58 DIFFERENT WAYS DO NOT USE IT

    Ah, the X server method!

    @blakeyrat said:

    That's it. Due to the incompetence of Microsoft and NVidia, it turns out that practically speaking, this 4k monitor is almost worthless to me. I hate computers.

    You pretty much get the same shit on Linux. Not sure about Mac. But yeah, running different resolution monitors is still a pain, no matter the OS and/or GPU.

    I also noticed that what applications that use the high DPI compatibility mode actually do (scale, don't scale, completely blow up...) seems to vary on what monitor is set as primary and what monitor it was started on. Sadly I don't know what the exact parameters are, played with this once on Win 8 on someone's laptop, but I remember that getting it to scale consistently was a near impossible task. Then again, it was written in Visual FoxPro, you can guess how old it is.

    If you go back to non-scaling trick setup at some point, give this a go with Skype, maybe you figure it out and get it to behave. And hope that it remembers the positioning between launches, of course.



  • So, disregarding problems with dual monitors, do you find 4K worth it, in terms of picture quality?



  • @Nprz said:

    Get another graphics card and buy another 4k monitor to pair with it.

    What's the real solution without the lag and also doesn't cost ANOTHER FUCKING GRAND?



  • @Onyx said:

    You pretty much get the same shit on Linux.

    Well duh. Linux doesn't work even given common hardware. I wouldn't even slightly expect it to work with relatively unique hardware, like a 24" 4k monitor.

    @Onyx said:

    But yeah, running different resolution monitors is still a pain, no matter the OS and/or GPU.

    Different resolution is perfectly fine on Windows. I've done that for decades. Different DPI appears to be the problem here.

    If Windows would simply ask me what the physical size of each monitor was, it could easily make this work. I'm actually thinking of writing a "better resolution" app, but I'm not sure if I can hook into the OS deep enough to fix the mouse wrapping issue.

    @Onyx said:

    If you go back to non-scaling trick setup at some point, give this a go with Skype, maybe you figure it out and get it to behave. And hope that it remembers the positioning between launches, of course.

    Of all the problems I've had with Skype, I've never seen it forget window positions.

    @cartman82 said:

    So, disregarding problems with dual monitors, do you find 4K worth it, in terms of picture quality?

    Yes and no.

    None of the slider stops in the control panel for "adjust the size of images" or whatever that control panel is called are very good for me-- the detail makes everything too large, but lowering it once notch down makes everything too small to read comfortably (not a big deal with SANE apps, but unfortunately, I have to run Steam.)

    The monitor itself is perfectly fine, it's gorgeous when the OS is supplying it the correct raw material. My GPU can run Skyrim on it with all the settings turned on at 30+ FPS. (Loading times take a lot longer, oddly, but once in the game it's fine.) Just reading text in applications that properly support high DPI, like Chrome, is amazing.

    So I'm happy with the product but I'm unhappy with how crappy the OS and my video card driver are at making use of the product.


  • :belt_onion:

    @Onyx said:

    high DPI compatibility mode

    I hate that with a burning passion. Windows just doesn't seem to do a good job scaling things at all, which is weird because that's usually in the realm of things Microsoft is good at. I have to turn that off on any computer I get because if I don't things get rendered at the monitor's DPI then scaled up because RAISINS. And of course since they're being scaled up they look absolutely horrible.


  • ♿ (Parody)

    tl;dr; should have bought the two of them


  • Java Dev

    I've considered going 4k (but both screens at once. I've got experience with asymmetric setups). But I'm afraid of trouble similar to the OP.



  • Yeah but then I would have had to buy another video card to power the second one and... well like I said, that's another grand. I like my early adopter stuffs but I'm not made of money.

    (Feel free to send me one if you want. It's a ASUS 970 GTX, I think you need a similar or identical second card to use SLI.)



  • The two really unforgiveable bugs here are:

    1. The taskbar being scaled to the primary monitor, instead of whatever monitor it happens to be displayed on.

    2. The DPI not being compensated for when the mouse moves between monitors.

    Number 2 there requires the OS to know the physical size of the monitor, but it could just ASK me.



  • When something changes in technology, like 4k screens becoming available (at affordable prices), you have to wait between 2 and 5 years before it works properly.


  • Java Dev

    It could just ask the monitor.


  • ♿ (Parody)

    @blakeyrat said:

    Yeah but

    No, the real tl;dr; is that 4K just isn't ready for prime time yet.


  • I survived the hour long Uno hand

    I feel like something has to succeed in TV technology for a few years and become mainstream before it's ready for use in monitors.


  • FoxDev

    @boomzilla said:

    tl;dr; should have bought the two of them

    would have needed a new graphics card too. none of nvidia's current offerings can support more than one 4k monitor, and AFAIK neither can ATI's offerings


  • ♿ (Parody)

    I don't think that's true. It's a lot easier to pump that much image from your computer to a screen than via all the ways we get TV. I'm sure it will mature over time, but right now stuff is expensive on both sides (graphics cards and actual display) and it just makes all of the ancient DPI issues stick out even worse.


  • ♿ (Parody)

    @accalia said:

    would have needed a new graphics card too.

    Man, now I'm really starting to regret leaving the 🛂 off of that post.


  • I survived the hour long Uno hand

    It's not about easier so much as the tech becoming widespread enough that companies adjust their graphics drivers to handle it properly. When everyone has 4k TVs, they'll stop having excuses to avoid 4k monitors.



  • Really? LED backlight, for example, caught on quickly without any major issues or delays.


  • FoxDev

    @PleegWat said:

    It could just ask the monitor.

    Assuming the monitor is willing to tell the truth


  • ♿ (Parody)

    @Yamikuronue said:

    When everyone has 4k TVs, they'll stop having excuses to avoid 4k monitors.

    But it seems like people had ~1080p monitors before they had the TVs. And people aren't going to get 4K TVs if they can't get 4K signals.

    NB: By people, I'm talking about critical masses.


  • I survived the hour long Uno hand

    Eh, that's probably fair.


  • Java Dev

    There's that. I've currently got 2 monitors connected to my work laptop, both claiming to be Dell Inc. 24". They are not the same size.



  • @boomzilla said:

    But it seems like people had ~1080p monitors before they had the TVs. And people aren't going to get 4K TVs if they can't get 4K signals.

    Before TV went 1080p, monitor were 1200p... then they reduced the monitor to 1080p (same screen for all).

    4K is a misnomer : there is at least 5 resolutions that qualify as 4K, not one alike another.

    My bet: TV will once again choose one resolution, and all 4K monitors at the end will be that one. In the meantime, we are doomed.


  • ♿ (Parody)

    @Jerome_Grimbert said:

    Before TV went 1080p, monitor were 1200p... then they reduced the monitor to 1080p (same screen for all).

    Yeah, that was why I had the tilde in there. Actually, TV held us back on the desktop.



  • @boomzilla said:

    No, the real tl;dr; is that 4K just isn't ready for prime time yet.

    That doesn't even make sense. The monitor is fine. It's doing everything it's supposed to.

    @boomzilla said:

    Man, now I'm really starting to regret leaving the off of that post.

    Why? What does the mailman mean?

    I don't speak your crazy moon language.


  • ♿ (Parody)

    @blakeyrat said:

    That doesn't even make sense. The monitor is fine. It's doing everything it's supposed to.

    As long as you have specific high end hardware and use it all by itself. The monitor itself may be fine, but your OP demonstrated what that's worth if the rest of the hardware (and software!) isn't ready to deal.

    @blakeyrat said:

    I don't speak your crazy moon language.

    We know. This is how we are able to talk about you in plain sight.



  • @blakeyrat said:

    Why? What does the mailman mean?

    I'm pretty sure it's used to indicate that the post is a troll and not to be taken seriously.

    Because any emoji with "trol" in in it means trolling.



  • @boomzilla said:

    The monitor itself may be fine, but your OP demonstrated what that's worth if the rest of the hardware (and software!) isn't ready to deal.

    When the software's fixed, the hardware will still be here.

    @Choonster said:

    Because any emoji with "trol" in in it means trolling.

    And mailman has "trol" in it. Obviously. Now I get it.



  • @blakeyrat said:

    And mailman has "trol" in it. Obviously. Now I get it.

    Hovering over the emoji displays :passport_control: for me.

    The trol = trolling thing seems to be based on the name of the emoji rather than what it looks like (e.g. :trolleybus:, a name for what appears to be a tram that I've never heard of before).


  • ♿ (Parody)

    @blakeyrat said:

    When the software's fixed, the hardware will still be here.

    And then, perhaps, it will be ready for prime time. At least we agree.


  • I survived the hour long Uno hand

    @Choonster said:

    trolleybus

    A trollybus is an electric trolley:

    As compared to a normal trolley:

    http://trimet.org/images/schedules/trolley.jpg



  • Ah, I've never seen a bus powered by overhead wires before.

    The second image is what I'd call a tram, but I have heard them referred to as trolleys before.


  • BINNED

    @Choonster said:

    tram

    aha but you are mistaking! a trolleybus is very different from a tram. A tram is a train like vehicle running on rails where a trolleybus is first a bus running on normal road. It does however get his power by an overhead electrical connection not unlike trams and trains often do.


  • I survived the hour long Uno hand

    I'd call it a streetcar, but I grew up near San Francisco so.... :)


  • FoxDev

    @Choonster said:

    The second image is what I'd call a tram, but I have heard them referred to as trolleys before.

    IIRC, trolley is an oldy-timey word for tram


  • I survived the hour long Uno hand

    @RaceProUK said:

    trolley is an oldy-timey word for tram

    Maybe. But trolley can also mean

    http://api.ning.com/files/l4W8jVdh3X-ywNFGwKpSOFUSjBzodwEvim54qMviX3Qb5gSqF5Gh2ylTMR0jFLdta9hoqI6MVmKIZcobidVymt9Q5QUtbl/shoppingtrolleyvector1.jpg



  • This sort of thing is why I'm always wary of early adoption of new hardware standards. I'll stick with my two 1080p monitors until all of these kinks are ironed out.

    Also do you prefer running at 4K and 30Hz? I'm pretty sure 30Hz would make my head hurt on extended use, not to mention the loss of FPS in videogames...



  • @Choonster said:

    The trol = trolling thing seems to be based on the name of the emoji rather than what it looks like (e.g. 🚎, a name for what appears to be a tram that I've never heard of before).

    Really? We have them in Seattle. It's just a bus run on overhead electricity like a street trolley.

    @dstopia said:

    Also do you prefer running at 4K and 30Hz? I'm pretty sure 30Hz would make my head hurt on extended use, not to mention the loss of FPS in videogames...

    I honestly don't think my eyeballs can see much more than 30 FPS anyway. I genuinely don't notice the difference. (Nor do I have any trouble with older films, which are 24 FPS.)



  • @blakeyrat said:

    I honestly don't think my eyeballs can see much more than 30 FPS anyway. I genuinely don't notice the difference. (Nor do I have any trouble with older films, which are 24 FPS.)

    I've heard that claim a lot and it always sounds like bullshit to me (it is VERY noticeable to me in games. Hell, even 45 FPS is a big hit), but to each his own I guess. In old CRT monitors which could normally go up to 85Hz, 60Hz gave out a very distinct fuzzy glare that would give me constant headaches. The less the frequency, the worse it was.

    I haven't felt the same with 60Hz LCD screens, but I've never gone below that to test.

    Also aren't most films still 24 FPS? I've only seen The Hobbit in 48 FPS (it looks as if it was filmed on home video or something, kinda striking at first).


  • FoxDev

    @dstopia said:

    Also aren't most films still 24 FPS?

    I believe that is the case, yes
    @dstopia said:
    I've only seen The Hobbit in 48 FPS

    Same here. It's really weird watching at that framerate; I get a sort of 'hyperreal' vibe from it, like it's too smooth…



  • @blakeyrat said:

    Really? We have them in Seattle. It's just a bus run on overhead electricity like a street trolley.

    It looks like there aren't any left operating in Australia apart from a tourist service in the Blue Mountains.



  • @dstopia said:

    I've heard that claim a lot and it always sounds like bullshit to me (it is VERY noticeable to me in games. Hell, even 45 FPS is a big hit), but to each his own I guess.

    My brother sees the 60 hertz flashing on florescent lights that 99% of the population can't see. It drives him nuts if he goes into like a warehouse store which it lit by those big florescents hanging from the ceiling.

    So yeah.

    @dstopia said:

    Also aren't most films still 24 FPS?

    Pretty sure 2k and 4k digital films (that is, pretty much every film made since about 2005) is 30 FPS. I could be wrong.



  • @blakeyrat said:

    My brother sees the 60 hertz flashing on florescent lights that 99% of the population can't see. It drives him nuts if he goes into like a warehouse store which it lit by those big florescents hanging from the ceiling.

    This depends for me. I know this was a problem for me in middle school, but I think those were just really bad lights.

    Honestly though, it's very easy to tell when something drops from 60 fps to 30. It's a bit harder to recognize 120hz, but I can do it. I've been running my monitor with lightboost for a while now, and that makes a very visible difference, and I don't think I'll use it once I format for 10 - it looks like your framerate is low.



  • @blakeyrat said:

    I hate computers.

    Yeah, me too. I set up my gaming PC in my living room, using my tv as a display. Last year, during the soccer World Cup, I got the bright idea that I'd hook up a monitor too, so I could game on one screen and watch footy on the other.

    The amount of heartache just trying to get two fullscreen programs working on two different screens...

    Meanwhile my roommate just streams stuff on his iPad when he's playing games. I'm tempted to get one, honestly. At least it'll ****ing work!


Log in to reply