And today's randomly not-working OS X feature is...



  • Typing! (or, more pedantically, typing into a network login form in Finder)

    You see, our Mac is no longer joined to the Active Directory domain for reasons unfathomable to anyone, so now to connect to one of our Windows servers to copy files across, you have to find the server in the Finder and click the "Connect As..." button and enter your Windows credentials. How anyone is supposed to do this when typing is apparently disabled in the "Name" and "Password" fields is beyond me.

    Typing seems to work everywhere else. I can type in Xcode, I can make this post on TDWTF, but I cannot type into the login form shown below.




  • How about typing on the address bar? Oh wait...



  • I like how the icon for the remote machine is the BSOD.



  • @Evilweasil said:

    I like how the icon for the remote machine is the BSOD.

    Everyone knows that Windows doesn't actually run.. it simply hasn't crashed yet.



  • @Evilweasil said:

    I like how the icon for the remote machine is the BSOD.

    Specifically, Win 95/9x BSOD. On an old CRT.



  • @alegr said:

    @Evilweasil said:

    I like how the icon for the remote machine is the BSOD.

    Specifically, Win 95/9x BSOD. On an old CRT.

    The icon goes all the way up to a 1024x1024 px icon too :)


    That's actually incredible.. OSX's icons are larger than most displays that the average Windows user uses!



  • Cool bug, bro. I have not come across that bug (it worked last time I tried in Lion), but I share your pain, it seems you haven't had much luck with Apple recently.



  • @ZPedro said:

    Cool bug, bro. I have not come across that bug (it worked last time I tried in Lion), but I share your pain, it seems you haven't had much luck with Apple recently ever.

    FTFY



  • @Evilweasil said:

    I like how the icon for the remote machine is the BSOD.

    Is that icon used for every remote machine, or just Windows ones?
    If the latter, that's quite a bold thing to do. It's not like the US were litigation happy or anything

     



  • @gu3st said:

    @alegr said:

    @Evilweasil said:

    I like how the icon for the remote machine is the BSOD.

    Specifically, Win 95/9x BSOD. On an old CRT.

    The icon goes all the way up to a 1024x1024 px icon too :)


    That's actually incredible.. OSX's icons are larger than most displays that the average Windows user uses!

    It's the kind of wildly misdirected refinement you come to expect from an Apple product.



  • @topspin said:

    @Evilweasil said:

    I like how the icon for the remote machine is the BSOD.

    Is that icon used for every remote machine, or just Windows ones?
    If the latter, that's quite a bold thing to do. It's not like the US were litigation happy or anything

     

    It's the icon for any SMB share. Anything that says "I have AFP support" uses a generic Mac icon.. or the Mac that it is if it knows.



    I also wouldn't really say that "Hi-Res Icons" is misdirected, especially considering OSX is icon-heavy, along with the fact that Apple has one of the few High-DPI computers available, along with OS X actually having WORKING High-DPI support. (Go try Windows on 200% UI Scale.. I dare you.. where OSX will work flawlessly :) ).



  • @gu3st said:

    (Go try Windows on 200% UI Scale.. I dare you.. where OSX will work flawlessly :) ).

    Who knew? But then, why doesn't OSX work at 100%? Yet another configuration WTF, I guess.



  • @gu3st said:

    It's the icon for any SMB share.
     

    Strange choice of icon.

    Designed by someone who - when asked to power-cycle their computer - smacks the power button on the monitor?

     



  • @boomzilla said:

    But then, why doesn't OSX work at 100%?

    It does, but the 1024×1024 icon is actually a 512×512 "@2x" icon for use on high-resolution ("Retina") displays. The reason, apparently, is because on-screen items are measured in points, not pixels, and on a high-res display one point equals two pixels rather than one as on a normal screen.



  • Not that the Retina even matter since last I checked Firefox and Chrome were stlil just scaled up from lower DPIs by the OS.

    And while I'm nitpicking, raster icons in 2012? Why the fuck haven't OS's started migrating to vector icons yet? I mean it's not like we don't have plenty of mature vector formats already. God, even Microsoft fucking Bob had an all-vector interface.


  • Winner of the 2016 Presidential Election

    @MiffTheFox said:

    God, even Microsoft fucking Bob had an all-vector interface.

    Must have screenshot of Bob running at 2880x1800.



  • @MiffTheFox said:

    Not that the Retina even matter since last I checked Firefox and Chrome were stlil just scaled up from lower DPIs by the OS.

    And while I'm nitpicking, raster icons in 2012? Why the fuck haven't OS's started migrating to vector icons yet? I mean it's not like we don't have plenty of mature vector formats already. God, even Microsoft fucking Bob had an all-vector interface.

    Chrome now does have native Hi-DPI support as of I think v18 or 19. Firefox.. I have no idea. Opera has Hi-DPI support, and of course Safari does.



  • @joe.edwards said:

    @MiffTheFox said:
    God, even Microsoft fucking Bob had an all-vector interface.

    Must have screenshot of Bob running at 2880x1800.

    Is 2048x1536 close enough? (Compare with the same scene rendered at 640x480.)

     There's also a 1600x900 screenshot out there, in case you want to see widescreen Bob.



  • @joe.edwards said:

    Must have screenshot of Bob running at 2880x1800.
    Sorry, VMWare only does 2560x1600, 2560x1920 and 6688x5016 (which is a bit troublesome for Bob).



  • @Gurth said:

    It does, but the 1024×1024 icon is actually a 512×512 "@2x" icon for use on high-resolution ("Retina") displays. The reason, apparently, is because on-screen items are measured in points, not pixels, and on a high-res display one point equals two pixels rather than one as on a normal screen.

    First... The 1024 x 1024 icon is 1024 x 1024 pixel icon for when you mouseover the 512 x 512 icon in the dock at which point it scales it up to 2x the resolution, hence the naming convention.

    Secondly... On-screen items are measured in pixels! NOT points! Points are a printed text measurement ONLY, where 1 point is 1/72 inches. The whole idea that on screen items were measured in points is a major misunderstanding from the web design community (namely the firefox fanboys) from when firefox assumed 12 pts to be 16 pixels (which is correct only at 96dpi, at the time the modal display on a laptop was 1280 x 800 at 15.6 inches which is approx 94 dpi). As web designers were incorrectly using points to descibe on-screen font sizes mozilla decided to hard code in 96 dpi as the assumed dpi rather than do the correct thing by attempting to figure out the actual display dpi (which you cant actually do unless the user or OEM specifies it to the OS).



  • First…
    @Apple developer docs said:

    OS X refers to screen size in points, not pixels. A point is one unit in user space, prior to any transformations on the space. Because, on a high-resolution display, there are four onscreen pixels for each point, points can be expressed as floating-point values. Values that are integers in standard resolution, such as mouse coordinates, are floating-point values on a high-resolution display, allowing for greater precision for such things as graphics alignment.

    Note: The term points has its origin in the print industry, which defines 72 points as to equal 1 inch in physical space. When used in reference to high resolution in OS X, points in user space do not have any relation to measurements in the physical world.

    (source)

    Secondly… points have been used as on-screen measurement for much longer than Mozilla has been around. The Xerox Star and original Macintosh had pixels that were 1 (typographical) point in size and thus had a 72 PPI screen resolution that became the norm — even if many screens don't physically use that resolution. The 96 PPI that Windows uses started a few years later because Microsoft tried to compensate for people viewing computer screens from further away than they normally do printed matter, and so increased Windows' virtual PPI to 96 while its physical PPI was supposed to be roughly 72. This meant that text set at, say, 10 points size would not be 10 pixels high (as on a Mac) but 13 pixels, making it easier to read from a distance.


  • Discourse touched me in a no-no place

    @Gurth said:

    The reason, apparently, is because on-screen items are measured in points, not pixels, and on a high-res display one point equals two pixels rather than one as on a normal screen.
    And that is an abomination. REAL points are 1/72nd of an inch, not some multiple of pixels (unless you inclue PPI information, and 72ppi is pathetic - akin to 1024x768 on a 17" monitor)

     

    Of course, I still use CRTs where I can because LCD pixel densities actually do tend to be around about 72PPI, excepting ludicrously expensive "retina" garbage. 144PPI is STILL pathetic (noting that I am biased by the fact that I work at 300-600PPI at work, and as a result can't ever get an actual proper visualization of the thing that I'm working on)



  • @Weng said:

    LCD pixel densities actually do tend to be around about 72PPI
     

    Depends. Mostly false. Usually it's closer to 100dpi, but Really Huge displays simply have the same dimensional resolution with a lower dot pitch, so yeah, that works out to about 72. But given the variability between monitors, it's completely impractical to use the precise number "72" except when referring to that number which an OS may or may not use to display fonts and/or UI elements.

    @Weng said:

    I still use CRTs

    CRTs don't have any real resolution beyond 120dpi. It's just a blur.

    @Weng said:

    I work at 300-600PPI at work,

    Oh, so you do use retina garbage because those are the only screens that actually have 300dpi.

     



  • @dhromed said:

    @Weng said:

    I work at 300-600PPI at work,

    Oh, so you do use retina garbage because those are the only screens that actually have 300dpi.

    Those are the only consumer-level screens that have high DPI. Medical-imaging systems, for example, have had high-DPI monitors for longer than Retina has been around.



  • @Carnildo said:

    Medical-imaging systems, for example, have had high-DPI monitors for longer than Retina has been around.
     

    I have learned something!


Log in to reply
 

Looks like your connection to What the Daily WTF? was lost, please wait while we try to reconnect.