Boring Thursday! Here's a link



  • Hell, too much developer learnability is how some WTFs are created, as you have people developing for systems they don't even understand in the first place, such as how Microsoft makes it too easy for developers to create web applications without even understanding how a client/server architecture works. (This isn't an isolated incident! Google "asp.net close window" or "asp.net popup" (the first answer to this question probably deserves it's own thread))

    Blakeyrat, before you say the client/server architecture is an implementation detail, it's a detail that the developer has to know because it's the developer preforming the implementation. It's users who aren't supposed to worry about that, not devs. If you want to add custom JavaScript to a page (such as a third-party analytics package), you have to understand the HTML and JavaScript that WebForms creates. If you want to make the site reliable in the case of a bad internet connection, you have to write client side code that's divorced from the server.



  • @MiffTheFox said:

    Blakeyrat, before you say the client/server architecture is an implementation detail, it's a detail that the developer has to know because it's the developer preforming the implementation.
     

    Surely client/server architecture is a design consideration and release packaging concept?

    I certainly agree that it's a detail that the developer needs to be aware of, but in many situations a coder just needs to concern themselves with ensuring the expected input gets processed and translated into expected output plus detecting and dealing with situations where deviations from the normal path may arise (such as unexpected inputs).

    Usually a lot of entry/exit points to a system tend to be abstracted away behind a set of modules and libraries - the coder doesn't need to be too bothered about it's origin, just that input will happen and what behaviour to exhibit when it does.



  • Discussing the Ultimate Correct Pigeonholes for these things is completely futile. What a developer "needs" to know is an arbitrary breadth of knowledge the limits of which are placed by different people at different points along the creation of the materials, the tools and the product.

    There's someone who is satisfied with just how hammers themselves are made, and one who is satisfied knowing hammers magically appear out of nowhere and just knows how to hit a nail, and even one who is satisfied with an abstracted component connection system, regardless of whether this is implementad as hammer+nail or glue.

    I like to think I'm a richer person for knowing that the elements in my hammer have been produced in a heavy star preceding our current one, that the evolution on this planet allowed wood for the handle to exist and iron to be mined which is then chopped and lathed and dug and smelted and combined into a hammer.

    So maybe someone thinks it's absolutely necessary for a webdev to know bit twiddling or how ports on a server work, and some who think document.getelementById() is an implementation detail for jQuery.



  • @dhromed said:

    beautiful rant
    Will you have my babies?

     

    I don't want them.



  • @dhromed said:

    What a developer "needs" to know is an arbitrary breadth of knowledge the limits of which are placed by different people at different points along the creation of the materials, the tools and the product.
     

    That put it better than I did.

    I dislike the blanket "it's an implementation detail that you don't need to know about". It's just as bad as someone ignoring or dismissing information out of hand because they've decided it doesn't concern them, without actually reviewing the details and making a more informed decision. True, they could get bogged down in unnecessary minute, but all big WTFs began life as a small acorn and left to grow, unwatched.



  • @Cassidy said:

    @blakeyrat said:

    No; the attitude is that you (meaning the general "you", meaning everybody) shouldn't have to learn anything at all to use any computer at all.
     

    And yet helpdesks are continuously plagued by users struggling to use unfamiliar technology without learning any fundamentals that equip them with necessary skills and knowledge to acheive their objective.

    Helpdesks would not be plagued by such calls if the products were better-designed. I don't know what your point is there. Were you confused by the word "should"?

    @Cassidy said:

    @blakeyrat said:

    Of course anybody who uses Linux won't get that because the thought is so fucking foreign to them.
    It's also a very foriegn thought to non-IT kit also. I don't know of very many items that people can expect to use just solely by picking it up and finding they're magically imbued with skills and knowledge to use them.

    And yet people should be able to use them. Usability isn't just about computers.

    @Cassidy said:

    You can guess I don't share the Great Aunt Ethel viewpoint of computing. Like ASheridan, I grew up with different types of computer and in each situation I accepted the fact that I would be expected to put in some effort to understand what I was using at my fingertips in order to gain maximum benefit from it.

    That's fine. As long as you don't inflict your view on other people.

    @Cassidy said:

    I really think that a lot of discoverability techniques have conditioned people into believing they can become fluent users with as little effort as possible, and blame the system when it doesn't do what they want it to (but does exactly as they wrongly asked).

    They blame the system because the system is at fault. The system caused the problem. The system fucked up.

    @Cassidy said:

    but I'd hate to think of the effort required by designers and programmers to cater for the "don't need to learn it, I just want it to work" crowd.

    Worthwhile things are usually difficult.



  • @MiffTheFox said:

    If you want to add custom JavaScript to a page (such as a third-party analytics package), you have to understand the HTML and JavaScript that WebForms creates.

    I agree you have to.

    I don't agree you should have to.

    I don't get how a community who spends all their times abstracting things-- abstracting data storage, abstracting I/O devices-- gets confused with abstract philosophical concepts. You're ok working with an ORM, but the instant someone talks about how things should work in an ideal world you're totally lost? What's up with that?



  • @Cassidy said:

    I dislike the blanket "it's an implementation detail that you don't need to know about". It's just as bad as someone ignoring or dismissing information out of hand because they've decided it doesn't concern them, without actually reviewing the details and making a more informed decision. True, they could get bogged down in unnecessary minute, but all big WTFs began life as a small acorn and left to grow, unwatched.

    I kind of agree with you (and by extension Dhromed), but my problem is that a lot, or even a majority, of programmers are completely obsessed with implementation details that truly do not matter and constantly waste time talking or debating over them. So I tend to error on the side of "implementation details don't fucking matter, do something fucking productive instead of debating whether to use IIS or Apache, Christ" if only to help developers be aware that the shit they're wasting time on doesn't fucking matter.

    That's not to say that there's no difference between IIS and Apache, or that the difference between IIS and Apache doesn't matter at all. But it is to say it doesn't matter proportionally to the amount of time idiot developers spend debating it. IIS and Apache are responsibly for maybe 0.5% of your application's experience, so you shouldn't spend more than 0.5% of your development time talking about it. And that's being generous, it's probably far less than 0.5%.

    Maybe I err too far in the other direction. Whatever.



  • @blakeyrat said:

    Helpdesks would not be plagued by such calls if the products were better-designed. I don't know what your point is there. Were you confused by the word "should"?
     

    My point is that no matter how well-designed the product is, there will still be someone out there that won't be able to use it because they're a lazy edge case that refuses to put in any effort to learn from glaringly obvious discoverability. It's not a matter that they can't use the product, but that they won't through their unwillingness to invest effort. 

    It's the old "lead a horse to water but can't make it drink" adage - there's a point at which those crafting the product need to stop giving and expect something in return from those using. Move the point too close to the user and you're expending unnecessary effort for minority cases, bumping up investment costs. Move it too far from the user and you'll watch your userbase disappear in droves for someone that's moved it further away from you. It's simply a gamble: people will err towards something good until something better comes along, putting the original product in the shade.

    I suppose you're dreaming of a nirvana world where products are designed such that the user needs to put in no effort whatsoever and all the onus is on the designer/builder to craft them to fit that pattern. Whilst we both agree that it's an unrealistic situation due to uncontrolable variables in end-user behaviour, I feel that striving for that goal is an expensive choice that represents lower returns and will increase the cost of deliverables.

    @blakeyrat said:

    And yet people should be able to use them. Usability isn't just about computers.

    They can use them. We're debating if they're able to use them without any learning or training whatsoever. I believe there is a responsibility upon the user to put in some effort on their part; your argument revolves around delegating this solely off to the product design and a user's lack of capability is a fault in the design process because it couldn't possibly be anything the user's doing wrong. Unfortunately that's a flawed argument because for many things I cannot do, someone else can do them - using exactly the same product.

    This essentially brings us back to the original argument: the fact that Shamus cannot achieve what others can may not necessarily be down to design issues but human issues.

    @blakeyrat said:

    That's fine. As long as you don't inflict your view on other people.

    Correct. I don't inflict my viewpoints, I share them. People are free to debate them, and in the past good reasoning has caused me to reflect and change my viewpoint.

    @blakeyrat said:

    They blame the system because the system is at fault.

    Again, you seem stubbornly fixated upon this direction. I find it strange that someone who exhibits a high level of intelligence and has worked in analytics suddenly shows an unwillingness to properly analyse the problem and jump directly to this conclusion.

    @blakeyrat said:

    but my problem is that a lot, or even a majority, of programmers are completely obsessed with implementation details that truly do not matter and constantly waste time talking or debating over them.

    I find this a lot - when I teach design courses, I'm continuously pulling coders out of their low-level viewpoint and back into a higher-level world when they need to view the bigger picture and focus more upon "how it should work (behave)" and not upon "how do we build it".
    Granted it's not all programmers - more the inexperienced ones.

    @blakeyrat said:

    That's not to say that there's no difference between IIS and Apache, or that the difference between IIS and Apache doesn't matter at all.

    ... but the differences shouldn't really matter to someone writing web code. To develop for a platform, they need to know the capabilities of that platform and how to utilise them. Worry more about what to build with your tools and less about how the tools were manufactured.



  • @Cassidy said:

    My point is that no matter how well-designed the product is, there will still be someone out there that won't be able to use it

    Well yeah, duh. It's an ideal. Ideals don't have to be obtainable.

    World peace is an ideal. It's probably not attainable. A crime rate of zero is an idea. It's probably not attainable.

    See how it works? This is simple elementary school stuff here, guys.

    In any case, my biggest beef as far as Linux goes isn't that it's an unusable mess, it's that the people developing Linux don't give a shit that it's an unusable mess, don't put any effort into improving it, and generally reject the entire idea that it needs improving in the first place. Linux development is completely stagnant. I hate that. You should hate that. Everybody should hate that. It's terrible.


  • BINNED

    @Cassidy said:

    My point is that no matter how well-designed the product is, there will still be someone out there that won't be able to use it because they're a lazy edge case that refuses to put in any effort to learn from glaringly obvious discoverability.
    You know you're arguing with one of those people, right?


  • BINNED

    @blakeyrat said:

    In any case, my biggest beef as far as Linux goes isn't that it's an unusable mess, it's that the people developing Linux don't give a shit that it's an unusable mess, don't put any effort into improving it, and generally reject the entire idea that it needs improving in the first place. Linux development is completely stagnant. I hate that. You should hate that. Everybody should hate that. It's terrible.

    It's ironic that in another browser tab, I'm reading an article about users leaving Ubuntu because of the new interface.



  • @blakeyrat said:

    Linux development is completely stagnant. I hate that. You should hate that. Everybody should hate that.
     

    I should hate it if it's true, but it ain't. I've seen advances in many thing Linux - services that run under Linux, advances in the desktop, etc.

    You, by your own admission, rarely use it - so you're not exactly experienced in the advances or changes it has made. This was evident from a post when I asked you how thought software management should work in Ubuntu and you proceeded to exactly described the graphical package management system already present in Ubuntu (and other distros) so it was clear you were completely unaware of the tools at your disposal.

    I recall a previous post in which you described your experiences of getting SSHd working on a MUD server - how difficult and arduous it was - then berated me and some others for talking things wrong with earlier versions of Windows. The boot's on the other foot here: you have a good working knowledge of Windows and MacOS but your Linux experiences are very light in comparison, meaning many rants you have posted about them come across as vitriolic ignorance. And sadly, I know you are more than capable of picking this stuff up, but you won't because you (a) seemed biased towards it, and (b) it doesn't form part of your everyday work so you have no interest in it.

    Let it go, stick to writing about the things you know best about. I've learned much from them because that's when you're at your best. Your Linux rants are almost stereotypical trollbait for the furry-toothed bad-hygiene generic Linux fanboi awaiting to snap off his leash and launch into a bigoted anti-MS defence.

    It ain't worth it.



  • @PedanticCurmudgeon said:

    It's ironic that in another browser tab, I'm reading an article about users leaving Ubuntu because of the new interface.
     

    URL us up, then.

    Unless it's that one about how Unity has pissed many people off and forums have been flooded with people asking how to flick the desktop back to a more traditional and familiar skin. Jesus, that got repetitive - fast.



  • @Cassidy said:

    I should hate it if it's true, but it ain't. I've seen advances in many thing Linux

    Hah bullshit. There's advances in the form of "adding more shit", but there's no advance in the form of "rethinking our bad decisions from decades ago". The CLI is fucking awful, and has seen zero improvement. It's great that there are GUI tools, but that doesn't mean the CLI tools should be kept in stasis for decades.

    Oh and some of those "advancements" like git, for example, are so user-hostile I almost think it was made that way on purpose. I mean, how it is possible for dozens of people to work on this thing and end up with such a fucking terrible experience? It boggles the mind.

    @Cassidy said:

    I recall a previous post in which you described your experiences of getting SSHd working on a MUD server - how difficult and arduous it was - then berated me and some others for talking things wrong with earlier versions of Windows.

    The difference is the process of getting SSH working hasn't changed, where Windows has.

    @Cassidy said:

    The boot's on the other foot here: you have a good working knowledge of Windows and MacOS but your Linux experiences are very light in comparison, meaning many rants you have posted about them come across as vitriolic ignorance.

    This has been explained several times in this thread. You're not reading what I type.

    @Cassidy said:

    Let it go, stick to writing about the things you know best about.

    I'm passionate about computers. I want the power computers provide to be available to everybody, regardless of their capabilities or educational level. I think that's a noble goal.



  • @blakeyrat said:

    I'm passionate about computers. I want the power computers provide to be available to everybody, regardless of their capabilities or educational level. I think that's a noble goal.

    Okay, let me break this down for you.

    Language is a series of (a series of (a series of symbols) that form words) that form statements. From any given level, you can't see the others without a lot of work.

    For example, take the letter W. It's a symbol like any other. On its own, it's meaningless. It's only when it gets put together with other letters that it becomes a word.

    There are infinite ways to combine symbols to make nonsensical words. For example, dffjciebnfocosnw. It has no meaning. From the level of letters, you wouldn't know that. You can also combine words to make nonsensical sentences. Dog pants go green eat. Every word in that sentence is a real English word, yet the sentence is without meaning.

    However, in order to understand one level, even if it gives you no help knowing what it means, you have to know at least a little about the levels below it. You can't read a sentence written in Japanese if all you know is English letters. You can read a sentence in English (without understanding it) if you know French or German, but switching out one of the lower levels makes it impossible to understand.

    Computer programs are written in language as well. Without knowing how computers work, you can't write a computer program. Of course, you can write one, but it will end up on a certain website.

    If you don't know the words of computer programs, you can't communicate with computers. You can pretend by typing words into Lotus Notes, but the computer has no way of understanding what you mean. It simply follows a list of rules written in its language. It understands the symbol level but not the word or sentence level of human languages.

    Suggested reading



  • Ben, I agree with everything you just typed, but I don't understand how it's a reply to what you quoted. What is your point exactly?


  • BINNED

    @Cassidy said:

    URL us up, then.

    Unless it's that one about how Unity has pissed many people off and forums have been flooded with people asking how to flick the desktop back to a more traditional and familiar skin. Jesus, that got repetitive - fast.

    It is that one. I actually don't think Unity's that bad, but I understand the complaints.


  • BINNED

    @blakeyrat said:

    Ben, I agree with everything you just typed, but I don't understand how it's a reply to what you quoted. What is your point exactly?
    He's got a point; you just missed it, and there's no hope of you getting it because it conflicts with ideas you seem to be unwilling to give up. It would be best if you assumed he was writing for the rest of us and moved on.



  • @blakeyrat said:

    The CLI is fucking awful, and has seen zero improvement.
     

    I remember using the C-shell years ago, and welcommed new features found in the Korn Shell, then some new BASH features (TAB-complete, history recall, etc). I also like the colour enhancements some commands exhibit, and Ubuntu's suggestion for packages to install if commands aren't matched. Perhaps you deem this as "adding more shit", but I've seen Unix admins stuck in their old ways pleasantly surprised at these new features displayed by the young whippersnapper Linux.

    @blakeyrat said:

    The difference is the process of getting SSH working hasn't changed, where Windows has.

    I'd like evidence of your belief that the process of getting SSH working has still remained as difficult as when you first tried it. Someone (Boomzilla?) posted the same lines as I use to get an SSHd server running: (i) a package management command to download and install, (ii) optionally amend some config settings (port number, etc) then (iii) start the service. I'd also (iv) set the service to automatically restart at reboot time.

    These can be done via commands (yeah, they're not too discoverable but they're consistent across all services) or via some graphical applet. Of course, you can manually download pure source, compile, curse when you need to find missing libraries, spend ages satisfying dependencies, write your own startup scripts.... but advances in operating system tools have taken a lot of that work out and streamlined it greatly.

    @blakeyrat said:

    I'm passionate about computers. I want the power computers provide to be available to everybody, regardless of their capabilities or educational level. I think that's a noble goal.

    We're in agreement with that destination. I just don't feel the journey should entirely consist of the designers/developers/builders carrying the user to that destination - that the user is expected to share some of the workload. A user's computing acheivements is related to their capabilities and/or educational level; a user improving their own capabilites opens up more possibilities to them.



  • @PedanticCurmudgeon said:

    It is that one. I actually don't think Unity's that bad, but I understand the complaints.
     

    Ah, yeah.

    I got the impression Unity became Linux's Vista: those that stuck to it and accepted change soon gained the benefits it bought; those that hated it only used it for a short period before deciding that they'd prefer to return to their familiar comfort zone than flounder with something new.

    I think the way in which it was foisted upon users (and thus people's first experiences) greatly influenced its perceptions. Someone told me about Microsoft studying the failures of Unity takeup in preparation for Metro.



  • @Cassidy said:

    I think the way in which it was foisted upon users (and thus people's first experiences) greatly influenced its perceptions. Someone told me about Microsoft studying the failures of Unity takeup in preparation for Metro.

    When did all the Unity buzz die down? With the release of 11.10? If so, that'd be consistent with how Microsoft's new user interfaces (Aero, Metro) have been picked up.



  • @Cassidy said:

    I'd like evidence of your belief that the process of getting SSH working has still remained as difficult as when you first tried it. Someone (Boomzilla?) posted the same lines as I use to get an SSHd server running: (i) a package management command to download and install, (ii) optionally amend some config settings (port number, etc) then (iii) start the service. I'd also (iv) set the service to automatically restart at reboot time.

    These can be done via commands (yeah, they're not too discoverable but they're consistent across all services) or via some graphical applet. Of course, you can manually download pure source, compile, curse when you need to find missing libraries, spend ages satisfying dependencies, write your own startup scripts.... but advances in operating system tools have taken a lot of that work out and streamlined it greatly.

    Have you ever used Remote Desktop in Windows? There's no comparison. Not even fucking close. Not even remotely fucking close. And Remote Desktop has been in Windows since, what, Windows 2000? Linux still hasn't gotten their shit together to make something comparable.

    The Linux and OS X world has VNC, but VNC is slow, insecure in many ways (doesn't encrypt by default, can be used as spyware), doesn't provide drag&drop between the two computers, doesn't send audio over the wire, etc etc.



  • @blakeyrat said:

    Have you ever used Remote Desktop in Windows? There's no comparison. Not even fucking close. Not even remotely fucking close. And Remote Desktop has been in Windows since, what, Windows 2000? Linux still hasn't gotten their shit together to make something comparable.
    XP actually, and only in Pro (in 2000 only Server had it as an optional add-on), and with limited number of connections.@blakeyrat said:
    The Linux and OS X world has VNC, but VNC is slow, insecure in many ways (doesn't encrypt by default, can be used as spyware), doesn't provide drag&drop between the two computers, doesn't send audio over the wire, etc etc.
    Linux actually also has X, which is older than Remote Desktop, and can be easily tunneled through ssh if you want.



  • @Cassidy said:

    I'd like evidence of your belief that the process of getting SSH working has still remained as difficult as when you first tried it. Someone (Boomzilla?) posted the same lines as I use to get an SSHd server running: (i) a package management command to download and install, (ii) optionally amend some config settings (port number, etc) then (iii) start the service. I'd also (iv) set the service to automatically restart at reboot time
    Installed Debian 6.0.  Had SSH server installed on build.  After it booted up, SSH daemon was already running.  Edited the config file to disallow root logins (PermitRootLogin no).  Was already set to only allow SSHv2.  Restarted SSH daemon.  Done.

    Not hard at all for someone who knows what he's doing.  Hell, if I spent some time looking at it I could probably figure out a way to make it a single command line that changed the config and restarted the daemon.  As it is, it's not worth it to me because it's all of about a 60 second change-and-restart anyway.



  • @ender said:

    @blakeyrat said:
    The Linux and OS X world has VNC, but VNC is slow, insecure in many ways (doesn't encrypt by default, can be used as spyware), doesn't provide drag&drop between the two computers, doesn't send audio over the wire, etc etc.
    Linux actually also has X, which is older than Remote Desktop, and can be easily tunneled through ssh if you want.
    And presumes that you need a GUI in the first place.  Which for many Linux functions is simply not the case.

    I've had problems getting RDP working right when connecting to a multi-monitor system from a single-monitor system.  A problem I never had with VNC.  VNC acted completely intuitively for me.  All of this with the understanding, however, that I rarely need to communicate with a remote GUI.  This week was the first time in months that I had any kind of a need for it.



  • @blakeyrat said:

    Have you ever used Remote Desktop in Windows?
     

    Yes. I've also used RDP from Linux to Windows, and RDP from Windows to Linux. I've also used Teamviewer on both systems for desktop sharing, and Xserver running on a thin client to present a graphical desktop that's running off Unix and Linux servers, long before Windows2000 included a remote desktop facility.

    (arse. What Ender said. Missed that.)

    But that doesn't answer the question about how you are unaware of advances in the Linux world means you've automatically decided there haven't been any.

    @nonpartisan said:

    Installed Debian 6.0.  Had SSH server installed on build...

    Yeah, all that. I've only found Mint and Ubuntu (Desktop Edition) that haven't had SSH server installed by default.

    @nonpartisan said:

    Edited the config file to disallow root logins (PermitRootLogin no).

    A WTF for me is the default setting for this directive (YES) if it's missing. This - as well as the port number - are two settings I always change on new installs.

     



  • @ender said:

    XP actually, and only in Pro (in 2000 only Server had it as an optional add-on), and with limited number of connections.

    So I said it was in Windows 2000, and you're correcting me by saying it was in Windows 2000.

    Brilliant. Thanks for that correction. That was much-needed. You are a hero to all of us, Ender.

    No seriously: WHY THE FUCK DID YOU TYPE THAT? THEN HIT "POST"? THEN NOD TO YOURSELF AND THINK, "THAT IS A QUALITY POST!" What is wrong with you?

    @ender said:

    Linux actually also has X, which is older than Remote Desktop,

    Oh yes I love this one! Whenever you point out some feature Linux has that sucks shit, they always come back with "but its existed longer!" Here's two reasons that's a horrible argument to make to someone with more than 4 braincells:

    1) If I'm making a purchasing decision, I'm making it in 2013, not 1997. It doesn't fucking matter what features Linux had in 1997. It could not possibly be less relevant to the issue at-hand. You immediately going to the "well Linux had it longer!" argument just tells me you got nothing. If you got nothing, just be a man and admit it.

    2) If Linux has had a equivalent feature for far longer than Windows, and it still sucks (which it does in this particular case, so much so that I didn't even bother to mention it in the comparison), that only backs-up my point that Linux developers don't give even a tiny shit about improving their product.

    So your argument is 1) pathetic, and 2) backs-up my case. Brilliantly done, Sherlock.

    @ender said:

    and can be easily tunneled through ssh if you want.

    Well duh. That's true of fucking anything. The point is: why should you have to?

    And I'm not even saying "why should you have to?" as some utopian ideal, in this case I'm just saying "why should you have to?" Everybody wants encryption when remote-controlling computers. Everybody. The Linux community is (and I'm generalizing here, but cope with it) full of people who constantly promote using encryption to protect their privacy. So why the hell doesn't X11 (or VNC for that matter*) encrypt the data by default? Why is Microsoft the company that caters to Linux-user needs better than Linux? Does that make sense?

    Is it purely because Linux users have an attitude of, "if the user doesn't know you need to manually set up encryption, FUCK THEM!" Because that's the only reason I can think of for the current retardness. Or maybe Linux users love the CLI so much they wanted to add CLI commands to even the process of remotely controlling a GUI? Because why wouldn't you want more CLI commands! WE LOVE CLI COMMANDS! What's your explanation?

    ASTERISK: I'm pretty sure the variety of VNC Apple uses in OS X does indeed encrypt by default, but we're talking about Linux here. I'm just heading off the pedantic dickweeds.



  • @nonpartisan said:

    I've had problems getting RDP working right when connecting to a multi-monitor system from a single-monitor system.
    That's weird - I've got 2 monitors at home, 4 monitors at work, and I regularly connect to both computers from machines that just have a single monitor, and never had any problems (one nice thing about RDP is that it doesn't care about the remote machine at all - it'll always use resolution settings that are specified by the client).
    @nonpartisan said:
    A problem I never had with VNC.
    Interesting, because that's exactly where I did have problems with certain VNC versions - when the server had multiple monitors, certain versions would only get the picture from the main monitor, and one version actually crashed if mouse was moved to a secondary monitor while a session was established (that is, if somebody working on the machine locally moved the mouse while VNC client was connected to it).



  • @Cassidy said:

    (arse. What Ender said. Missed that.)

    You'd be better off not hitching your cart to that particular horse.

    @Cassidy said:

    But that doesn't answer the question about how you are unaware of advances in the Linux world means you've automatically decided there haven't been any.

    I'm sure there have been advancements. My problem is how behind the entire Linux environment is. Using Linux today is like using Windows a decade ago, and that's actually being generous-- it was far easier to set up a simple web server in Windows XP or Server 2003 than it is in Ubuntu Server. If you're driving 20 MPH, and you do some work on your car and now it can go 30 MPH-- well that's a huge improvement. But you're still not going to beat the McLaren P1 on the track.



  • @ender said:

    @nonpartisan said:
    I've had problems getting RDP working right when connecting to a multi-monitor system from a single-monitor system.
    That's weird - I've got 2 monitors at home, 4 monitors at work, and I regularly connect to both computers from machines that just have a single monitor, and never had any problems (one nice thing about RDP is that it doesn't care about the remote machine at all - it'll always use resolution settings that are specified by the client)
    To be more specific, for me it always takes everything that is open on the secondary monitors and smashes it back onto the primary monitor.  When I go sit in front of that computer again, it's still all smashed onto the primary and I have to reorganize everything.  What a pain in the ass with four monitors.  With VNC, it showed a massive desktop that I could scroll through all over the place.  Was never able to get RDP to work right with either XP or 7, so I just gave up on it.  We used to make VNC a standard install on new builds; that's no longer the case (mostly because common applications are available through Citrix, systems have been upgraded so they no longer have fat clients but use Web-based interfaces, so it's less of an issue).  I had more reason to perform remote GUI connections years ago.  Not so much any more.  So don't know, don't care any longer.



  • @blakeyrat said:

    it was far easier to set up a simple web server in Windows XP or Server 2003 than it is in Ubuntu Server.
    That's a bunch of horseshit.  I tell it to install the server on initial install and it's done.  Next bit of FUD?

    @blakeyrat said:

    But you're still not going to beat the McLaren P1 on the track.
    Windows is no McLaren P1, and you, sir, are no race car driver.



  • @nonpartisan said:

    @blakeyrat said:
    it was far easier to set up a simple web server in Windows XP or Server 2003 than it is in Ubuntu Server.
    That's a bunch of horseshit. I tell it to install the server on initial install and it's done. Next bit of FUD?

    Right. Oh wait you need a security certificate on the server. Oh wait there's no GUI for that in Linux (natch, and even if there was there's no way to remotely control the server's GUI so it wouldn't matter) and it's a long, involved, 23476-step process done entirely on the CLI? And the tutorials are all out-of-date or for the wrong distro or plain wrong? Oh and to get the cert file onto the server now you need to set up a SFTP server, which is another 23476-step process? You're right! LINUX IS BETTER! Much better than Windows, where you could just RDS to the server, generate the cert request in a relatively-easy GUI, then copy-and-paste the cert files onto the server's desktop and install them using another relatively-easy GUI.

    There's no comparison.

    If the only criteria was "ships with a web server installed by default" then Ubuntu and Windows 2003 are both dead-even-- they both do, it just has to be turned on. That's not the point.

    @nonpartisan said:

    Windows is no McLaren P1

    I never said it was. Windows sucks too. Computers are, in my estimation, something like 15-20 years back from what they should be due to company ignorance/hostility towards usability (including Microsoft for a long period and Apple right now), the inclination for developers to assume users are developers, and the general fear of change that pops up with anything computer-related.

    Computers suck shit. If we could build a computer equivalent of the McLaren P1, I'd pay $10k for it.

    Another fucking annoying habit Linux users have: if you criticize Linux, they always assume you love Windows.



  • @blakeyrat said:

    @nonpartisan said:
    @blakeyrat said:
    it was far easier to set up a simple web server in Windows XP or Server 2003 than it is in Ubuntu Server.
    That's a bunch of horseshit. I tell it to install the server on initial install and it's done. Next bit of FUD?
    . . . blah blah blah pointlessness snipped . . .
    None of that had anything to do with your criteria of "a basic Web server."  And if you're going to say that any of it was implied, then Ender's point about RDP being an optional add-in in W2K becomes relevant and you lose that point too.

    @blakeyrat said:

    Ubuntu and Windows 2003 are both dead-even-- they both do
    Thank you for agreeing with me.

    @blakeyrat said:

    Another fucking annoying habit Linux users have: if you criticize Linux, they always assume you love Windows.
    To continue the car analogy, you can't be traveling at 200 MPH, slam on the brakes, hop out and yell "AAAAHAHAHAHAHAHAHAHAHAHAHA!!!  I wasn't TALKING ABOUT THAT!!!!"  Windows has been your yardstick up to this point.  So if your comparison is about how far Linux has (or hasn't) come against Windows, you can't say that Linux is now doing 30 MPH against a McLaren P1 and not be implying that Windows isn't the P1.

     



  • @blakeyrat said:

    Right. Oh wait you need a security certificate on the server. Oh wait there's no GUI for that in Linux
     

    Incorrect. There are GUI tools to administer Apache under both Linux and Windows. There's also browser-based remote administration, if you want to count that as a GUI tool.

    @blakeyrat said:

    and even if there was there's no way to remotely control the server's GUI so it wouldn't matter

    Incorrect. RDP and other remote desktop systems have already been discussed.

    @blakeyrat said:

    and it's a long, involved, 23476-step process done entirely on the CLI?

    Incorrect, as discussed above.

    @blakeyrat said:

    And the tutorials are all out-of-date or for the wrong distro or plain wrong?

    There are many online tutorials that have been neglected when it comes to updating content with new developments. There are many that are accurate. This isn't a Linux problem, any more than an out-of-date tutorial for configuring IE6 settings are Microsoft's fault or a tutorial for an outdated version of Mail.app is Apple's fault. You just need to match the tutorial version with the software version.

    @blakeyrat said:

    Oh and to get the cert file onto the server now you need to set up a SFTP server, which is another 23476-step process?

    Incorrect. There are many commands and tools to copy files over onto the server. I'd SSH into the server and use "wget" meself if there's no server-based service that accepts file transfer.

    @blakeyrat said:

    LINUX IS BETTER!

    No, it's just different. It's not any better or worse, it simply has other routes to the same destination - some overlap, some are completely different, but the end goals are the same. Some people prefer the longer and scenic route so criticise the shorter and steeper route. Some prefer the road so will complain about how boggy the grass is. It's all about uderstanding that different users (or admins) have different tastes and requirements. Subjective taxonomy and all that.

    @blakeyrat said:

    you could just RDS to the server, generate the cert request in a relatively-easy GUI, then copy-and-paste the cert files onto the server's desktop and install them using another relatively-easy GUI.

    And your lack of knowledge of Linux systems means you've never considered the same process could be performed under Linux.

    @blakeyrat said:

    Another fucking annoying habit users have: criticize something based upon limited experience of that something.

    FTFY.

     @blakeyrat said:

    Computers are, in my estimation, something like 15-20 years back from what they should be due to company ignorance/hostility towards usability (including Microsoft for a long period and Apple right now), the inclination for developers to assume users are developers, and the general fear of change that pops up with anything computer-related.

    *nods*. Fear of change is what affected Unity takeup and also currently the subject of debate about Metro/Surface.



  • @nonpartisan said:

    Not hard at all for someone who knows what he's doing. 
    Isn't that the point?  You may get just as many points for the "efficiency" portion of usability, but you don't get points for learnability, memorability, errors, or satisfaction.



  • @Sutherlands said:

    @nonpartisan said:
    Not hard at all for someone who knows what he's doing. 
    Isn't that the point?  You may get just as many points for the "efficiency" portion of usability, but you don't get points for learnability, memorability, errors, or satisfaction.
     

    Wasn't the original point that Shamus didn't know what he was doing?

    Or rather, didn't know how to acheive his goal when in unfamiliar territory but if the system was better designed then he could ...?



  • @Sutherlands said:

    @nonpartisan said:
    Not hard at all for someone who knows what he's doing. 
    Isn't that the point?  You may get just as many points for the "efficiency" portion of usability, but you don't get points for learnability, memorability, errors, or satisfaction.
    When installing the OS, there's a screen that asks "What fucking servers do you want me to install????", listing SSH, and database servers (PostgreSQL or MySQL), Web server (Apache), and a few others -- I don't remember what all right off.  You select "SSH Server" and move on.  If that's too hard for Blakey then he's hopeless . . . although really I don't think the first necessarily has any bearing on the second.

    Turning off root login was my preference.  But even without it, an SSH server that was already pretty secure would've been up and running without any further intervention on his part. That's a pretty easy way to do it for Linux.  So his point that installing an SSH server on Linux is basically equivalent to hand-to-hand combat and it has been for years is blatant FUD because he's too fucking lazy to find out the right way to do it.  The right way stares him in the face.  But all he wants to do is bitch and moan.



  • @nonpartisan said:

    When installing the OS,

    I spawned it from Amazon AWS, there was no install process.



  • @nonpartisan said:

    Turning off root login was my preference
     

    As an aside... does Debian disable the root account, like Ubuntu does? I'm guessing that PermitRootLogin won't have any effect since there's no root shell to launch.

    (I've never rootwalked into any Deb server I've administered - I always used sudo to elevate my normal login once on)



  • @blakeyrat said:

    @nonpartisan said:
    When installing the OS,

    I spawned it from Amazon AWS, there was no install process.

    Now that would've been a nice fucking detail to have, now wouldn't it?  How about [url=http://docs.aws.amazon.com/AWSEC2/latest/UserGuide/AccessingInstancesLinux.html#AccessingInstancesLinuxSSHClient]looking at the Amazon documentation[/url]??



  • @Cassidy said:

    @nonpartisan said:

    Turning off root login was my preference
     

    As an aside... does Debian disable the root account, like Ubuntu does? I'm guessing that PermitRootLogin won't have any effect since there's no root shell to launch.

    (I've never rootwalked into any Deb server I've administered - I always used sudo to elevate my normal login once on)

    Not by default.  Debian is what I'm most experienced with.  You can root in from the console.  With a default SSH server install, you can root in over SSH.  But I normally turn off PermitRootLogin.

     



  • @nonpartisan said:

    Now that would've been a nice fucking detail to have, now wouldn't it?

    It was in the original thread this whole bullshit is about. You didn't even read it!?



  • RDP is nice when it works, but it's difficult to bring up a connection when the two computers aren't on the same domain and LAN.

    In Linux's defense, X was designed for use over a secure, firewalled LAN.

    Interestingly, VPN fixes both these problems.



  • @nonpartisan said:

    Not by default.  Debian is what I'm most experienced with.  You can root in from the console.  With a default SSH server install, you can root in over SSH.  But I normally turn off PermitRootLogin.

    If you use the advanced installer, it's an option. I've always turned it off there.



  • @MiffTheFox said:

    but it's difficult to bring up a connection when the two computers aren't on the same domain and LAN.
     

    The LAN part is just a networking/routing issue, isn't it?

    I can RDP into my works Terminal Services from SWMBO's Ubuntu box (at home) or my works netbook (from public WiFi spots) - neither are on the same LAN. 

    (they don't belong to the same domain either - the domain name only matters when specifying login credentials)



  • @nonpartisan said:

    You can root in from the console.
     

    Okay, so the "root account disabled" bit is only limited to Ubuntu (and Mint) but doesn't feature in stock Debs. Interesting.

    (disabling root login on the console - or from any TTY - is another config change I make to fresh installs)

    @nonpartisan said:

    With a default SSH server install, you can root in over SSH.

    Yeah, this bit I know - and I still think that if unspecified it defaulting to YES is a WTF. I've heard the default in BSDs is NO, which suggests the Linux community tampered with OpenSSH along the way to relax the security, but don't know the full details.

    Hmm... more of a WTFery than I thought.



  • @blakeyrat said:

    @nonpartisan said:
    Now that would've been a nice fucking detail to have, now wouldn't it?

    It was in the original thread this whole bullshit is about. You didn't even read it!?

    This thread was about some idiot who can't rub two brain cells together to figure out how to do what he wants to do under Linux.  I immediately countered with someone who has been using Linux for several years without issue.  You brought up a whole buttload of unsubstantiated vitriol comparing Windows and Linux without putting any context into it.

    NOWHERE in this thread did you mention Amazon or AWS.  And even if you did and I missed it, YOU'RE STILL WRONG!  Because based on Amazon's own documentation, you need to allow the traffic through and you need to install SSH keys, but nowhere does it say you need to install an SSH server!  So the server is already installed!

    Do us all a favor.  Go stand on that track with the McLaren traveling at you at high speed.  Don't get out of the way.  Don't let the driver swerve.



  • @blakeyrat said:

    That's fine. As long as you don't inflict your view on other people.
    Oh, like the way you do?

    @blakeyrat said:

    They blame the system because the system is at fault. The system caused the problem. The system fucked up.
    Actually it's a bit more like the way people blamed the Titanic sinking on ginger people; they had no real idea of what happened so they reached out to blame the first thing they could. An inept user blaming the computer for doing what it was told do by said user is still an inept user, they just feel by blaming they've regained some measure of control.



  • @blakeyrat said:

    So I tend to error on the side of "implementation details don't fucking matter, do something fucking productive instead of debating whether to use IIS or Apache, Christ" if only to help developers be aware that the shit they're wasting time on doesn't fucking matter.

    That's not to say that there's no difference between IIS and Apache, or that the difference between IIS and Apache doesn't matter at all. But it is to say it doesn't matter proportionally to the amount of time idiot developers spend debating it. IIS and Apache are responsibly for maybe 0.5% of your application's experience, so you shouldn't spend more than 0.5% of your development time talking about it. And that's being generous, it's probably far less than 0.5%.



    There's a reason Apache is being run on about 60% of the web servers worldwide and IIS on only about 16%, so I'd say it's more than an implementation detail. Of course, given your rants on the forum when it comes to all things web showing a lack of knowledge, it's no surprise you don't know about the differences that make it a pretty important part of the decision making process.

     


Log in to reply