Ubuntu WTFs



  • So I have this old Pentium 3 laptop whose hard disk I had wiped clean with the intention of putting it into the trash.  Then I realized that for my move to the other end of the country, at least in the first few days before I have time to buy something new, I would need some kind of computer to e-mail all the people I need to e-mail, and my desktop does not fit in my checked luggage.  So I reconsidered and brought it with me.  I figured since it was wiped anyway I might as well install Linux on it, and chose Ubuntu since I heard it was the best for desktop-style use.

    Now I haven't really used Linux since oh 2002 or so.  I used it as my main desktop operating system from about 1998-2000, then relegated it to a fileserver/firewall role from 2001-2002, before getting rid of it entirely and switching to Windows XP.  Back in those days you had to do everything by editing config files since all autoconfiguration tools were utter rubbish.  I've got more of a life now and don't have time for that kind of nonsense anymore.  But I'd heard Linux had gotten so much easier now.

    I installed Ubuntu from a LiveCD while the laptop was not connected to the net (important point: you'll see why later) and everything went nice and smoothly with very little input from my part.  Well it got stuck at one point for a stupid reason (I forget exactly what, but it was stupid) and required me to click Ok to continue, but that's a very minor flaw.  The installed Ubuntu booted fine.  Then, I tried what I considered an acid test: while the system was running, I hotplugged in my D-Link USB WiFi adapter into a USB port which was itself on a PCMCIA card.  Then I opened the Network control panel and lo and behold, a "Wireless Connection" was present.  Impressive!  So, happy with my new Ubuntu, I left with it to the Bay Area.

    Arrived at my new apartment a thousand miles away, [i]now[/i] the WTFs begin to rear their ugly head.  My new roommate has a bog-standard Linksys G router configured at 192.168.1.1 with DHCP support with WPA-PSK.  In other words, almost the factory defaults for one of the most commonplace home NAT routers in the world.  My roommate's Macbook Pro has no difficulty using the WiFi, nor does my just-purchased BlackBerry Curve.  However, when I try using it with my Ubuntu laptop using the graphical interface only, the connection works for about 30 seconds, then mysteriously drops forever.  Running ifconfig reveals that it loses the IP address for some reason.  This happens the same whether I use DHCP or static IP addresses (although switching between the two can cause a reset that makes it work again -- but only for another 30 seconds).  I spent an hour reading help wikis and trying to get it to work using command-line utilities, but in vain.  I'm sitting here with a too-short Cat-5 wire plugged straight into the router.

    Secondly.  I tried running svn to get some code of interest to me from a public repository, and Ubuntu told me to run apt-get to install it.  Fair enough.  But then, apt-get mysteriously fails with "E: Couldn't find package subversion".  Which raises the question of why the advice to run apt-get was there at all, if the package doesn't exist.  Anyway, long story short, I did some poking around, and it turns out that the problem was in /etc/apt/sources.list.  It was full of lines like this:

    # Line commented out by installer because it failed to verify:
    #deb http://ca.archive.ubuntu.com/ubuntu/ gutsy main restricted

    Screw you, Ubuntu.  So because I didn't have Internet access when I initially installed you, you assume those servers have ceased to exist and knock them out [i]permanently[/i]?  (unless the user is savvy enough to go edit that file with a text editor)  Ever hear of a little thing called testing, Ubuntu team?  Or simple logic for that matter?


    But, these are just minor kinks right?  Later in the year I'm sure they'll have it all sorted out.  2009: the year of desktop Linux!



  •  Administration -> Software Source

    You can then just enable them and it will ask to reload the packages. No need to edit the text file at all. (More tutorials should really say this)

    And IIRC restricted is never enabled by default...



  •  Hmm, okay, that's not nearly as WTFy as what I had inferred had happened, then.  I guess "failed to verify" must just mean it didn't have the proper flags to be enabled (maybe even something I specified at one point in the install but forgot about).  I guess at worst the WTF here is unclear error messages, then.



  • Okay, yeah, come to think of it, I was too quick to rant here and should cut Ubuntu a little slack.  I've never known WiFi to be perfectly reliable anyway, and it's not like other desktop operating systems have a package manager at all.



  • In my ongoing experiment with Ubuntu as Worthy Alternative on my other box, I'm finding that it's quite adequate.

    But I still don't understand how to really install custom software. The application manager is easy point-and-click, but I tried to download Quod Libet -- just for kicks, see what it can do -- and I have no clue how to get it running. All I have is a big set of files and a couple of python scripts that for some reason I can't run. I don't even know if it requires any sort of installation.

    I installed Celestia just fine, except that I don't remember where it went, physically.

     



  • I had a similar escapade yesterday installing Ubuntu 7.10 on a test machine. The first time I booted the live CD and ran the installer, it complained about not being able to verify a repository, and commented it out in the sources list. It also killed the installation at about 95% with some bizarre error, and left the machine's hard disk unbootable. (You'd get grub, but that's pretty much it.)

    So I fired up the live CD a second time, and verified the network settings. Yup, it saw the network card, and was set for DHCP. But ifconfig shows the interface as up with no address. Hmm. I ran dhclient manually from the terminal, and lo and behold, I get a network address. This time, the installer ran perfectly.

    Was there something weird about my hardware, or do you actually have to manually invoke dhclient just so you can have a working network connection to prevent the installer from failing miserably? Don't get me wrong, once I got it to install properly, it worked like a charm, and was one of the more pleasant Linux desktop distributions I've seen, but little stuff like this is what perpetually prevents it from being the year of Linux on the desktop.



  •  I would say bad program then, most good programs come either with a simple options to add to the sources list (which you can also do in the Software Sources app, you can add the complete deb line), or those that come with as a .deb file, which means just double-clicking and install.

     If I look at your program the .tar.gz wasn't really needed:

    http://www.sacredchao.net/quodlibet/wiki/Guide/Requirements

    The best way to get Quod Libet is to run

     $ sudo apt-get install quodlibet quodlibet-ext quodlibet-plugins

     Which means it should just show up with the other apps



  • As for the restricted repository commented out, that's a license issue. Default setup considers you only want GPL software, which usually means you're incredibly lucky if all your hardware, protocols and other software needs are met, even for what is considered basic stuff ( playing all the video formats you need, full-accelerated graphic driver, and so on ). A shame indeed, I guess legal issues are what's really makes free software so clunky, and why all the hackers in the world can't do anything about it. It may seem an ultimate, unfixable deadend but Ubuntu guys got it right : fixing non-technical issues require non-technical workarounds, that is, they have great FAQs and great forums. Perhaps they could use more in-software hints as a workaround for what they can't legally automate.

     And as no software comes with no bug, what really matters is the follow-up.



  • @seaturnip said:

    Okay, yeah, come to think of it, I was too quick to rant here and should cut Ubuntu a little slack.  I've never known WiFi to be perfectly reliable anyway, and it's not like other desktop operating systems have a package manager at all.

     

    Linux has come on in leaps and bounds in terms of wireless support. When I first installed Linux in 2005 I had to mess around for a couple of hours with ndiswrapper to get it to work.

    When I installed Ubuntu a year or two ago, my wireless card was configured for me by the time it had finished installing. 

    I think a large part of it is having the right hardware - if you've got supported hardware, Ubuntu works great "out of the box". If you don't, be prepared to spend time Googling! 



  • @db2 said:

    So I fired up the live CD a second time, and verified the network settings. Yup, it saw the network card, and was set for DHCP. But ifconfig shows the interface as up with no address. Hmm. I ran dhclient manually from the terminal, and lo and behold, I get a network address. This time, the installer ran perfectly.
     

    You know, I tried to get 7.10 to install the other day, but the wireless wouldn't work.  I bet I just needed to run the dhcp client manually to get it to work.  Kinda stupid considering 6.10 works flawlessly on the same machine.



  • @Heron said:

    @db2 said:

    So I fired up the live CD a second time, and verified the network settings. Yup, it saw the network card, and was set for DHCP. But ifconfig shows the interface as up with no address. Hmm. I ran dhclient manually from the terminal, and lo and behold, I get a network address. This time, the installer ran perfectly.
     

    You know, I tried to get 7.10 to install the other day, but the wireless wouldn't work.  I bet I just needed to run the dhcp client manually to get it to work.  Kinda stupid considering 6.10 works flawlessly on the same machine.

    Yeah, give it a try. I think I only had to do "sudo dhclient" from the terminal and wait a few seconds while it got an address. I don't think I had to give it any command line arguments, but it'll probably tell you what's missing if it needs any. A coworker of mine had the exact same issue with another distribution a while back, though his troubles were post-install. Suse, maybe? I wonder if there's something wonky about our DHCP server. (It is a Windows box, after all.)



  • My absolute favourite when it comes to ubuntu (well most linux distros for this) is applying a small patch to a piece of software...

     

    In this case patching KDE for transparency support with compiz fusion. The patch modifies KDesktop from KDE only, how ever in order to apply the patch you must:

    1) install build-essential fakeroot debhelper debconf kdebase-dev kdelibs-dev

    2) download the build dependancys for kdebase 

    3) downlaod the kdesource

    4) create two files in kdesktop folder (despite the patch binary frequenly being used in such a way that (Revision 0) means that the particular part of the patch is not actually a revision but a new source file no body seems to have updated so that patch recognises this and therefore creates a new file rather than spouting out that the file is missing)

    5) patch the source code

    6) modify kdesktop's Makefile.in adding in references to the two new files that were created earlier

    7) build the debian package for kdebase (building ALL of kdebase not just kdesktop) 

    8)  install the debian package (reinstalling ALL of kdebase not just kdesktop)

     

    I remeber installing the 17KB patch I required downloading hundreds of MBs of extra crap, wasting countless hours of valuable time building lotsa binaries that I had already installed on the system in the exact same state as the outputed 'new' binaries...

    all for one silly little feature that I could have just coppied the exact specific binaries that were modified (e.g. JUST KDESKTOP) and overwritten the orinial kdesktop files with the precompiled files and not have to waste soooooooo much time.....



  • Actually, if you run any current Ubuntu-derived installer without a network connection, all the software sources get commented out, without giving you any notification that this is happening. (And this produces WTF behavior from any program which uses apt-get, because all of them expect at least one valid source and give error messages which say nothing.) It's a real bug, it's been reported, and they're going to try to address it in the next major release.

    I recently got a pair of older (but still usable -- one was a 500 MHz P3, the other a 1.8 GHz P4) laptops and tried all three Ubuntu distributions, to check out the claims that I don't cut Linux enough slack and am out of date with my complaints. Sorry, Linux fanboys, but all three versions had major GUI bugs and/or failures which would prevent me from recommending Linux to anyone who wasn't already command-line savvy. (And my complaints have nothing to do with speed, although frankly X11 is utter rubbish as a GUI. I've run a better GUI on a 16 MHz Mac LC with no VM and 4 MB of RAM than any X11 GUI I've ever seen, regardless of hardware. Yeah, the LC's GUI had no sparkles or wobbly windows or rotating cubes, but it was responsive, intuitive, and functional. X11 GUIs all fail on at least one of the three, usually all three.)

    From memory (i.e. these are merely the bugs which stuck in my head after a few weeks):

    All LiveCDs: no GUI way to mount a disk with write permission, or even figure out why all disks are mounted without write permission. (News flash: people considering a switch to Linux will often want to back up their files and then erase the whole disk. If all local disks mount read-only, they usually can't do this. And saying "you can do it from the terminal" is not an answer. Most people are not comfortable with the command line. If your GUI can't do this, then your GUI is a failure.)

    All Ubuntu-derived systems: the "no network connection on install" bug above, and the useless error messages which would never inspire anyone to go to "Software Sources" to fix the problem, or to the actual file (whose path I forget) to remove the comment characters. Plus the GUI controls for GUI devices are utterly insufficient in all variants: there's no way to change monitor depth (see the Xubuntu section below for why this is important), the "select a driver for the video card" section looks like it works but actually does nothing, trackpad controls are conspicuously absent (at least on both the laptops I had, which both had the trackpads correctly detected and operational), etc. (They all also fail to detect the original ATI Radeon Mobility graphics card properly, but that's apparently a regression of some kind, because the bug reports indicate that it used to work. I'm willing to file that under "nobody's perfect".) All default GUI text editors have no ability to do a "sudo open" or "sudo save", which is unbelievably stupid if controlling the GUI requires editing a root-owned text file. The ability to launch multiple instances of a single program is nice -- except that some programs (such as package managers) are meant to only have one instance, and allowing multiple instances of them encourages the user to do things which are useless, which is a GUI failure.

    Ubuntu (GNOME): help buttons which do nothing at all (not even a "there's no help available for this window" or "oops an error occurred no help for you" message). Plus there's no GUI feedback when things are being done. (Launch a program, for example, and until the window actually draws, you can't tell that the computer has even noticed your action. Not such a big deal -- unless you launch a program which takes more than a fraction of a second to launch, and are used to any other GUI. Oh, and did I mention that the computer periodically actually did ignore my GUI actions, so that the lack of feedback when things worked became more frustrating with every passing minute?) And then there's the fact that all user interface elements are built out of the same 4 primitives, so that it can be almost impossible to figure out whether a control is really a button (or whatever) or merely looks like one.

    Kubuntu (KDE): the system update application chokes on the first run on one of the updates currently available and crashes, leaving apt-get's database locked. On subsequent runs, it says something to the effect of "The database is locked. Fix it? Yes/No/Quit"; "Quit" quits, "No" runs the program without the ability to actually update, and "Yes" makes it crash. This is a known, reproducible bug, but amazingly isn't considered important enough to warrant a fresh release for some reason, probably because you can fix it from the command line. (See above for my comments on "you can do it from the terminal".)

    Xubuntu (Xcfe): actually, Xubuntu is pretty good. Except that there's a bug wherein certain old video cards can't open the GUI's Terminal windows unless you change the monitor depth (or other more esoteric video card settings), and the OpenOffice.org package does not install the gtk plugin so that all OpenOffice windows look crappy unless you are savvy enough to figure out to run the package manager (and then make it work -- I can't imagine trying to talk my father through that one!). And there's the "limited primitives" problem, but it isn't rubbed in your face as in GNOME.

    Really, the phrase "not ready for prime time" comes to mind. And I reiterate the statement I made a while ago: the problems with Linux GUIs will never be fixed by Linux programmers, because they don't really understand that there are problems in the first place. A quick and dirty rule: if your user ever, under any circumstances, has no other option than to use the command line, then your GUI is a failure at whatever task the user wants to do. Xubuntu comes closest to the ideal, but if you want to do anything but the most basic GUI device configuration, it still fails.

    I'll try the next major release, but after several years of Linux geeks telling me how X11-based GUIs are on a par with Windows and Mac and being disappointed every single time, I'm not getting my hopes up. After a while, you all start to sound like the Linux zealot in the Mandatory Fun Day comics on the main page, disconnected from reality but sure of the superiority of Linux.



  • @The Vicar said:

    ...

     

    TL/DNR



  • Another really fun bug in Xubuntu's 6.10 (?) installer: When running the install program Thunar will detect that a drive was formatted and act like it was just hot-plugged in which causes Thunar's auto-mount feature to mount said partition which the installer needs to do something while the drive is still unmounted which fails because thunar helpfully mounted the drive. That being said I much prefer Arch, which has some irritating traits as well (where can PATH be set other than ~/.bashrc, ~/.bash_profile and etc?).



  • @The Vicar said:

    ... 

     

     As someone who has a Ubuntu box as well as an XP box. I cannot agree with you more.

    Linux will never be ready for the primetime, just like developers never find their own bugs (Thats why we have QA).

    The horribly not standard implementations of GUI's across distro's show that developers like to create eye candy without actually thinking about the end users. In the paid software world that I exist in, we have things such as useability studies and focus groups, and we still don't get things perfect.

     





  • @mfah said:

    I think Eric Raymond put it best:
    The CUPS programmers were contacted about that and they did fix the issue, actually I found that I was able to use a networked printer from an XP machine whereas the other XP machines couldn't.



  • @Lingerance said:

    @mfah said:
    I think Eric Raymond put it best:
    The CUPS programmers were contacted about that and they did fix the issue, actually I found that I was able to use a networked printer from an XP machine whereas the other XP machines couldn't.

    While that's good news, including to those like me who would like Linux to be good but are disappointed so far, if that's your only comment on those articles, then you have missed the point. (And you're probably one of those Linux programmers who don't even realize there's a problem I mentioned a few posts back.)



  • @Jonathan Holland said:

    @The Vicar said:

    ... 

     

     As someone who has a Ubuntu box as well as an XP box. I cannot agree with you more.

    Linux will never be ready for the primetime, just like developers never find their own bugs (Thats why we have QA).

    The horribly not standard implementations of GUI's across distro's show that developers like to create eye candy without actually thinking about the end users. In the paid software world that I exist in, we have things such as useability studies and focus groups, and we still don't get things perfect.

     

    I am not sure what universe you live in but none of that makes the slightest bit of sense. What nonstandard implementation? Or, even, what implementation? The majority of software in a distro is upstream, even for handholding distros like Ubuntu.

    You are aware that developers ARE the end users, right? And that software goes through a testing phase as well, just with a broader audience? You make it sound like Bugzilla, mailing lists or any other form of contact doesn't exist.



  • @The Vicar said:

    then you have missed the point.
    Actually I haven't; I honestly don't think that Linux is ready for the computer illiterate, I use it for my day to day workings and generally prefer the more advanced distributions (Arch, Lunar, Slackware, etc), I actually commented on Xubuntu's messed-up installer a few posts down/up.



    May as well complain about something again: /media, what is the point of it's existence? /mnt does the. Exact. Same. Thing. Why does Ubuntu use GUIDs? It certainly isn't useful to power-users, confuses the shit out of me, why would it be remotely better for a normal user? Use LABEL instead, it's actually human readable, Arch* is the only distro (I've used) that lets me set a LABEL for my disks during the install.



    *Using advanced/quick install only.



  • @aythun said:

    You are aware that developers ARE the end users, right? And that software goes through a testing phase as well, just with a broader audience?

    That's, uh, kind of the problem.

    Developers -- and I speak as one myself -- can easily talk themselves into believing that a UI bug is feature. We get attached to our work, and are reluctant to give it up if it turns out badly. Some bad ideas get removed from Linux, usually because they can be shown directly to be bad ideas in an easily measurable way. (If, to give a fictional example, I could demonstrate that swapping the first and second bytes of an executable magically improved load times on i386 hardware by ~5% across the board, then swapped first and second bytes would triumph.)

    Interface problems, though, are less susceptible to testing. The moreso because alternate interfaces will not be self-evidently better or faster to the programmer(s) of the original interface. If I spend six months designing a program, its UI is going to be convenient for me, but by that time I won't even be close to a normal user. In Linux (and other Unix derivatives), a lot of awkward (and sometimes downright bad) interface ideas have been meticulously preserved because old programmers are attached to them, and new programmers are told that that's the right way to do things. (Look at sendmail!) You can do tests to find GUI problems, but it's hard work and takes a long time. Go listen to that Google talk by the Graphing Calculator guy's experience at Apple: he not only had random Apple employees playing with the program making suggestions, he also had math teachers trying the program without being allowed to help them along, to see what behaviors they expected and weren't getting.

    If programmers were the only users, this would not be a problem. In fact, that would be ideal. Unix systems would be horribly arcane, but would only be used by people who like things to be arcane, which would weed them out of other OSes. But they aren't. And Linux users are constantly trying to push for greater mainstream acceptance of Linux -- in other words, to increase the number of non-programmer users of Linux. This makes the shortcomings of Linux UI into everyone's business.

    Linux GUIs are pretty much a huge bundle of WTFs, mostly arising because Linux GUI programmers have mostly forgotten what it's like to use a computer for a purpose other than programming. Linux GUIs are uniformly based on X11, which more or less sacrifices speed and compactness for the ability to run remote sessions, which most users outside of university CS departments and professional server administration don't need. (Compare X11's performance, for example, to the windowing/graphics system in Windows on the same hardware. Or compare it to that of Mac OS 9 on dramatically inferior hardware. Either of the two other windowing systems will run circles around X11 on older hardware, usually with a fraction of the memory. They would do the same on newer hardware, but thanks to Moore's Law X11 escapes having to deal with the problem.)

    And thanks to the different widget libraries, and the way that programmers in different libraries have had to reinvent different wheels, there's no guarantee that GUI elements (such as progress bars, column headings, and popup menus) will look the same across a single session, let alone between two different logins. It takes more effort even to get help from someone else on Linux than it does on Windows or the Mac OS, because the descriptions of just about everything are so fluid. ("Click on the close box for the window, it's the little box in the upper right." "There's nothing in the upper right. There are two circles and a triangle in the upper left, though, which one should I click?") Apple and Microsoft learned a few decades ago that "There's More Than One Way To Draw It" in GUIs is fun for programmers, but a nuisance for most other people, including their own technical support people. Linux developers not only embrace "flexible" (i.e. inconsistent) GUIs, they often think they're superior. When the GNOME team drew up some GUI guidelines and began to enforce them in the included utilities, they drew lots of criticism from Linux zealots, even though GNOME's usability went way up from the perspective of non-programmers as a result.

    And then, on top of this, there's the attitude among Linux users that GUIs aren't really important, especially in programs which do things that can be done from the terminal. This attitude is why you can't set the monitor color depth using a control panel, why the graphics card driver selection panel doesn't work, why the system updater on Kubuntu can choke and burn out the whole apt-get system, and why none of the GUI text editors can do a sudo-save. A programmer will be able to figure things out, right? Of course, it means that even a savvy programmer from another platform will find Linux massively inconvenient -- I can't ever see myself actually being productive on a current Linux distribution, and I know I'm not alone -- and a non-programmer is unlikely to be able to get much out of Linux at all. @aythun said:

    You make it sound like Bugzilla, mailing lists or any other form of contact doesn't exist.

    As long as Linux programmers remain so clueless about designing GUIs, it doesn't matter whether Bugzilla exists or not, because programmers are usually the ones to file bugs on OSS products. Most other people either don't know they can file bugs, or don't care enough to deal with following up on a bug report.



  • @Lingerance said:

    @mfah said:
    I think Eric Raymond put it best:
    The CUPS programmers were contacted about that and they did fix the issue, actually I found that I was able to use a networked printer from an XP machine whereas the other XP machines couldn't.

    Did you honestly think those articles were only about CUPS?



  • @The Vicar said:

    Go listen to that Google talk by the Graphing Calculator guy's experience at Apple: he not only had random Apple employees playing with the program making suggestions, he also had math teachers trying the program without being allowed to help them along, to see what behaviors they expected and weren't getting.

    I don't know of this talk, but you're referring to The Graphing Calculator Story, which is a fantastic tale. The spontaneously combusting monitor is the highlight though :)

    I saw a screenshot recently of SimplyMEPIS Linux, where two open windows didn't even look like each other. The legal insanity that lead to the KDE vs GNOME means that to this day, you can't get programs to look like they're from the same system. I find it sad and angering that my simple Win2k desktop is far cleaner and easier to follow than KDE or GNOME, with sharp clear icons and a taskbar free of random clutter and unreadable buttons (KDE desktops have finally reached Mac OS 10.1.5's level of maturity with transparent inactive title bars and taskbar buttons to make sure you can't actually read any of the captions; Apple have long since realised that is is ridiculous, and Mac OS X's design has come up a long way over the years).

    The REAL WTF™ of course is putting a Pentium III laptop in the trash. That's ridiculous – unless it's completely knackered, it's far from unusable. My PIII laptop can seriously outperform my aunt's brand new Vista PC (specced well above my laptop). So can my PII desktop. It would be quite funny to find someone who thinks Vista's performance on a new but cheap PC is normal and watch their head explode as my ~eight-year-old PII blows it out of the water.



  • @Daniel Beardsmore said:

    @The Vicar said:
    Go listen to that Google talk by the Graphing Calculator guy's experience at Apple: he not only had random Apple employees playing with the program making suggestions, he also had math teachers trying the program without being allowed to help them along, to see what behaviors they expected and weren't getting.

    I don't know of this talk, but you're referring to The Graphing Calculator Story, which is a fantastic tale. The spontaneously combusting monitor is the highlight though :)

    Sorry. I wrote a much longer reply, then reread it, decided it was way too long (even for me), and edited the heck out of it without paying too much attention. That paragraph really suffered as a result. (So did the transitions between paragraphs. Too bad.) That is the talk I meant, and he does indeed speak of having to watch users without being able to "help" them. Motto #1 for UI design (whether GUI or CLI) is "you will not be there to hold the users' hands so it better behave the way they'll expect it to behave".

    @Daniel Beardsmore said:

    The REAL WTF™ of course is putting a Pentium III laptop in the trash. That's ridiculous – unless it's completely knackered, it's far from unusable. My PIII laptop can seriously outperform my aunt's brand new Vista PC (specced well above my laptop). So can my PII desktop. It would be quite funny to find someone who thinks Vista's performance on a new but cheap PC is normal and watch their head explode as my ~eight-year-old PII blows it out of the water.

    Agreed. At the very least, stick Xubuntu on your old machine and give it to somebody who can't afford a computer of their own. The 500 MHz PIII which I mentioned above ran 7.10 Xubuntu fast enough to make an acceptable web/mail/word processing terminal with only 192 MB of RAM.



  • @Daniel Beardsmore said:

    The legal insanity that lead to the KDE vs GNOME means that to this day, you can't get programs to look like they're from the same system.

    It annoys me that I now have two keyboard mappermanagers in Ubuntu -- One Gnome in system prefs, and one KDE in, um, Applications -> Other. And the KDE one looks like crap.

    Also, I can't find Celestia. It's installed, and runs, but it's not in /usr/lib, where I found Firefox and everything else. Surely it's not running straight from the autopackage?



  • @dhromed said:

    Also, I can't find Celestia. It's installed, and runs, but it's not in /usr/lib, where I found Firefox and everything else. Surely it's not running straight from the autopackage?
    Try /opt or /usr/local/lib. If that fails: locate celestia; might need to do updatedb as root first.



  • @mfah said:

    I think Eric Raymond put it best:

    http://www.catb.org/~esr/writings/cups-horror.html
    http://www.catb.org/~esr/writings/luxury-part-deux.html

     

    I can summarize by telling you my experience:

    I want to install linux. I want to also configure my touchpad to not click if a flake of dust lands on the touch pad. Well being a complete linux nubee at the time I looked it up... and some more... and about 1 week of headake later(yes I didn't spend the necessary 10 hrs on google)  I found all the configuration paramaters which gave the the luxury of typing it all in by hand into /etc/X11/xorg.conf YAY THANKS LINUX.

    Hell I may dislike windows, I may even despise Microsoft, but I never EVER had to open up windows version of xorg.conf and do all sorts of funky stuff and RANDOMLY it would revert to default (thanks RedHat for auto unwanted recovery). I even made a script that would restore my xorg.conf on startup to what I wanted it to be incase Fedora Core decided that it was not "hip" enough to run on it.

    I am a programmer, not an encyclopedia. I don't know/care about 99% of computer crap. Sure its cool to know how linux implements it's wifi drivers (oh if you thought touchpad was hard, try wifi on a kernel that is conveniently not compatible with any wifi drivers and nobody tells you, it just says that 2.6.0.1 != 2.6.0.1 (sorry i forgot exact numbers) drove me insane) but I would LOVE to not give a shit, hell with windows i didn't have to...

    Seems that linux is the windows "alternative" for the corporations, not to desktop users, ubuntu tries but its like trying to make a redneck gay, you just can't get there :P



  • A friend with a Dell PC (something I'd take to be the most likely machine to work with Linux) tried all the likely suspects including Fedobunto and Clbuttubunto (and plain Debian) and not one of them managed to actually work (this was a only few months ago). He's middle-aged and not very technical but unhappy with Apple and Adobe's home-phoning (or so his router lights tell him) and he can't really figure Windows out. Finally he tried SimplyMEPIS as a last resort, and his experiences have been pretty positive, and he loves it.

    Everyone's experiences will be different, but for him at least, SimplyMEPIS is the one that works well. Maybe others here will agree.

    [Edit: Since that wasn't clear – he also owns various Macs, including an eMac that's had unending hardware and software problems including icons spontaneously vanishing in the Finder as you watch (even a shop technician witnessed that, but guessing a cause ... who knows); the eMac is now scrapped although he saved some parts including the CPU.]



  • Well, in my experience installing an off-the-shelf boxed version XP (not a specifically adapted OEM recovery version) is not at all more easy or funny than installing Linux. BTW, as always, a lot of criticism sounds a lot like "Linux is bad because it isn't exactly like Windows". In my family, Linux is the primary desktop operating system (XP available for games, though) and everyone, including my kids and my wife, none of which are computer-wizzards, are perfectly happy to use it. Of course it helps to have a geek like me in the family, to do the setup and administration, but exactly the same can be said about Windows. Anyway, even to me, it seems unlikely that Linux will ever takeover the desktop market. But new kinds of devices, like Asus' Eee-PC, are the perfect playground for Linux, especially since MS has made Vista very hardware-hungry.


  • ♿ (Parody)

    @The Vicar said:

    And thanks to the different widget libraries, and the way that programmers in different libraries have had to reinvent different wheels, there's no guarantee that GUI elements (such as progress bars, column headings, and popup menus) will look the same across a single session, let alone between two different logins.

     

    Yeah, because everything is standard in windows.  Well, except for Classic vs Luna vs Aero.  And apps without manifests, so they don't look the same.  And for any app that uses skins.  And, oh yeah, for all the custom widgets that Office uses.

    Still, on either platform, I've never had a problem figuring out what all the elements are.  Maybe this is a real problem for people who still can't find the Any Key, but those people aren't getting much done regardless of the system.  In fact, they're probably most productive in a green screen environment where they don't have any choices.  And I've seen at least as many interface WTFs on windows as linux.  User testing still doesn't guarantee that you get a good interface (not that it isn't a good idea), but almost no one does this (big companies like Apple and MS are the exceptions).



  • @ammoQ said:

    Well, in my experience installing an off-the-shelf boxed version XP (not a specifically adapted OEM recovery version) is not at all more easy or funny than installing Linux. BTW, as always, a lot of criticism sounds a lot like "Linux is bad because it isn't exactly like Windows". In my family, Linux is the primary desktop operating system (XP available for games, though) and everyone, including my kids and my wife, none of which are computer-wizzards, are perfectly happy to use it. Of course it helps to have a geek like me in the family, to do the setup and administration, but exactly the same can be said about Windows. Anyway, even to me, it seems unlikely that Linux will ever takeover the desktop market. But new kinds of devices, like Asus' Eee-PC, are the perfect playground for Linux, especially since MS has made Vista very hardware-hungry.

     

    As a Linux "enthusiast" and eee PC owner and Vista user, I totally agree.

    Linux works well for novice users. Novice users don't install anything themselves, let alone an OS. Compare a PC that comes pre-loaded with Linux (such as the eee or gPC) to one that comes pre-loaded with Windows (with similar specs). Note that the Linux PC is cheaper, faster and comes with all the apps the novice user needs, without the trial versions and other bloat that comes with the Windows PC.

    Linux also works well for advanced users. We are happy to use the command line and text config files (something I personally find far easier than hacking the registry on Windows and the Linux command line is actually useful, unlike Windows where it only exists for a few advanced utilities). We are also happy to install programs from source (./configure, make, sudo make install) and some of us can even hack the source code if it doesn't work.  We can usually figure out even the ugliest and worst-designed of GUIs. (Although I personally try to stick to GTK+ based apps, for consistency and because GTK+'s GUI design tools make it hard to produce ugly UIs). We know which tools to install to make things easier.

    However, Linux does not work very well for "intermediate" users. I do not mean advanced users who are still learning, they enjoy the challenge of Linux. I mean the users who are pretty advanced in the world of Windows (or Mac), but when dropped into the world of Linux, which is at least as different from Windows and Mac OS X and they are from each other, get lost quickly. They expect GUIs for everything and don't feel at all comfortable with the command line. They prefer registry editing to text files simply because regedit is more "GUI" than a text editor. If for some reason X won't start then the only thing they know how to do is re-install. If a GUI for something isn't installed by default, then they assume that it doesn't exist (because they heard that Linux is all about hacking text files). A hint to the person with touchpad problems: GSynaptic (or QSynaptic). They come up with all sorts of reasons to rationalise their dislike Linux, rather than try to learn it, often even "forgetting" that the same thing happens on their usual OS (everything made up of 4 primitives? which 4? I don't think I've ever seen anything that looks like a button that wasn't except in some really badly designed VB apps on Windows, no GUI feedback? ok, maybe the mouse cursor isn't that obvious, what does Windows do when an app takes a while to start?).



  • @ammoQ said:

    Well, in my experience installing an off-the-shelf boxed version XP (not a specifically adapted OEM recovery version) is not at all more easy or funny than installing Linux.

    There are a couple of annoyances. It would be nice to just say "I am in Great Britain" rather than have to crawl through international settings putting everything right (and another tick box saying "Give me international fonts so foreign sites look pretty and don't show '???? ?? ??????'"). The hard drive partitioning part is probably the hardest part, as it means nothing to most people, and some of it is directly related to how completely screwed up the IBM PC is to begin with.

    I can't tell, though, just how many WTFs are Windows and how many are IBM's and other companies. Why does Windows struggle to recognise hardware so much? Why does it ship, on the CD, drivers that blatantly don't work, e.g. video drivers that crash the system frequently. The most fun part with reinstalling Windows is taking the whole damn computer apart to read off chipset numbers from components when Windows is too stupid to recognise them. I blame Windows, because Knoppix recognised the Synaptics trackpad on my laptop when Windows XP did not.

    @mallard said:

    Linux works well for novice users. Novice users don't install anything themselves, let alone an OS.

    Of course they do. The design of Mac OS and Windows has never stopped them. Some Web surfing or a URL from a friend takes you to a page, where you find a download link, click it, and then run what downloaded. (I do really dislike what Apple did with disk images though.)

    The Linux people, though, somehow managed to botch Linux so badly that downloading binaries is useless, so all you get to download is source code. So then you need to find the app in a package manager and hope that the people who made your distro aren't in a jihad against that product (*cough* Debian *cough*). Some software still requires typing all sorts of arcane commands to install, a pretty vicious learning cliff (and where do you find the climbing gear?)

    When I started playing with Gentoo though, inside coLinux, I didn't assign Linux enough RAM to run emerge, which just crashed and burned maniacally without explanation. So I did compile one or two tools from source because there was no way to use their stupid package manager.

    I guess the only down side to making installation so trivial in Windows is that, given the pervasive evil in the Windows world, we get people installing all sorts of things they really shouldn't be ...


  • ♿ (Parody)

    @Daniel Beardsmore said:

    The Linux people, though, somehow managed to botch Linux so badly that downloading binaries is useless, so all you get to download is source code. So then you need to find the app in a package manager and hope that the people who made your distro aren't in a jihad against that product (*cough* Debian *cough*). Some software still requires typing all sorts of arcane commands to install, a pretty vicious learning cliff (and where do you find the climbing gear?)

    It's not useless, but there are certainly more caveats than running with something that's closed and controlled by a single entity.  It does make it more difficult for malware, so there's that.  You also don't have the lock down from backwards compatibility that Windows does.  There are good and bad parts to that.  If you're really interested in running all the latest, experimental stuff, then you probably should learn how to do stuff from the command line.  But for most things, a well designed package manager plus front end (i.e., apt + synaptic) will be more than sufficient for most users.  Not every OS is right for every person, but I think you're ignoring the benefits of linux in this aspect. 

    @Daniel Beardsmore said:

    When I started playing with Gentoo though, inside coLinux, I didn't assign Linux enough RAM to run emerge, which just crashed and burned maniacally without explanation. So I did compile one or two tools from source because there was no way to use their stupid package manager.
     

    coLinux is TRWTF.  It's a neat idea, but just running in VMWare or VirtualBox are much better. 



  • @boomzilla said:

    coLinux is TRWTF.  It's a neat idea, but just running in VMWare or VirtualBox are much better.

    I have VirtualPC 2007 and it has precisely the same pre-boot RAM allocation in the config as coLinux. You're suggesting that VMWare and VirtualBox have a different method of allocating RAM for the virtual machine that doesn't permanently eat a chunk of RAM from the host OS?



  •  Im still figuring out why I need IDE support in order to access my SATA drives...



  • @zipfruder said:

     Im still figuring out why I need IDE support in order to access my SATA drives...

     

    SATA was designed to be essentially 100% software-identical to IDE. The electrical/physical specifications are completely different, of course, but in theory you could rip an IDE controller out of a machine, slap in a SATA controller, and the system wouldn't know anything had changed.

    So no surprise that SATA support is done via the IDE system. No reason to have two separate driver sub-systems when they'd be sharing 99% or more of the same code anyways. 


  • ♿ (Parody)

    @Daniel Beardsmore said:

    @boomzilla said:

    coLinux is TRWTF.  It's a neat idea, but just running in VMWare or VirtualBox are much better.

    I have VirtualPC 2007 and it has precisely the same pre-boot RAM allocation in the config as coLinux. You're suggesting that VMWare and VirtualBox have a different method of allocating RAM for the virtual machine that doesn't permanently eat a chunk of RAM from the host OS?

     

    It's not necessarily the RAM allocation, but the VMs of VMWare/VirtualBox/VirtualPC are way ahead of that of coLinux.  Not to mention that you need to set up your own X server on Windows.  Like I said, coLinux was a neat idea, but I can't think of a single way it's superior (maybe in the # of WTFs) than using a generic VMrunning the distro of your choice.



  • @MarcB said:


    SATA was designed to be essentially 100% software-identical to IDE. The electrical/physical specifications are completely different, of course, but in theory you could rip an IDE controller out of a machine, slap in a SATA controller, and the system wouldn't know anything had changed.

    So no surprise that SATA support is done via the IDE system. No reason to have two separate driver sub-systems when they'd be sharing 99% or more of the same code anyways. 

     

     If the code is identical, why do I need to enable it seperatly from SATA? Should not said code exposing SATA devices on /dev be referenced directly by the SATA source? Its odd enough that one accesses SATA devices with the same nomeclature as SCSI ones(/dev/sd??).



  • @Daniel Beardsmore said:

    The Linux people, though, somehow managed to botch Linux so badly that downloading binaries is useless
    Incorrect, unrar (rarlabs.com), virtualbox (non-ose), firefox (rather than bon echo or whatever), ATI Flgrx and Nvidia video drivers and a number of other commercial apps for Linux are shipped as binaries. If libraries are going to be a problem, the company releasing the binary will statically link against them.



    For the most part if you want to learn Linux use a VM and possibly an X server for windows setting display as: IPofVMsGateway:0 (iirc) so that your Linux windows appear like Windows windows.



  • @ammoQ said:

    BTW, as always, a lot of criticism sounds a lot like "Linux is bad because it isn't exactly like Windows".

    No. Linux is bad because:

    1. Its GUIs are slower than the Windows and Mac GUIs on the same hardware.
    2. Its GUI utilities fail a lot more frequently than those included with the Windows and Mac GUIs, requiring users to grub around on the command line or in text files to actually accomplish the tasks which the utilities are supposed to achieve. (Usually, this seems to be a lack of error-checking on the parts of the programmers, combined with a "what the heck, they can always drop to the command line if they really need to do this" attitude.)
    3. Its GUIs are less configurable within the GUI itself than the Windows and Mac GUIs, requiring users to deal with the xorg.conf file directly. Since xorg.conf requires, in effect, magic keynames (how do you turn off tap-clicking on a touchpad?) this is a serious problem deserving of its own item in this list.
    4. GUI Programs for Linux use competing libraries for interface elements, and thus no Linux GUI is actually consistent unless you are restricted to the relatively small number of programs which come with the desktop environment.
    5. Linux programmers such as yourself don't actually see these as problems, meaning that the situation is unlikely ever to improve. (And it's hard to see how it could, anyway. Perhaps you could write a substitution library that used GTK+ widgets in response to Qt calls, or vice versa. That would certainly help. But it would take a long time to write and would probably be very difficult to implement. It would probably be simpler to abandon X11 entirely in favor of something less amateurish.) @boomzilla said:

    Yeah, because everything is standard in windows. Well, except for Classic vs Luna vs Aero. And apps without manifests, so they don't look the same.

    If Windows were actually a good GUI, I'd defend it. It isn't. In many small but noticeable ways it is inferior to the Mac OS GUI. But it's still better than the options available for Linux. The differences between GUI elements in the three Windows variants are slight, and as programs are updated the older variants disappear. Not so with Linux, where Konquerer will look just as WTF-y in GNOME a year from now as it did last year. @boomzilla said:

    And for any app that uses skins.

    Skins are another bad GUI idea. If I could wave a magic wand and banish bad GUI design, skins would go too. @boomzilla said:

    Still, on either platform, I've never had a problem figuring out what all the elements are. Maybe this is a real problem for people who still can't find the Any Key, but those people aren't getting much done regardless of the system. In fact, they're probably most productive in a green screen environment where they don't have any choices.

    Yeah, you can figure them out if you're computer savvy. The point is that you shouldn't have to do that. You shouldn't be required to be switching contexts all the time. It's just one more distraction, something to take away a little more of your attention from what you're working on. You can work with a flickering monitor, loud musak, and coworkers talking in the background, too. But I don't see Linux users recommending any of that.

    And what about the non-computer-savvy? Go to your local public library, if you live in a town where the library has a computer lab, and sit in on one of their computer classes for novices. These things still get heavy attendance today; they didn't vanish after the dot-com bust. There are still masses of people who actually do have trouble with this sort of thing, and what you are basically saying is that they can go screw themselves: if they can't match your skills, then they aren't worth helping. You are, of course, entitled to your opinion. But there's a substantial portion of the world, also entitled to its opinion, which says that if Linux can't be helpful to everyone, we'll look elsewhere, thank you very much.

    You don't have to put up with these constant switches on the Mac, and you can avoid them on Windows without too much trouble. (Particularly by sticking with XP.) But the only way to avoid them on Linux is to restrict your options to a ridiculous degree. The minute you even accept so much as the Ubuntu Add/Remove Programs list, the interface chaos begins.


  • Discourse touched me in a no-no place

    @The Vicar said:

    1. Its GUIs are slower than the Windows and Mac GUIs on the same hardware.
    2. Its GUI utilities fail a lot more frequently than those included with the Windows and Mac GUIs, requiring users to grub around on the command line or in text files to actually accomplish the tasks which the utilities are supposed to achieve. (Usually, this seems to be a lack of error-checking on the parts of the programmers, combined with a "what the heck, they can always drop to the command line if they really need to do this" attitude.)
    3. Its GUIs are less configurable within the GUI itself than the Windows and Mac GUIs, requiring users to deal with the xorg.conf file directly. Since xorg.conf requires, in effect, magic keynames (how do you turn off tap-clicking on a touchpad?) this is a serious problem deserving of its own item in this list.
    4. GUI Programs for Linux use competing libraries for interface elements, and thus no Linux GUI is actually consistent unless you are restricted to the relatively small number of programs which come with the desktop environment.
    5. Linux programmers such as yourself don't actually see these as problems,
    As a relatively new user to Linux (used seriously for the past 3 months) I don't actually see them as problems, because I don't recognise those problems (well at least on the distro that was 'foisted' on me in my current job.)


  • ♿ (Parody)

    @The Vicar said:

    No. Linux is bad because:

    1. Its GUIs are slower than the Windows and Mac GUIs on the same hardware.

    Maybe on some hardware. This is not universally true.

    @The Vicar said:


    2. Its GUI utilities fail a lot more frequently than those included with the Windows and Mac GUIs, requiring users to grub around on the command line or in text files to actually accomplish the tasks which the utilities are supposed to achieve. (Usually, this seems to be a lack of error-checking on the parts of the programmers, combined with a "what the heck, they can always drop to the command line if they really need to do this" attitude.)
    3. Its GUIs are less configurable within the GUI itself than the Windows and Mac GUIs, requiring users to deal with the xorg.conf file directly. Since xorg.conf requires, in effect, magic keynames (how do you turn off tap-clicking on a touchpad?) this is a serious problem deserving of its own item in this list.

    This is something else that is changing (even though it's being done by "Linux programmers").

     @The Vicar said:

    4. GUI Programs for Linux use competing libraries for interface elements, and thus no Linux GUI is actually consistent unless you are restricted to the relatively small number of programs which come with the desktop environment.
    5. Linux programmers such as yourself don't actually see these as problems, meaning that the situation is unlikely ever to improve. (And it's hard to see how it could, anyway. Perhaps you could write a substitution library that used GTK+ widgets in response to Qt calls, or vice versa. That would certainly help. But it would take a long time to write and would probably be very difficult to implement. It would probably be simpler to abandon X11 entirely in favor of something less amateurish.)

    The vast array of stuff that runs on linux is perhaps less consistent that the stuff that runs on some other particular system, but there's plenty of inconsistency to go around--because of time if nothing else.

    @The Vicar said:

    If Windows were actually a good GUI, I'd defend it. It isn't. In many small but noticeable ways it is inferior to the Mac OS GUI. But it's still better than the options available for Linux. The differences between GUI elements in the three Windows variants are slight, and as programs are updated the older variants disappear. Not so with Linux, where Konquerer will look just as WTF-y in GNOME a year from now as it did last year.

    To each his own.  I've never understood the ferver for UI purity, or why Konqueror looks 'WTF-y' in GNOME.  In any case, IMHO, there's a lot more to figure out what things do between apps than the differences in how the widgets look.   

    @The Vicar said:

    Skins are another bad GUI idea. If I could wave a magic wand and banish bad GUI design, skins would go too.
    I certainly agree with you there.

    @The Vicar said:

    @boomzilla said:
    Still, on either platform, I've never had a problem figuring out what all the elements are. Maybe this is a real problem for people who still can't find the Any Key, but those people aren't getting much done regardless of the system. In fact, they're probably most productive in a green screen environment where they don't have any choices.

    Yeah, you can figure them out if you're computer savvy. The point is that you shouldn't have to do that. You shouldn't be required to be switching contexts all the time. It's just one more distraction, something to take away a little more of your attention from what you're working on. You can work with a flickering monitor, loud musak, and coworkers talking in the background, too. But I don't see Linux users recommending any of that.

     Who's required to be switching contexts?  If it's that much of a problem, stick to one.  Again, I think it's a bigger context switch to change apps than to change from KDE to GNOME.

    @The Vicar said:

    And what about the non-computer-savvy? Go to your local public library, if you live in a town where the library has a computer lab, and sit in on one of their computer classes for novices. These things still get heavy attendance today; they didn't vanish after the dot-com bust. There are still masses of people who actually do have trouble with this sort of thing, and what you are basically saying is that they can go screw themselves: if they can't match your skills, then they aren't worth helping. You are, of course, entitled to your opinion. But there's a substantial portion of the world, also entitled to its opinion, which says that if Linux can't be helpful to everyone, we'll look elsewhere, thank you very much.

    This is the most retarded thing you've said.  Obviously, there are people who can't figure out computers.  I've met them, and had to deal with them quite a bit.  And again, even going between apps is too difficult for many of them.  They can usually memorize a sequence of keystrokes, which tends to be the navigation involved in an old style green-screen terminal app.  If the only GUI they can figure out is a Mac (because according to you, there is ultimate consistency there, and everything is easy--see, I can make straw men, too) then they should stay there, or use whatever is easiest for them.

    @The Vicar said:

    You don't have to put up with these constant switches on the Mac, and you can avoid them on Windows without too much trouble. (Particularly by sticking with XP.) But the only way to avoid them on Linux is to restrict your options to a ridiculous degree. The minute you even accept so much as the Ubuntu Add/Remove Programs list, the interface chaos begins.
    If you say so.  How do you manage if a car has the gas tank on the other side?  Talk about interface chaos.  And don't even get me started about manual vs automatic transmissions.



  • @The Vicar said:

    No. Linux is bad because:

    1. Its GUIs are slower than the Windows and Mac GUIs on the same hardware.

     

    Vista. BTW, that hasn't been a problem for me for the last 10 years or so. Because of the stupid deep integration of Explorer in Windows, it's easy to show that a Linux GUI is actually more responsive that Windows XP. (Haven't tried Vista yet) Just point explorer to a slow (or even non-existing) network drive, and while it tries to contact the server, the whole GUI is unresponsive.


    2. Its GUI utilities fail a lot more frequently than those included with the Windows and Mac GUIs, requiring users to grub around on the command line or in text files to actually accomplish the tasks which the utilities are supposed to achieve. (Usually, this seems to be a lack of error-checking on the parts of the programmers, combined with a "what the heck, they can always drop to the command line if they really need to do this" attitude.)

    You overestimate the usefullness of GUI tools. When things become complicated, users need instructions. It's probably easier to copy-paste some commands for the command line than to follow some click-here-enter-that-instructions for the GUI, especially when the instructions are made for the English professional edition of the OS, while the user uses the German home edition. Besides that, chances are that 10-year-old instructions for the command line still work even with the latest versions of Linux. Try that on Windows. 


    3. Its GUIs are less configurable within the GUI itself than the Windows and Mac GUIs, requiring users to deal with the xorg.conf file directly. Since xorg.conf requires, in effect, magic keynames (how do you turn off tap-clicking on a touchpad?) this is a serious problem deserving of its own item in this list.

    See 2. 

    4. GUI Programs for Linux use competing libraries for interface elements, and thus no Linux GUI is actually consistent unless you are restricted to the relatively small number of programs which come with the desktop environment.

    Different file dialogs are annoying (though file dialogs are not consistent in a Windows+Office installation either); besides that, nobody even notices. In the web, every fscking page has it's own style for buttons etc. Why should anybody care anymore about the buttons of the GUI.

    5. Linux programmers such as yourself don't actually see these as problems, meaning that the situation is unlikely ever to improve. (And it's hard to see how it could, anyway. Perhaps you could write a substitution library that used GTK+ widgets in response to Qt calls, or vice versa. That would certainly help. But it would take a long time to write and would probably be very difficult to implement. It would probably be simpler to abandon X11 entirely in favor of something less amateurish.)

    You obviously have no fscking clue.

     http://freedesktop.org/wiki/Home

     http://gtk-qt.ecs.soton.ac.uk/



  • Ever had explorer.exe crash on you? Yea well when it does not reboot I need to restart my computer. Or alt-ctrl-del, new task, explorer.exe and HOPE that the task bar appears.  50% of the time I gota reboot, which at work can take 15 minutes to get all my dev crap open.

    Linux: alt-ctrl-backspace = kill x, and restart it. I actaully had the file explorer tool die on me once in gnome (ONCE) and it restarted quickly.

    Linux definately performs better than XP. If you use an emulator and compare benchmarks then yea windows is better, but other than that I like Linux when it comes to usability. The requirements to run Linux + Compiz/Beryl/Fusion vs Windows XP is comparable and it gives better visuals than Vista can ever hope to achieve.

    And complaints about linux themes? Well the point of GTK is to unify which is what gnome and kde are based on. Sure if you still use your old x-windows system things look ununified. Hell, can you customize windows themes? To get windows themes u gota pay microsoft. GTK lets you customize EVERYTHING and even independently, so 1 set of window color, 1 set of window skins, another for icon sets, etc... Or a unified theme. And most are free and extendible.

    I still haven't figured out why linux does not have a windows-style but better installer. We could use it, makes installations a breeze. I prefer it over packages caz packages can't be customized so easily. Its customizeable via. command line, who the heck wants that?!?!?!!



  • @dlikhten said:

    Well the point of GTK is to unify which is what gnome and kde are based on.

     KDE uses QT not GTK.

    There has never been a case where I couldn't respawn explorer.exe.

     Then again, I've only seen it crash once or twice.

     

     



  • @MarcB said:

    If the code is identical, why do I need to enable it seperatly from SATA? Should not said code exposing SATA devices on /dev be referenced directly by the SATA source? Its odd enough that one accesses SATA devices with the same nomeclature as SCSI ones(/dev/sd??).

    Very interesting question.

    The IDE and SATA subsystems of linux have gone through many iterations which are tied directly to the evolution of the hardware interfaces.

    Originally when the first SATA chipsets came out they were rolled into the IDE subsystem because they shared many similarities with IDE controllers (this was actually a bad thing because the IDE layer was very complicated and the standard itself very clunky and tied to the aging PC ISA).

    Later SATA chipsets were standard PCI devices modeled on SCSI controllers and since they were very different they needed separate drivers; these became SCSI host controllers as implemented in linux but with many features gimped out (since they didn't do SCSI, only ATAPI, and had a fixed bus configuration).

    Eventually Intel introduced AHCI and most SATA chipset manufacturers started implementing THAT instead of their own interfaces so that BIOS and OS support could be simplified. ACHI would be the SATA equivalent of EHCI for USB or the IDE specification for IDE interfaces.

    To unify AHCI and many of the similar, simple SATA driver standards in linux the libata project was created. Libata unifies many SATA types, and even some of the wierder IDE variations, and makes them appear as SCSI disks since it is a SCSI host emulation layer in the SCSI system. The reason for this is because any modern IDE or SATA drive uses the ATAPI interface which is a cut-down version of the SCSI protocol; moreover it simplifies support for SATA CD-ROMs and DVD-burners from the perspective of user-space apps like cdrdao and cdrecord.

    SATA drivers that are not part of libata are still usually treated as SCSI devices because SCSI commands tend to map up well with ATAPI commands that SATA encapsulates. The IDE subsystem and the hdX labels are reserved for drives that are PC architecture-specific and conforming to the old IDE interface. hdX is also used on other platforms where there is an equivalent low-level old-school HD system.

    the sdX is used for PCI devices implementing SCSI or SCSI-like interface; essentially EVERYTHING ELSE. USB, SATA, SCSI, firewire, all of that. And you'll see that on PCs, macs, sun boxes, nokia tablets, whatever.



  • @boomzilla said:

    @The Vicar said:

    1. Its GUIs are slower than the Windows and Mac GUIs on the same hardware.

    Maybe on some hardware. This is not universally true.

    It's universally true on hardware with drivers for both Linux and its competitor (depending on which one you're testing), but on fast enough hardware, the difference becomes unnoticeable to a human being. One millisecond more to draw a window is too small to be noticed, and if you're dealing with a good video card and a decent processor, that's the range we're talking about.

    The delay is still important, though. GUIs bloat over time. (Let's call it what it is.) Three years ago Linux geeks were saying "it runs about as fast as other OSes on the current hardware", just like they say now. The difference is that I can take the hardware from three years ago and run Windows and Linux on it, (or Mac OS X and Linux on it, for PPC versions of Linux) and Linux will look like crap compared to the other OS. If I want to make my computer into an investment, something I intend to use and keep on using for a long time, then Linux currently is not a wise decision, and won't be until this sort of thing stops being taken for granted. @boomzilla said:

    @The Vicar said:

    2. Its GUI utilities fail a lot more frequently than those included with the Windows and Mac GUIs, requiring users to grub around on the command line or in text files to actually accomplish the tasks which the utilities are supposed to achieve. (Usually, this seems to be a lack of error-checking on the parts of the programmers, combined with a "what the heck, they can always drop to the command line if they really need to do this" attitude.)
    3. Its GUIs are less configurable within the GUI itself than the Windows and Mac GUIs, requiring users to deal with the xorg.conf file directly. Since xorg.conf requires, in effect, magic keynames (how do you turn off tap-clicking on a touchpad?) this is a serious problem deserving of its own item in this list.

    This is something else that is changing (even though it's being done by "Linux programmers").

    Yeah, but the bar is rising at least as fast as the work progresses. The problems being solved now are the sort which should have been solved five years ago at the very least. Since you seem to object to bringing up Windows so often, let's take a different example: the BeOS GUI. It's incredibly fast on old hardware. It took less than 6 years for it to reach the point where it could be used as a primary OS. (Power Computing, a maker of Mac clones during the period when Apple was granting licenses, actually shipped machines with a BeOS alternate install!) That with a small team of engineers who also wasted a bunch of time building their own hardware. GNOME has been around for going on 11 years. In terms of man-hours (whether you think that's a good measurement or not) GNOME almost certainly had more work done in 2007 than BeOS had during the entire run of Be, Inc. from 1990 to 2001. Yet GNOME still sucks when compared to BeOS in a number of ways -- the BeOS GUI was easy to use, intuitive, clean, and coherent, in addition to being fast which is not necessarily a concern if the experience is good enough. If the Haiku Project ever completes a full OS and builds OpenOffice and either Firefox or some KHTML/Webkit-based browser, I will without hesitation switch my Linux terminals (which are meant for non-programmers) to Haiku. And that's a GUI which stopped changing in 2001!

    (What are the GNOME programmers working on? Rotating cubes, wobbly windows, and other eye candy. BeOS couldn't do any of that. That's a point in its favor.) @boomzilla said:

    @The Vicar said:

    If Windows were actually a good GUI, I'd defend it. It isn't. In many small but noticeable ways it is inferior to the Mac OS GUI. But it's still better than the options available for Linux. The differences between GUI elements in the three Windows variants are slight, and as programs are updated the older variants disappear. Not so with Linux, where Konquerer will look just as WTF-y in GNOME a year from now as it did last year.

    To each his own. I've never understood the ferver for UI purity

    It's because a GUI is nothing but metaphors, and different people have different thresholds for how much the metaphor can be polluted before it collapses for them. If you're a programmer, or even someone who regularly uses more than one OS, then you probably understand how the computer is "thinking" about things, and that raises the amount of pollution you can accept. But there are people whose tolerances are much lower, and if you're pushing for broad acceptance, you should target them. Expert users can always find ways to make an OS more productive for themselves, but someone who can't use an OS in the first place can't somehow magically compensate for that.

    It's funny -- nobody disagrees with the idea that checkboxes and radio buttons have two distinct uses and should never substitute for each other because it confuses users. But if you start building random interface elements out of normal buttons, as happens all-too-frequently in GNOME, that's okay. @boomzilla said:

    , or why Konqueror looks 'WTF-y' in GNOME.

    Two visibly different objects playing the same role == metaphor pollution. A GTK+ widget usually does not look like a functionally equivalent Qt widget. @boomzilla said:

    In any case, IMHO, there's a lot more to figure out what things do between apps than the differences in how the widgets look.

    You mean like supporting typed clipboard data, which Apple managed to do back in 1983 but Linux still doesn't really manage? (But look at the rotating cubes and wobbly windows!) @boomzilla said:

    Obviously, there are people who can't figure out computers. I've met them, and had to deal with them quite a bit. And again, even going between apps is too difficult for many of them.

    Many, but not even close to a majority, or even a plurality. By making the experience uneven, you place an extra barrier in the way which raises the percentage who can't even get started. @boomzilla said:

    If the only GUI they can figure out is a Mac (because according to you, there is ultimate consistency there, and everything is easy--see, I can make straw men, too) then they should stay there, or use whatever is easiest for them.

    Most of them would probably be quite productive with Macs. I'd be reasonably happy with that, because I've gone all-Mac now except for my two Xubuntu laptops, which are a very recent development, and I wouldn't mind seeing Apple's market share grow. But I'd be happier if the choice was "any of them will make you productive" instead of "you have a choice between 'free' and 'productive'". @boomzilla said:

    @The Vicar said:
    You don't have to put up with these constant switches on the Mac, and you can avoid them on Windows without too much trouble. (Particularly by sticking with XP.) But the only way to avoid them on Linux is to restrict your options to a ridiculous degree. The minute you even accept so much as the Ubuntu Add/Remove Programs list, the interface chaos begins.
    If you say so. How do you manage if a car has the gas tank on the other side? Talk about interface chaos. And don't even get me started about manual vs automatic transmissions.

    Well, for one thing, a single car's gas tank doesn't change from side to side, and it doesn't suddenly swap between manual and automatic transmission. But if it were like a session with Linux, the gas tank would change sides every so often, the car would have automatic transmission on some trips but not on others, all the controls on the dashboard would be made out of welded-together pushbuttons, and occasionally when you turned on the windshield wipers the engine would die. And if you complained about it, the manufacturers would say, "well, you can always stick your head out the window, so we don't really think that seeing through the windshield is important enough to warrant assigning any of our engineers to the problem. We're focussing on these really nifty-looking spoilers and tailfins instead."


  • @The Vicar said:

    Well, for one thing, a single car's gas tank doesn't change from side to side, and it doesn't suddenly swap between manual and automatic transmission. But if it were like a session with Linux, the gas tank would change sides every so often, the car would have automatic transmission on some trips but not on others, all the controls on the dashboard would be made out of welded-together pushbuttons, and occasionally when you turned on the windshield wipers the engine would die. And if you complained about it, the manufacturers would say, "well, you can always stick your head out the window, so we don't really think that seeing through the windshield is important enough to warrant assigning any of our engineers to the problem. We're focussing on these really nifty-looking spoilers and tailfins instead."
     

     I'd use a metaphor that there is a gas tank, but the manufacturer forget to put a filler door on the body. So to fill it, you have to first remove the tank, and then fill it, and then put it back.

     When asked about it, the manufacturer stated that adding a filler door was going to be done, but they couldn't agree where to place it, and if it should be locked or not. Eventually a prototype car was made with a filler door, but it was placed on the underside of the gas tank under the car, and opening it usually led to making a mess.



  • @dlikhten said:

    To get windows themes u gota pay microsoft.

    No you dont... Just download them. Where the fuck did you get this idea from?

    Microsoft offers a bunch of stuff for free: http://www.microsoft.com/windowsxp/downloads/desktop/default.mspx

    So do others:  http://www.wincustomize.com/

    If the themes don't go far enough, just use a different shell.

    @dlikhten said:

    I still haven't figured out why linux does not have a windows-style but better installer. We could use it, makes installations a breeze

    This is coming from the same person who whined and complained and said Windows was impossible to install, and that you just install Linux because windows is too hard...

     I mean, you did say:

    @dlikhten said:

    Also, windows just installed on E: (thats right i had an external HDD plugged in which took C: for some reason)

    Their partitioner sucks

    After installing i usually spend 2-3 hrs installing updates.

    Let me drill it down:

    Install: 30 minutes

    Update to SP1: 30 minutes

    Update to SP2: 45 minutes

    Deal with .net framework errors: 45 minutes

    Deal with other errors which ALWAYs appear (like windows installer not being registered) 30 minutes

     

    Have you actually learned how to do the incredibly simple task of installing a Windows machine?? None of the things in your list are truthful at all, but I just assumed that you were not up to the skill level to install Windows (wow!) and you were trying to throw decoys so no one would realize you were full of crap.

    @dlikhten said:

    I prefer it over packages caz packages

    Why do you seem to think this baby talk is cute?


Log in to reply