I tried renaming a file in Ubuntu



  • @ender said:

    Actually, I'm pretty sure that mv supported renaming files from day one - after all, as far as the OS is concerned, renaming and moving a file (within the same filesystem) is the same operation.

    Lesson learned: when you write a MEGA-POST, literally nobody reads it.

    I should have just unsubscribed this morning.



  • @heterodox said:

    I'm familiar with Perl and I don't recognize the syntax error, or at least what my next action should be.

    Check how you got the expression wrong and fix the error. You're even given a location where the parser failed. If you don't know what you want to do, that's not Perl's problem.

    @heterodox said:

    You say "all the CLI utilities assume" like it's an immutable fact of nature. It's a huge design flaw.

    Then you're reading too much into that statement. Of course you can write a shell which will hold the user's hand and tell "you seem to be trying to destroy your filesystem; try sudo rm -rf /*" and no law of physics will forbid you from doing this. Hell, some people already tried that and fish is a thing. But traditional Unix prefers programs which are dumb (as in both: stupid and mute/terse) and behave (relatively) consistently instead of second-guessing what the user meant.


  • Discourse touched me in a no-no place

    @ender said:

    @HardwareGeek said:
    FWIW, Cygwin has the non-Perl version described above.
    Here's a native Windows port that seems to originate from an ancient util-linux package.
    @FrostCat said:
    FWIW, on Windows 8, move acts like mv in that it can rename a file in place. I don't know how long that feature has been there.
    Since move.exe was introduced on DOS 6?

    Har de har. It might not have been around that long and I wasn't going to make the effort to look, because it's not really all that important. Have you got DOS 6 handy to verify that you can rename a file with move.exe? BTW, move is a shell intrinsic in Win8. I don't know--nor do I care--when that happened.



  • @heterodox said:

    @gilhad said:

    Argument 1 IS regular expression. And very simple one. Do you think that  "your regular expression is too easy for me" is good message in this case?  The regular expression index.php mathes index1php index2php index3php ..... and also index.php ofcourse.

    Whatever, I'm not familiar with the utility. I'd never even heard of it before this discussion. And your pedantic dickweedery does not undermine the point I was trying to make, which you clearly understand.

     

    The problem was NOT in first parameter NOT being regular expression. The problem was, that rat did not gread documentation, passed only 2. and 3. argument (as 1. and 2.)  which was syntactically correct values and the program complained about missing the last parameter. The program was given two of three parameters, both passed for acceptable values on such place and there was not a way the program should know, that ratwanted something totally different, from what  he asked for.

     

    the same problem, when you use ternaty operator like this:

     x=b?c:

    the compiler will complain about missing 3. parameter, not first, even if you wanted to type

    x=a?b:c

     

    @heterodox said:

    @spamcourt said:

    1) argument 1 need not be a regular expression - it may be any series of statements which when executed results in $_ containing the new name, and 2) if you know what this utility does and how to use it, then you must be somewhat familiar with Perl, and hence recognise this syntax error.

    I'm familiar with Perl and I don't recognize the syntax error, or at least what my next action should be. That is, without reading the code of the utility, which is something I absolutely should not have to do. You say "all the CLI utilities assume" like it's an immutable fact of nature. It's a huge design flaw.

     

     

    I would first try to type 

    man rename

    and I would be given the answer immediatelly, without need to read code of such utility

     



  • @blakeyrat said:

    That's because they defined the system specifically to *exclude* the average Joe, so they could remain the "high priest of technology"
    Bullshit. "They" defined the system in the days when computers were only used by "high priest[s] of technology." The "average Joe" had probably never seen a real computer, much less used one, and nobody expected he ever would. Even the first popular hobbiest computer, the Altair 8800, wouldn't be invented for another 5 years, or so, and it was hardly aimed at the average Joe; it required enough computer knowledge to toggle machine instructions in from the front panel switches.

    It's a really safe bet that "they" never even thought of the average Joe when defining UNIX. It is not malice to fail to anticipate a usage model 40 years ahead of time.

     



  • @blakeyrat said:

    2) Users shouldn't have to understand what a filesystem is to use a computer.

    Nobody is talking about grandma at home who just wants to see some cute cat pictures. Or even Mr. Non-technical VP who just wants to get his fantasy football scores. Yes, linux is pretty shitty for those people. We are talking about developers and sysadmins. Those people SHOULD understand what a filesystem is.

    @blakeyrat said:
    Why would you assume that someone who is an expert in the CLI will make a decent piece of software? My experience tells me they usually write shit software that is shit.

    I wrote software for Mac Classic; Mac Classic didn't even have a CLI. Does that mean, by your logic, that all Mac Classic software, literally every single program on the OS, was "something resembling working software?" Is that seriously what you're suggesting?

    I can't debate with people this delusional. (And insulting.)

    There's is no correlation between "people who know the CLI well" and "people who are good at writing software". (And, for that matter, "idiots who have no experience, training or talent". Not until you fucking demonstrate one with evidence.


    Did I say anything specifically about the CLI? No. My point is not restricted to CLI. It's a general point that someone whose job it is to write/develop/maintain software should never take the attitude that "I'm not going to read the manual or learn to use the tools, and the tools are broken because I have to do that to use them."

     


  • Discourse touched me in a no-no place

    @blakeyrat said:

    Why the fuck doesn't the CLI [have undo]?
    Because the CLI's typically implemented as a much thinner layer over the OS's API than the GUI; the undo feature of the GUI requires a lot of extra code to implement (unless the filesystem is really unusual). The same is true on Windows as on Linux; the command line tools there don't offer undo either.

    Normal users are recommended to use the GUI these days because of all the extra protection code; sysadmins don't need so many training wheels.



  • @HardwareGeek said:

    It's a really safe bet that "they" never even thought of the average Joe when defining UNIX. It is not malice to fail to anticipate a usage model 40 years ahead of time.

    Yeah, well, at the risk of being a broken record, why the fuck hasn't it been fixed since then?

    You people could anticipate my obvious replies and head them off, you know. It would save me typing.


  • Discourse touched me in a no-no place

    @blakeyrat said:

    @skotl said:
    UNIX (and Linux) have managed without you perfectly well for the last forty years...

    That's why Linux is so popular and beloved? Oh wait. Nobody uses it, except people on Android cellphones who only use it because they don't know it's Linux. And even die-hard life-long Unix lovers are on MacBook Air's.

    Demonstrably something is wrong.

    Here you're just delusional. Most big businesses use it because you don't fork out as much money up front as you do for Windows, and for a long time, Linux had better performance than Windows on the same hardware. That's why, for example most web host companies don't have Windows. Most of the places I've worked that are of any substantial size have Linux somewhere doing server duties, whether it's web or database hosting, if nothing else.


  • Considered Harmful

    @spamcourt said:

    Hell, some people already tried that and fish is a thing.

    After reading their homepage, the tone of it left me unsure if it's just a joke or something real.



  • @Snooder said:

    We are talking about developers and sysadmins. Those people SHOULD understand what a filesystem is.

    I agree.

    But they still shouldn't have to make use of that knowledge to install a web server. No more than they should have to understand every naunce of TCP/IP to write a text file.

    And you're using the word "should", which is great, but here in reality there's people who do things they shouldn't, and the system should be smart enough to cope with that in a reasonable way. The system is designed to be used by humans, not robots.

    @Snooder said:

    Did I say anything specifically about the CLI? No. My point is not restricted to CLI. It's a general point that someone whose job it is to write/develop/maintain software should never take the attitude that "I'm not going to read the manual or learn to use the tools, and the tools are broken because I have to do that to use them."

    I agree with that; how is it relevant to this thread? I'm still confused.



  • @dkf said:

    sysadmins don't need so many training wheels.

    Why not? What would be the harm in having them available? Like I said above, even fucking SQL (a system used by the geekiest of geeks) has "training wheels" in-place-- those database admins obviously value them, why not Linux sysadmins?

    I think Linux sysadmins (not Windows ones, generally speaking) like the fact that there's no safety net, because it helps them keep "newbs" off their precious systems. I mean, how could the high priesthood function if just anybody could log in and use the computer?


  • ♿ (Parody)

    @Snooder said:

    @blakeyrat said:
    2) Users shouldn't have to understand what a filesystem is to use a computer.

    Nobody is talking about grandma at home who just wants to see some cute cat pictures. Or even Mr. Non-technical VP who just wants to get his fantasy football scores. Yes, linux is pretty shitty for those people. We are talking about developers and sysadmins. Those people SHOULD understand what a filesystem is.

    And even Linux has graphical file managers for those people (and anyone else who wants to use them). Not that this should detract from blakey's straw men and their moving goalposts.



  • @blakeyrat said:

    But they still shouldn't have to make use of that knowledge to install a web server.
    What does a web server do? Because I think it serves files. And sometimes executes them to serve their output. They reside in this thing called a... uhm... uhhhh...

    @blakeyrat said:

    @Snooder said:
    Did I say anything specifically about the CLI? No. My point is not restricted to CLI. It's a general point that someone whose job it is to write/develop/maintain software should never take the attitude that "I'm not going to read the manual or learn to use the tools, and the tools are broken because I have to do that to use them."

    I agree with that; how is it relevant to this thread? I'm still confused.

    Because the very first post has demonstrated you did exactly that?



  • @Snooder said:

    @blakeyrat said:

    2) Users shouldn't have to understand what a filesystem is to use a computer.

    Nobody is talking about grandma at home who just wants to see some cute cat pictures. Or even Mr. Non-technical VP who just wants to get his fantasy football scores. Yes, linux is pretty shitty for those people. We are talking about developers and sysadmins. Those people SHOULD understand what a filesystem is.

    @blakeyrat said:
    Why would you assume that someone who is an expert in the CLI will make a decent piece of software? My experience tells me they usually write shit software that is shit.

    I wrote software for Mac Classic; Mac Classic didn't even have a CLI. Does that mean, by your logic, that all Mac Classic software, literally every single program on the OS, was "something resembling working software?" Is that seriously what you're suggesting?

    I can't debate with people this delusional. (And insulting.)

    There's is no correlation between "people who know the CLI well" and "people who are good at writing software". (And, for that matter, "idiots who have no experience, training or talent". Not until you fucking demonstrate one with evidence.


    Did I say anything specifically about the CLI? No. My point is not restricted to CLI. It's a general point that someone whose job it is to write/develop/maintain software should never take the attitude that "I'm not going to read the manual or learn to use the tools, and the tools are broken because I have to do that to use them."

     

    The problem here is something I like to call artificial complexity. You have task X that has some intrinsic difficulty because you have to figure out what you want to do. Then you get additional difficulty from the tools you use. Linux CLI have a much higher artificial complexity than needed.

    Or in other words, don't make me have to think about file system details while I'm trying to do something that's already complicated. It's distracting and unnecessary.

    Of course the mv command is just a tiny not very significant example of this.



  • @blakeyrat said:

    @Snooder said:
    Did I say anything specifically about the CLI? No. My point is not restricted to CLI. It's a general point that someone whose job it is to write/develop/maintain software should never take the attitude that "I'm not going to read the manual or learn to use the tools, and the tools are broken because I have to do that to use them."

    I agree with that; how is it relevant to this thread? I'm still confused.



    It's relevant because that is exactly what you did. You had a new tool that you don't know how to use. You tried to use a command that you assumed would work a certain way. It didn't work. You didn't then decide to check the manual, learn what the correct command should be, or learn what the command that you used really does. Instead you came here to write a rant about how Ubuntu is shit because it didn't work the way you, an inexperienced user, expected it to.

     


  • Discourse touched me in a no-no place

    @ender said:

    @blakeyrat said:
    You realize of course that if "rename" just did what its name implies it did, this whole post wouldn't exist?
    But it does what it's name implies, it simply doesn't do it the same way rename on Windows does it (because guess what - Linux isn't Windows). I'm not sure why Debian (and Ubuntu, which is it's derivate) default to perl-rename unlike other distributions (which use rename from util-linux, that shows a brief help text instead of just erroring out when there's just 2 arguments), but that's the way it is. Look at what happens if I try the Linux syntax on Windows:
    W:\Users\ender>rename foo bar *.txt
    The syntax of the command is incorrect.

    BTW, that's a shit error message. A better one is
    Usage: rename [expression] [files]

    Which is (more or less) what man rename actually says right at the top.  An even better one is the actual first example:
           For example, to rename all files matching "*.bak" to strip the
           extension, you might say
    
                   rename 's/\.bak$//' *.bak
    

    Blakey's got a point in that many, many tools are written poorly (and I'm not talking about "cryptic" filenames (which actually generally aren't) or "arcane syntax" (which generally aren't, either.))

    It is actually a shame that so much software is fairly braindead, in examples as simple as people not giving a thought to tab order in forms, or having otherwise-deranged workflows that require you to bounce all over a window, switching from keyboard to mouse constantly. An example of pretty good software is the electronic state income tax app Massachusetts used in the '90s, which was entirely driven by user input. It started off asking for your name and SSN, then your filing status, or something basically like that. From then on, the UI changed so that it only asked specifically for what you needed based on prior input. So if you chose "single" filing status, it wouldn't ask for dependents; any other choice would, and so on. Outside of Microsoft, almost nobody gets that right, and even they often don't.

    E2A: oh my god formatting.



  • @blakeyrat said:

    @HardwareGeek said:
    It's a really safe bet that "they" never even thought of the average Joe when defining UNIX. It is not malice to fail to anticipate a usage model 40 years ahead of time.

    Yeah, well, at the risk of being a broken record, why the fuck hasn't it been fixed since then?

    You people could anticipate my obvious replies and head them off, you know. It would save me typing.

     

    Why they should fix, what is not broken (for them). If you do not like it, do not use it - there are much more suitable  environments for you - maybe MS BOB should suit you well.

     

    I know. You live very miserable life, as you are forced by your boss to do things, you are not able (nor willing) to do properly and you cannot afford to look for work suiting your skills. And your boss is not willing to spend money on somebody, who would know, what to do. Nor he want to spend money to buy you just point-and-click-and-it-is-magically-done system to deploy whatever you are forced to deploy.

     

    Nobody here envy you such miserable position and you are really good example, why good fathers have their sons to learn in order to archieve something. I think, they all say - "Look at blackeyrat, how miserable he is. And now go to learn, so you do not end as miserable as him"

     


  • Discourse touched me in a no-no place

    @heterodox said:

    I'm familiar with Perl and I don't recognize the syntax error, or at least what my next action should be. That is, without reading the code of the utility, which is something I absolutely should not have to do. You say "all the CLI utilities assume" like it's an immutable fact of nature. It's a huge design flaw.

     

    Indeed. Nobody should expect a bare argument of index.php to get wildcard expanded, because no other utility does that; in the shell, you use ? for a single character.



  • @spamcourt said:

    They reside in this thing called a... uhm... uhhhh..

    Implementation detail, irrelevant to the process of serving files to a web browser.


  • Discourse touched me in a no-no place

    @blakeyrat said:

    @HardwareGeek said:
    It's a really safe bet that "they" never even thought of the average Joe when defining UNIX. It is not malice to fail to anticipate a usage model 40 years ahead of time.

    Yeah, well, at the risk of being a broken record, why the fuck hasn't it been fixed since then?

    Well, get started, then!



  • @blakeyrat said:

    Lesson learned: when you write a MEGA-POST, literally nobody reads it.
    I did read it, but I admit, I forgot to reply to this part:@blakeyrat said:
    Seriously, I haven't tried it. I assume Linux creates a new folder called "poop", and a new "folder2" inside it. Which would mean: a single move command can result in THREE operations taking place, TWO of which are making new directories.
    <font color="#00a000">ender@deepthought</font> <font color="#0000a0">/home/ender/tmp/foo $</font> mv bar ~/temp/foo/bar
    mv: cannot move 'bar' to '/home/ender/temp/foo/bar': No such file or directory

    @FrostCat said:
    Have you got DOS 6 handy to verify that you can rename a file with move.exe?
    As a matter of fact, I do (in VMWare).
    Demonstration of move command in DOS 6.22


  • Discourse touched me in a no-no place

    @blakeyrat said:

    @spamcourt said:
    They reside in this thing called a... uhm... uhhhh..

    Implementation detail, irrelevant to the process of serving files to a web browser.

    Uh, this isn't exactly "what color do you want your fire?" or "do people even want fire that can be fitted nasally?" though. As a developer you need to have some inkling of how the server expects to find the files.


  • Discourse touched me in a no-no place

    @ender said:

    @blakeyrat said:
    Lesson learned: when you write a MEGA-POST, literally nobody reads it.
    I did read it, but I admit, I forgot to reply to this part:@blakeyrat said:
    Seriously, I haven't tried it. I assume Linux creates a new folder called "poop", and a new "folder2" inside it. Which would mean: a single move command can result in THREE operations taking place, TWO of which are making new directories.
    <font color="#00a000">ender@deepthought</font> <font color="#0000a0">/home/ender/tmp/foo $</font> mv bar ~/temp/foo/bar
    mv: cannot move 'bar' to '/home/ender/temp/foo/bar': No such file or directory

    @FrostCat said:
    Have you got DOS 6 handy to verify that you can rename a file with move.exe?
    As a matter of fact, I do (in VMWare).
    Demonstration of move command in DOS 6.22

    Well isn't that special. It's still really not all that relevant.



  • @blakeyrat said:

    If you think I've ever said, "WHOA GUYS CMD IS FUCKING GREAT AWESOME SO BEST SOFTWARE SQUEEE" then you're mistaken; it sucks shit too. The reason it doesn't matter as much that it sucks shit is, 1) it's been deprecated for literally decades at this point, and 2) you never need to actually USE the thing to do some general task (like installing a web server).

    What about installing Windows Server Core -- so no UI. You use PowerShell to do pretty much everything.



  • @blakeyrat said:

    @spamcourt said:
    They reside in this thing called a... uhm... uhhhh..
    Implementation detail, irrelevant to the process of serving files to a web browser.
    Yes, completely irrelevant. Where do I put those files I want to serve? Hmm, I'll put them in /dev. The server should pick them up from there. Right?


  • ♿ (Parody)

    @blakeyrat said:

    @spamcourt said:
    They reside in this thing called a... uhm... uhhhh..

    Implementation detail, irrelevant to the process of serving files to a web browser.

    Eh...maybe. But the task you started out with was how to deploy a bunch of files to the web server. I guess, if you want to be a pedantic dickweed, you could say that the files you're deploying are just an implementation detail, but they're your detail, not some filesystem developer. Really, you have an amazing talent for writing blazingly stupid things.



  • @FrostCat said:

    BTW, that's a shit error message. A better one is
    Usage: rename [expression] [files]
    Here's what the util-linux rename says when you call it with 2 parameters:
    <font color="#00a000">ender@deepthought</font> <font color="#0000a0">~ $</font> rename foo bar
    rename: not enough arguments

    Usage:
    rename [options] expression replacement file...

    Options:
    -v, --verbose explain what is being done
    -V, --version output version information and exit
    -h, --help display this help and exit


    And I agree, the perl rename error text is awful.



  • @e4tmyl33t said:

    An OS should be inherently discoverable, at least in a useful world.
     

    Why?


  • Considered Harmful

    @Cassidy said:

    @e4tmyl33t said:

    An OS should be inherently discoverable, at least in a useful world.
     

    Why?


    I want to live in a useful world.



  • @blakeyrat said:

    @stinerman said:
    The "fucking hard" one has to be OS agnostic.

    Except they aren't. Those directions won't work on Windows, for example-- they have to put Windows in another section. And it won't work with Linux distributions that don't have apt-get installed, I believe some of those are still popular.

    Then those people are idiots. There should be a "Debian and derivates" section, a "Red Hat and derviates" section, and then maybe Windows and OSX if it runs on those platforms.



  • @gilhad said:

    The problem was, that rat did not gread documentation, passed only 2. and 3. argument (as 1. and 2.)  which was syntactically correct values and the program complained about missing the last parameter. The program was given two of three parameters, both passed for acceptable values on such place and there was not a way the program should know, that ratwanted something totally different, from what  he asked for.

    Not actually true, because Blakey was working with the rename command supplied with Ubuntu, whose first argument is expected to be a Perl substitution or translation function and whose subsequent arguments are filenames. You can certainly use that version of rename with only two arguments: rename s/half/fuck/ totalhalfwit.txt works just fine.

    What it doesn't do is resemble the DOS rename command's syntax even slightly: instead of rename fromthis tothat its semantics are rename thisway thisfile andthatfile andtheseothers... As I explained above, this is largely an adaptation to the environment it will most commonly be used in i.e. a POSIX-like shell that pre-expands wildcard filespecs.

    The fact that it spews gibberish instead of issuing meaningful error messages is a genuine deficiency and is, as others have observed, a sadly familiar experience for those of us accustomed to using Perl-based tools for almost any purpose. You pretty quickly learn to deal with this by Googling some random selection of words from the error message along with the command name, rather than actually trying to read it. Perl is like an angry low-paid Chinese factory worker - it can do an amazing amount with very little instruction, but it's unlikely to speak politely to you in fully formed English sentences when it can't make out what you want of it.


  • ♿ (Parody)

    @Cassidy said:

    @e4tmyl33t said:
    An OS should be inherently discoverable, at least in a useful world.

    Why?

    Why not? A better question would be how he defines "useful world" and "inherently discoverable".



  • @blakeyrat said:

    That's because they defined the system specifically to *exclude* the average Joe, so they could remain the "high priest of technology" and draw a larger salary.
     

    Rubbish. They designed a system that did what they wanted it to do, and didn't feel the need to expend effort to add in hand-holding measures for people unwilling to learn how to work it.

    @blakeyrat said:

    I just want the Linux community to give a shit about the quality of their software.

    They do. They just don't give a shit about the users that want to use it.

    @blakeyrat said:

    Which means all these people are being *brainwashed* by this shitty operating system. It's goddamned criminal.

    For a moment I thought you were talking about people who follow a point-and-click wizard that are are conditioned into believing they're more highly skilled than they actually are.

    @blakeyrat said:

    Or, more seriously, because a lot of people never bother doing a cost/benefit analysis of software that considers total cost over a few years, and correctly accounts for the number of hours wasted by broken bullshit in Linux.

    Yeah, I guess amateurs like the NYSE would know very little about financial analysis. There seems to be quite a few cowboys that have succumbed to the Rock Star bullshit, but luckily these are small unknowns and probably likely to go under soon.

    @blakeyrat said:

    you know EXACTLY what my problem with CLI tools is. Hint: it's not a "principle", it's an actual physical disability which makes them extraordinarily difficult for me to use.

    I've already suggested a GUI replacement that could assist in your administrative workload.

     



  • @Lorne Kates said:

    END OF THREAD
     

    You're no fun.



  • @JoeCool said:

    What about installing Windows Server Core -- so no UI. You use PowerShell to do pretty much everything.

    That's not so bad, because PowerShell comes with pre-canned aliases that look reasonably familiar to POSIX speakers, and if you ignore all the wanky object-oriented bullshit you can actually get work done with it. PowerShell is not cmd; it's how Redmond thinks a CLI should work given infinite resources. It has the same bugs as bash (i.e. it's too big and too slow) but unlike bash it won't tell you that in its man page: you have to figure it out yourself by banging your head against it.



  • @witchdoctor said:

    The problem here is something I like to call artificial complexity. You have task X that has some intrinsic difficulty because you have to figure out what you want to do. Then you get additional difficulty from the tools you use. Linux CLI have a much higher artificial complexity than needed.

    Or in other words, don't make me have to think about file system details while I'm trying to do something that's already complicated. It's distracting and unnecessary.

    Of course the mv command is just a tiny not very significant example of this.

     

    Then you probabelly want some more protective system. Maybe Windows, or OS X, or such.

     

    I like the way the Linux is and the authors of Linux probabelly like it this way too, because they made it this way. 

    I do not want to be restricted by some Artifical Stupidity, which tries guest, what I would like to do and then try to correct my commands and restrict my movements just to fit its (AS) own (wrong) assumptions.

    This is what I hated on Windows the most. The aproach "we know better than you, what you want to do. And we will force you to do it as we decided, you would want it if you were really stupid user - not as you really  want it to do."

     

    (And  for most time I do not need much about things under the hoods - installing apache is as easy as to type

    emerge Apache

    /etc/init.d/apache2 start

    In graphical suits, like KDE or GNOME are even GUI wizards, where you just select apache, click install and then click start (or something like that)

    )

     

    Somehow I found Linux (especially Gentoo) more intuitive and easy to manipulate for me, than any kind of Windows (from 3.1 to 7) or MSDOS (from 3.2 to 6)

    ---

    For me the artifical complexity lies in the "GUI WIZARDS" which protect me from simply setting my IP/routes and are forcing me to guess, what their authors had on mind when they set paths like "dome network", "company network" and such.

    I have network with DHCP server, I have it at home, I have it for company purpose too - I never was sure, what to choose and why I could not just say - you are connected by RJ45 to DHCP server - talk with him, connect me to internet.

    On linux i just run "/etc/init.d/net.eth0 start" with default empty /etc/conf.d/net config and voila - I am connected. And if I do not want the care ofDHCP server, i just say, what IP I wantand what routes in the config file, run  "/etc/init.d/net.eth0 restart" and voila - I have totally different IP and routes, just as I wanted

    ---

    Some people prefere other distributions, or other OSes - fine with me - let everyone find, what best suits his needs and use it.

    Some people customize Linux to their visions of easy using - why not, if they keep it in thier distribution and leave to use my distribution how I like it.

    The goal is not to have all systems identical - the goal is to have systems to choose from to find the one suiting best you. (and then maybe even customize it more to your needs)

     

    Rat is angry, because  some people made system for themself and then give everybody the right to use it and to improve it. And that people was not paid by rat, so they did not asked him. They did it in their free time and did it just for themself. Some other people have similar needs and started use that system too (for free). And that system went so popular, that it is used from the small embeded systems, thru internet servers to the biggest megacomputers. And it rules in such areas, because of its quality and price. And it even fits on home PCs and netbooks, so many people use it there too. And some made enhancements even for not-so-skilled users (KDE, GNOME, ...), but still the system under is what did geeks for geeks, not for ignorants.

     

     

     



  • @flabdablet said:

    @JoeCool said:
    What about installing Windows Server Core -- so no UI. You use PowerShell to do pretty much everything.

    That's not so bad, because PowerShell comes with pre-canned aliases that look reasonably familiar to POSIX speakers, and if you ignore all the wanky object-oriented bullshit you can actually get work done with it. PowerShell is not cmd; it's how Redmond thinks a CLI should work given infinite resources. It has the same bugs as bash (i.e. it's too big and too slow) but unlike bash it won't tell you that in its man page: you have to figure it out yourself by banging your head against it.

    I never said it was bad. BlakeyRat's basic argument is that you should never have to use a CLI - use the UI like in Windows. But even Windows has a Core server with no UI.



  • @JoeCool said:

    BlakeyRat's basic argument is that you should never have to use a CLI - use the UI like in Windows. But even Windows has a Core server with no UI.

    Read him more carefully and you'll notice his Platonic ideal is not Windows but Mac OS versions before NeXTSTEP got their Unix in his Macintosh. He harks back to a completely CLI-free Golden Age, an age where men were men and ponies were free for the taking.

    I guess he never used Macintosh Programmer's Workshop, which was basically an xterm with IDE pretensions.



  • @flabdablet said:

    Not actually true, because Blakey was working with the rename command supplied with Ubuntu, whose first argument is expected to be a Perl substitution or translation function and whose subsequent arguments are filenames. You can certainly use that version of rename with only two arguments: rename s/half/fuck/ totalhalfwit.txt works just fine.

     

    OK, my bad. I have Gentoo and my rename works other way.

    Still I think that simple "man rename" would show something even on Ubuntu (but I have no way to confirm that)

     


  • sekret PM club

    @boomzilla said:

    @Cassidy said:
    @e4tmyl33t said:
    An OS should be inherently discoverable, at least in a useful world.
    Why?
    Why not? A better question would be how he defines "useful world" and "inherently discoverable".

    Well, I may not be able to define "useful world", but "inherently discoverable" would be something like the following:

    • Anything that can be done on the CLI has a GUI equivalent. These equivalents will be able to perform every function the CLI version can.
    • If you have to use CLI commands, said commands make some sort of sense in their naming conventions.
    • GUIs are easy to understand, have things in their logical places, and come with help documentation that doesn't require a decoder ring or 5 years of existing experience to decipher.
    • GUIs don't look like utter ass (most Java programs, I'm looking at you). This is sort-of related to the above point, but it bears repeating because a junk GUI is more useless than a jargony one.
    • Programs are responsive and communicative to the user. This means error messages either make sense to someone who isn't a developer, or contains enough information that troubleshooting and/or contacting the company for support doesn't take forever. It also means that programs don't go unresponsive when processing a task, and instead show a progress bar or some other form of indicator that the process is working, preferably showing an accurate percentage towards completion.
    • Operating systems and programs take into account both common use cases: The experienced user/power user, and the complete new user who has never seen this before in their life.
    • Operating systems can allow for a level of customization, to permit users to tailor their experience to what works best for them.

    As to *why* should an OS be inherently discoverable? Is that a serious question? Why *shouldn't* it be? Why would anyone WANT an OS that makes it difficult to determine how to do things, whether by design or by accident?

    I run a Windows gaming machine, a Macbook Pro, and an Ubuntu media center. Out of the three of them, the OSX machine probably hits closer to more of the above points than the others. Yes, the Ubuntu box may be more "customizable" in terms of choosing a window manager, and choosing a skin for it, and the like...but I've found that process to be annoyingly bothersome so I set it once and have left it since, even though I'm starting to get sick of the skin. Not to mention the issues I had getting it to properly output 1920x1080... 


  • Considered Harmful

    @flabdablet said:

    It has the same bugs as bash (i.e. it's too big and too slow) but unlike bash it won't tell you that in its man page: you have to figure it out yourself by banging your head against it.

    My ex-wife used man bash when we had too many arguments.



  • @e4tmyl33t said:

    @boomzilla said:

    @Cassidy said:
    @e4tmyl33t said:
    An OS should be inherently discoverable, at least in a useful world.
    Why?
    Why not? A better question would be how he defines "useful world" and "inherently discoverable".

    Well, I may not be able to define "useful world", but "inherently discoverable" would be something like the following:

    • Anything that can be done on the CLI has a GUI equivalent. These equivalents will be able to perform every function the CLI version can.
    • If you have to use CLI commands, said commands make some sort of sense in their naming conventions.
    • GUIs are easy to understand, have things in their logical places, and come with help documentation that doesn't require a decoder ring or 5 years of existing experience to decipher.
    • GUIs don't look like utter ass (most Java programs, I'm looking at you). This is sort-of related to the above point, but it bears repeating because a junk GUI is more useless than a jargony one.
    • Programs are responsive and communicative to the user. This means error messages either make sense to someone who isn't a developer, or contains enough information that troubleshooting and/or contacting the company for support doesn't take forever. It also means that programs don't go unresponsive when processing a task, and instead show a progress bar or some other form of indicator that the process is working, preferably showing an accurate percentage towards completion.
    • Operating systems and programs take into account both common use cases: The experienced user/power user, and the complete new user who has never seen this before in their life.
    • Operating systems can allow for a level of customization, to permit users to tailor their experience to what works best for them.

    As to *why* should an OS be inherently discoverable? Is that a serious question? Why *shouldn't* it be? Why would anyone WANT an OS that makes it difficult to determine how to do things, whether by design or by accident?

    I run a Windows gaming machine, a Macbook Pro, and an Ubuntu media center. Out of the three of them, the OSX machine probably hits closer to more of the above points than the others. Yes, the Ubuntu box may be more "customizable" in terms of choosing a window manager, and choosing a skin for it, and the like...but I've found that process to be annoyingly bothersome so I set it once and have left it since, even though I'm starting to get sick of the skin. Not to mention the issues I had getting it to properly output 1920x1080... 

     

    I actually prefere "usefull system" before "GUI ridden  inherently discoverable" one.

    Maybe I am just affected by my history, but tell me please, how would in your ideal world would GUI enable user to do things like:

    - every hour download some web paage, find all people mentioned there (by name and link),  follow all the links, look what picture (green dot or red cross) sits near the name on the linked page and make me local html page with table of those with green dot with some pictures (copied to local disk for speed and independncy on internet connection) and links to original pages. Also made graf of when those people had green dot and when they had red cross (hour by hour, months of time)

    - connect to distributed  versioning system, get newest version of list. For eachline in the list ensure, that the mentioned directory exist (if not, create it) and contains updated related project (you can derive name of project from name of last directory in path). Whatever is after # sign consider just comment and ignore that.

    - My GF in work have blocked all usual mail servers outside her company, but she is able to use distant screen (or how it is called in english - I mean log to windows via KRDC over VPN) and I am (so she is too) able to send email via my ISP, but only from inside his network (so no from external adress). Click-and-drag interface to let her send emails via my connection to my ISP (we are both on local net)

     

    I am afraid, that such GUI would be too complex to be  "inherently discoverable"(suppose, that such requests was generated on-the fly, not when the GUI was designed, so the GUI should allow all such actions and all other posiible combination of them)

    Those things are real scripts, that I wrote in bash (which called and piped other CLI tools like sed, grep, git, wget ...).

     

     

     


  • sekret PM club

    @gilhad said:

    I actually prefere "usefull system" before "GUI ridden  inherently discoverable" one.

    Maybe I am just affected by my history, but tell me please, how would in your ideal world would GUI enable user to do things like:

    - every hour download some web paage, find all people mentioned there (by name and link),  follow all the links, look what picture (green dot or red cross) sits near the name on the linked page and make me local html page with table of those with green dot with some pictures (copied to local disk for speed and independncy on internet connection) and links to original pages. Also made graf of when those people had green dot and when they had red cross (hour by hour, months of time)

    - connect to distributed  versioning system, get newest version of list. For eachline in the list ensure, that the mentioned directory exist (if not, create it) and contains updated related project (you can derive name of project from name of last directory in path). Whatever is after # sign consider just comment and ignore that.

    - My GF in work have blocked all usual mail servers outside her company, but she is able to use distant screen (or how it is called in english - I mean log to windows via KRDC over VPN) and I am (so she is too) able to send email via my ISP, but only from inside his network (so no from external adress). Click-and-drag interface to let her send emails via my connection to my ISP (we are both on local net)

     

    I am afraid, that such GUI would be too complex to be  "inherently discoverable"(suppose, that such requests was generated on-the fly, not when the GUI was designed, so the GUI should allow all such actions and all other posiible combination of them)

    Those things are real scripts, that I wrote in bash (which called and piped other CLI tools like sed, grep, git, wget ...).

    Why would all that need to be part of a single GUI? That would be insane. The logical GUI way to do it would be to have a task scheduler that would invoke the necessary programs with whatever pre-made configurations you had saved in the order you needed them to do it. Each program you call (the equivalents of sed, awk, grep, etc.) would have its own GUI where you set those configurations up and then save them for recurring use.

    I'm not arguing that the CLI should be removed entirely. For power-users, it may entirely be necessary to use a CLI to do things that the GUI can't do readily...but there's no reason to nearly *mandate* CLI usage just to use or configure a machine, as a lot of Linux users tend to argue for on forums and the like. Any common activity that someone who's completely unfamiliar with the operating system may want to do should have an easily understandable GUI attached to it. A properly designed operating system should be able to be installed, configured, and ready to use by a complete newbie for everyday usage without ever seeing a CLI or needing someone to decipher any part of the interface.

    And the term I think you may have been looking for in that last example is "screen sharing" or "remote desktop", basically the visual equivalent of an SSH session.

     



  • @e4tmyl33t said:

    Well, I may not be able to define "useful world", but "inherently discoverable" would be something like the following:
    Mostly good points, but a lot of this has nothing to do with the OS. Confusingly-named utilities (not part of the OS), badly designed GUIs, unresponsive and uncommunicative programs – crappy software is crappy software, no matter what OS it runs on, and it's not the OS's fault.


  • Considered Harmful

    @HardwareGeek said:

    @e4tmyl33t said:

    Well, I may not be able to define "useful world", but "inherently discoverable" would be something like the following:
    Mostly good points, but a lot of this has nothing to do with the OS. Confusingly-named utilities (not part of the OS), badly designed GUIs, unresponsive and uncommunicative programs – crappy software is crappy software, no matter what OS it runs on, and it's not the OS's fault.


    If the OS (distro) ships with crappy software bundled into the default installation, it does reflect badly on the OS/distro.


  • sekret PM club

    @HardwareGeek said:

    @e4tmyl33t said:

    Well, I may not be able to define "useful world", but "inherently discoverable" would be something like the following:
    Mostly good points, but a lot of this has nothing to do with the OS. Confusingly-named utilities (not part of the OS), badly designed GUIs, unresponsive and uncommunicative programs – crappy software is crappy software, no matter what OS it runs on, and it's not the OS's fault.

    You or I could make that distinction. What about Bob, the guy who's never sat in front of a computer in his life? If he goes to copy a file on a brand new computer, and doing so either requires said confusingly-named utilities or a terrible UI, what is he likely to blame for this? He won't say "Dammit, these programs make it so hard to use them!", he'll say "What is this <OS> shit? Why don't it work?"

    Like it or not, many utilities (such as blakey's move/rename/whatnot), are seen as "part of the OS" by pretty much anyone who uses a computer. If they're distributed with the OS, there should be a single, unifying user experience across the entire platform, enforced by said OS. Third-party software I'll give you will always have these kinds of problems, as much as we might wish otherwise, but stuff that *comes with the OS* should be treated as part of the OS and therefore shouldn't be crap.



  • @flabdablet said:

    I guess he never used Macintosh Programmer's Workshop, which was basically an xterm with IDE pretensions.

    It was shit. That's why everybody used THINK C or CodeWarrior instead, and even for free nobody touched MPW.



  • @e4tmyl33t said:

    Well, I may not be able to define "useful world", but "inherently discoverable" would be something like the following:

    • Anything that can be done on the CLI has a GUI equivalent. These equivalents will be able to perform every function the CLI version can.
    • If you have to use CLI commands, said commands make some sort of sense in their naming conventions.
    • GUIs are easy to understand, have things in their logical places, and come with help documentation that doesn't require a decoder ring or 5 years of existing experience to decipher.
    • GUIs don't look like utter ass (most Java programs, I'm looking at you). This is sort-of related to the above point, but it bears repeating because a junk GUI is more useless than a jargony one.
    • Programs are responsive and communicative to the user. This means error messages either make sense to someone who isn't a developer, or contains enough information that troubleshooting and/or contacting the company for support doesn't take forever. It also means that programs don't go unresponsive when processing a task, and instead show a progress bar or some other form of indicator that the process is working, preferably showing an accurate percentage towards completion.
    • Operating systems and programs take into account both common use cases: The experienced user/power user, and the complete new user who has never seen this before in their life.
    • Operating systems can allow for a level of customization, to permit users to tailor their experience to what works best for them.

    That's a great list, I approve.



  • @gilhad said:

    Maybe I am just affected by my history, but tell me please, how would in your ideal world would GUI enable user to do things like:

    - every hour download some web paage, find all people mentioned there (by name and link), follow all the links, look what picture (green dot or red cross) sits near the name on the linked page and make me local html page with table of those with green dot with some pictures (copied to local disk for speed and independncy on internet connection) and links to original pages. Also made graf of when those people had green dot and when they had red cross (hour by hour, months of time)

    AppleScript/Automator/whatever the fuck Apple calls it now can do that easily. VBScript or JScript in Windows. That's a solved problem in every GUI I know of. And goddamned Windows Task Scheduler is nice to work with after touching that Cron bullcrap.

    @gilhad said:

    - connect to distributed versioning system, get newest version of list. For eachline in the list ensure, that the mentioned directory exist (if not, create it) and contains updated related project (you can derive name of project from name of last directory in path). Whatever is after # sign consider just comment and ignore that.

    Again: JScript, VBScript or (presumably) AppleScript can do this easily. The only quirk is it might have to query the CLI interface of the DVCS depending on whether the open source programmers who wrote it were too lazy to put in scripting hooks. (Hint: they undoubtedly were. Lazy fucking open source programmers.)

    @gilhad said:

    - My GF in work have blocked all usual mail servers outside her company, but she is able to use distant screen (or how it is called in english - I mean log to windows via KRDC over VPN) and I am (so she is too) able to send email via my ISP, but only from inside his network (so no from external adress). Click-and-drag interface to let her send emails via my connection to my ISP (we are both on local net)

    Ok I'm having trouble with your English, but are you saying you need a CLI to create a click-and-drag interface for sending email outside of a firewall? I'm a bit lost.

    @gilhad said:

    Those things are real scripts, that I wrote in bash (which called and piped other CLI tools like sed, grep, git, wget ...).

    Ok, you've spent years learning Bash. How long have you spent learning VBScript or JScript or AppleScript?

    Your argument doesn't boil down to "the CLI is better", it boils down to "I know the CLI better". That's different.


Log in to reply