The Linux command line sucks



  • @dkf said in The Linux command line sucks:

    but want to use a different model? ....

    Then write a set of adapter classes which do the transformations....


  • Discourse touched me in a no-no place

    @thecpuwizard said in The Linux command line sucks:

    Then write a set of adapter classes which do the transformations.

    Of course. And then there's little to no benefit from using object models as you're off into adapter class land. With the number of adapter class hierarchies depending on the square of the number of component-providing DLLs involved…

    Which was my real point…



  • @dkf said in The Linux command line sucks:

    then there's little to no benefit from using object models

    Flip side is that comprehensive models are hard [and have other problems], complete models are impossible (it would no longer be a model, it would be the thing itself). Thus a minimal model for a specific domain is usually best.

    This leads to a direct consequence that crossing domains (i.e. using different libraries) involves crossing models. It is the responsibility of the person doing the integration to do the necessary work, and fill missing pieces. This work can not be done by either of the library producers.


  • ♿ (Parody)

    @thecpuwizard But eventually all the models will include left pad.


  • Discourse touched me in a no-no place

    @thecpuwizard said in The Linux command line sucks:

    Thus a minimal model for a specific domain is usually best.

    True… and everyone insists that their own model is the perfect minimal one and expects you to build on top of it. Interoperation is annoying precisely because the domains tend to practice megaphone diplomacy.


  • Considered Harmful

    @dkf said in The Linux command line sucks:

    @jbert said in The Linux command line sucks:

    Again, you can use PowerShell today:

    But you then have to deal with the fun that comes with integrating two libraries with slightly incompatible object models.

    Sucks a lot less than string typing and no library functions.



  • @dkf said in The Linux command line sucks:

    their own model is the perfect minimal one

    Well, for their specific purposes (which is the scope) it can be pretty close to perfect.. :)

    and expects you to build

    YES.

    on top of it.

    Not necessarily (contrast to building along side it).

    Interoperation is annoying

    For so many reasons....not just related to computers...


  • Fake News

    @pie_flavor said in The Linux command line sucks:

    @bulb said in The Linux command line sucks:

    Of course for users, most of the time, Powershell does what you mean. Except when you didn't mean it. Then it makes it harder to understand what's wrong.

    Please do what I said the first time, which is to find an example.

    I found an example for @Bulb, although I have to add that it is more an example of how .NET interop can mess things up:

    $sb = New-Object System.Data.Common.DbConnectionStringBuilder
    $sb.ConnectionString = "server = 'smtp.gmail.com'; port = 587; user = you@gmail.com"
    $sb["server"] # No output gets shown
    

    It turns out that PowerShell here fumbles on setting ConnectionString by converting the statement into $sb["ConnectionString"] rather than setting the actual .NET property due to the fact that $sb also implements the IDictionary-of-type-T interface.



  • @remi said in The Linux command line sucks:

    learning all commands that exist

    Since any application can be invoked as a command, "all commands" means "all applications." I don't think anybody suggests that's a reasonable expectation.

    @remi said in The Linux command line sucks:

    Plus, remember we are talking about a tool that is destined to be used to do various little tasks on the side of your main one. It's not a programming language that you're spending all your day using, it's something that you switch to in the middle of another task to get a tidbit of information, to fix an issue with something etc.

    Hm, in my world, the shell is not just "used to do various little tasks on the side of [my] main one." Using the shell itself is not my main job, but it's directly involved in doing my main job. The applications to invoke editors (gvim, TYVM, and I tend to have 10 or so windows open), compile, run tests, search log files, search the code base to find which file contains a particular error string, commit changes, etc. are all done using the CLI. I wouldn't be surprised if I typically spend, in an 8-hour workday, an hour or more interacting directly with the shell (although, toby faire, a good bit of that is waiting for results from one CLI application or another).



  • @hardwaregeek there was a time with ms-dos that one normal person could know what all files in his dos directory were doing there.


  • Banned

    @hardwaregeek said in The Linux command line sucks:

    @remi said in The Linux command line sucks:

    learning all commands that exist

    Since any application can be invoked as a command, "all commands" means "all applications." I don't think anybody suggests that's a reasonable expectation.

    Somebody up the thread suggested reading entire Unix manual before even starting shell for the first time.



  • @sockpuppet7 said in The Linux command line sucks:

    @hardwaregeek there was a time with ms-dos that one normal person could know what all files in his dos directory were doing there.

    There was also a time when I knew what all the punch cards in my reference rack did :) :) :)



  • @gąska said in The Linux command line sucks:

    Somebody up the thread suggested reading entire Unix manual before even starting shell for the first time.

    That is a bit off from what I said (though not too far). My point was that in other fields, people do invest serious time and research before jumping in or having the expectation that "they know what they are doing". In some cases it is years before they actually touch the tools of the trade in an independent fashion.


  • Banned

    @thecpuwizard mostly because of government regulations and licenses. Which is only good in maintaining high prices and not high quality.



  • @kian said in The Linux command line sucks:

    The thing I find odd is that for not much more complexity, you could do something like have a REPL that can call C functions on shared libraries instead of a shell that calls executables, and that would give you some basic type safety, multiple entry points into a "program" with different parameters for each representing different things it can do, parameter assistance and such. With just the C headers you could even have library defined data types you use as parameters instead of stringly typed data. The REPL could handle the marshalling and serialization and problems like that. No need to use C specifically in the REPL, of course, a saner language could be used. It's just that calling C functions in a shared library is no more difficult, conceptually, than calling a program. Simpler, since the parameters are available and you can understand what they do without looking at the body most of the time.

    There are several Unix shells that work in just that way, including the aforementioned port of PowerShell - in fact, the second most common standard shell, TENEX C Shell (tcsh), is exactly that for a C subset. Well, that was the idea, at any rate - it doesn't quite work that way in practice, it just changes some of the scripting syntax, but that was sort of what it was supposed to be.

    Most of them are simplified REPLs of languages such as Python (pysh, quasishell), Perl (psh, pl), or Scheme (scsh), but a number are novel designs (e.g., LUSH, which another Lisp shell but not based on a specific dialect).

    And yes, there are ones which imitate classic Microsoft BASIC dialects such as Commodore and Applesoft (as well as other BASICs such as Dartmouth BASIC, the various Tiny BASICs, or Wozniak's Integer BASIC from before Apple decided to buy theirs from MS).

    For that matter, it is fairly easy to go old-skool and set up a REPL as a replacement for the existing shell; the shell is just a userland application, after all, and you can run any TTY mode program as your shell by changing your login preferences. You probably don't want to do that with, say, sed 😱 , but you could.

    There is even this explanation of setting up a Python REPL as a drop-in shell replacement, and this similar one for the CLISP implementation of Common Lisp, though 'drop-in' is a bit of an exaggeration. Similarly, while it isn't a shell per se, BASIC-256 is sometimes used as another way to recreate the feeling of power one gets when driving a MOS6502-based system :face_with_stuck-out_tongue_winking_eye:.

    There are a lot of shells for Linux; writing a simple one is a stock project for systems programming courses (I called mine 'dosh', and it sucked donkey balls; I can post the code, but I can't imagine anyone would want to see it). Just on my own Mint system, Synaptic lists the following ones for installation:

    • bash
    • csh
    • tcsh
    • zsh
    • ksh
    • busybox
    • lshell
    • pdksh
    • mksh
    • fish
    • rush
    • dash
    • sash
    • yash
    • posh
    • lush
    • scsh
    • mosh

    Some of those aren't full shells, and I may have missed a few; the way the term 'shell' gets overloaded can be confusing.

    But none of this is really relevant. Why has Bourne-Again survived so long as the primary *nix shell when it is obviously crap? Simple: because it's there, and it is always there. Every Unix-like system expects to have a derivative of the Thompson shell, Bourne shell or C Shell, and their descendants survive because they are the lowest common denominator for *nix shell programming.

    So why do most of the existing distros make one of them the default user shell, rather than setting up a better shell or a language REPL and relegating BASH and tcsh to legacy scripting? A number of reasons, no particularly good but all things which pressure the distro developers into using the existing shells. Three that I can see off the top of my head are:

    • Laziness - if they introduced a new shell, or defaulted to a REPL, they would have to support that setup.
    • Peer Pressure - doing something different is risky; it is easier to follow the herd.
    • Not want to break existing documentation - Most Linux users expect that if they gong looking for a solution to a problem online, they will be able to use the advice on their system regardless of distro, or have some version of it that works on their distro. This usually leads to them getting a BASH command to type in, or a script to run. While running a copy of BASH from a different shell is certainly possible, it becomes an extra set that they would have to explain to the users, and which the users would have to remember to do.
    • It's a Forking Headache - trying to get Linux distros, or Linux users, to all agree on anything? You'd have better luck with economists.

    It's very similar in some ways to why shell persists as the default user experience in the first place, really.



  • @thecpuwizard said in The Linux command line sucks:

    @sockpuppet7 said in The Linux command line sucks:

    @hardwaregeek there was a time with ms-dos that one normal person could know what all files in his dos directory were doing there.

    There was also a time when I knew what all the punch cards in my reference rack did :) :) :)

    There was a time when I could read the punch cards without using a card reader. :belt_onion:



  • @thecpuwizard said in The Linux command line sucks:

    My point was that in other fields, people do invest serious time and research before jumping in or having the expectation that "they know what they are doing".

    Two words: Dunning-Kruger.



  • @scholrlea said in The Linux command line sucks:

    LUSH, which another Lisp shell

    That would drive me to drink.

     

     

     

    :rimshot:



  • @scholrlea said in The Linux command line sucks:

    So why do most of the existing distros make one of them the default user shell, rather than setting up a better shell or a language REPL and relegating BASH and tcsh to legacy scripting? A number of reasons, no particularly good but all things which pressure the distro developers into using the existing shells. Three that I can see off the top of my head are:
    [4 reasons listed]

    Every UNIX/Linux system I've ever used, and that's a bunch over the last 30+ years, has had either bash (or sh, back in the day) or csh/tcsh as the default login shell. (There might have been one or two that used zsh, but it's similar enough to not invalidate my point.) As an experienced user logging into a new system (e.g., at a new job) for the first time, having a significantly different default shell would badly violate the principal of least astonishment. I might as well be a complete noob, at least for a few days until I learn the basics of the new shell.



  • @hardwaregeek Wayway back when I was working on Unix systems, it was either sh or ksh as the login shell.


  • Banned

    @scholrlea said in The Linux command line sucks:

    you can run any TTY mode program as your shell by changing your login preferences. You probably don't want to do that with, say, sed , but you could.

    Fun fact: you could do the same thing in Windows 3.1 with GUI applications.



  • @hardwaregeek said in The Linux command line sucks:

    @scholrlea said in The Linux command line sucks:

    So why do most of the existing distros make one of them the default user shell, rather than setting up a better shell or a language REPL and relegating BASH and tcsh to legacy scripting? A number of reasons, no particularly good but all things which pressure the distro developers into using the existing shells. Three that I can see off the top of my head are:
    [4 reasons listed]

    Crap. I added another example and then forgot to increment the count. Mii r teh dumazz.



  • @sockpuppet7 said in The Linux command line sucks:

    @hardwaregeek there was a time with ms-dos that one normal person could know what all files in his dos directory were doing there.

    You still can. It's not hard to get a copy of MS-DOS, and studying the files it comes with would be about the same effort now as it was back when.



  • @scholrlea said in The Linux command line sucks:

    @hardwaregeek said in The Linux command line sucks:

    @scholrlea said in The Linux command line sucks:

    So why do most of the existing distros make one of them the default user shell, rather than setting up a better shell or a language REPL and relegating BASH and tcsh to legacy scripting? A number of reasons, no particularly good but all things which pressure the distro developers into using the existing shells. Three that I can see off the top of my head are:
    [4 reasons listed]

    Crap. I added another example and then forgot to increment the count. Mii r teh dumazz.

    Oh. I just thought you were a member of the Spanish Inquisition.


  • ♿ (Parody)

    @hardwaregeek said in The Linux command line sucks:

    @thecpuwizard said in The Linux command line sucks:

    @sockpuppet7 said in The Linux command line sucks:

    @hardwaregeek there was a time with ms-dos that one normal person could know what all files in his dos directory were doing there.

    There was also a time when I knew what all the punch cards in my reference rack did :) :) :)

    There was a time when I could read the punch cards without using a card reader. :belt_onion:

    Blonde, brunette, etc, etc, etc.



  • @hardwaregeek said in The Linux command line sucks:

    @remi said in The Linux command line sucks:

    Plus, remember we are talking about a tool that is destined to be used to do various little tasks on the side of your main one. It's not a programming language that you're spending all your day using, it's something that you switch to in the middle of another task to get a tidbit of information, to fix an issue with something etc.

    Hm, in my world, the shell is not just "used to do various little tasks on the side of [my] main one." Using the shell itself is not my main job, but it's directly involved in doing my main job. The applications to invoke editors (gvim, TYVM, and I tend to have 10 or so windows open), compile, run tests, search log files, search the code base to find which file contains a particular error string, commit changes, etc. are all done using the CLI. I wouldn't be surprised if I typically spend, in an 8-hour workday, an hour or more interacting directly with the shell (although, toby faire, a good bit of that is waiting for results from one CLI application or another).

    What you describe is about exactly the same thing as what I did (perhaps not very well). You do not sit in the morning in front of your computer saying "right, today I'm going to use the shell". You want to compile, edit, search logs etc. and to do so you need to use the shell, but there are not two commands that are exactly the same (well, yes, there are, but not that many), and you don't get judged on how well you used the shell, but how well you ended up doing the rest of your job. For these purposes, who cares if you used the wrong shell command, if you had to create an intermediate file for search results that you edited manually rather than using the perfect shell command, if you made a sub-optimal shell command or a UUOC or didn't protect spaces or whatever?

    My point is that, for the vast majority of us, you definitely do not need to understand all subtleties and commands of the shell, you just need to know it "well enough" to get through to the information that you want. Of course, knowing it well will help you, but even a basic knowledge of it will get you through, and therefore (remember, this was the start of this thread of discussion!) it is pointless to require people to get an extensive training in it before letting them loose.

    Compare and contrast with, say, the programming language in which the main application you're developing is written in. This one, you can't really wing it (or rather, you shouldn't...) with approximate knowledge and good-enough, because it's built on all the code written before, and the code you write will be added to that and stay "forever".



  • @hardwaregeek said in The Linux command line sucks:

    There was a time when I could read the punch cards without using a card reader.

    Well, the text was usually printed across the top edge, so no real magic there :) [but I get the intent, been quite a few years (ok, decades) since I actually tried reading Hollerith - doubt I could recognize much myself anymore.


  • Discourse touched me in a no-no place

    @remi said in The Linux command line sucks:

    This one, you can't really wing it (or rather, you shouldn't...)

    Have you noticed where you are? 🏆



  • @thecpuwizard said in The Linux command line sucks:

    @hardwaregeek said in The Linux command line sucks:

    There was a time when I could read the punch cards without using a card reader.

    Well, the text was usually printed across the top edge, so no real magic there :) [but I get the intent, been quite a few years (ok, decades) since I actually tried reading Hollerith - doubt I could recognize much myself anymore.

    Likewise. I recently learned there were quite a few encodings. The one our punches/readers used was pretty straightforward. IIRC, the digits were simple — one punch in rows 0 – 9 of the card. Letters were 2 (or 3?) punches — one in one of the upper rows and one in rows 0 – 9 for the first ten letters, the next upper row and 0 – 9 for the next ten letters, and either the next upper row or the two upper rows and 0 – 5 for the last six letters. Punctuation I have no idea; I don't think I knew it back in the day.

    Getting back on topic (:doing_it_wrong:), however much you may think the Linux command line sucks, it's almost infinitely better than non-interactive punched cards.



  • @sockpuppet7 said in The Linux command line sucks:

    @hardwaregeek there was a time with ms-dos that one normal person could know what all files in his dos directory were doing there.

    ...was there? 5 points for you if you can tell me, without looking it up, what GRAPHICS.COM did.


  • :belt_onion:

    @maciejasjmj said in The Linux command line sucks:

    ...was there? 5 points for you if you can tell me, without looking it up, what GRAPHICS.COM did.

    I can, but I don't know if you were addressing "you" in the general sense or @sockpuppet7. :P


  • Trolleybus Mechanic

    @maciejasjmj said in The Linux command line sucks:

    @sockpuppet7 said in The Linux command line sucks:

    @hardwaregeek there was a time with ms-dos that one normal person could know what all files in his dos directory were doing there.

    ...was there? 5 points for you if you can tell me, without looking it up, what GRAPHICS.COM did.

    Was that the one that let you switch the display mode (VGA, EGA, CGA) ?



  • @maciejasjmj I think it was something with codepages, but I'm not sure, it was a long time ago

    edit: looking it up, it was for printing the screen on the printer http://www.easydos.com/graphics.html


  • Fake News

    @pie_flavor @Bulb

    I just found a PowerShell SNAFU for your collection. Parsing this JSON:

    {
      "lastBuildFolderCreatedOn": "01/26/2018 15:41:12 +01:00",
      "lastBuildFolderNumber": 63
    }
    

    with the following command line (I originally wanted to parse an entire directory full, but I quickly scaled down to just trying one):

    ls mappings.json | % { get-content $_ | convertfrom-json }

    convertfrom-json : Invalid object passed in, ':' or '}' expected. (1): {
    At line:1 char:41
    + ls mappings.json | % { get-content $_ | convertfrom-json }
    +                                         ~~~~~~~~~~~~~~~~
        + CategoryInfo          : NotSpecified: (:) [ConvertFrom-Json], ArgumentException
        + FullyQualifiedErrorId : System.ArgumentException,Microsoft.PowerShell.Commands.ConvertFromJsonCommand
    

    Welp, turns out that Get-Content returns an array of lines, and ConvertFrom-JSON will try to parse each line separately. You need to pass it a single string by using one of the following three constructs:

    • ls mappings.json | % { get-content $_ -raw | convertfrom-json }
    • ls mappings.json | % { get-content $_ | out-string | convertfrom-json }
    • ls mappings.json | % { (get-content $_) -join "``n" | convertfrom-json }
      I got this last one from the docs. (Also, PowerShell uses just a single backtick to escape a newline but my excuse is markdown.)

     

    EDIT: Mind you, I then went on to parse an entire directory tree of JSON files to filter out just a few fields each using this:

    ls -recurse SourceFolder.json | % { get-content $_ -raw | convertfrom-json | select -property agent_builddirectory,definitionName }

    So I remain by my point that it's at least somewhat awesome.



  • @jbert - don't have time to test, but just the following should work...

    Get-Content -Raw -Path <jsonFile>.json | ConvertFrom-Json



  • My new favorite from a few weeks ago is:

    The crontab command is using -e for editing, and -r for deleting a crontab. It does deleting without asking, and you know, QWERTY keyboards have their E and R keys awfully close to each other.

    So, a few weeks ago I just fucked root's crontab on a, thank goodness, staging server. That evening I wanted to go full Blakey mode and write a long rant about how even CLI tools, once they get popular, really need a UX expert too. CLIs are UIs too, after all!

    The basic rule is this: you don't put destructive options and non-destructive options on adjacent keys, if you must, you make them differ in case, if you cannot, you make it interactive (unless an option which uses a key from a different area and in a different case is specified), if you still cannot, you ship an alias which damn makes it interactive, and you recommend distributions to fucking include it in all their default *rc files for interactive shells.

    But apparently dinosaurs like Paul Vixie are all on Dvorak so that's not their problem.


  • Considered Harmful

    @wft Or do what Powershell does, which is 'don't use stupid tiny abbreviations that you'll forget for things since we no longer have twentieth-century constraints on our shells'.


  • Discourse touched me in a no-no place

    @jbert said in The Linux command line sucks:

    • ls mappings.json | % { (get-content $_) -join "``n" | convertfrom-json }
      I got this last one from the docs. (Also, PowerShell uses just a single backtick to escape a newline but my excuse is markdown.)

    There are ways. Dirty, dirty ways. Or you can use HTML tags.
    ls mappings.json | % { (get-content $_) -join "`n" | convertfrom-json }


  • area_can

    @wft alternatively, make the dangerous thing harder by requiring you to type crontab -remove


  • Considered Harmful

    @dkf Or you can just do it with Markdown.
    ls mappings.json | % { (get-content $_ ) -join "`n" | convertfrom-json }



  • @wft said in The Linux command line sucks:

    The basic rule is this: you don't put destructive options and non-destructive options on adjacent keys,

    Wusss - let's go for short cryptic names, that do not require any arguments or switches or options for destructive behavior 👹 👹 👹 👹



  • @bb36e you just need to add the -i switch: prompt before deleting user's crontab :facepalm:


  • area_can

    @timebandit said in The Linux command line sucks:

    @bb36e you just need to add the -i switch: prompt before deleting user's crontab :facepalm:

    ah right, because it's not the Unix Way™ if the only safeguards available for a destructive command are disabled by default


  • :belt_onion:

    @thecpuwizard said in The Linux command line sucks:

    @jbert - don't have time to test, but just the following should work...

    Get-Content -Raw -Path <jsonFile>.json | ConvertFrom-Json

    He did say present that as one of the three options.


  • Fake News

    @heterodox Indeed, and the ls mappings.json part is in there because I later went for the -Recurse option to find each and every file with that name in a bunch of folders.



  • @scholrlea said in The Linux command line sucks:

    busybox

    That's not a shell, that's an entire Linux userspace statically linked into one file.



  • @dkf said in The Linux command line sucks:

    @jbert said in The Linux command line sucks:

    • ls mappings.json | % { (get-content $_) -join "``n" | convertfrom-json }
      I got this last one from the docs. (Also, PowerShell uses just a single backtick to escape a newline but my excuse is markdown.)

    There are ways. Dirty, dirty ways. Or you can use HTML tags.
    ls mappings.json | % { (get-content $_) -join "`n" | convertfrom-json }

    Guys, Markdown specifically says that you can use any number of backticks inside an inline code element as long as it's not the same as the starting tag. `` ` ``


  • Considered Harmful


  • Impossible Mission - B

    @pie_flavor said in The Linux command line sucks:

    @wft Or do what Powershell does, which is 'don't use stupid tiny abbreviations that you'll forget for things since we no longer have twentieth-century 1970s constraints on our shells'.

    FTFY. Those constraints stopped being relevant well before the year 2001.


  • Notification Spam Recipient

    @maciejasjmj said in The Linux command line sucks:

    @sockpuppet7 said in The Linux command line sucks:

    @hardwaregeek there was a time with ms-dos that one normal person could know what all files in his dos directory were doing there.

    ...was there? 5 points for you if you can tell me, without looking it up, what GRAPHICS.COM did.

    My memory is heavily corrupted in that area, but wasn't it supposed to set the extended character glyphs? Now I'm curious...

    Edit: of course not. Why would I even try to use corrupted data?


Log in to reply