Replacing shells



  • @dkf said in Is my interview question too hard?:

    This sort of thing is why the usual fix to making a shell script robust is to stop using shell scripts. Virtually any other programming language would do better (but if the candidate thinks that writing C++ for this is a good plan, he should be smacked upside the head anyway).

    But which programming language, actually.

    The name of the game is still orchestrating external processes, so it shouldn't be too verbose setting up pipelines.

    I used to do some such things in Perl. That is fairly nice when you just need to read output of a process, but setting up more complex pipelines isn't that nice. And it's pain in the … actually, just like Bash, it was brought to all development Windows machines with Git, so there's that. It's still a bit of a forgotten lore.

    These days I might pull out Python instead. It's still quite a bit more verbose than shell with all the quoting, but setting up pipelines is, with a bit of wrapping, acceptable, and since it grew async, one can even handle communicating with multiple processes reliably. Plus it has a completeish standard library.

    The problems with Python are:

    • It is relatively big. 155MiB here. On an embedded device we had quite a bit of the standard library stripped down, but it was still around 30MiB, compared to 300KiB the busybox (shell+core utils) the busybox takes.
    • It's a bit problematic on Windows. Git does not pull it in (unlike Bash and Perl), Store can't be relied on (the customer notebook I have has store blocked) and Azure CLI brings its own copy that can't be easily taken advantage of.

    Powershell aims to be this, but my experience with it so far is … mixed. It can be installed on Linux now, so it's available on all systems, but

    • It's actually rather bad at manipulating text, so processing output of the external commands or reading things from files is ugly. I especially had problems working with text encoding.
    • It actually tends to be horribly outdated in Windows, and so tend to be the .NET DLLs installed on the system. I wanted to use it for the ability to call .NET libraries, but gave up exactly because the version in system was too old and I couldn't find a good way to give it a new one.
    • It's just ugly. The inconsistency between commandlets and functions in how they handle arguments, and the many options that can be specified for a commandlet argument make it just … unnecessarily harder to learn I think.

    Then we have node. I'm not installing that anywhere. That's just pain.

    … deno might be much better though. Especially with the way you can directly import packages from servers, so it does not need much setup. I am eagerly awaiting Debian package though.

    Any other options?

    … TCL. TCL might actually be good. It is also depended on by Git so it's actually available. It's just seems to be completely forgotten lore by now.


  • Discourse touched me in a no-no place

    @Bulb I don't know Powershell at all, and Node hardly more than that. I know that Python, Perl, Ruby and Tcl would all do the task fine. Pick your poison. (I'm not sure about Lua. That tends to have a very restrictive language profile; I don't know if subprocess launching is in there by default.)

    Most compiled languages have either large runtimes or require that you deal directly with the complexity of system calls. (The difference on Windows is that the actual system call takes an unparsed command line and apps don't have consistent ideas about how to parse it.) For something like this, that's very fiddly. Better to use someone's existing code.



  • @dkf said in Replacing shells:

    @Bulb I don't know Powershell at all, and Node hardly more than that. I know that Python, Perl, Ruby and Tcl would all do the task fine. Pick your poison.

    Yeah, that's the problem. Shell is everywhere. Including Windows now thanks to Git. The other ones … every environment has different one.

    (I'm not sure about Lua. That tends to have a very restrictive language profile; I don't know if subprocess launching is in there by default.)

    Lua can launch subprocesses if you install the package providing the interfaces. But while Lua is very small, it has also ❄ syntax and semantics and is generally a bit of pain to switch to and from. And nobody has it installed by default.

    Most compiled languages have either large runtimes

    Java's goal was to be available everywhere, but it very much failed at that and isn't installed, by default, almost anywhere. If you have it, go ahead—actually, groovy could be a good choice then—but you often don't. And for .NET, it is in Windows by default, but it may be an old version and it changed a lot lately.

    or require that you deal directly with the complexity of system calls. (The difference on Windows is that the actual system call takes an unparsed command line and apps don't have consistent ideas about how to parse it.) For something like this, that's very fiddly. Better to use someone's existing code.

    Windows does have standard functions to quote and unquote the arguments, and most languages use them to provide the Unix interface—except the .NET languages that are Windows-native.


  • Discourse touched me in a no-no place

    @Bulb said in Replacing shells:

    Windows does have standard functions to quote and unquote the arguments, and most languages use them to provide the Unix interface—except the .NET languages that are Windows-native.

    Yes. It depends on which basic runtime you're using. Most code these days uses some variant of the MSVC runtime, and that's pretty sane in this area, but that's very much not universally used. I remember reading that, for example, the Delphi runtime has quite a different set of quirks, and if you have to call builtins of cmd then you're in for a bad time as they're very quirky in places.

    I prefer to admit I know less about this than I actually do.



  • @dkf said in Replacing shells:

    the Delphi runtime

    :whywouldyouusethat:?

    if you have to call builtins of cmd then you're in for a bad time as they're very quirky in places.

    :whywouldyoudothat:?

    You pass the whole cmd command to cmd /c as one argument and that should generally work, but I never actually tried to do that … I don't even think there is a good reason to (cmd being much weaker than sh, and I don't generally use sh that way either).



  • @Bulb said in Replacing shells:

    Yeah, that's the problem. Shell is everywhere. Including Windows now thanks to Git. The other ones … every environment has different one.

    In my experience with Python and Perl most everyday uses (which it sounds like your use case is) the differences between ports was never an issue.



  • @Dragoon It's not so much difference between the ports as the lower chance it's already installed. Because my intent usually is to bootstrap the toolchain with as few steps as possible to make it easier to reproduce if something fails or if it needs to be moved or when installing a new whatever system.


  • Discourse touched me in a no-no place

    @Bulb said in Replacing shells:

    :whywouldyoudothat:?

    Delphi? :mlp_shrug: Ask the users...

    Call into cmd? Usually for start, and that is one of the quirkiest commands in it.



  • @Bulb said in Replacing shells:

    Powershell aims to be this, but my experience with it so far is … mixed.

    You forgot to mention PowerShell's schizophrenia: powershell vs. pwsh. The former is no longer maintained but it comes pre-installed on Windows. The latter is the multiplatform one. They don't necessarily work the same...



  • @Deadfast said in Replacing shells:

    @Bulb said in Replacing shells:

    Powershell aims to be this, but my experience with it so far is … mixed.

    You forgot to mention PowerShell's schizophrenia: powershell vs. pwsh. The former is no longer maintained but it comes pre-installed on Windows. The latter is the multiplatform one. They don't necessarily work the same...

    Don't forget the one that comes with Visual Studio (which isn't necessarily any of the others) and the one for Azure (which may actually be bash instead of PowerShell).

    I'll move on from Command Prompt one of these days. Probably.


  • Fake News

    @Deadfast said in Replacing shells:

    @Bulb said in Replacing shells:

    Powershell aims to be this, but my experience with it so far is … mixed.

    You forgot to mention PowerShell's schizophrenia: powershell vs. pwsh. The former is no longer maintained but it comes pre-installed on Windows. The latter is the multiplatform one. They don't necessarily work the same...

    Correct.

    I like PowerShell a lot because it has the .NET Runtime sitting beneath it to give it some scripting power and has the possibility of writing cmdlets in .NET languages all while being a decent shell, but it doesn't come without quirks.

    One of the differences between Windows PowerShell and pwsh is that pwsh uses UTF-8 (without BOM) by default for file & console output, Windows Powershell defaults to UTF-16, and if you request UTF-8 you get a BOM at the start of your files.

    Meanwhile, integrating with native Windows commands is a bit of a pain. pwsh might use UTF-8 for writing files, but integrating with commands still uses some legacy codepage by default, both for passing arguments / input to commands as well as reading their output back to a string. There are workarounds as seen in this lengthy stackoverflow answer, but it is a pain nonetheless.


  • Considered Harmful

    Windows was not designed, it was rather, as is well known, aggregated via theft per fashion sense and cargo cultery, as by a gulp of magpies. Therefore it should be no surprise there is no sensical shell. It was never engineered to have one. You need a finite set of concerns to build a concise shell - for instance some grand unifying principle. It hainit got one. Everything's A :arrows:.

    This won't change.



  • @Bulb said in Replacing shells:

    Lua can launch subprocesses if you install the package providing the interfaces. But while Lua is very small, it has also syntax and semantics and is generally a bit of pain to switch to and from. And nobody has it installed by default.

    The default interfaces include os.execute and io.popen. It's a bit primitive, but workable. I would imagine doing stuff in parallel to be more of an issue.

    The main advantage that I would see with lua is that you can distribute a single .exe/binary for the interpreter (similar to e.g. premake). In that case, you can avoid needing to install anything, just make sure a known-good version of the interpreter is included with your script. It's not huge (I'd guess a few MB statically linked), so you can even check it into a repo or add to a wiki.


  • Discourse touched me in a no-no place

    @cvi said in Replacing shells:

    The default interfaces include os.execute and io.popen. It's a bit primitive, but workable. I would imagine doing stuff in parallel to be more of an issue.

    The big problem with them is that they're system() and popen() in Lua clothing, with all the quoting fun you'd expect (see the discussion earlier that triggered this thread).

    Yes, I know this just by reading the docs. Duh. They're pretty explicit about what the functions really do.



  • @dkf said in Replacing shells:

    see the discussion earlier that triggered this thread

    I vaguely remember that from a different thread now that you mention it. Remembering stuff across threads is :kneeling_warthog:.

    But, yeah, you're right. Basically all of Lua's standard lib is a thin wrapper over the C-apis.

    You can still go down the premake route. Essentially, they distribute a custom .exe which is mainly a lua interpreter with a bunch of added to it. The added part could be relatively light-weight (but written in C); the part about not requiring additional installation would still hold true.

    Edit: Inb4 "but it's still Lua". True. There are a few similar options that are less well known. If you can tolerate JS, you can probably put together something similar with e.g. duktape.


  • Java Dev

    @dkf said in Replacing shells:

    @cvi said in Replacing shells:

    The default interfaces include os.execute and io.popen. It's a bit primitive, but workable. I would imagine doing stuff in parallel to be more of an issue.

    The big problem with them is that they're system() and popen() in Lua clothing, with all the quoting fun you'd expect (see the discussion earlier that triggered this thread).

    Yes, I know this just by reading the docs. Duh. They're pretty explicit about what the functions really do.

    Eh, only place I've ever seen lua is in video game scripting/modding. In which case those functions don't get included anyway so it doesn't matter.



  • @cvi said in Replacing shells:

    The main advantage that I would see with lua is that you can distribute a single .exe/binary for the interpreter (similar to e.g. premake).

    You can do the same thing with Python. And your arrays (or "arrays") don't index from 1...

    The main advantage of Lua is its ease of integrating with existing C or even C++ APIs, and its speed when used through LuaJIT. It almost makes one forgive the array indexing problem. That's the reason it's commonly used in games as @PleegWat pointed out. That's probably not really applicable here though.



  • @Deadfast said in Replacing shells:

    You can do the same thing with Python.

    TIL.

    Minor nitpick: that seems to give you a pile of files (~13MB in a minimal test). All of premake is a single file of 2.1M on Linux (separate binaries for Windows+Linux+MacOS land you on a grand total of three files and 5MB).

    FWIW- premake is a "meta build system" (similar to CMake) that uses Lua as its configuration+scripting language. It ships as a single binary. IME it does a better job at generating clean project files for e.g. VS (though rumors say that recent CMake has also shaped up).

    I use it as an example here because it is somewhat similar to what OP asked - e.g., describing tasks (=builds). Main difference is that premake itself doesn't run stuff itself (similar to CMake), but the reasons for choosing Lua as a basis nevertheless hold. Cross platform, lightweight basis (~2MB), battle-tested implementation and language (with quirks, as the aforementioned 1-based indexing; but no significant whitespace, so you win some and you lose some). I figure you could get something up and running in it with a similar effort as python, but you'd sidestep the installation and size issues.



  • @cvi said in Replacing shells:

    Minor nitpick: that seems to give you a pile of files (~13MB in a minimal test).

    You can configure it to spit out a single binary and possibly also exclude unneeded libraries to reduce the size, but this is hardly a fun process. It's not really documented and you get weird errors until you get it just right.

    @cvi said in Replacing shells:

    FWIW- premake is a "meta build system" (similar to CMake) that uses Lua as its configuration+scripting language. It ships as a single binary. IME it does a better job at generating clean project files for e.g. VS (though rumors say that recent CMake has also shaped up).

    I'm familiar with Premake. I really appreciate it for what it does. I'm also familiar with the fact that it has a set of custom os.* functions. Unfortunately the fact that those are necessary is not a good indication of Lua itself being up to the task that @Bulb is seemingly after.



  • @cvi said in Replacing shells:

    @Bulb said in Replacing shells:

    Lua can launch subprocesses if you install the package providing the interfaces. But while Lua is very small, it has also syntax and semantics and is generally a bit of pain to switch to and from. And nobody has it installed by default.

    The default interfaces include os.execute and io.popen. It's a bit primitive, but workable. I would imagine doing stuff in parallel to be more of an issue.

    Is it always there? I doubt that given the base library is intended for embedding.

    The main advantage that I would see with lua is that you can distribute a single .exe/binary for the interpreter (similar to e.g. premake). In that case, you can avoid needing to install anything, just make sure a known-good version of the interpreter is included with your script. It's not huge (I'd guess a few MB statically linked), so you can even check it into a repo or add to a wiki.

    Yeah, but you have to build it yourself, because in the distro binaries it's usually in a separate shared library.

    … at this day and age, it's probably better to use deno.

    @cvi said in Replacing shells:

    Edit: Inb4 "but it's still Lua". True. There are a few similar options that are less well known. If you can tolerate JS, you can probably put together something similar with e.g. duktape.

    Lua is conceptually similar to JavaScript, just with a lot of extra quirks like 1-based indexing (by default, configurable, 🤮), unusual operators (not-equals being ~=, not != the the biggest offender to muscle memory, but there is a bunch more). So if you can tolerate lua, you sure can tolerate typescript, and just use deno.

    @PleegWat said in Replacing shells:

    Eh, only place I've ever seen lua is in video game scripting/modding. In which case those functions don't get included anyway so it doesn't matter.

    I had it in my previous project. There was a massively ugly Java concoction that compiled some domain-specific program-in-XML to Lua and used that to avoid having to write a runtime for the XML, and allow some customization (eventually someone did write that runtime and they (I left the project last year) are slowly switching over to it).

    Lua is simple enough, but has a lot of quirks. And is not that different from javascript anyway.



  • @Bulb said in Replacing shells:

    … TCL. TCL might actually be good. It is also depended on by Git so it's actually available.

    Really?? TIL, that might be useful

    Btw, I have sold similar problem by running all commands in docker (because that one is required anyway). You know what they say - if all the tools you have is a pneumatic drill, everything looks like a pavement.


  • 🚽 Regular

    @PleegWat said in Replacing shells:

    Eh, only place I've ever seen lua is in video game scripting/modding.

    I haven't seen, but I have heard Lua is used in ngnix and in tiling window managers, for what it's worth.



  • @Bulb said in Replacing shells:

    Is it always there? I doubt that given the base library is intended for embedding.

    It's included in the sources by default. It's very easy to include/exclude when you set up a Lua context.

    @Bulb said in Replacing shells:

    Lua is simple enough, but has a lot of quirks. And is not that different from javascript anyway.

    Fair. If you can find the necessary infrastructure in a tolerable form (e.g., w.r.t. installation and so on), then having a small code base that's easy to build and shuffle around isn't that much of an advantage. I don't see doing a custom build (e.g. for the static binary) as a major issue, but YMMV. (If I have sources that I can build from scratch and have "portable" binaries, there's a good chance that I can get it running next week and 10 years down the road.)

    I'd also argue against one of the early comments (quoted in the OP). If you're a X-language team, then doing minor automation in X isn't such a bad choice. Everybody in your team already knows X and has presumably the necessary software to do X things. Adding a second language in the mix, even if it's "more appropriate" for the task, is an extra complexity.

    For example, we automated our data wrangling with C++ once upon a time. I could have done it quicker in bash, but then the Windows guys would have trouble (both running it and modifying it). Cmd is retarded enough that not even the Windows guys suggested it; nobody really knew Powershell back then (and I'm pretty sure it didn't yet exist for *nix). Lua bindings would have been neat, but cost more developer time. Same for a bunch of similar choices.

    So auto-smacking the guy who suggests C or C++ on the head is dumb too. If that's what your team is doing already anyway, it might be the path of least resistance. Replace with C# or Rust or whatever as appropriate.


  • Discourse touched me in a no-no place

    @cvi said in Replacing shells:

    If that's what your team is doing already anyway

    Auto-smackings for the whole team then!



  • @Zecc said in Replacing shells:

    I haven't seen, but I have heard Lua is used in ngnix and in tiling window managers, for what it's worth.

    It's found its way into quite a few different things. It's prominent in games, mainly because it's always been fairly lightweight, easy to build and easy to embed. That used to be a big driver originally; nowadays, there are quite a few other options available.



  • @Deadfast said in Replacing shells:

    I'm also familiar with the fact that it has a set of custom os.* functions. Unfortunately the fact that those are necessary is not a good indication of Lua itself being up to the task that @Bulb is seemingly after.

    It's not perfect either. Realistically, you'd probably end up adding some stuff (like premake, but you can probably get away with less). Mainly, it addresses a few of the problems mentioned in the OP, specifically size and installation.

    (On the other hand, @Bulb doesn't seem too enthused by it either, so I guess that point has been settled for this specific discussion. 🙂)



  • @cvi said in Replacing shells:

    easy to embed

    The C API is fairly simple and obvious. Hm, but then so is the Python one and Perl has that h2xs tool to generate the binding or you can use SWIG, so … I might be advertising more than anything else.



  • @Kamil-Podlesak said in Replacing shells:

    @Bulb said in Replacing shells:

    … TCL. TCL might actually be good. It is also depended on by Git so it's actually available.

    Really?? TIL, that might be useful

    Well, sort of. The gitk and git gui are written in TCL/TK, which means the Windows distribution has it, but in Linux usually those are a separate package, so Linux servers may not.

    Btw, I have sold similar problem by running all commands in docker (because that one is required anyway). You know what they say - if all the tools you have is a pneumatic drill, everything looks like a pavement.

    Yeah, I often do that too. But sometimes I have this chicken-and-egg problem: I need a bit of code to sniff out which container to run the rest in because I don't want to copy parameter all over the place.

    If @devcontainers/cli didn't have so many nasty dependencies, it would otherwise be perfect for most CI/CD needs. But node and C++ compiler and python? :angry:



  • … very much on the same topic.


  • BINNED

    @Bulb said in Replacing shells:

    … very much on the same topic.

    This post gave me PTSD.

    I just want to develop some software, but as of late I spend an increasingly larger fraction of time on figuring out how to build shit and deploy it on different machines. None of which I find interesting at all, rather frustrating.

    One of the first problems in this area is bootstrapping. In general, you can paper over quite a bit of complexity by writing some custom script to do all the grunt work. But how do you run it?

    Although scripting and plumbing should be a way to combat complexity, just getting to the point where every contributor to your software can run scripts requires a docker container a great deal of futzing with the environment!

    Sigh. Yes.

    However ...

    Deno doesn’t solve the problem of just being already there on every imaginable machine. However, it strives very hard to not create additional problems once you get the deno binary onto the machine.

    that just adds more dependencies. Besides a rather strong disdain for running webshit, which is terrible enough as is, outside of the browser, I don't want to bring in even more dependencies.


  • Considered Harmful

    @topspin said in Replacing shells:

    I just want to develop some software

    I just wanna feeel real love 🎹



  • @Applied-Mediocrity said in Replacing shells:

    @topspin said in Replacing shells:

    I just want to develop some software

    I just wanna feeel real love 🎹

    I'm afraid those two statements are incompatible with each other.


  • Considered Harmful

    @Zerosquare My idea was that they are equally unlikely to happen, but :also-yes:.



  • @topspin said in Replacing shells:

    @Bulb said in Replacing shells:

    … very much on the same topic.

    This post gave me PTSD.

    I just want to develop some software, but as of late I spend an increasingly larger fraction of time on figuring out how to build shit and deploy it on different machines. None of which I find interesting at all, rather frustrating.

    Grouches like you are why we have dedicated DevOps engineers now :tro-pop:

    Well, you always had to build the software, and you always had to either install it, or release it in such a way that someone else could install it. At times of make it was a lot of work too, and every developer had to know it. The IDEs made the build part easier. But that just means not everybody has to deal with the release and deploy parts, not that those became any easier. They did not.

    One of the first problems in this area is bootstrapping. In general, you can paper over quite a bit of complexity by writing some custom script to do all the grunt work. But how do you run it?

    Although scripting and plumbing should be a way to combat complexity, just getting to the point where every contributor to your software can run scripts requires a docker container a great deal of futzing with the environment!

    Sigh. Yes.

    However ...

    Deno doesn’t solve the problem of just being already there on every imaginable machine. However, it strives very hard to not create additional problems once you get the deno binary onto the machine.

    that just adds more dependencies. Besides a rather strong disdain for running webshit, which is terrible enough as is, outside of the browser, I don't want to bring in even more dependencies.

    Well, you can either add dependencies, or stick with 44 year old technology because that was about the last time significant fraction of the computing world could agree on a common base functionality. And unfortunately as powerful regular expressions are, most data these days is structured and requires L2 parser, which common unix tools don't include.


  • Discourse touched me in a no-no place

    @Bulb said in Replacing shells:

    And unfortunately as powerful regular expressions are, most data these days is structured and requires L2 parser, which common unix tools don't include.

    Yacc. And it is awful.



  • @dkf Isn't that more-or-less replaced by bison these days? I have no clue whether it's any less awful, as I've never used it.

    Also, it's not included by default with WSL, although it's only a sudo apt install away.



  • @dkf That needs a compilation step, and while 44 years ago you'd usually install C compiler on servers, these days you don't.


  • Java Dev

    We're working on automating our dev setups. One of the manual steps is updating a json file with information from the local node.

    🧝♂ Should be easy enough with some sed
    pleegwat I'm not doing it in a language which doesn't understand json
    🧝♂ I guess you could but you're just making it difficult
    ⏰ time passes
    🧝♂ You know, we could add an interface for editing keys in this part of the json
    pleegwat 💭 And this is why I will do this in a language which understands json.

    Now, my immediate choices are python and javascript. My python is near-nonexistant. But I suspect invoking node from bash is trickier than I'm willing to bother with.



  • @PleegWat said in Replacing shells:

    My python is near-nonexistant.

    Same. But if you just want to read the thing, change a few items, and dump it out again, Python seems like a tolerable solution. (If you want a whole graphical interface or something around it, YMMV. Haven't had to try that, thankfully.)



  • @PleegWat said in Replacing shells:

    We're working on automating our dev setups. One of the manual steps is updating a json file with information from the local node.

    🧝♂ Should be easy enough with some sed
    pleegwat I'm not doing it in a language which doesn't understand json

    That's when you install jq, the “sed for json”. It's packaged in all major Linux distribution, it is in Chocolatey for Windows, and it's fairly easy to use.

    But I suspect invoking node from bash is trickier than I'm willing to bother with.

    Why would it be? node script.js and that's that. It's a hassle to install though; deno is much simpler in that regard. Also in the regard of pulling dependencies—you just import them from URLs directly, and the you can deno bundle it to create a stand-alone script so your build is not dependent on the infernet.


  • Considered Harmful

    @PleegWat Bulb Excuse me, sir, do you have a moment of time to talk about our Lord and Savior Rust deno? :tro-pop:



  • @Applied-Mediocrity It's still ducktapescript. But if you intend to sink so low that you consider node, it is a huge improvement.



  • @Applied-Mediocrity … by the way, you should not have struck out Rust. Deno is written in Rust 😉


  • BINNED

    @PleegWat said in Replacing shells:

    And this is why I will do this in a language which understands json.
    My python is near-nonexistant.

    That shouldn't matter, you need like 5 lines to edit a bit of json. It's something a stackoverflow monkey could do, so you'll probably be able to pick it up in half an hour.


  • Discourse touched me in a no-no place

    @topspin said in Replacing shells:

    That shouldn't matter, you need like 5 lines to edit a bit of json. It's something a stackoverflow ChatGPT monkey could do, so you'll probably be able to pick it up in half an hour.

    FTFTheKidsTheseDays


  • Java Dev

    @Bulb said in Replacing shells:

    That's when you install jq, the “sed for json”. It's packaged in all major Linux distribution, it is in Chocolatey for Windows, and it's fairly easy to use.

    This seems like a more reasonable approach.



  • @PleegWat I use jq a lot. Sometimes to modify a value in JSON—I even assemble a json file with it, entry by entry as I get various connection strings with different commands, that I then pass to helm as values in project—but most often simply because all of az (azure cli), docker and kubectl can output json, but have different syntax for queries to extract specific values—so it's easier to just pipe the json through jq and use always the same syntax.


  • Java Dev

    @Bulb I successfully implemented it for my usecase - updating an existing json settings file with some values from shell variables.

    jq --arg foo "$foo" '. * { settings: { foo: $foo } }' < src/config.json > work/config.json
    

    I think it's closer to awk than sed.



  • @PleegWat said in Replacing shells:

    I think it's closer to awk than sed.

    Might be. It's true that it can do some simple calculations, which awk can, but sed can't.

    < src/config.json > work/config.json

    I wish it had the -i option like sed. Because I sometimes end up calling it multiple times with different values obtained from different places, and need to put them all in one file.

    @PleegWat said in Replacing shells:

    jq --arg foo "$foo" '. * { settings: { foo: $foo } }'

    Could be simplified a bit to jq --arg foo "$foo" '.settings.foo = $foo'


  • Java Dev

    @Bulb said in Replacing shells:

    Could be simplified a bit to jq --arg foo "$foo" '.settings.foo = $foo'

    Since I'm actually setting multiple values, I instead simplified it to

    jq --arg foo "$foo" --arg bar "$bar" --arg baz "$baz" \
        '. * { settings: { $foo, $bar, $baz } }'
    

    Also in this case the copy is intentional. The template file is in the source tree, the version with values set for the local environment should not be committed to git.


Log in to reply