My primary use case for Amber is when I need to write a Bash script but don’t remember the silly syntax. My most recent Bash mistake was misusing test -n and test -z. In Amber, I can just use something == "" or len(something) == 0
My primary use case for Amber is when I need to write a Bash script but don’t remember the silly syntax. My most recent Bash mistake was misusing test -n and test -z. In Amber, I can just use something == "" or len(something) == 0
This is an interesting idea and from the looks of it well executed, but I am having trouble imagining a scenario were I would prefer to use Amber over a scripting language like Python. If your bash scripts are getting long enough to warrant the use of Amber you are probably already in a situation where you can justify installing Python.
to some extents - sunk cause fallacy and performance.
I had a launcher script which required it’s run to complete under 50 ms to be usable. python just did not make the cut (it would call external stuff and more). I know i should not expect performancce from shell scripts, but started from essentially a find of about 20-30 files and a cat of at most 100 lines. It was fast enough then. then I kept adding stuff, it kept slowing down. I thought of converting to python, even did some initial bits, performance was not good.
beyond a certain point, i kinda stopped caring optimising (i replaced bash with dash, made mos tif not all calls to external tools a minimum, and tried to minimise slow calls(think calling some tool again and again vs a vectorised operation, or not reading a variable multiple times in memory )). At some point it reached 300-400ms, and i decided to split it into 2 things - a executor part which cahes output of some commands with certain run conditions, and a launcher part which just read the output file (now almost a 1 miB).
At some point i decided learning rust, and first thing i wrote was that launcher. implemented caching and run conditions better, moved larger files (now it read multiple megabytes(15+)) to /tmp dir, which is mounted in memory, and tried to minimise variable calls. now it lists 10 times more files, in less than a third or fifth of the time.
tl;dr - a stupid person did not shift to a compiled program for a long time and kept optimising a shell script
If your program relies too many external tools then I think it makes more sense to use bash than abuse os.system
it’s called
subprocess.run.I agree, but I can envision scenarios where you are integrating into someone else’s workflow/machine and they (or their build system etc.) are expecting a shell script. Python is ubiquitous but sometimes you just want to work like everything else.
One reason is that Python is not built-in on macOS anymore, so it’s hard to justify using it for management scripts. Particularly when you do not have control of the execution environment to begin with. I’ve written some obnoxiously complicated bash (or zsh) scripts because I want to make sure it will run on a vanilla Mac with no additional dependencies. 10 years ago I would’ve done all that stuff in Python, but not anymore. Thanks, Apple!
From a technical perspective, sure, I could push out a portable python environment and it wouldn’t affect the rest of the system. But that comes at a cost. I don’t want to fight for it, and I don’t want to be responsible for maintaining it. It’s easier to just use bash/zsh.
Python is also too heavy for some embedded devices. Not sure if I can count on Amber scripts to run in a busybox environment but maybe?
That said, if the question is “is it worth learning a whole new thing when I already know bash/zsh”, I am not so sure. But in principle, I dig it, regardless of how practical it is with my specific background and needs. I mean, if I learned about this 20 years ago I feel like I might still be reaping rewards.