curl https://some-url/ | sh
I see this all over the place nowadays, even in communities that, I would think, should be security conscious. How is that safe? What’s stopping the downloaded script from wiping my home directory? If you use this, how can you feel comfortable?
I understand that we have the same problems with the installed application, even if it was downloaded and installed manually. But I feel the bar for making a mistake in a shell script is much lower than in whatever language the main application is written. Don’t we have something better than “sh” for this? Something with less power to do harm?
Yeah I hate this stuff too, I usually pipe it into a file figure out what it’s doing and manually install the program from there.
FWIW I’ve never found anything malicious from these scripts but my internal dialogue starts screaming when I see these in the wild, I don’t want to run some script and not know what it’s touching malicious or not it’s a PITA.
As a linux user, I like to know what’s happening under the hood as best I can and these scripts go against that
| sh
stands for shake head at bad practicesYou could just read the script file first… Or YOLO trust it like you trust any file downloaded from a relatively safe source… At least you can read a script.
For security reasons, I review every line of code before it’s executed on my machine.
Before I die, I hope to take my ‘93 dell optiplex out of its box and finally see what this whole internet thing is about.
Not good enough. You should really be inspecting your CPU with a microscope.
What’s stopping the downloaded script from wiping my home directory?
It isn’t more dangerous than running a binary downloaded from them by any other means. It isn’t more dangerous than downloaded installer programs common with Windows.
TBH macOS has had the more secure idea of by default using sandboxes applications downloaded directly without any sort of installer. Linux is starting to head in that direction now with things like Flatpak.
What’s stopping the downloaded script from wiping my home directory?
What’s stopping any Makefile, build script, or executable from running
rm -rf ~
? The correct answer is “nothing”. PPAs are similarly open, things are a little safer if you only use your distro’s default package sources, but it’s always possible that a program will want to be able to delete something in your home directory, so it always has permission.Containerized apps are the only way around this, where they get their own home directory.
Don’t forget your package manager, running someone’s installer as root
It’s roughly the same state as when windows vista rolled out UAC in 2007 and everything still required admin rights because that’s just how everything worked…but unlike Microsoft, Linux distros never did the thing of splitting off installs into admin vs unprivileged user installers.
plenty of package managers have.
flatpak doesn’t require any admin to install a new app
nixos doesn’t run any code at all on your machine for just adding a package assuming it’s already been cached. if it hasn’t been cached it’s run in a sandbox. the cases other package managers use post install configuration scripts for are a different mechanism which possibly has root access depending on what it is.
Gonna ignore nix since they have two users, but flatpak is fair. However flatpak is a sandboxing scheme which is distinct from per-user installs. In many cases it can be the better route but not always. I think the reason it’s popular on Linux is also the dll hell problem.
idk if 2 users is fair, it may just be my circles but I see nixos mentioned more than almost anything else on lemmy/hn/etc in the past couple years
If you’re worried, download it into a file first and read it.
This is just normal Linux poor security. Even giants like docker do this.
Docker doesn’t do this anymore. Their install script got moved to “only do this for testing”.
Use a convenience script. Only recommended for testing and development environments.
Now, their install page recommends packages/repos first, and then a manual install of the binaries second.
And don’t forget to
sudo
!The security concerns are often overblown. The bigger problem for me is I don’t know what kind of mess it’s going to make or whether I can undo it. If it’s a .deb or even a tarball to extract in /usr/local then I know how to uninstall.
I will still use them sometimes but for things I know and understand - e.g. rustup will put things in ~/.rustup and update the PATH in my shell profile and because I know that’s what it does I’m happy to use the automation on a new system.
Damn that’s bad misinformation. Its a security nightmare
So tell me: if I download and run a bash script over https, or a .deb file over https and then install it, why is the former a “security nightmare” and the latter not?
Both are a security nightmare, if you’re not verifying the signature.
You should verify the signature of all things you download before running it. Be it a bash script or a .deb file or a .AppImage or to-be-compiled sourcecode.
Best thing is to just use your Repo’s package manager. Apt will not run anything that isn’t properly signed by a package team members release PGP key.
I have to assume that we’re in this situation because because the app does not exist in our distro’s repo (or homebrew or whatever else). So how do you go about this verification? You need a trusted public key, right? You wouldn’t happen to be downloading that from the same website that you’re worried might be sending you compromised scripts or binaries? You wouldn’t happen to be downloading the key from a public keyserver and assuming it belongs to the person whose name is on it?
This is such a ridiculously high bar to avert a “security nightmare”. Regular users will be better off ignoring such esoteric suggestions and just looking for lots of stars on GitHub.
No, you download the key from many distinct domains and verify it matches before TOFU
Ah yes, so straightforward.
Fortunately package managers already do this for you. Open a bug report to add to apt. Easy.
For example: A compromised host could detect whether you are downloading the script or piping it.
I’m confident that if the host is compromised I’m screwed regardless.
No it isn’t. What could a Bash script do that the executable it downloads couldn’t do?
It’s not just protection against security, but also human error.
https://github.com/MrMEEE/bumblebee-Old-and-abbandoned/issues/123
https://hackaday.com/2024/01/20/how-a-steam-bug-once-deleted-all-of-someones-user-data/
Just because I trust someone to write a program in a modern language they are familier in, doesn’t mean I trust them to write an install script in bash, especially given how many footguns bash has.
Hilarious, but not a security issue. Just shitty Bash coding.
And I agree it’s easier to make these mistakes in Bash, but I don’t think anyone here is really making the argument that curl | bash is bad because Bash is a shitty error-prone language (it is).
Definitely the most valid point I’ve read in this thread though. I wish we had a viable alternative. Maybe the Linux community could work on that instead of moaning about it.
Hilarious, but not a security issue. Just shitty Bash coding.
It absolutely is a security issue. I had a little brain fart, but what I meant to say was “Security isn’t just protection from malice, but also protection from mistakes”.
Let’s put it differently:
Hilarious, but not a security issue. Just shitty C coding.
This is a common sentiment people say about C, and I have a the same opinion about it. I would rather we use systems in place that don’t give people the opportunity to make mistakes.
I wish we had a viable alternative. Maybe the Linux community could work on that instead of moaning about it.
Viable alternative for what? Packaging.
I personally quite like the systems we have. The “install anything from the internet” is exactly how Windows ends up with so much malware. The best way to package software for users is via a package manager, that not only puts more eyes on the software, but many package managers also have built in functionality that makes the process more reliable and secure. For example signatures create a chain of trust. I really like Nix as a distro-agnostic package manager, because due to the unique way they do things, it’s impossible for one package’s build process to interfere with another.
If you want to do “install anything from the internet” it’s best to do it with containers and sandboxing. Docker/podman for services, and Flatpak for desktop apps, where it’s pretty easy to publish to flathub. Both also seem to be pretty easy, and pretty popular — I commonly find niche things I look at ship a docker image.
This is a common sentiment people say about C, and I have a the same opinion about it. I would rather we use systems in place that don’t give people the opportunity to make mistakes.
The issue with C is it lets you make mistakes that commonly lead to security vulnerabilities - allowing a malicious third party to do bad stuff.
The Bash examples you linked are not security vulnerabilities. They don’t let malicious third parties do anything. They done have CVEs, they’re just straight up data loss bugs. Bad ones, sure. (And I fully support not using Bash where feasible.)
Viable alternative for what? Packaging.
A viable way to install something that works on all Linux distros (and Mac!), and doesn’t require root.
The reason people use curl | bash is precisely so they don’t have to faff around making a gazillion packages. That’s not a good answer.
You’re telling me that you dont verify the signatures of the binaries you download before running them too?!? God help you.
I download my binaries with apt, which will refuse to install the binary if the signature doesn’t match.
No because there’s very little point. Checking signatures only makes sense if the signatures are distributed in a more secure channel than the actual software. Basically the only time that happens is when software is distributed via untrusted mirror services.
Most software I install via curl | bash is first-party hosted and signatures don’t add any security.
All publishing infrastructure shouldn’t be trusted. Theres countless historical examples of this.
Use crypto. It works.
Crypto is used. It is called TLS.
You have to have some trust of publishing infrastructure, otherwise how do you know your signatures are correct?
TLS is a joke because of X.509.
We dont need to trust any publishing infrastructure because the PGP private keys don’t live on the publishing infrastructure. We solved this issue in the 90s
By definition nothing
The point you appear to be making is “everything is insecure so nothing is” and the point others are making is “everything is insecure so everything is”
No, the point I am making is there are no additional security implications from executing a Bash script that someone sends you over executing a binary that they send you. I don’t know how to make that clearer.
Back up your data folks. You’re probably more likely to accidentally
rm -rf
yourself than download a script that will do it.To be fair that’s because Linux funnels you to the safeguard-free terminal where it’s much harder to visualize what’s going on and fewer checks to make sure you’re doing what you mean to be doing. I know it’s been a trend for a long time where software devs think they are immune from mistakes but…they aren’t. And nor is anyone else.
You have the option of piping it into a file instead, inspecting that file for yourself and then running it, or running it in some sandboxed environment. Ultimately though, if you are downloading software over the internet you have to place a certain amount of trust in the person your downloading the software from. Even if you’re absolutely sure that the download script doesn’t wipe your home directory, you’re going to have to run the program at some point and it could just as easily wipe your home directory at that point instead.
All the software I have is downloaded from the internet…
You should try downloading the software from your mind brain, like us elite hackers do it. Just dump the binary from memory into a txt file and exe that shit, playa!
You should start getting it from CD-roms, that shit you can trust
I got my software from these free USB sticks I found in the parking lot.
Ah, you’re one of my users
Steady on Buck Rogers, what is this, 2025!?
It is kind of cool, when you’ve actually written your own software and use that. But realistically, I’m still getting the compiler from the internet…
Indeed, looking at the content of the script before running it is what I do if there is no alternative. But some of these scripts are awfully complex, and manually parsing the odd bash stuff is a pain, when all I want to know is : 1) what URL are you downloading stuff from? 2) where are you going to install the stuff?
As for running the program, I would trust it more than a random deployment script. People usually place more emphasis on testing the former, not so much the latter.
You have the option of piping it into a file instead, inspecting that file for yourself and then running it, or running it in some sandboxed environment.
That’s not what projects recommend though. Many recommend piping the output of an HTTP transfer over the public Internet directly into a shell interpreter. Even just
curl https://... > install.sh; sh install.sh
would be one step up. The absolute minimum recommendation IMHO should be
curl https://... > install.sh; less install.sh; sh install.sh
but this is still problematic.
Ultimately, installing software is a labourious process which requires care, attention and the informed use of GPG. It shouldn’t be simplified for convenience.
Also, FYI, the word “option” implies that I’m somehow restricted to a limited set of options in how I can use my GNU/Linux computer which is not the case.
Showing people that are running curl piped to bash the script they are about to run doesn’t really accomplish anything. If they can read bash and want to review the script then they can by just opening the URL, and the people that aren’t doing that don’t care what’s in the script, so why waste their time with it?
Do you think most users installing software from the AUR are actually reading the pkgbuilds? I’d guess it’s a pretty small percentage that do.
Showing people that are running curl piped to bash the script they are about to run doesn’t really accomplish anything. If they can read bash and want to review the script then they can by just opening the URL
What it accomplishes is providing the instructions (i.e. an easily copy-and-pastable terminal command) for people to do exactly that.
If you can’t review a bash script before running it without having an unnecessarily complex one-liner provided to you to do so, then it doesn’t matter because you aren’t going to be able to adequately review a bash script anyway.
If you can’t review a bash script before running it without having an unnecessarily complex one-liner provided to you
Providing an easily copy-and-pastable one-liner does not imply that the reader could not themselves write such a one-liner.
Having the capacity to write one’s own commands doesn’t imply that there is no value in having a command provided.
unnecessarily complex
LOL
I don’t think you realize that if your goal is to have a simple install method anyone can use, even redirecting the output to install.sh like in your examples is enough added complexity to make it not work in some cases. Again, those are not made for people that know bash.
even redirecting the output to install.sh like in your examples is enough added complexity to make it not work in some cases
You can’t have any install method that works in all cases.
if your goal is to have a simple install method anyone can use
Similarly, you can’t have an install method anyone can use.
I mean if you think that it’s bad for linux culture because you’re teaching newbies the wrong lessons, fair enough.
My point is that most people can parse that they’re essentially asking you to run some commands at a url, and if you have even a fairly basic grasp of linux it’s easy to do that in whatever way you want. I don’t know if I personally would be any happier if people took the time to lecture me on safety habits, because I can interpret the command for myself.
curl https://some-url/ | sh
is terse and to the point, and I know not to take it completely literally.linux culture
snigger
you’re teaching newbies the wrong lessons
The problem is not that it’s teaching bad lessons, it’s that it’s actually doing bad things.
most people can parse that they’re essentially asking you to run some commands at a url
I know not to take it completely literally
Then it needn’t be written literally.
I think you’re giving the authors of such installation instructions too much credit. I think they intend people to take it literally. I think this because I’ve argued with many of them.
Who the fuck types out “snigger” haha
Teleports behind you
deleted by creator
I dont just cringe, I open a bug report. You can be the change to fix this.
One of the few worthwhile comments on Lemmy…
Can we also open bug reports for open-source projects that base their community on Discord?
I usually read it first.
Download it and then read it. Curl has a different user agent than web browsers.
Yeah I guess if they were being especially nefarious they could supply two different scripts based on user-agent. But I meant what you said anyways… :) I download and then read through the script. I know this is a common thing and people are wary of doing it, but has anyone ever heard of there being something disreputable in one of this scripts? I personally haven’t yet.
I’ve seen it many times. It usually takes the form of fake websites that are impersonating the real thing. It is easy to manipulate Google results. Also, there have been a few cases where a bad design and a typo result in data loss.