I set up a quick demonstration to show risks of curl|bash and how a bad-actor could potentially hide a malicious script that appears safe.
It’s nothing new or groundbreaking, but I figure it never hurts to have another reminder.
Never have I ever piped curl to bash.
I never thought about opening it in a browser. I always used curl to download such a script and view it where it was supposed to be run.
Oh, people will keep using it no matter how much you warn them.
Proxmox-helper-scripts is a perfect example. They’ll agree with you until that site comes up, and then its “it’ll never, ever get hacked and subverted, nope, can’t happen, impossible”.
Wankers.
I was looking at that very thing last night.
But then I realized, “why can’t immich just create usable packages like we had before?” and moped back out.
But, for a moment, I was sure a little inspection and testing would make the Internet equivalent of NYC MTA coin-sucking magically safe. It looked so eeeeasy.
Yes this has risks. At the same time anytime you run any piece of software you are facing the same risks, especially if that software is updated from the internet. Take a look at the NIST docs in software supply chain risks.
But those are two very different things, I can very easily give you a one liner using curl|bash that will compromise your system, to get the same level of compromise through a proper authenticated channel such as apt/pacman/etc you would need to compromise either their private keys and attack before they notice and change them or stick malicious code in an official package, either of those is orders of magnitude more difficult than writing a simple bash script.
I would feel more comfortable running curl bash from a trusted provider than doing apt get from an unknown software repo. What you are trying to do is establish trust in your supply chain, the delivery vehicle is less important.
This is a bit like saying crossing the street blindfolded while juggling chainsaws and crossing the street on a pedestrian crossing while the light is red for cars both carry risk. Sure. One’s a terrible idea though.
Not completely correct. A lot of updaters work with signatures to verify that what was downloaded is signed by the correct key.
With bash curl there is no such check in place.
So strictly speeking it is not the same.
Signatures do not help if your distribution infra gets compromised. See Solarwinds and the more recent node.js incidents.
Please tell me you are not seriously equating a highly sophisticated attack line the Solarwind compromise with piping curl to bash?
This is incorrect. If the update you download is compromised then the signature is invalid and the update fails.
To achieve a compromised update you either need to compromise the update infrastructure AND the key or the infratstructure AND exploit the local updater to accept the invalid or forged signature.
If I can control your infra I can alter what is a valid signature. It has happened. It will happen again. Digital signatures are not sufficient by themselves to prevent supply chain risks. Depending on your threat model, you need to assume advanced adversaries will seek to gain a foothold in your environment by attacking your software supplier. in these types of attacks threat actors can and will take control over the distribution mechanisms deploying trojaned backdoors as part of legitimately signed updates. It is a complex problem and I highly encourage you to read the NIST guidance to understand just how deep the rabbit hole goes.
Cybersecurity Supply Chain Risk Management Practices for Systems and Organizations
No you cannot, the pub key either needs to be present on the updater or uses infrastructure that is not owned by you. Usually how most software suppliers are doing it the public key is supplied within the updater.
Not sure how else to explain this. Look at the CISA bulletin on Shai-Hulud the attacker published valid and signed binaries that were installed by hundreds of users.
"CISA is releasing this Alert to provide guidance in response to a widespread software supply chain compromise involving the world’s largest JavaScript registry, npmjs.com. A self-replicating worm—publicly known as “Shai-Hulud”—has compromised over 500 packages.[i]
After gaining initial access, the malicious cyber actor deployed malware that scanned the environment for sensitive credentials. The cyber actor then targeted GitHub Personal Access Tokens (PATs) and application programming interface (API) keys for cloud services, including Amazon Web Services (AWS), Google Cloud Platform (GCP), and Microsoft Azure.[ii]
The malware then:
- Exfiltrated the harvested credentials to an endpoint controlled by the actor.
- Uploaded the credentials to a public repository named
Shai-Huludvia theGitHub/user/reposAPI. - Leveraged an automated process to rapidly spread by authenticating to the npm registry as the compromised developer, injecting code into other packages, and publishing compromised versions to the registry.[iii]"
After gaining initial access, the malicious cyber actor deployed malware that scanned the environment for sensitive credentials.
So as I said, the keys got compromised. Thats what i said in the second post.
Apt is great
This helped a lot. I had no clue I could post the curl string in the URL bar of a browser to view the script. Thanks for the education!
Shit are URLs esoteric knowledge now?
You had no idea you could paste a url into a browser’s location bar ?
You didn’t knew that the tool to handle URLs written in C (very creatively named C-Url) was handling URLs? It’s also written in C if you didn’t knew.
Curl bash is no different than running an sh script you dont know manually…
No, it is different, as it adds an entire layer of indirection and unknown to the mix, increasing the risk in the process.
True, but this is specifically about scripts you think you know, and how curl bash might trick you into running a different script entirely.
a more cautious user might first paste the url into the address bar of their web browser to see what the script looks like before running it.
Wow, I never thought anyone would be that dumb.
Why wouldn’t they just wget it, read it, and then execute it?
Oh the example in the article is the nice version if this attack.
Checking the script as downloaded by wget or curl and then piping curl to bash is still a terrible idea, as you have no guarantee you’ll get the same script in both cases:
Anytime I see a project that had this in their install instructions, I don’t use that project.
It shows how dumb the devs are
Yes, this is the correct approach from a security perspective.
@K3can@lemmy.radio love the early 2000s stylesheet/color theme of your blog 🙂
Thanks! I like to keep things simple. The colors are based on Counter Strike 1.6. 😁
And if you’re into the classic styling, my homepage is a direct homage to my old 2000s sites.
You mean blindly running code is bad? /s
you’d have to be mad to willingly pipe a script to bash without checking it. holy shit
Is it different from running a bash script you downloaded without checking it? E.g. the installer that you get with GOG games?
Genuine question, I’m no expert.
It is, see https://github.com/m4tx/curl-bash-attack
That’s an interesting proof of concept, but I don’t think it shows it’s different. That’s a server side attack, whoever has control of the server could just have the script download a malicious binary instead and you wouldn’t be able to tell from the script.
It’s really only about trusting the source. Your operating system surely has thousands of scripts that you’ve never read and never checked. And wouldn’t have time to. And people don’t complain about that.
But it’s really bad practice to run random things from random sites. So the practice of downloading a script and running it is frowned upon. Mostly as a way of maintaining good security hygiene.
I have no problems with running scripts from the internet, AFTER you check them. Do NOT blindly run a script you found on the internet. As others have said download them, then check them, then and only then run them if they’re safe. NEVER pipe to bash, ever.
Ok but not everyone has that skill. And anyway, how is this different to running a binary where you can’t check the code?
it’s exactly the same. Don’t run binaries you don’t trust fully. But i get what you mean. miley_cyrus_nude.jpg.exe is probably gonna end badly.
Yeah I get that, but I would install docker, cloudflared, etc by piping a convenience script to bash without hesitation. I’ve already decided to install their binary, I don’t see why the install script is any higher risk.
I know it’s a controversial thing for everyone to make their own call on, I just don’t think the risk for a bash script is any higher than a binary.
I won’t lie, I use curl | bash as well, but I do dislike it for two reasons:
Firstly, it is much, much easier to compromise the website hosting than the binary itself, usually. Distributed binaries are usually signed by multiple keys from multiple servers, resulting in them being highly resistant to tampering. Reproducible builds (two users compiling a program get the same output) make it trivial to detect tampering as well.
On the other hand, websites hosting infrastructure is generally nowhere near as secure. It’s typically one or two VPS’s, and there is no signature or verification that the content is “official”. So even if I’m not tampering with the binary, I can still tamper with the bash script to add extra goodies to it.
On the other hand (but not really relevant to what OP is talking about), just because I trust someone to give me a binary in a mature programming language they have experience writing in, doesn’t mean I trust them to give me a script in a language known for footguns. A steam bug in their bash script once deleted a user’s home directory. There have also been issues with AUR packages, which are basically bash scripts, breaking people’s systems as well. When it comes to user/community created scripts, I mostly trust them to not be malicious, and I am more fearful of a bug or mistake screwing things up. But at the same time, I have little confidence in my ability to spot these bugs.
Generally, I only make an exception for running bash installers if the program being installed is a “platform” that I can use to install more software. K3s (Kubernetes distro), or the Nix package manager are examples. If I can install something via Nix or Docker then it’s going to be installed via there and not installed via curl | bash. Not every developer under the sun should be given the privilege of running a bash script on my system.
As a sidenote, docker doesn’t recommend their install script anymore. All the instructions have been removed from the website, and they recommend adding their own repo’s instead. Personally, I prefer to get it from the distro’s repositories, as usually that’s the simplest and fastest way to install docker nowadays.
Firstly, it is much, much easier to compromise the website hosting than the binary itself, usually. Distributed binaries are usually signed by multiple keys from multiple servers, resulting in them being highly resistant to tampering. Reproducible builds (two users compiling a program get the same output) make it trivial to detect tampering as well.
Yeah this is a fair call.
But at the same time, I have little confidence in my ability to spot these bugs.
This is the key thing for me. I am not likely to spot any issues even if they were there! I’d only be scanning for external connections or obviously malicious code, which I do when I don’t have as much trust in the source.
As a sidenote, docker doesn’t recommend their install script anymore.
Yeah I used it as an example because there are very few times I ever remember piping to bash, but that’s probably the most common one I have done in the past.
the difference though is you can check a script. if it’s an open source project, you can also compile from source. but I get what you mean
You can, but to me it seems weird to say it’s crazy to pipe to bash when people happily run binaries. If anything, the convenience script is lower risk than the binary since people have probably checked it before you.
I wouldn’t pipe a random script to bash though, nothing where I wouldn’t trust the people behind it.
Most developers I’ve looked at would happily just paste the curl|bash thing into the terminal.
I often would skim the script in the browser, but a. This post shows that’s not fool proof and b. a sufficiently sophisticated malicious script would fool a casual read
Most developers I’ve looked at would happily just paste the curl|bash thing into the terminal.
I mean, I typically see it used for installing applications, and so long as TLS is used for the download, I’m still not aware of a good reason why you should check the Bash script in particular in that case, since the application itself could just as well be malware.
Of course, it’s better to check the Bash script than to not check it, but at that point we should also advise to download the source code for the application, review it and then compile it yourself.
At some point, you just have to bite the bullet and I have not yet seen a good argument why the Bash script deserves special treatment here…Having said that, for cases where you’re not installing an application, yeah, reviewing the script allows you to use it, without having to trust the source to the same degree as you do for installing an application.
And you better inspect and execute a downloaded copy, because a malicious actor can serve a different file for curl/wget than to your browser
They can even serve a different file for curl vs curl|bash
Yeah that do, I remember that the demo was pretty impressive ten fifteen years ago!
Does curl send a different useragent when it’s piped?
Searching for those words just vomits ‘hOW to SeT cUrL’s UseRaGenT’ blog spam.
Its timing based. When piped a script, bash executes each line completly before taking the next line from the input. Curl has a limited output buffer.
- Operation that takes a long time. Like a sleep, or if you want it less obvious. A download, an unzip operation, apt update, etc.
- Fill the buffer with more bash commands.
- Measure on the server if at some point curl stops downloading the script.
- Serve a malicious payload.
Oh that is clever.
Not that I know of, which means I can only assume it’ll be a timing-based attack.
With strategic use of sleep statements in the script you should stand a pretty good chance of detecting the HTTP download blocking while the script execution is paused.
If you were already shipping the kind of script that unpacks a binary payload from the tail end of the file and executes it, it’s well within the realm of possibility to swap it for a different one.
Yep! That’s what the post shows.
I created a live demo file, too, so that you can actually see the difference based on how you request the file.
Hit the nail on the head. Download the file, inspect, then run that local copy.
And it’s wild how much even that has been absolutely normalized by all these shitty lazy developers and platforms. Vibe coding it just going to make it worse. All these programs that look nice on the surface and are just slop on the inside. It’s going to be a mess.
The post is specifically about how you can serve a totally different script than the one you inspect. If you use
curlto fetch the script via terminal, the webserver can send a different script to a browser based on the UserAgent.And whether or not you think someone would be mad to do it, it’s still a widespread practice. The article mentions that piping curl straight to bash is already standard procedure for Proxmox helper scripts. But don’t take anyone’s word for it, check it out:
https://community-scripts.github.io/ProxmoxVE/
It’s also the recommended method for PiHole:
The reality is a lot of newcomers to Linux won’t even understand the risks involved, it’s run because that’s what they’re told or shown to do. That’s what I did for pihole many years ago too, I’ll admit
Users are blameless, I find the fault with the developers.
Asking users to pipe curl to bash because it’s easier for the developer is just the developer being lazy, IMO.
Developers wouldn’t get a free pass for taking lazy, insecure shortcuts in programming, I don’t know why they should get a free pass on this.
I’ve been accused of “gate keeping” when I tell people that this is a shitty way to deploy applications and that nobody should do it.
In addition to the other examples it’s also in the default installation mode for node.js - they use this to install nvm
Ya cant even blame someone non-technical falling for this if they haven’t been explicitly informed - it’s getting reinforced as completely normal by too many “reputable” projects.
I’m pretty sure brew on mac is the same too
I mean, true, but most of the things I do that with are private scripts that I wrote. I think the main exception to that is Oh-my-zsh.
Also it’s not really a full pipe…
bash <(curl cht.sh/curl)That’s saves the URL as a temporary file and opens it with bash. Frankly, the URL I gave you is very bad because it is not actually a script, just the help page for curl. Frankly, it would better if it wasn’t nested.
the article isn’t about scripts you wrote yourself. run your own scripts all you like.
Acronyms, initialisms, abbreviations, contractions, and other phrases which expand to something larger, that I’ve seen in this thread:
Fewer Letters More Letters DNS Domain Name Service/System HTTP Hypertext Transfer Protocol, the Web PiHole Network-wide ad-blocker (DNS sinkhole) SSL Secure Sockets Layer, for transparent encryption TLS Transport Layer Security, supersedes SSL VPS Virtual Private Server (opposed to shared hosting)
4 acronyms in this thread; the most compressed thread commented on today has 11 acronyms.
[Thread #111 for this comm, first seen 23rd Feb 2026, 04:40] [FAQ] [Full list] [Contact] [Source code]
Good bot











