There are common programs you need to install via the terminal, you can’t even change sound playback quality without editing a conf file which requires sudo!
There is so much you need the shell for and until people stop defending it and start focusing on UX Linux will never be popular for your average user.
There are common programs you need to install via the terminal
Out of interest, which programs do you need to install via terminal that concern the average user?
you can’t even change sound playback quality without editing a conf file which requires sudo!
What do you consider changing “playback quality”?
Sampling rate? That can be changed in a config file without sudo (~/.config/pipewire/pipewire.conf.d), you shouldn’t though because many applications expect 48000 as sampling rate. Unless you’re doing studio recordings you want 48000.
There is so much you need the shell
Correct, there is a lot of need for the shell, for power users. I don’t really see anything that the average office and browser enjoyer needs to do in the terminal. You can even game now in most distros without opening the terminal once.
Out of interest, which programs do you need to install via terminal that concern the average user?
For example installing the GPU driver for an older GPU. Or installing the driver for an obscure printer, touchpad or other weird hardware.
Average user doesn’t mean total noob. Installing Windows and the relevant drivers is something many users in the “Gamer class” can do. These guys usually don’t to command line (except for maybe pinging something), but they are comfortable with installing and configuring stuff in GUI.
They understand how to google the driver to their weird hardware, download the .exe or .msi, start it and navigate the install wizard.
On Linux I’ve had it a few times that you e.g. have to unload/load kernel modules and stuff to get a driver working. I once even had it, that the Linux driver for a device was only supplied in source code to be compiled with an ancient version of GCC that wasn’t available over the package manager. So then I spend an hour or two fixing compiler errors to upgrade that old source code to work with a current GCC.
Getting the same hardware to run under Windows meant downloading the .exe and running it.
And yeah, that’s not something you’ll do on a daily basis, but it is a huge roadblock for someone afraid of white text in a black window.
I feel like we can cherry pick situations on other operating systems where you might have to open a terminal window to solve an issue, but I agree that there are roadblocks that many won’t even try to get past. There has been a load of progress around usability and informational resources for less advanced users over just the last few years. I feel the main barrier to entry is the willingness to learn something new.
I guess, many people here can’t take it when people talk about the issues holding back Linux, considering the downvotes.
I think, if you like something, it is really important to talk about the issues it has, so that they can be improved. Blaming the users is not going to make more people switch.
These instances I posted weren’t cherry picked. They where just what I encountered when setting up a single laptop.
I could tell you about the issues I had with my work laptop, where it was pretty difficult getting the VPN solution we use at work to run. We are using Teams and Outlook for work, both don’t have official apps on Linux and the unofficial ones are really buggy.
Getting simple stuff like screen sharing to work under Wayland is basically impossible, which required me to revert to X11.
And sure, you can say that it’s all edge cases, and people shouldn’t be running Linux with a GPU, on an old device, with Microsoft tools or do screen sharing.
But if you say all of these common use cases are rare edge cases that shouldn’t really be done on Linux, then you aren’t talking about a general purpose desktop OS any more.
Older hardware and software that are made by companies who have hostile or ignorant stances towards FOSS are major contributors to at least some of the issues you mentioned. I can tell you that there is active development around solving some GPU/Wayland issues, but the limitations on what Linux can or can’t do isn’t fully the fault of Linux.
There is definitely room for improvement in Linux. The improvements in just the last three years shows that it has improved at a pretty brisk pace. Free and open community driven operating systems work toward the active needs of the community, so hopefully any issues or bugs you had got reported and you’ve actively checked up on them. I am making an assumption here, but if all of these issues you have had were extremely common, there would be every incentive for development of solutions to them.
There is some level of compromise that is needed when using proprietary software or hardware from hostile vendors or using some older hardware with Linux. This also goes for company supplied or required hardware/software. Linux might not be for everyone on every piece of hardware right now. The tradeoffs for having control over your hardware and software can sometimes be frustrating.
As for “blaming the users”, I don’t think I did that at all. I just feel like some folks prefer appliances over heavy machinery. That’s personal preference. Sure, Linux should make onboarding as easy as possible, but in my opinion, the active pursuit of being #1 or #2 in desktop OS use is going down the wrong path. There is a certain type of person who chooses to go down the desktop Linux path and catering to their needs seems much more important for the long-term health of the OS.
I honestly don’t see how Ubuntu installing the wrong driver is Nvidia’s fault. Ubuntu has both the current and the legacy driver in their repos. When installing the OS it asks you whether you want to install proprietary drivers, which I agreed to, so it installed the non-legacy driver that does not work with the old GPU.
Also, when installing the correct driver on apt, there is no text prompt or CLI wizard (like in many other tools installed over apt) to actually load that driver into the kernel. I don’t see how this is Nvidia’s fault, as they haven’t created the Ubuntu package.
That’s squarely on the Ubuntu guys. That has literally nothing to do with hostile vendors.
I can’t speak for Ubuntu or your situation as I don’t have your issues and I don’t use Ubuntu, but I would advise you reach out to the Ubuntu community with your issues and if you can’t find a solution, file a bug report. They are a large community with a lot of engagement, so I would think that you might have luck either solving your problem or pointing their devs toward fixing the issue on their end. The squeaky wheel gets the grease.
For example installing the GPU driver for an older GPU. Or installing the driver for an obscure printer, touchpad or other weird hardware.
That’s not quite my definition of “common”.
Average user doesn’t mean total noob. Installing Windows and the relevant drivers is something many users in the “Gamer class” can do.
The “Gamer class” is far from the average user, the average user doesn’t even know what a GPU or a driver is and doesn’t care. As long as the OS installs all drivers by default or the OEM has preinstalled them all is good.
Getting the same hardware to run under Windows meant downloading the .exe and running it.
Until there’s no more drivers for that generation of GPU. The Windows 11 drivers for AMD only go down to the Vega 64, if you have a Fury X or a 7970 you’re out of luck. Not that Windows 11 even lets you install on a machine that old.
AMDGPU goes down all the way to GCN 1.2, which means you can even run a 7970 on a modern Linux OS. Even out of the box if your distro has the legacy flags enabled.
It would be fantastic if there was more hardware that works out of the box in Linux, but that’s up to the manufacturers. Until more people switch to Linux they don’t bother and until they bother everybody complains that XY doesn’t work on Linux.
As of right now the biggest hurdle is Nvidia without drivers included in Linux. Without a distro that takes care of installing their drivers they are essentially out of luck.
Using a GPU under Linux is not common? And installing Linux on old laptops isn’t either?
As of right now the biggest hurdle is Nvidia without drivers included in Linux. Without a distro that takes care of installing their drivers they are essentially out of luck.
I can’t say anything about AMD, since the last time I had an AMD GPU is ~15 years ago.
When I installed an Ubuntu variant on my G580, which has a Geforce 635M it automatically installed the current driver for Geforce GPUs when I setup the OS, but that driver doesn’t support the 635M. That one needs a legacy driver. And getting that to work was a major pain.
I first installed the legacy driver over apt, but it didn’t do anything, because apparently installing the driver doesn’t actually load the kernel module for the driver. So I had to load it manually, and it still didn’t do anything. Turns out, uninstalling the original driver didn’t unload it from the GPU either. So I had to re-install the old driver, unload the module, uninstall the old driver, install the legacy driver and load the legacy module. Took me a few hours to figure all of that out.
No way someone without CLI experience will be able to do that.
Using a GPU under Linux is not common? And installing Linux on old laptops isn’t either?
Installing drivers for an older GPU, obscure printer, touchpad or other weird hardware is not common.
When I installed an Ubuntu variant on my G580, which has a Geforce 635M it automatically installed the current driver for Geforce GPUs when I setup the OS, but that driver doesn’t support the 635M. That one needs a legacy driver. And getting that to work was a major pain.
Which is an issue with Nvidia, they have no drivers for that GPU for Windows 11 either. Not saying that this is not an issue but there is absolutely nothing Linux can do to make every legacy GPU work without help from Nvidia. It uses the open source driver out of the box, which works sometimes but not for everything and definitely not for gaming.
Not saying that this is not an issue but there is absolutely nothing Linux can do to make every legacy GPU work without help from Nvidia.
Yes, they can. They literally have the correct (legacy) driver in the Ubuntu repo. But the autoinstaller installs the wrong driver during installing the OS. And if you try to manually install it, there is not even a text prompt in the CLI saying “You just installed that driver, do you want to actually use it to? (Y/n)”.
They could have even gone so far as to make a CLI wizard (like many other packages do) or even a GUI wizard. But no, the package just installs and does nothing by default.
It uses the open source driver out of the box, which works sometimes but not for everything and definitely not for gaming.
Also that is not correct. All the *buntu installers ask you when you install the OS whether you also want to have closed source drivers installed, and then it installs the closed source Nvidia drivers. Just the wrong ones.
Your use of ‘anything’ and ‘everything’ is quite exaggerated.
The average user can do most of their general day to day tasks on Linux without touching the terminal.
Even on Windows, you need to use the command line/shell to complete certain task, so you can’t escape it fully.
There are common programs you need to install via the terminal, you can’t even change sound playback quality without editing a conf file which requires sudo!
There is so much you need the shell for and until people stop defending it and start focusing on UX Linux will never be popular for your average user.
Out of interest, which programs do you need to install via terminal that concern the average user?
What do you consider changing “playback quality”?
Sampling rate? That can be changed in a config file without sudo (
~/.config/pipewire/pipewire.conf.d
), you shouldn’t though because many applications expect 48000 as sampling rate. Unless you’re doing studio recordings you want 48000.Correct, there is a lot of need for the shell, for power users. I don’t really see anything that the average office and browser enjoyer needs to do in the terminal. You can even game now in most distros without opening the terminal once.
For example installing the GPU driver for an older GPU. Or installing the driver for an obscure printer, touchpad or other weird hardware.
Average user doesn’t mean total noob. Installing Windows and the relevant drivers is something many users in the “Gamer class” can do. These guys usually don’t to command line (except for maybe pinging something), but they are comfortable with installing and configuring stuff in GUI.
They understand how to google the driver to their weird hardware, download the .exe or .msi, start it and navigate the install wizard.
On Linux I’ve had it a few times that you e.g. have to unload/load kernel modules and stuff to get a driver working. I once even had it, that the Linux driver for a device was only supplied in source code to be compiled with an ancient version of GCC that wasn’t available over the package manager. So then I spend an hour or two fixing compiler errors to upgrade that old source code to work with a current GCC.
Getting the same hardware to run under Windows meant downloading the .exe and running it.
And yeah, that’s not something you’ll do on a daily basis, but it is a huge roadblock for someone afraid of white text in a black window.
I feel like we can cherry pick situations on other operating systems where you might have to open a terminal window to solve an issue, but I agree that there are roadblocks that many won’t even try to get past. There has been a load of progress around usability and informational resources for less advanced users over just the last few years. I feel the main barrier to entry is the willingness to learn something new.
I guess, many people here can’t take it when people talk about the issues holding back Linux, considering the downvotes.
I think, if you like something, it is really important to talk about the issues it has, so that they can be improved. Blaming the users is not going to make more people switch.
These instances I posted weren’t cherry picked. They where just what I encountered when setting up a single laptop.
I could tell you about the issues I had with my work laptop, where it was pretty difficult getting the VPN solution we use at work to run. We are using Teams and Outlook for work, both don’t have official apps on Linux and the unofficial ones are really buggy.
Getting simple stuff like screen sharing to work under Wayland is basically impossible, which required me to revert to X11.
And sure, you can say that it’s all edge cases, and people shouldn’t be running Linux with a GPU, on an old device, with Microsoft tools or do screen sharing.
But if you say all of these common use cases are rare edge cases that shouldn’t really be done on Linux, then you aren’t talking about a general purpose desktop OS any more.
Older hardware and software that are made by companies who have hostile or ignorant stances towards FOSS are major contributors to at least some of the issues you mentioned. I can tell you that there is active development around solving some GPU/Wayland issues, but the limitations on what Linux can or can’t do isn’t fully the fault of Linux.
There is definitely room for improvement in Linux. The improvements in just the last three years shows that it has improved at a pretty brisk pace. Free and open community driven operating systems work toward the active needs of the community, so hopefully any issues or bugs you had got reported and you’ve actively checked up on them. I am making an assumption here, but if all of these issues you have had were extremely common, there would be every incentive for development of solutions to them.
There is some level of compromise that is needed when using proprietary software or hardware from hostile vendors or using some older hardware with Linux. This also goes for company supplied or required hardware/software. Linux might not be for everyone on every piece of hardware right now. The tradeoffs for having control over your hardware and software can sometimes be frustrating.
As for “blaming the users”, I don’t think I did that at all. I just feel like some folks prefer appliances over heavy machinery. That’s personal preference. Sure, Linux should make onboarding as easy as possible, but in my opinion, the active pursuit of being #1 or #2 in desktop OS use is going down the wrong path. There is a certain type of person who chooses to go down the desktop Linux path and catering to their needs seems much more important for the long-term health of the OS.
I honestly don’t see how Ubuntu installing the wrong driver is Nvidia’s fault. Ubuntu has both the current and the legacy driver in their repos. When installing the OS it asks you whether you want to install proprietary drivers, which I agreed to, so it installed the non-legacy driver that does not work with the old GPU.
Also, when installing the correct driver on apt, there is no text prompt or CLI wizard (like in many other tools installed over apt) to actually load that driver into the kernel. I don’t see how this is Nvidia’s fault, as they haven’t created the Ubuntu package.
That’s squarely on the Ubuntu guys. That has literally nothing to do with hostile vendors.
I can’t speak for Ubuntu or your situation as I don’t have your issues and I don’t use Ubuntu, but I would advise you reach out to the Ubuntu community with your issues and if you can’t find a solution, file a bug report. They are a large community with a lot of engagement, so I would think that you might have luck either solving your problem or pointing their devs toward fixing the issue on their end. The squeaky wheel gets the grease.
That’s not quite my definition of “common”.
The “Gamer class” is far from the average user, the average user doesn’t even know what a GPU or a driver is and doesn’t care. As long as the OS installs all drivers by default or the OEM has preinstalled them all is good.
Until there’s no more drivers for that generation of GPU. The Windows 11 drivers for AMD only go down to the Vega 64, if you have a Fury X or a 7970 you’re out of luck. Not that Windows 11 even lets you install on a machine that old.
AMDGPU goes down all the way to GCN 1.2, which means you can even run a 7970 on a modern Linux OS. Even out of the box if your distro has the legacy flags enabled.
It would be fantastic if there was more hardware that works out of the box in Linux, but that’s up to the manufacturers. Until more people switch to Linux they don’t bother and until they bother everybody complains that XY doesn’t work on Linux.
As of right now the biggest hurdle is Nvidia without drivers included in Linux. Without a distro that takes care of installing their drivers they are essentially out of luck.
Using a GPU under Linux is not common? And installing Linux on old laptops isn’t either?
I can’t say anything about AMD, since the last time I had an AMD GPU is ~15 years ago.
When I installed an Ubuntu variant on my G580, which has a Geforce 635M it automatically installed the current driver for Geforce GPUs when I setup the OS, but that driver doesn’t support the 635M. That one needs a legacy driver. And getting that to work was a major pain.
I first installed the legacy driver over apt, but it didn’t do anything, because apparently installing the driver doesn’t actually load the kernel module for the driver. So I had to load it manually, and it still didn’t do anything. Turns out, uninstalling the original driver didn’t unload it from the GPU either. So I had to re-install the old driver, unload the module, uninstall the old driver, install the legacy driver and load the legacy module. Took me a few hours to figure all of that out.
No way someone without CLI experience will be able to do that.
Installing drivers for an older GPU, obscure printer, touchpad or other weird hardware is not common.
Which is an issue with Nvidia, they have no drivers for that GPU for Windows 11 either. Not saying that this is not an issue but there is absolutely nothing Linux can do to make every legacy GPU work without help from Nvidia. It uses the open source driver out of the box, which works sometimes but not for everything and definitely not for gaming.
https://www.nvidia.com/en-us/drivers/results/180339/
Yes, they do.
Yes, they can. They literally have the correct (legacy) driver in the Ubuntu repo. But the autoinstaller installs the wrong driver during installing the OS. And if you try to manually install it, there is not even a text prompt in the CLI saying “You just installed that driver, do you want to actually use it to? (Y/n)”.
They could have even gone so far as to make a CLI wizard (like many other packages do) or even a GUI wizard. But no, the package just installs and does nothing by default.
Also that is not correct. All the *buntu installers ask you when you install the OS whether you also want to have closed source drivers installed, and then it installs the closed source Nvidia drivers. Just the wrong ones.
OS clouds and proprietary internets might shove another percent or 2 down the rabbit hole