I’ve been trying Workstation recently. Python dependency issues caused me to switch to Silverblue for the last 2 years. A new machine with Nvidia got me to try WS. I just had a mystery problem with Python after booting today and that got me looking into Anaconda. I didn’t know it was used under the kernel like this. I’m not sure how I feel about this level of Python integration. I would feel a lot more comfortable with a less accessible precompiled binary but I know I am likely making naïve assumptions saying this. Any thoughts or insights?
Anaconda is just an OS installer program. At least, the Anaconda that you’re referring to. After installation, it’s gone.
There is also Anaconda which is a Python platform/package system/whatever. Maybe you’re confusing the two?
https://stackoverflow.com/questions/33683530/is-anaconda-for-fedora-different-from-anaconda-for-python
I should have searched for this first I guess. That is reassuring. I was mostly uncomfortable with the idea of the two being the same.
Still Anaconda from RH claims the software is mostly written in Python. That still makes me uneasy. I’ve always thought of C as very near to the hardware assembly and an interpreted language as prioritizing portability, flexibility, and access. I find it far harder to hack around with a binary written in C versus all of the python stuff I’ve encountered. Maybe I’m just mistaken in my understanding of how this code is running.
I look at C programs as tied more to the hardware they are compiled to run on with permanence. I look at python as a language designed to constantly depreciate itself and its code base. It feels like an extension of proprietary hardware planned obsolescence and manipulation. I don’t consider programs written in Python to have permanence or long term value because their toolchains become nearly impossible to track down from scratch.
You might be even more concerned to find that your Fedora package manager, DNF, is also written in Python: https://github.com/rpm-software-management/dnf
Fact of the matter is that Python is a language that gets used all the time for system level things, and frequently you just don’t know it because there is no “.py” extension.
I’m not sure I understand your concerns about python…
Anyway, people like the Fedora folks working on anaconda choose a language that makes sense for their purpose. Python absolutely makes sense for this purpose compared to C. It allows for fast development and flexibility, and there’s not much in an installer program that needs high performance.
That’s not to say C isn’t a very important language too. But it’s important to use the best tool for the job.
I mean, I’m playing with offline AI right now and reproducibility sucks. Most of this is python based. I think I need to switch to Nix for this situation where I need to compile most tools from source. While python itself my be easily available for previous versions, sorting out the required dependencies is outside of what most users are capable of reproducing. I get the impression C is more centralized with less potential to cause problems. At least when it comes to hobbyist embedded projects, the stuff centered around C has staying power long term. The toolchains are easy to reproduce. Most python stuff is more trouble to reproduce than it is worth after it is a few years old. IMO it feels something like the Android SDK where obsolescence is part of the design. I shouldn’t need to track down all the previous package versions just to reproduce software. It should simply work by default when all of the dependencies are named. There should be something like a central library of deprecated packages that guarantees reproducibility. Even if these packages are available somewhere in the aether, piecing together relevant documentation for them is nearly impossible to sort out in historical context. This has been my experience as an intermediate user and hobbyist.