

My website’s the one linked in this post: https://snee.la/
My email is at the contact page: https://snee.la/contact/
sneela [at] tugraz [dot] at
InfoSec Person | Alt-Account#2


My website’s the one linked in this post: https://snee.la/
My email is at the contact page: https://snee.la/contact/
sneela [at] tugraz [dot] at


I’ll be sure to reach out if I find myself being unable to replicate it.
No worries, and good luck! My email can be found on my website if you want it :D
I wasn’t even talking about tikzplotlib. It’s just that pgf backend is now supported by matplotlib and you can produce pgf files with.
Ah… I’ve think I’ve heard of it, but I never really registered that. Thanks for the info :D


I could give you the tikz source of Fig 2 if you’d like. The patterns and colors of the plots took me almost a day to choose. I wanted to go for a color-blind friendly pallette and keep it looking still snazzy. (https://github.com/simon-pfahler/colorblind)
I’m familiar with matplotlib -> PGFplots (using the Python tikzplotlib library). Unfortunately, I’ve decided against using it for the paper as it produces quite unmanageable outputs. Especially if I rerun experiments + with new data, and later want to change patterns, colors… It was always more of a hassle. I used it for my Master’s thesis.
Instead, Python program -> show plot -> if okay, generate CSV.
In LaTeX, have PGFplot code which reads CSV file and generates the data that way. Much, much easier to maintain.


Thanks for your words!
Yes! We use TikZ for the diagrams, which can be a nightmare sometimes… but it gets better the more I use it.
Regarding the plots, we use PGFplots. I often use matplotlib for quick plots while running experiments, but the paper itself uses PGFplots with the data in a CSV for that sweet, sweet scaling when you zoom in.
Also fuck off with this attitude man. I’m not attacking you, learn how to speak to people.
Sorry. I get quite triggered when people add pseudo-labels to distributions, mainly Debian being outdated. Looking back, I was quite harsh and I apologize.
However, you’re actively spreading the false narrative by saying Debian’s not good for “general computing” - this is what triggers me. A distribution is nothing but its package manager and some defaults. Some have different defaults and package managers.
Older packages can be difficult for new users who want a computer to “just work”.
The only place this makes a difference is with the latest hardware which OP does not have. I have more recent hardware than OP and Debian 13 + KDE Plasma 6 works out of the box.
It’s fine for general computing, but not great.
Again, I really hate this sentence. I will tone down the rudeness this time in explaining why. I have daily-driven Debian for years with AMD + Intel CPUs, Nvidia GPUs (1070, 3060) with use cases ranging wildly through the years. I cannot fathom what kind of general computing cannot work. If you say specialized computing, I would still disagree as there are always ways to make things work.
Just off the top of my head where things are iffy with Debian: bat cannot be installed via a package manager, but not on most distros anyway. There’s a deb package though which works. Similar with dust, although more distros have it in their package manager.
Debian, like you said, is rock-solid stable. In my many years of developing code, university courses, daily work (research), maintaining servers with wildly different usages, Debian’s “outdated” packages have only let me down once and that was with a LaTeX package which could be installed via ctan anyway.
Debian is rock-solid stable, but lacks newer packages. It’s great for a server, not so great for […] general computing.
What the fuck??? I’ve been daily driving Debian for years now on my personal laptops, desktop, mini PC, and mutliple servers. I’ve found and reported Linux kernel vulnerabilities on my trusty Debian systems.
What do you mean it’s not so great for general computing? What can’t you do with Debian computing-wise that you can do with other distros? The only issues I’ve ever had was with some LaTeX packages being older versions. You just get that from CTAN and install that manually.
This is such a ridiculous comment. What do you do on a server that’s not general computing? You’re doing a subset of general computing??? How does a fucking distro actively prevent you from doing general computing???


If the reports are somewhat technical (written with Latex for example), check out sioyek: https://sioyek.info/. It’s a PDF reader mainly for academic use.
Sioyek has made reading and reviewing papers SO much easier and it’s really, really convenient… once you get the hang of it. It takes a bit of time to get used to all the things, but it’s worth it. I also review students’ theses with it. Highlighting colors and adding comments is super easy (select text, h+g (green highlight), type comment).
If you have want to export your notes and comments, you will need this script though: https://github.com/ahrm/sioyek/blob/main/scripts/embedded_annotations.py
Installed it on my desktop and the process was painful (my fault) because I ran out of space on my boot ssd (128Gigs) while doing the upgrades.
I don’t really have much on my boot ssd and all my important data is on my laptop, backed up to my servers, or on my desktop’s HDD. I did a fresh install with a kde live usb stick and that went smooth, until something with the nvidia drivers prevented the display server from launching.
Thankfully, I’ve been through this charade multiple times in the past, and I’m significantly more experienced in dealing with the kernel these days. Adding the nvidia-drm modeset kernel command line launch param worked, and my system is running deb 13. I’m so happy I have KDE plasma 6.
Overall, a one hour process. Could have been faster if I had free space on my system lol. I’m a bit more reluctant to upgrade my servers at the moment, but I may in the upcoming months.
One minor thing: they updated their apt sources (https://repolib.readthedocs.io/en/latest/deb822-format.html, https://unix.stackexchange.com/questions/498021/deb822-style-etc-apt-sources-list#583015). Idk why, but the installer didn’t create & populate the .sources file. After a quick check of the man page, I created the file and it worked.


Can single-branch handle cloning from a particular commit? I know that it’s possible to clone particular branches and particular tags with depth=1, but OP states cloning at a particular commit, not HEAD.


--depth=1? I use this all the time when I clone the kernel.
Edit: reread that you wanted to download code at a particular commit.
I suggest using two different spellings:
Mold is the fungus.
To mould is to shape.
Nvm I’m an idiot. Lol
That seems to be the consensus online. But thanks for that tidbit! It feels even more bizarre now knowing that.
I wonder why a handful of people think the way I presented in the post. Perhaps American/British influences in certain places? Reading books by british authors and books by american authors at the same time? Feels unlikely.


Yes, this would essentially be a detecting mechanism for local instances. However, a network trained on all available federated data could still yield favorable results. You may just end up not needing IP Addresses and emails. Just upvotes / downvotes across a set of existing comments would even help.
The important point is figuring out all possible data you can extract and feed it to a “ML” black box. The black box can deal with things by itself.


My bachelor’s thesis was about comment amplifying/deamplifying on reddit using Graph Neural Networks (PyTorch-Geometric).
Essentially: there used to be commenters who would constantly agree / disagree with a particular sentiment, and these would be used to amplify / deamplify opinions, respectively. Using a set of metrics [1], I fed it into a Graph Neural Network (GNN) and it produced reasonably well results back in the day. Since Pytorch-Geomteric has been out, there’s been numerous advancements to GNN research as a whole, and I suspect it would be significantly more developed now.
Since upvotes are known to the instance administrator (for brevity, not getting into the fediverse aspect of this), and since their email addresses are known too, I believe that these two pieces of information can be accounted for in order to detect patterns. This would lead to much better results.
In the beginning, such a solution needs to look for patterns first and these patterns need to be flagged as true (bots) or false (users) by the instance administrator - maybe 200 manual flaggings. Afterwards, the GNN could possibly decide to act based on confidence of previous pattern matching.
This may be an interesting bachelor’s / master’s thesis (or a side project in general) for anyone looking for one. Of course, there’s a lot of nuances I’ve missed. Plus, I haven’t kept up with GNNs in a very long time, so that should be accounted for too.
Edit: perhaps IP addresses could be used too? That’s one way reddit would detect vote manipulation.
[1] account age, comment time, comment time difference with parent comment, sentiment agreement/disgareement with parent commenters, number of child comments after an hour, post karma, comment karma, number of comments, number of subreddits participated in, number of posts, and more I can’t remember.
Ah if you messed it up, you can press “e” on the grub entry and edit the command line parameters to remove the thing that messes it up. Good luck with your fresh install [and use Debian this time… jk :)]
Make sure to update your grub after you do. I’ve messed that one up before lol 😅
Do you not need the nvidia-drm.modeset=1 in GRUB_CMDLINE_LINUX?
https://www.if-not-true-then-false.com/2015/fedora-nvidia-guide/#262-edit-etcdefaultgrub
Could you show us the kernel command line parameters (in /etc/default/grub)? Is the modeset along with other params enabled? I’m not a fedora user, so I may not be of too much help.
https://www.linuxjournal.com/article/10754
MINIX originally was developed in 1987 by Andrew S. Tanenbaum as a teaching tool for his textbook Operating Systems Design and Implementation. Today, it is a text-oriented operating system with a kernel of less than 6,000 lines of code. MINIX’s largest claim to fame is as an example of a microkernel, in which each device driver runs as an isolated user-mode process—a structure that not only increases security but also reliability, because it means a bug in a driver cannot bring down the entire system.
In its heyday during the early 1990s, MINIX was popular among hobbyists and developers because of its inexpensive proprietary license. However, by the time it was licensed under a BSD-style license in 2000, MINIX had been overshadowed by other free-licensed operating systems.
Today, MINIX is best known as a footnote in GNU/Linux history. It inspired Linus Torvalds to develop Linux, and some of his early work was written on MINIX. Probably too, Torvalds’ early decision to support the MINIX filesystem is responsible for the Linux kernel’s support of almost every filesystem imaginable.
Later, Torvalds and Tanenbaum had a frank e-mail debate about the relative merits of macrokernels (sic) and microkernels. This early history resurfaced in 2004 when Kenneth Brown of the Alexis de Tocqueville Institution prepared a book alleging that Torvalds borrowed code from MINIX—a charge that Tanenbaum, among others, so comprehensively debunked, and the book was never actually published (see Resources).
See also: https://en.wikipedia.org/wiki/Tanenbaum–Torvalds_debate
Just purchased a server license (for life). Not only is this update jam packed full of nice features, but a lot of their updates are. I’ve been self-hosting it (on a VPS) for the past year and it’s about time I supported them