

Yeah i specifically don’t do any AI workloads on my server, that would be stupid slow with my old ass hardware. But my buddy (who’s a bit impulsive with money) bought two Spark GX10s and we’re likely to get some fun out of them :)


Yeah i specifically don’t do any AI workloads on my server, that would be stupid slow with my old ass hardware. But my buddy (who’s a bit impulsive with money) bought two Spark GX10s and we’re likely to get some fun out of them :)


VMs mostly
oh yeah i see how that can be hungry
What are you hosting on Minecraft that isn’t using >=4 gigs?
Just a vanilla server i play on with my son, it’s got 2G and i haven’t noticed anything out of the ordinary. Chunk gen is slow-ish but i suppose that’s CPU-bound.
BTW i exagerated in my initial comment, i looked at the machine and it’s sitting just under 8G of used RAM.
Also ZFS
Jesus christ 😅 no idea if you’re jesting


I used to collect books and the cheapest option at the time was ebay. I would search things like “science fiction bulk” and select the lot with the best titles. Generally I could find some around 1€ per book, don’t know if that’s still plausible.


Serious question, what does RAM help with in the context of self hosting? I recently bought 32G for my server, and it’s DDR3 ecc so it’s so cheap I could have afforded 64 but I just kept wondering what will I use it for? I rarely go north of 6G usage and that’s with half a dozen services, a Minecraft server etc… I just don’t know what kind of services are RAM hungry.
Oh man believe me I’m all for it. I totally understand having an approach of engineering that is not bankable or tailored for Californian degen culture.
I’m not saying there’s anything wrong with your stance. Just saying it will become an aesthetic niche just like there’s some people who still track music on magnetic tape when it would be exponentially faster to use cubase.
I don’t have your specific axe to grind against AI but my personal angle is to only use old hardware and make software that runs on it.
Not everything has to be superlative, and self imposed constraints are great for quality of life.
Exactly. And a commit is a commit. Unless it’s 10Kloc in one go you can just read what’s in it and decide for yourself.
At my previous job we used to jokingly (?) tell our engineering manager “no commits, no opinions” well I think it’s kinda like that.
And that’s great for you but I still think you’ll be in a minority. Which is not necessarily bad of course.
Open Source devs mostly come from the industry and the penetration of agentic coding in the industry has been massive over the last six months. I don’t think I’ve ever seen anything of this scale.
I think disclosure is good and should be tackled as soon as possible because being transparent in your communication is just good practice in general.
However I feel like this will soon be rendered useless as all projects will move to agentic (or otherwise ai-assisted) coding.
Maybe there’ll be a movement of hand coded FOSS but realistically they’ll have a hard time. Resources are already tight for most projects, and rejecting productivity in favor of aesthetics is a rich guy’s strategy.
This whole debacle is showing that people fundamentally misunderstand how code works. They are trying to declare code good or code bad because of some silly heuristics like ai/not-ai, as if it wasn’t literal lines of text which you can read before you form an opinion and make a fool of yourself.
I see, that’s interesting. I do a lot of transcoding but offline so i don’t have usage for such a cache. I’ve tested various storage solutions but on my setup, transcoding is always CPU-bound, even on old ass HDDs the bottleneck is never I/O.