You can run a basic model on pretty mid-range hardware, the smaller ones are only 1-2GB in size.
You can run a basic model on pretty mid-range hardware, the smaller ones are only 1-2GB in size.
Duplicati or BackRest and use any S3 compatible storage such as Backblaze B2, iDrive E2, Wasabi S3, etc…
Why would that be the case?
Does rclone support Proton Drive? That’d be an option until an official client comes out.
Garage definitely seems better suited for selfhosters and small setups, Minio is just so large and complex with specific requirements now.
You can also delegate a subdomain to another provider with an API, but yes I see what you mean. Although I feel like getting port 80 open would be difficult as well in those situations.
It does but it’s a bit of a weird way of doing things.
I’d say they’re actually easier, at least in my experience. Since wildcard certs use DNS-01 verification with an API, you don’t need to deal with exposing port 80 directly to the internet.
You shouldn’t have the do anything specific at all, local network stuff works without internet and Jellyfin doesn’t rely on any internet servers like Plex does for authentication.
I just do full system images for that reason, easier than trying to pick and choose what should be backed up. Used to use Veeam, currently using Synology Active Backup.
For online backups I don’t due to size, but for local backups it’s just way easier.
To install at minimum you’ll need to likely shrink existing partitions and create new ones for linux if you don’t want to wipe the drive, that would be a dual-boot setup with Windows still installed along side. Or you can just wipe the drive entirely and have only Linux.
Regarding the files you should already have backups of anything important, if you don’t, set it up ASAP.
Messing with partitions can easily cause data loss if something goes wrong.
You also never know when hardware failure, malware, power surges, lightning strikes, or whatever other disaster will happen and cause data loss. 1 copy of files might as well be 0 copies.
Maybe, the average user really doesn’t understand backups though and would be more likely to ignore it then.
I’m always conflicted on this kind of thing, because for every person annoyed that it’s asking there’s a bunch that are complaining that they lost all their photos because their phone was stolen or something like that, and had no idea that they need to backup their stuff.
How did they end up thinking that everything must be done with terminal while using Ubuntu?
Most guides on installing things or help on fixing things will offer terminal commands, so I can see how that could certainly lead to that feeling as a new user.
Also depending on the DE and stuff certain very basic obvious settings are not available in the GUI, like fractional scaling on KDE which has to be done by editing some config file first.
Couldn’t tiling just be done with an app like how PowerToys FancyZones does it on Windows? That way anyone could just install it when wanted.
Even 255 bytes with 10 million entries is only ~2.6GB of data you need to store, and if you have 10 million users the probably $1 a month extra that would cost is perfectly fine.
I suppose there may be a performance impact too since you have to read more data to check the hash, but servers are so fast now it doesn’t seem like that would be significant unless your backend was poorly made.
Interesting, that sounds much more complex than using some backup software to image the drive!
Neat idea, a lot more money than a 7th/8th gen box on ebay, but you have a built in UPS and screen for troubleshooting which is nice.
Odd, I’ve had a Pixel, Oneplus 7 pro, and now a Galaxy S21 and they all pick up my DNS server from DHCP without any issues.
Is that any different from no one checking the code every update?