So, this is a rather odd request of a backup solution, but it’s kinda what I want right now.
I’m still relatively new to Linux and self-hosting in general
A few years ago, my cousin and I were hosting our own Minecraft server. It had a mod that would create backups of the world folder. It zipped it up, named it “yyyy-mm-dd.zip” and placed it in a backups folder somewhere on the server.
The most important feature that I want is actually the next part. It would allow us to specify how many backups we wanted to keep, and also how frequent we wanted the backup to run.
We set it to backup daily, and keep 14 days of backups. After that, it would delete the oldest one, and make a new backup.
I would like to replicate that functionality! Specify the frequency, but ALSO how many backups to keep.
Idk if it’s asking too much. I’ve tried doing some research, but I’m not sure where to start.
Ideally I’d like something I can host on docker. Maybe connect to a Google account or something so it can be off-site.
I only want to use it for docker config files, compose files, container folders, etc.
I’ve looked into restic, but it seems it encrypts the backups, and you NEED a working copy of restic to restore? I’d like something simple like a .zip file instead or something, to be able to just download, unzip, and spin up the compose file and stuff.
Sorry for the wall of text, thanks in advance if you have any suggestions!
P.S. I’m pretty sure the upload to Google or some other service would have to be a separate program, so I’m looking into that as well.
I do something similar with rclone. Most server software have some way of creating backups. Have that software create a backup and use rclone to move the file over to some cloud storage. Rclone also has the option to delete older stuff (rclone delete --min-age 7d). Do all that with a shell script and add it to the crontab.
I think this is the best solution. rclone also has built in crypt too.
Edit: built in crypt if you configure it for use
That sounds like the 2nd part of what I want! The uploading to off-site part! Awesome, I’ll def look into it, thank you!
If you look at my recent post history I gave out my script using rclone to backup my server. It’s in NixOS but you can ignore it as it is bash scripting at its core. It has everything you need like using rclone to delete older backups.
Just saved your comment for reference later, thank you so much!
What you want is a bash script and a cron job that calls it. Most of what you need is likely already installed for you.
“crontab -e” will pull up your crontab editor. Check out “man crontab” first to get an idea of the syntax…it looks complicated at first but it’s actually really easy once you get the hang of it.
Your script will call tar to create your backup archive. You’ll need the path to the folder where your files to backup are and then something like: tar -C PATH_TO_FILES -czf PATH_AND_NAME_OF_BACKUP.tgz .
That last dot tells it to tar up everything in the current folder. You can also use backticks to call programs in line…like date (man date). So if your server software lives in /opt/server and your config files you want to backup are in /opt/server/conf and you want to store the backups in /home/backups you could do something like:
tar -C /opt/server/conf -czf /home/backups/server_bkup.`date +%Y%m%d`.tgz .
Which would call tar, tell it to change directory (-C) to /opt/server/conf and then create (-c) +gzip (-z) into file (-f) /home/backups/blah.tgz everything in the new current directory (.)
I don’t know if that’s what you’re looking for but that would be the easiest way to do it…sorry for potato formatting but I’m on mobile
No honestly, this was very helpful!
This, in combination with the solutions some others have suggested here already, would be pretty much what I want, just in multiple different parts, instead of 1 program/utility.
I’ll def look into this, and honestly see if I can find a docker image for something like this as well!!
Thank you so much!!!
I would say since you want simple .zip archives, this could be something to script yourself since it would be fairly easy.
Basically:
- Zip the server into a dated zip file
- Check for old zip files and delete
- Upload zip files using rclone to remote storage (gdrive, etc)
- Optionally send a notification to discord, telegram, healthchecks.io, or something like that
The downside of zipping backups like this is obviously storage space, every backup takes up the full amount of space of the server, since there’s no deduplication or incremental versioning happening.
@MangoPenguin
If you’re scripting it yourself, https://www.complete.org/dar/ gives a few extra niceties over just zip files or tarballs.
Thank @jgoerzen for the nice summary.
@koinuI think this might be the way I have to go!
I’m really liking Kopia. Nice GUI and some pretty nice settings. But I don’t like the obfuscation. Like you said, I just want the zip files. I think I’ll try Borgbackup, then rclone to drive, but I’ll also look into just scripting it myself!
Thank you so much :)
Borg Backup works great for that, exactly it’s use case. It’s a command line thing, but you can use Vorta as a UI if you want that. If you have a NAS, it can back up directly to that.
I have a second cronjob in my setup that syncs the encrypted archive to B2 nightly. Works great
But… The Vorta were the spokespeople for the Dominion. Why would they be working for the Borg?? Why would you do this to me???
Duplicati does this and it’s one of the best backup solutions imo
Awesome, I’ll add it to the list of software to look into! Actually, if it does everything, then it’s gonna be the first 1 I try! Thank you!
Sounds like a job for logrotate. It does more than just log files, kinda average name I guess. Checkout this server fault q&a for more details. https://serverfault.com/questions/196843/logrotate-rotating-non-log-files
I’ll have to look more into this, because I think I misunderstood, but it seems that it is ½ of the backup solution right? It won’t actually MAKE the backups, but it’ll allow me to “rotate” and only keep the last “x” files?
Yep that’s the one. If you can make a cron job to make the zip file, logrotate could handle keeping the last x files.
It might sound complicated, but the cool thing about *nix environments is that everything is made up of a combo of little tools. You can learn one at a time and slowly build something super complicated over time. First thing would be figuring out the right set of commands to make a zip file from the directory I reckon. Then add that to cron so it happens every day. Then add logrotate into the mix and have that do its thing every day after the backup runs.
So I think I’ll try Duplicati for docker next, and if that fails, then I’ll try scripting and cronjobs.
I’m so happy with all the support, thank you! :)
You should be able to achieve this with Kopia
I’m trying this out right now with Kopia for docker, and I’m not the biggest fan of (seemingly) not being able to turn off the obfuscation, and making it do just a single .zip file or .tar or whatever. Also, having a hard time setting up drive integration with the GUI, but that’s just my fault. I’m not familiar with rclone or Kopia at all.
You can mount the complete backup as a local file system, which I think would suit your needs. I’m not familiar with their various integrations either, I just backup over SFTP.
But to reassure you, I also needed a bit of trial and error with Kopia, as it’s not the easiest GUI ever to get used to. But I’ve got it running now, and I’m very happy with it. I’ve also used it to successfully restore multiple backups (to test if it worked) and they all worked.
You can look at backuppc, it has served us well for years now. Offsite, manages incremental and full back ups, file deduplication, etc.
So on your Minecraft server do a daily backup and add the day off the week to it (whatever.7.gz), this way you always have 7 backups on the server and it auto rotates. Add that for folder to backuppc and the backup server will automatically decrease the amount of backups if they get older.Thank you so much for your suggestion, imma add it to my list :)
@koinu@lemmy.world I suspect you can force restic not to encrypt. The other additional advantage of restic and similars is that you can specify s3, sftp and other targets.
I’ll keep digging into it, and probably spin up a container to fully test it out myself.
Thank you!
@koinu@lemmy.world I take it back 😢
Dang. I’m still gonna look into it though. The hardest part was getting the names of different software. I kept finding different ways to do it in CLI, but no docker software or anything.
As someone already said, Duplicati. You can install it using docker and it has super easy web gui
This is what I’m going to try next, as I’m not completely happy with Kopia