So I’m not really a selfhoster per say - but I run 2 NAS at home and I am working my way toward it. I don’t really need my stuff open to the internet so I just use it on my LAN.
However I do have a lot of data, and I’m constantly backing things up. My question is - I have the following setup:
- Computer hard drives
- These backup to my NAS
- I have a separate HDD in an enclosure I plug into the NAS directly and copy my data onto every few months to put in my safe.
- Some cloud storage for very important smaller stuff (pictures)
My main question is - what is the best way to copy off new data from my NAS (Synology) to my “cold storage” drive, without recopying everything every time? is there a way to detect the files that exist on both and only copy new ones? I’ve always had this question when doing backups and it seems to always be overly complex.
You guys are very knowledgeable so I’m sure someone has dealt with this!
I use an
rsync
job to do it. Rsync by default uses the files metadata to determine if the file has changed or not and updates based on that. You can opt to use checksums instead if you’d rather. IIRC, you can do it with a Synology task, or just do it yourself on the command line. Ive got a Jenkins setup to run it so i can gather the logs and not have to remember the command all the time (and i use it for other periodic jobs as well), but its pretty straightforward on its own.Rsync is the correct solution. It does exactly what you want and nothing more. A script that uses rsync is future-proof. Other backup solutions depend on the maintenance of the software, which could be abandoned, go up in price, or have vulnerabilities.