

2·
3 days agoI did (am doing) something very similar. I definitely have issues with my indexing, but I’m just ordering it manually by year/date for now.
I’m doing a little extra for parity though. I’m using 50-100gb discs for the data, and using 25gb discs as a full parity disc via dvdisaster for each disc I burn. Hopefully that reduces the risk of the parity data also being unreadable, and gives MORE parity data without eating into my actual data discs. It’s hard enough to break up the archives into 100gb chunks as is.
Need to look into bacula as suggested by another poster.
To add to this….ive added a layer of protection against accidental deletion and dumb fingering by making each year of my photos archive into a separate zfs dataset. Then each year I set each dataset to read-only and create a new one.
Manual, but effective enough. I also have automatic snapshots against dumb fingering, but this helps against ones I don’t notice before the snapshots expire.