Ohhkay, so it's basically just a mass checksum to compare changes. That's certainly useful! The amount of cronjobs i'm going to have in place..It's probably enough of a divert to warrant its own thread but this sort of thing is very useful for a recovery scenario as a way of gauging how scunnered your files might be.
I've been using md5deep/hashdeep for years for the same sort of thing (I'm not using a filesystem with full checksumming so it's a poor-mans solution to spotting bitrot); you essentially just run it on your directory tree and it'll make an MD5/SHA/whatever hash of all the files within. You can save that out to a file, and then at a later date compare the hashes stored in the file vs. what the hashes of the files are right now.
Create a list of MD5 hashes for the files under /stuff using 4 CPU threads and save to an audit file:
Compare the current files with the previously generated hash list (audit mode):Code:
pushd /stuff && nice -n 19 ionice -c 3 hashdeep -c md5 -l -r -j 4 * > /var/lib/hashdeep/stuff_2021-03-10.hashdeep
This'll output a list of all the files that have either changed or didn't exist since the audit file was made so whilst it's very useful for static datasets (movies, photos, etc) it's not ideal for rapidly changing datasets.Code:
nice -n 19 ionice -c 3 hashdeep -r -j4 -c md5 -x -v -k /var/lib/hashdeep/stuff_2021-03-10.hashdeep /stuff