Using rsync for incremental backup












2















I'm hosting a page and have ssh-access to the webspace.



The site allows modification by it's users. To be able to revert it back to an older state, I thought about rsync to create an incremental backup every 30 minutes using cron to launch the following script.



#!/bin/bash

# Binaries
RSYNC=`which rsync`
LN=`which ln`
MKDIR=`which mkdir`
#TODO: Is this enough to make the script distro independent?

# Other Variables
source="<username>@<provider>:<workspace path>"
target="<local backup path>"
# Date ...
year=$(date +%Y)
month=$(date +%m)
day=$(date +%d)
# ... and time
hour=$(date +%H)
minute=$(date +%M)

# Prepare directories
$MKDIR -p $target/$year/$month/$day/"$hour"_"$minute"/
# TODO: Why is this necessary? The actual backup won't work without this line
# saying "directory does not exist...".

# Actual backup
$RSYNC -av --delete "$source" "$target/$year/$month/$day/"$hour"_"$minute"/" --link-dest="$target/latest/"
$LN -nsf "$target/$year/$month/$day/"$hour"_"$minute"/" "$target/latest"

# End script
exit 0


The script seems to work so far but the target-path bloated to roughly three times the actual size of the source path within in the last three days.



Incremental backuping should only lead to a small increase, right?



What am I doing wrong?



Thanks in advance



Markus










share|improve this question

























  • How do you determine the size of your backup directories? I'm asking because I use a similar script for years without any problems. In my case I have directories like my-backups/2018-04-29, my-backups/2018-04-30, my-backups/2018-05-01 etc. When I do du -hs 2018-04-29 then it returns ±200GB (=full backup), and almost the same for du -hs 2018-04-30 ...

    – PerlDuck
    Aug 11 '18 at 15:12











  • (continued) But when I do du -hs 2018-04-29 2018-04-30 (i.e. both directories in one go), then it returns ±200GB for the first and just ±5GB (=incremental backup) for the second. I think in that case du notices that most files in …30 are hardlinked with files in …29 and only counts them once.

    – PerlDuck
    Aug 11 '18 at 15:12











  • Is your local directory, $target, on an ext4 filesystem? Else the hardlinks won't work and you won't save any space.

    – PerlDuck
    Aug 11 '18 at 15:22











  • Yes, it is a ext4-Filesystem. Using four physical disks, combined by lvm. But that shouldn't affect the characteristics of an ext4 partition, right?

    – Markus
    Aug 12 '18 at 10:33
















2















I'm hosting a page and have ssh-access to the webspace.



The site allows modification by it's users. To be able to revert it back to an older state, I thought about rsync to create an incremental backup every 30 minutes using cron to launch the following script.



#!/bin/bash

# Binaries
RSYNC=`which rsync`
LN=`which ln`
MKDIR=`which mkdir`
#TODO: Is this enough to make the script distro independent?

# Other Variables
source="<username>@<provider>:<workspace path>"
target="<local backup path>"
# Date ...
year=$(date +%Y)
month=$(date +%m)
day=$(date +%d)
# ... and time
hour=$(date +%H)
minute=$(date +%M)

# Prepare directories
$MKDIR -p $target/$year/$month/$day/"$hour"_"$minute"/
# TODO: Why is this necessary? The actual backup won't work without this line
# saying "directory does not exist...".

# Actual backup
$RSYNC -av --delete "$source" "$target/$year/$month/$day/"$hour"_"$minute"/" --link-dest="$target/latest/"
$LN -nsf "$target/$year/$month/$day/"$hour"_"$minute"/" "$target/latest"

# End script
exit 0


The script seems to work so far but the target-path bloated to roughly three times the actual size of the source path within in the last three days.



Incremental backuping should only lead to a small increase, right?



What am I doing wrong?



Thanks in advance



Markus










share|improve this question

























  • How do you determine the size of your backup directories? I'm asking because I use a similar script for years without any problems. In my case I have directories like my-backups/2018-04-29, my-backups/2018-04-30, my-backups/2018-05-01 etc. When I do du -hs 2018-04-29 then it returns ±200GB (=full backup), and almost the same for du -hs 2018-04-30 ...

    – PerlDuck
    Aug 11 '18 at 15:12











  • (continued) But when I do du -hs 2018-04-29 2018-04-30 (i.e. both directories in one go), then it returns ±200GB for the first and just ±5GB (=incremental backup) for the second. I think in that case du notices that most files in …30 are hardlinked with files in …29 and only counts them once.

    – PerlDuck
    Aug 11 '18 at 15:12











  • Is your local directory, $target, on an ext4 filesystem? Else the hardlinks won't work and you won't save any space.

    – PerlDuck
    Aug 11 '18 at 15:22











  • Yes, it is a ext4-Filesystem. Using four physical disks, combined by lvm. But that shouldn't affect the characteristics of an ext4 partition, right?

    – Markus
    Aug 12 '18 at 10:33














2












2








2








I'm hosting a page and have ssh-access to the webspace.



The site allows modification by it's users. To be able to revert it back to an older state, I thought about rsync to create an incremental backup every 30 minutes using cron to launch the following script.



#!/bin/bash

# Binaries
RSYNC=`which rsync`
LN=`which ln`
MKDIR=`which mkdir`
#TODO: Is this enough to make the script distro independent?

# Other Variables
source="<username>@<provider>:<workspace path>"
target="<local backup path>"
# Date ...
year=$(date +%Y)
month=$(date +%m)
day=$(date +%d)
# ... and time
hour=$(date +%H)
minute=$(date +%M)

# Prepare directories
$MKDIR -p $target/$year/$month/$day/"$hour"_"$minute"/
# TODO: Why is this necessary? The actual backup won't work without this line
# saying "directory does not exist...".

# Actual backup
$RSYNC -av --delete "$source" "$target/$year/$month/$day/"$hour"_"$minute"/" --link-dest="$target/latest/"
$LN -nsf "$target/$year/$month/$day/"$hour"_"$minute"/" "$target/latest"

# End script
exit 0


The script seems to work so far but the target-path bloated to roughly three times the actual size of the source path within in the last three days.



Incremental backuping should only lead to a small increase, right?



What am I doing wrong?



Thanks in advance



Markus










share|improve this question
















I'm hosting a page and have ssh-access to the webspace.



The site allows modification by it's users. To be able to revert it back to an older state, I thought about rsync to create an incremental backup every 30 minutes using cron to launch the following script.



#!/bin/bash

# Binaries
RSYNC=`which rsync`
LN=`which ln`
MKDIR=`which mkdir`
#TODO: Is this enough to make the script distro independent?

# Other Variables
source="<username>@<provider>:<workspace path>"
target="<local backup path>"
# Date ...
year=$(date +%Y)
month=$(date +%m)
day=$(date +%d)
# ... and time
hour=$(date +%H)
minute=$(date +%M)

# Prepare directories
$MKDIR -p $target/$year/$month/$day/"$hour"_"$minute"/
# TODO: Why is this necessary? The actual backup won't work without this line
# saying "directory does not exist...".

# Actual backup
$RSYNC -av --delete "$source" "$target/$year/$month/$day/"$hour"_"$minute"/" --link-dest="$target/latest/"
$LN -nsf "$target/$year/$month/$day/"$hour"_"$minute"/" "$target/latest"

# End script
exit 0


The script seems to work so far but the target-path bloated to roughly three times the actual size of the source path within in the last three days.



Incremental backuping should only lead to a small increase, right?



What am I doing wrong?



Thanks in advance



Markus







bash ssh backup cron rsync






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Aug 11 '18 at 9:37







Markus

















asked Aug 11 '18 at 9:28









MarkusMarkus

4601920




4601920













  • How do you determine the size of your backup directories? I'm asking because I use a similar script for years without any problems. In my case I have directories like my-backups/2018-04-29, my-backups/2018-04-30, my-backups/2018-05-01 etc. When I do du -hs 2018-04-29 then it returns ±200GB (=full backup), and almost the same for du -hs 2018-04-30 ...

    – PerlDuck
    Aug 11 '18 at 15:12











  • (continued) But when I do du -hs 2018-04-29 2018-04-30 (i.e. both directories in one go), then it returns ±200GB for the first and just ±5GB (=incremental backup) for the second. I think in that case du notices that most files in …30 are hardlinked with files in …29 and only counts them once.

    – PerlDuck
    Aug 11 '18 at 15:12











  • Is your local directory, $target, on an ext4 filesystem? Else the hardlinks won't work and you won't save any space.

    – PerlDuck
    Aug 11 '18 at 15:22











  • Yes, it is a ext4-Filesystem. Using four physical disks, combined by lvm. But that shouldn't affect the characteristics of an ext4 partition, right?

    – Markus
    Aug 12 '18 at 10:33



















  • How do you determine the size of your backup directories? I'm asking because I use a similar script for years without any problems. In my case I have directories like my-backups/2018-04-29, my-backups/2018-04-30, my-backups/2018-05-01 etc. When I do du -hs 2018-04-29 then it returns ±200GB (=full backup), and almost the same for du -hs 2018-04-30 ...

    – PerlDuck
    Aug 11 '18 at 15:12











  • (continued) But when I do du -hs 2018-04-29 2018-04-30 (i.e. both directories in one go), then it returns ±200GB for the first and just ±5GB (=incremental backup) for the second. I think in that case du notices that most files in …30 are hardlinked with files in …29 and only counts them once.

    – PerlDuck
    Aug 11 '18 at 15:12











  • Is your local directory, $target, on an ext4 filesystem? Else the hardlinks won't work and you won't save any space.

    – PerlDuck
    Aug 11 '18 at 15:22











  • Yes, it is a ext4-Filesystem. Using four physical disks, combined by lvm. But that shouldn't affect the characteristics of an ext4 partition, right?

    – Markus
    Aug 12 '18 at 10:33

















How do you determine the size of your backup directories? I'm asking because I use a similar script for years without any problems. In my case I have directories like my-backups/2018-04-29, my-backups/2018-04-30, my-backups/2018-05-01 etc. When I do du -hs 2018-04-29 then it returns ±200GB (=full backup), and almost the same for du -hs 2018-04-30 ...

– PerlDuck
Aug 11 '18 at 15:12





How do you determine the size of your backup directories? I'm asking because I use a similar script for years without any problems. In my case I have directories like my-backups/2018-04-29, my-backups/2018-04-30, my-backups/2018-05-01 etc. When I do du -hs 2018-04-29 then it returns ±200GB (=full backup), and almost the same for du -hs 2018-04-30 ...

– PerlDuck
Aug 11 '18 at 15:12













(continued) But when I do du -hs 2018-04-29 2018-04-30 (i.e. both directories in one go), then it returns ±200GB for the first and just ±5GB (=incremental backup) for the second. I think in that case du notices that most files in …30 are hardlinked with files in …29 and only counts them once.

– PerlDuck
Aug 11 '18 at 15:12





(continued) But when I do du -hs 2018-04-29 2018-04-30 (i.e. both directories in one go), then it returns ±200GB for the first and just ±5GB (=incremental backup) for the second. I think in that case du notices that most files in …30 are hardlinked with files in …29 and only counts them once.

– PerlDuck
Aug 11 '18 at 15:12













Is your local directory, $target, on an ext4 filesystem? Else the hardlinks won't work and you won't save any space.

– PerlDuck
Aug 11 '18 at 15:22





Is your local directory, $target, on an ext4 filesystem? Else the hardlinks won't work and you won't save any space.

– PerlDuck
Aug 11 '18 at 15:22













Yes, it is a ext4-Filesystem. Using four physical disks, combined by lvm. But that shouldn't affect the characteristics of an ext4 partition, right?

– Markus
Aug 12 '18 at 10:33





Yes, it is a ext4-Filesystem. Using four physical disks, combined by lvm. But that shouldn't affect the characteristics of an ext4 partition, right?

– Markus
Aug 12 '18 at 10:33










4 Answers
4






active

oldest

votes


















1














There's actually already a tool created that does exactly this, based on rsync. It's called rdiff-backup and I've used it many times in the past to create incremental backups and it supports rolling back to previous states. It can also be configured to clean up old backups so that your backup directory doesn't keep growing forever.



Find out more about it here and look at the usage examples on the documentation page: http://rdiff-backup.nongnu.org/






share|improve this answer































    2














    If your backup media has a linux format eg ext3 or ext4 (and it probably should, or file attributes won't get backed up), then there is a neat trick you can do with rsync and cp -al making good use of a feature of the file system: you do an incremental backup, but then you create hard links to the files at each backup. This means you only copy the files that have changed but the backup media only has one copy of each file so doesn't balloon in size, (I can't take the credit for this; it was in a comment on a long-previous question that I could not begin to find again.)



    My (daily) backup goes something like:



    DEST=/media/$USER/backups         # the name my backup media is mounted under
    rsync -av --progress --delete --exclude ".[!.]*" ~/ $DEST/current
    DATE=`date -I`
    mkdir $DEST/$DATE
    cp -al $DEST/current/ $DEST/$DATE


    this updates "current" with only the files that have changed, but creates a directory named after today's date with hard links to all the files. Thus ls of each days backups appear to contain all the files in situ, but there is in fact only one copy on the backup media. The latter point is also the downside: as there's only one copy of each file you should rotate the media so you have multiple copies, but that is good backup practice anyway.






    share|improve this answer





















    • 1





      Indeed a neat trick, but rsync has a switch --link-dest that already creates hardlinks to another (e.g. yesterday's) directory. Basically you would say rsync -a --delete --link-dest=$previous $source $current.

      – PerlDuck
      Aug 11 '18 at 15:33








    • 2





      @PerlDuck thanks I did not know about --link-dest, I don't find the man page for rsync a masterpiece of clarity! The trouble might be that to do the same thing with --link-dest you need to know the name of the last backup directory, date -I -d yesterday would do it for daily backups but would fail if you missed one eg on a Sunday.

      – B.Tanner
      Aug 11 '18 at 15:56






    • 1





      Exactly. You will need to figure out the name of the previous directory yourself. See my approach here. Or use the OP's trick to create a symlink to the previous (aka latest) directory after doing a backup. I like that idea but didn't test how --link-dest behaves when given a symlink.

      – PerlDuck
      Aug 11 '18 at 16:02





















    0














    The rsync program already has backup options that do what you want.



    This is the script that I use for backups, running as root at 23:45 each day:



    #!/bin/bash -e
    # This is run as root at the end of the day

    ( echo ">>>>>>>>>>>>>>>>>>>>>>>" $(date)
    today=$(date +%Y-%m-%d)
    month=$(date +%Y-%m)
    # USB backups
    cd /media/ray/Backup-Ray
    rsync --archive --one-file-system --delete --backup --backup-dir="../../$today/etc" "/etc/" "mostrecent/etc/"
    rsync --archive --one-file-system --delete --backup --backup-dir="../../$today/home" --exclude=".config/google-chrome/" --exclude=".cache/" --exclude=".local/share/zeitgeist/" --exclude="Downloads/" "/home/" "mostrecent/home/"
    rsync --archive $today/ $month/
    echo "<<<<<<<<<<<<<<<<<<<<<<<" $(date)
    ) &>>/home/ray/Log/root.out

    exit 0


    All changed and deleted files are preserved.
    It's easy to use the standard unix tools to examine and recover files:



    $ cd /media/ray/Backup-Ray
    $ ls -l {,*}/home/ray/public/Log/wait.xhtml
    -rw-r--r-- 1 ray ray 14002 Dec 3 21:04 2018-12-16/home/ray/public/Log/wait.xhtml
    -rw-r--r-- 1 ray ray 14102 Dec 16 09:28 2018-12-17/home/ray/public/Log/wait.xhtml
    -rw-r--r-- 1 ray ray 14202 Dec 17 20:47 2018-12-20/home/ray/public/Log/wait.xhtml
    -rw-r--r-- 1 ray ray 14302 Dec 20 15:12 2018-12-25/home/ray/public/Log/wait.xhtml
    -rw-r--r-- 1 ray ray 14402 Dec 25 21:21 2018-12-26/home/ray/public/Log/wait.xhtml
    -rw-r--r-- 1 ray ray 14402 Dec 25 21:21 2018-12/home/ray/public/Log/wait.xhtml
    -rw-r--r-- 1 ray ray 14452 Dec 26 18:43 /home/ray/public/Log/wait.xhtml
    -rw-r--r-- 1 ray ray 14452 Dec 26 18:43 mostrecent/home/ray/public/Log/wait.xhtml


    Only the "mostrecent" directory is large.



    The monthly accumulation directory (2018-12) contains the most recent changes throughout the month. It isn't necessary to do this step, but when I need to save space it allows me to delete all the daily updates for that month (A year from now I might care what things looked like at the end of December, but not so much how things changed within the month.)



    Obviously you'd need to change the frequency, timestamps, etc., and add your portability code, but the same mechanism should do what you want.






    share|improve this answer































      0














      based on B.Tanner answer this is a script which tests every 60 seconds , if any file has changed , it will create a backup, you should have 2 folders backups/OLD and backups/current



      while true  
      do
      DEST=/home/$USER/backups # the name my backup media is mounted under
      if [ -n "$(rsync -ai --delete --exclude ".[!.]*" $(pwd)/ $DEST/current)" ]; then
      DATE=`date +"%m-%d-%y"`
      TIME=`date +"%T"`
      mkdir -p $DEST/OLD/$DATE
      mkdir $DEST/OLD/$DATE/$TIME
      cp -al $DEST/current/ $DEST/OLD/$DATE/$TIME
      echo "done:$DATE/$TIME"
      fi
      sleep 60
      done





      share|improve this answer

























        Your Answer








        StackExchange.ready(function() {
        var channelOptions = {
        tags: "".split(" "),
        id: "89"
        };
        initTagRenderer("".split(" "), "".split(" "), channelOptions);

        StackExchange.using("externalEditor", function() {
        // Have to fire editor after snippets, if snippets enabled
        if (StackExchange.settings.snippets.snippetsEnabled) {
        StackExchange.using("snippets", function() {
        createEditor();
        });
        }
        else {
        createEditor();
        }
        });

        function createEditor() {
        StackExchange.prepareEditor({
        heartbeatType: 'answer',
        autoActivateHeartbeat: false,
        convertImagesToLinks: true,
        noModals: true,
        showLowRepImageUploadWarning: true,
        reputationToPostImages: 10,
        bindNavPrevention: true,
        postfix: "",
        imageUploader: {
        brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
        contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
        allowUrls: true
        },
        onDemand: true,
        discardSelector: ".discard-answer"
        ,immediatelyShowMarkdownHelp:true
        });


        }
        });














        draft saved

        draft discarded


















        StackExchange.ready(
        function () {
        StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2faskubuntu.com%2fquestions%2f1064379%2fusing-rsync-for-incremental-backup%23new-answer', 'question_page');
        }
        );

        Post as a guest















        Required, but never shown

























        4 Answers
        4






        active

        oldest

        votes








        4 Answers
        4






        active

        oldest

        votes









        active

        oldest

        votes






        active

        oldest

        votes









        1














        There's actually already a tool created that does exactly this, based on rsync. It's called rdiff-backup and I've used it many times in the past to create incremental backups and it supports rolling back to previous states. It can also be configured to clean up old backups so that your backup directory doesn't keep growing forever.



        Find out more about it here and look at the usage examples on the documentation page: http://rdiff-backup.nongnu.org/






        share|improve this answer




























          1














          There's actually already a tool created that does exactly this, based on rsync. It's called rdiff-backup and I've used it many times in the past to create incremental backups and it supports rolling back to previous states. It can also be configured to clean up old backups so that your backup directory doesn't keep growing forever.



          Find out more about it here and look at the usage examples on the documentation page: http://rdiff-backup.nongnu.org/






          share|improve this answer


























            1












            1








            1







            There's actually already a tool created that does exactly this, based on rsync. It's called rdiff-backup and I've used it many times in the past to create incremental backups and it supports rolling back to previous states. It can also be configured to clean up old backups so that your backup directory doesn't keep growing forever.



            Find out more about it here and look at the usage examples on the documentation page: http://rdiff-backup.nongnu.org/






            share|improve this answer













            There's actually already a tool created that does exactly this, based on rsync. It's called rdiff-backup and I've used it many times in the past to create incremental backups and it supports rolling back to previous states. It can also be configured to clean up old backups so that your backup directory doesn't keep growing forever.



            Find out more about it here and look at the usage examples on the documentation page: http://rdiff-backup.nongnu.org/







            share|improve this answer












            share|improve this answer



            share|improve this answer










            answered Aug 11 '18 at 10:09









            NiklasNiklas

            53861840




            53861840

























                2














                If your backup media has a linux format eg ext3 or ext4 (and it probably should, or file attributes won't get backed up), then there is a neat trick you can do with rsync and cp -al making good use of a feature of the file system: you do an incremental backup, but then you create hard links to the files at each backup. This means you only copy the files that have changed but the backup media only has one copy of each file so doesn't balloon in size, (I can't take the credit for this; it was in a comment on a long-previous question that I could not begin to find again.)



                My (daily) backup goes something like:



                DEST=/media/$USER/backups         # the name my backup media is mounted under
                rsync -av --progress --delete --exclude ".[!.]*" ~/ $DEST/current
                DATE=`date -I`
                mkdir $DEST/$DATE
                cp -al $DEST/current/ $DEST/$DATE


                this updates "current" with only the files that have changed, but creates a directory named after today's date with hard links to all the files. Thus ls of each days backups appear to contain all the files in situ, but there is in fact only one copy on the backup media. The latter point is also the downside: as there's only one copy of each file you should rotate the media so you have multiple copies, but that is good backup practice anyway.






                share|improve this answer





















                • 1





                  Indeed a neat trick, but rsync has a switch --link-dest that already creates hardlinks to another (e.g. yesterday's) directory. Basically you would say rsync -a --delete --link-dest=$previous $source $current.

                  – PerlDuck
                  Aug 11 '18 at 15:33








                • 2





                  @PerlDuck thanks I did not know about --link-dest, I don't find the man page for rsync a masterpiece of clarity! The trouble might be that to do the same thing with --link-dest you need to know the name of the last backup directory, date -I -d yesterday would do it for daily backups but would fail if you missed one eg on a Sunday.

                  – B.Tanner
                  Aug 11 '18 at 15:56






                • 1





                  Exactly. You will need to figure out the name of the previous directory yourself. See my approach here. Or use the OP's trick to create a symlink to the previous (aka latest) directory after doing a backup. I like that idea but didn't test how --link-dest behaves when given a symlink.

                  – PerlDuck
                  Aug 11 '18 at 16:02


















                2














                If your backup media has a linux format eg ext3 or ext4 (and it probably should, or file attributes won't get backed up), then there is a neat trick you can do with rsync and cp -al making good use of a feature of the file system: you do an incremental backup, but then you create hard links to the files at each backup. This means you only copy the files that have changed but the backup media only has one copy of each file so doesn't balloon in size, (I can't take the credit for this; it was in a comment on a long-previous question that I could not begin to find again.)



                My (daily) backup goes something like:



                DEST=/media/$USER/backups         # the name my backup media is mounted under
                rsync -av --progress --delete --exclude ".[!.]*" ~/ $DEST/current
                DATE=`date -I`
                mkdir $DEST/$DATE
                cp -al $DEST/current/ $DEST/$DATE


                this updates "current" with only the files that have changed, but creates a directory named after today's date with hard links to all the files. Thus ls of each days backups appear to contain all the files in situ, but there is in fact only one copy on the backup media. The latter point is also the downside: as there's only one copy of each file you should rotate the media so you have multiple copies, but that is good backup practice anyway.






                share|improve this answer





















                • 1





                  Indeed a neat trick, but rsync has a switch --link-dest that already creates hardlinks to another (e.g. yesterday's) directory. Basically you would say rsync -a --delete --link-dest=$previous $source $current.

                  – PerlDuck
                  Aug 11 '18 at 15:33








                • 2





                  @PerlDuck thanks I did not know about --link-dest, I don't find the man page for rsync a masterpiece of clarity! The trouble might be that to do the same thing with --link-dest you need to know the name of the last backup directory, date -I -d yesterday would do it for daily backups but would fail if you missed one eg on a Sunday.

                  – B.Tanner
                  Aug 11 '18 at 15:56






                • 1





                  Exactly. You will need to figure out the name of the previous directory yourself. See my approach here. Or use the OP's trick to create a symlink to the previous (aka latest) directory after doing a backup. I like that idea but didn't test how --link-dest behaves when given a symlink.

                  – PerlDuck
                  Aug 11 '18 at 16:02
















                2












                2








                2







                If your backup media has a linux format eg ext3 or ext4 (and it probably should, or file attributes won't get backed up), then there is a neat trick you can do with rsync and cp -al making good use of a feature of the file system: you do an incremental backup, but then you create hard links to the files at each backup. This means you only copy the files that have changed but the backup media only has one copy of each file so doesn't balloon in size, (I can't take the credit for this; it was in a comment on a long-previous question that I could not begin to find again.)



                My (daily) backup goes something like:



                DEST=/media/$USER/backups         # the name my backup media is mounted under
                rsync -av --progress --delete --exclude ".[!.]*" ~/ $DEST/current
                DATE=`date -I`
                mkdir $DEST/$DATE
                cp -al $DEST/current/ $DEST/$DATE


                this updates "current" with only the files that have changed, but creates a directory named after today's date with hard links to all the files. Thus ls of each days backups appear to contain all the files in situ, but there is in fact only one copy on the backup media. The latter point is also the downside: as there's only one copy of each file you should rotate the media so you have multiple copies, but that is good backup practice anyway.






                share|improve this answer















                If your backup media has a linux format eg ext3 or ext4 (and it probably should, or file attributes won't get backed up), then there is a neat trick you can do with rsync and cp -al making good use of a feature of the file system: you do an incremental backup, but then you create hard links to the files at each backup. This means you only copy the files that have changed but the backup media only has one copy of each file so doesn't balloon in size, (I can't take the credit for this; it was in a comment on a long-previous question that I could not begin to find again.)



                My (daily) backup goes something like:



                DEST=/media/$USER/backups         # the name my backup media is mounted under
                rsync -av --progress --delete --exclude ".[!.]*" ~/ $DEST/current
                DATE=`date -I`
                mkdir $DEST/$DATE
                cp -al $DEST/current/ $DEST/$DATE


                this updates "current" with only the files that have changed, but creates a directory named after today's date with hard links to all the files. Thus ls of each days backups appear to contain all the files in situ, but there is in fact only one copy on the backup media. The latter point is also the downside: as there's only one copy of each file you should rotate the media so you have multiple copies, but that is good backup practice anyway.







                share|improve this answer














                share|improve this answer



                share|improve this answer








                edited Aug 11 '18 at 15:28









                PerlDuck

                6,16211334




                6,16211334










                answered Aug 11 '18 at 14:50









                B.TannerB.Tanner

                9581814




                9581814








                • 1





                  Indeed a neat trick, but rsync has a switch --link-dest that already creates hardlinks to another (e.g. yesterday's) directory. Basically you would say rsync -a --delete --link-dest=$previous $source $current.

                  – PerlDuck
                  Aug 11 '18 at 15:33








                • 2





                  @PerlDuck thanks I did not know about --link-dest, I don't find the man page for rsync a masterpiece of clarity! The trouble might be that to do the same thing with --link-dest you need to know the name of the last backup directory, date -I -d yesterday would do it for daily backups but would fail if you missed one eg on a Sunday.

                  – B.Tanner
                  Aug 11 '18 at 15:56






                • 1





                  Exactly. You will need to figure out the name of the previous directory yourself. See my approach here. Or use the OP's trick to create a symlink to the previous (aka latest) directory after doing a backup. I like that idea but didn't test how --link-dest behaves when given a symlink.

                  – PerlDuck
                  Aug 11 '18 at 16:02
















                • 1





                  Indeed a neat trick, but rsync has a switch --link-dest that already creates hardlinks to another (e.g. yesterday's) directory. Basically you would say rsync -a --delete --link-dest=$previous $source $current.

                  – PerlDuck
                  Aug 11 '18 at 15:33








                • 2





                  @PerlDuck thanks I did not know about --link-dest, I don't find the man page for rsync a masterpiece of clarity! The trouble might be that to do the same thing with --link-dest you need to know the name of the last backup directory, date -I -d yesterday would do it for daily backups but would fail if you missed one eg on a Sunday.

                  – B.Tanner
                  Aug 11 '18 at 15:56






                • 1





                  Exactly. You will need to figure out the name of the previous directory yourself. See my approach here. Or use the OP's trick to create a symlink to the previous (aka latest) directory after doing a backup. I like that idea but didn't test how --link-dest behaves when given a symlink.

                  – PerlDuck
                  Aug 11 '18 at 16:02










                1




                1





                Indeed a neat trick, but rsync has a switch --link-dest that already creates hardlinks to another (e.g. yesterday's) directory. Basically you would say rsync -a --delete --link-dest=$previous $source $current.

                – PerlDuck
                Aug 11 '18 at 15:33







                Indeed a neat trick, but rsync has a switch --link-dest that already creates hardlinks to another (e.g. yesterday's) directory. Basically you would say rsync -a --delete --link-dest=$previous $source $current.

                – PerlDuck
                Aug 11 '18 at 15:33






                2




                2





                @PerlDuck thanks I did not know about --link-dest, I don't find the man page for rsync a masterpiece of clarity! The trouble might be that to do the same thing with --link-dest you need to know the name of the last backup directory, date -I -d yesterday would do it for daily backups but would fail if you missed one eg on a Sunday.

                – B.Tanner
                Aug 11 '18 at 15:56





                @PerlDuck thanks I did not know about --link-dest, I don't find the man page for rsync a masterpiece of clarity! The trouble might be that to do the same thing with --link-dest you need to know the name of the last backup directory, date -I -d yesterday would do it for daily backups but would fail if you missed one eg on a Sunday.

                – B.Tanner
                Aug 11 '18 at 15:56




                1




                1





                Exactly. You will need to figure out the name of the previous directory yourself. See my approach here. Or use the OP's trick to create a symlink to the previous (aka latest) directory after doing a backup. I like that idea but didn't test how --link-dest behaves when given a symlink.

                – PerlDuck
                Aug 11 '18 at 16:02







                Exactly. You will need to figure out the name of the previous directory yourself. See my approach here. Or use the OP's trick to create a symlink to the previous (aka latest) directory after doing a backup. I like that idea but didn't test how --link-dest behaves when given a symlink.

                – PerlDuck
                Aug 11 '18 at 16:02













                0














                The rsync program already has backup options that do what you want.



                This is the script that I use for backups, running as root at 23:45 each day:



                #!/bin/bash -e
                # This is run as root at the end of the day

                ( echo ">>>>>>>>>>>>>>>>>>>>>>>" $(date)
                today=$(date +%Y-%m-%d)
                month=$(date +%Y-%m)
                # USB backups
                cd /media/ray/Backup-Ray
                rsync --archive --one-file-system --delete --backup --backup-dir="../../$today/etc" "/etc/" "mostrecent/etc/"
                rsync --archive --one-file-system --delete --backup --backup-dir="../../$today/home" --exclude=".config/google-chrome/" --exclude=".cache/" --exclude=".local/share/zeitgeist/" --exclude="Downloads/" "/home/" "mostrecent/home/"
                rsync --archive $today/ $month/
                echo "<<<<<<<<<<<<<<<<<<<<<<<" $(date)
                ) &>>/home/ray/Log/root.out

                exit 0


                All changed and deleted files are preserved.
                It's easy to use the standard unix tools to examine and recover files:



                $ cd /media/ray/Backup-Ray
                $ ls -l {,*}/home/ray/public/Log/wait.xhtml
                -rw-r--r-- 1 ray ray 14002 Dec 3 21:04 2018-12-16/home/ray/public/Log/wait.xhtml
                -rw-r--r-- 1 ray ray 14102 Dec 16 09:28 2018-12-17/home/ray/public/Log/wait.xhtml
                -rw-r--r-- 1 ray ray 14202 Dec 17 20:47 2018-12-20/home/ray/public/Log/wait.xhtml
                -rw-r--r-- 1 ray ray 14302 Dec 20 15:12 2018-12-25/home/ray/public/Log/wait.xhtml
                -rw-r--r-- 1 ray ray 14402 Dec 25 21:21 2018-12-26/home/ray/public/Log/wait.xhtml
                -rw-r--r-- 1 ray ray 14402 Dec 25 21:21 2018-12/home/ray/public/Log/wait.xhtml
                -rw-r--r-- 1 ray ray 14452 Dec 26 18:43 /home/ray/public/Log/wait.xhtml
                -rw-r--r-- 1 ray ray 14452 Dec 26 18:43 mostrecent/home/ray/public/Log/wait.xhtml


                Only the "mostrecent" directory is large.



                The monthly accumulation directory (2018-12) contains the most recent changes throughout the month. It isn't necessary to do this step, but when I need to save space it allows me to delete all the daily updates for that month (A year from now I might care what things looked like at the end of December, but not so much how things changed within the month.)



                Obviously you'd need to change the frequency, timestamps, etc., and add your portability code, but the same mechanism should do what you want.






                share|improve this answer




























                  0














                  The rsync program already has backup options that do what you want.



                  This is the script that I use for backups, running as root at 23:45 each day:



                  #!/bin/bash -e
                  # This is run as root at the end of the day

                  ( echo ">>>>>>>>>>>>>>>>>>>>>>>" $(date)
                  today=$(date +%Y-%m-%d)
                  month=$(date +%Y-%m)
                  # USB backups
                  cd /media/ray/Backup-Ray
                  rsync --archive --one-file-system --delete --backup --backup-dir="../../$today/etc" "/etc/" "mostrecent/etc/"
                  rsync --archive --one-file-system --delete --backup --backup-dir="../../$today/home" --exclude=".config/google-chrome/" --exclude=".cache/" --exclude=".local/share/zeitgeist/" --exclude="Downloads/" "/home/" "mostrecent/home/"
                  rsync --archive $today/ $month/
                  echo "<<<<<<<<<<<<<<<<<<<<<<<" $(date)
                  ) &>>/home/ray/Log/root.out

                  exit 0


                  All changed and deleted files are preserved.
                  It's easy to use the standard unix tools to examine and recover files:



                  $ cd /media/ray/Backup-Ray
                  $ ls -l {,*}/home/ray/public/Log/wait.xhtml
                  -rw-r--r-- 1 ray ray 14002 Dec 3 21:04 2018-12-16/home/ray/public/Log/wait.xhtml
                  -rw-r--r-- 1 ray ray 14102 Dec 16 09:28 2018-12-17/home/ray/public/Log/wait.xhtml
                  -rw-r--r-- 1 ray ray 14202 Dec 17 20:47 2018-12-20/home/ray/public/Log/wait.xhtml
                  -rw-r--r-- 1 ray ray 14302 Dec 20 15:12 2018-12-25/home/ray/public/Log/wait.xhtml
                  -rw-r--r-- 1 ray ray 14402 Dec 25 21:21 2018-12-26/home/ray/public/Log/wait.xhtml
                  -rw-r--r-- 1 ray ray 14402 Dec 25 21:21 2018-12/home/ray/public/Log/wait.xhtml
                  -rw-r--r-- 1 ray ray 14452 Dec 26 18:43 /home/ray/public/Log/wait.xhtml
                  -rw-r--r-- 1 ray ray 14452 Dec 26 18:43 mostrecent/home/ray/public/Log/wait.xhtml


                  Only the "mostrecent" directory is large.



                  The monthly accumulation directory (2018-12) contains the most recent changes throughout the month. It isn't necessary to do this step, but when I need to save space it allows me to delete all the daily updates for that month (A year from now I might care what things looked like at the end of December, but not so much how things changed within the month.)



                  Obviously you'd need to change the frequency, timestamps, etc., and add your portability code, but the same mechanism should do what you want.






                  share|improve this answer


























                    0












                    0








                    0







                    The rsync program already has backup options that do what you want.



                    This is the script that I use for backups, running as root at 23:45 each day:



                    #!/bin/bash -e
                    # This is run as root at the end of the day

                    ( echo ">>>>>>>>>>>>>>>>>>>>>>>" $(date)
                    today=$(date +%Y-%m-%d)
                    month=$(date +%Y-%m)
                    # USB backups
                    cd /media/ray/Backup-Ray
                    rsync --archive --one-file-system --delete --backup --backup-dir="../../$today/etc" "/etc/" "mostrecent/etc/"
                    rsync --archive --one-file-system --delete --backup --backup-dir="../../$today/home" --exclude=".config/google-chrome/" --exclude=".cache/" --exclude=".local/share/zeitgeist/" --exclude="Downloads/" "/home/" "mostrecent/home/"
                    rsync --archive $today/ $month/
                    echo "<<<<<<<<<<<<<<<<<<<<<<<" $(date)
                    ) &>>/home/ray/Log/root.out

                    exit 0


                    All changed and deleted files are preserved.
                    It's easy to use the standard unix tools to examine and recover files:



                    $ cd /media/ray/Backup-Ray
                    $ ls -l {,*}/home/ray/public/Log/wait.xhtml
                    -rw-r--r-- 1 ray ray 14002 Dec 3 21:04 2018-12-16/home/ray/public/Log/wait.xhtml
                    -rw-r--r-- 1 ray ray 14102 Dec 16 09:28 2018-12-17/home/ray/public/Log/wait.xhtml
                    -rw-r--r-- 1 ray ray 14202 Dec 17 20:47 2018-12-20/home/ray/public/Log/wait.xhtml
                    -rw-r--r-- 1 ray ray 14302 Dec 20 15:12 2018-12-25/home/ray/public/Log/wait.xhtml
                    -rw-r--r-- 1 ray ray 14402 Dec 25 21:21 2018-12-26/home/ray/public/Log/wait.xhtml
                    -rw-r--r-- 1 ray ray 14402 Dec 25 21:21 2018-12/home/ray/public/Log/wait.xhtml
                    -rw-r--r-- 1 ray ray 14452 Dec 26 18:43 /home/ray/public/Log/wait.xhtml
                    -rw-r--r-- 1 ray ray 14452 Dec 26 18:43 mostrecent/home/ray/public/Log/wait.xhtml


                    Only the "mostrecent" directory is large.



                    The monthly accumulation directory (2018-12) contains the most recent changes throughout the month. It isn't necessary to do this step, but when I need to save space it allows me to delete all the daily updates for that month (A year from now I might care what things looked like at the end of December, but not so much how things changed within the month.)



                    Obviously you'd need to change the frequency, timestamps, etc., and add your portability code, but the same mechanism should do what you want.






                    share|improve this answer













                    The rsync program already has backup options that do what you want.



                    This is the script that I use for backups, running as root at 23:45 each day:



                    #!/bin/bash -e
                    # This is run as root at the end of the day

                    ( echo ">>>>>>>>>>>>>>>>>>>>>>>" $(date)
                    today=$(date +%Y-%m-%d)
                    month=$(date +%Y-%m)
                    # USB backups
                    cd /media/ray/Backup-Ray
                    rsync --archive --one-file-system --delete --backup --backup-dir="../../$today/etc" "/etc/" "mostrecent/etc/"
                    rsync --archive --one-file-system --delete --backup --backup-dir="../../$today/home" --exclude=".config/google-chrome/" --exclude=".cache/" --exclude=".local/share/zeitgeist/" --exclude="Downloads/" "/home/" "mostrecent/home/"
                    rsync --archive $today/ $month/
                    echo "<<<<<<<<<<<<<<<<<<<<<<<" $(date)
                    ) &>>/home/ray/Log/root.out

                    exit 0


                    All changed and deleted files are preserved.
                    It's easy to use the standard unix tools to examine and recover files:



                    $ cd /media/ray/Backup-Ray
                    $ ls -l {,*}/home/ray/public/Log/wait.xhtml
                    -rw-r--r-- 1 ray ray 14002 Dec 3 21:04 2018-12-16/home/ray/public/Log/wait.xhtml
                    -rw-r--r-- 1 ray ray 14102 Dec 16 09:28 2018-12-17/home/ray/public/Log/wait.xhtml
                    -rw-r--r-- 1 ray ray 14202 Dec 17 20:47 2018-12-20/home/ray/public/Log/wait.xhtml
                    -rw-r--r-- 1 ray ray 14302 Dec 20 15:12 2018-12-25/home/ray/public/Log/wait.xhtml
                    -rw-r--r-- 1 ray ray 14402 Dec 25 21:21 2018-12-26/home/ray/public/Log/wait.xhtml
                    -rw-r--r-- 1 ray ray 14402 Dec 25 21:21 2018-12/home/ray/public/Log/wait.xhtml
                    -rw-r--r-- 1 ray ray 14452 Dec 26 18:43 /home/ray/public/Log/wait.xhtml
                    -rw-r--r-- 1 ray ray 14452 Dec 26 18:43 mostrecent/home/ray/public/Log/wait.xhtml


                    Only the "mostrecent" directory is large.



                    The monthly accumulation directory (2018-12) contains the most recent changes throughout the month. It isn't necessary to do this step, but when I need to save space it allows me to delete all the daily updates for that month (A year from now I might care what things looked like at the end of December, but not so much how things changed within the month.)



                    Obviously you'd need to change the frequency, timestamps, etc., and add your portability code, but the same mechanism should do what you want.







                    share|improve this answer












                    share|improve this answer



                    share|improve this answer










                    answered Jan 2 at 13:57









                    Ray ButterworthRay Butterworth

                    1086




                    1086























                        0














                        based on B.Tanner answer this is a script which tests every 60 seconds , if any file has changed , it will create a backup, you should have 2 folders backups/OLD and backups/current



                        while true  
                        do
                        DEST=/home/$USER/backups # the name my backup media is mounted under
                        if [ -n "$(rsync -ai --delete --exclude ".[!.]*" $(pwd)/ $DEST/current)" ]; then
                        DATE=`date +"%m-%d-%y"`
                        TIME=`date +"%T"`
                        mkdir -p $DEST/OLD/$DATE
                        mkdir $DEST/OLD/$DATE/$TIME
                        cp -al $DEST/current/ $DEST/OLD/$DATE/$TIME
                        echo "done:$DATE/$TIME"
                        fi
                        sleep 60
                        done





                        share|improve this answer






























                          0














                          based on B.Tanner answer this is a script which tests every 60 seconds , if any file has changed , it will create a backup, you should have 2 folders backups/OLD and backups/current



                          while true  
                          do
                          DEST=/home/$USER/backups # the name my backup media is mounted under
                          if [ -n "$(rsync -ai --delete --exclude ".[!.]*" $(pwd)/ $DEST/current)" ]; then
                          DATE=`date +"%m-%d-%y"`
                          TIME=`date +"%T"`
                          mkdir -p $DEST/OLD/$DATE
                          mkdir $DEST/OLD/$DATE/$TIME
                          cp -al $DEST/current/ $DEST/OLD/$DATE/$TIME
                          echo "done:$DATE/$TIME"
                          fi
                          sleep 60
                          done





                          share|improve this answer




























                            0












                            0








                            0







                            based on B.Tanner answer this is a script which tests every 60 seconds , if any file has changed , it will create a backup, you should have 2 folders backups/OLD and backups/current



                            while true  
                            do
                            DEST=/home/$USER/backups # the name my backup media is mounted under
                            if [ -n "$(rsync -ai --delete --exclude ".[!.]*" $(pwd)/ $DEST/current)" ]; then
                            DATE=`date +"%m-%d-%y"`
                            TIME=`date +"%T"`
                            mkdir -p $DEST/OLD/$DATE
                            mkdir $DEST/OLD/$DATE/$TIME
                            cp -al $DEST/current/ $DEST/OLD/$DATE/$TIME
                            echo "done:$DATE/$TIME"
                            fi
                            sleep 60
                            done





                            share|improve this answer















                            based on B.Tanner answer this is a script which tests every 60 seconds , if any file has changed , it will create a backup, you should have 2 folders backups/OLD and backups/current



                            while true  
                            do
                            DEST=/home/$USER/backups # the name my backup media is mounted under
                            if [ -n "$(rsync -ai --delete --exclude ".[!.]*" $(pwd)/ $DEST/current)" ]; then
                            DATE=`date +"%m-%d-%y"`
                            TIME=`date +"%T"`
                            mkdir -p $DEST/OLD/$DATE
                            mkdir $DEST/OLD/$DATE/$TIME
                            cp -al $DEST/current/ $DEST/OLD/$DATE/$TIME
                            echo "done:$DATE/$TIME"
                            fi
                            sleep 60
                            done






                            share|improve this answer














                            share|improve this answer



                            share|improve this answer








                            edited Jan 2 at 15:02

























                            answered Jan 2 at 13:21









                            AliceAlice

                            11




                            11






























                                draft saved

                                draft discarded




















































                                Thanks for contributing an answer to Ask Ubuntu!


                                • Please be sure to answer the question. Provide details and share your research!

                                But avoid



                                • Asking for help, clarification, or responding to other answers.

                                • Making statements based on opinion; back them up with references or personal experience.


                                To learn more, see our tips on writing great answers.




                                draft saved


                                draft discarded














                                StackExchange.ready(
                                function () {
                                StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2faskubuntu.com%2fquestions%2f1064379%2fusing-rsync-for-incremental-backup%23new-answer', 'question_page');
                                }
                                );

                                Post as a guest















                                Required, but never shown





















































                                Required, but never shown














                                Required, but never shown












                                Required, but never shown







                                Required, but never shown

































                                Required, but never shown














                                Required, but never shown












                                Required, but never shown







                                Required, but never shown







                                Popular posts from this blog

                                How to change which sound is reproduced for terminal bell?

                                Title Spacing in Bjornstrup Chapter, Removing Chapter Number From Contents

                                Can I use Tabulator js library in my java Spring + Thymeleaf project?