Scheduled folder backup

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
1
down vote

favorite
1












I'm looking for how to automatically backup a user's home directory in CentOs 7 to a remote host or NAS or just to ~/.snapshot.
In some Linux setups, I have seen a .snapshot folder in the user's home directory (~/.snapshot/) that holds hourly, nightly, and weekly backups of their home directory (ie ~/.snapshot/weekly1 for a copy of what was in the user's home directory 1 week ago).



I've seen several related posts on stack overflow, but so far, I haven't seen a guide that explains the complete workflow.



This is what I know so far:



  1. Use rsync to copy the contents of a given folder to the remote host, NAS, or (~/.snapshot/hourly0)

  2. Create a shell script to execute the rsync command

#!/bin/bash
sudo rsync -av --progress --delete --log-file=/home/username/$(date +%Y%m%d)_rsync.log --exclude "/home/username/.snapshot" /home/username/ /home/username/.snapshot/hourly1



  1. Change the permissions on the script to make it executable

sudo chmod +x /home/username/myscript.sh



  1. Use crontab to schedule the rsync command at the desired backup interval


  2. Somehow move hourly0 to hourly1 before running the scheduled hourly rsync


  3. Delete the oldest backup once rsync completes successfully


Are there any guides that cover how to do this? I don't understand how to automatically rename the folders as time goes on (ie weekly1 to weekly2), or how to delete week10 if I decide to only keep weeks up to 9. Is this another cron job?










share|improve this question









New contributor




Seth is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.



















  • Welcome to U&L! We are happy to assist, but we aren't a scriptwriting service. Please show what you've tried, and explain how it did not work as you expected or intended, and we'll be happy to help guide you.
    – DopeGhoti
    1 hour ago










  • Would you please clarify what you would like to delete. An arbitrary deletion of the backups or just backups older than specific time?
    – Goro
    45 mins ago










  • Er, backing up a user's /home into that directory doesn't seem like an actual backup...
    – jasonwryan
    29 mins ago










  • @jasonwryan: Why not? Taking a snapshot of the contents of a user's home directory at scheduled times seems like the definition of a backup to me.
    – Seth
    5 mins ago














up vote
1
down vote

favorite
1












I'm looking for how to automatically backup a user's home directory in CentOs 7 to a remote host or NAS or just to ~/.snapshot.
In some Linux setups, I have seen a .snapshot folder in the user's home directory (~/.snapshot/) that holds hourly, nightly, and weekly backups of their home directory (ie ~/.snapshot/weekly1 for a copy of what was in the user's home directory 1 week ago).



I've seen several related posts on stack overflow, but so far, I haven't seen a guide that explains the complete workflow.



This is what I know so far:



  1. Use rsync to copy the contents of a given folder to the remote host, NAS, or (~/.snapshot/hourly0)

  2. Create a shell script to execute the rsync command

#!/bin/bash
sudo rsync -av --progress --delete --log-file=/home/username/$(date +%Y%m%d)_rsync.log --exclude "/home/username/.snapshot" /home/username/ /home/username/.snapshot/hourly1



  1. Change the permissions on the script to make it executable

sudo chmod +x /home/username/myscript.sh



  1. Use crontab to schedule the rsync command at the desired backup interval


  2. Somehow move hourly0 to hourly1 before running the scheduled hourly rsync


  3. Delete the oldest backup once rsync completes successfully


Are there any guides that cover how to do this? I don't understand how to automatically rename the folders as time goes on (ie weekly1 to weekly2), or how to delete week10 if I decide to only keep weeks up to 9. Is this another cron job?










share|improve this question









New contributor




Seth is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.



















  • Welcome to U&L! We are happy to assist, but we aren't a scriptwriting service. Please show what you've tried, and explain how it did not work as you expected or intended, and we'll be happy to help guide you.
    – DopeGhoti
    1 hour ago










  • Would you please clarify what you would like to delete. An arbitrary deletion of the backups or just backups older than specific time?
    – Goro
    45 mins ago










  • Er, backing up a user's /home into that directory doesn't seem like an actual backup...
    – jasonwryan
    29 mins ago










  • @jasonwryan: Why not? Taking a snapshot of the contents of a user's home directory at scheduled times seems like the definition of a backup to me.
    – Seth
    5 mins ago












up vote
1
down vote

favorite
1









up vote
1
down vote

favorite
1






1





I'm looking for how to automatically backup a user's home directory in CentOs 7 to a remote host or NAS or just to ~/.snapshot.
In some Linux setups, I have seen a .snapshot folder in the user's home directory (~/.snapshot/) that holds hourly, nightly, and weekly backups of their home directory (ie ~/.snapshot/weekly1 for a copy of what was in the user's home directory 1 week ago).



I've seen several related posts on stack overflow, but so far, I haven't seen a guide that explains the complete workflow.



This is what I know so far:



  1. Use rsync to copy the contents of a given folder to the remote host, NAS, or (~/.snapshot/hourly0)

  2. Create a shell script to execute the rsync command

#!/bin/bash
sudo rsync -av --progress --delete --log-file=/home/username/$(date +%Y%m%d)_rsync.log --exclude "/home/username/.snapshot" /home/username/ /home/username/.snapshot/hourly1



  1. Change the permissions on the script to make it executable

sudo chmod +x /home/username/myscript.sh



  1. Use crontab to schedule the rsync command at the desired backup interval


  2. Somehow move hourly0 to hourly1 before running the scheduled hourly rsync


  3. Delete the oldest backup once rsync completes successfully


Are there any guides that cover how to do this? I don't understand how to automatically rename the folders as time goes on (ie weekly1 to weekly2), or how to delete week10 if I decide to only keep weeks up to 9. Is this another cron job?










share|improve this question









New contributor




Seth is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.











I'm looking for how to automatically backup a user's home directory in CentOs 7 to a remote host or NAS or just to ~/.snapshot.
In some Linux setups, I have seen a .snapshot folder in the user's home directory (~/.snapshot/) that holds hourly, nightly, and weekly backups of their home directory (ie ~/.snapshot/weekly1 for a copy of what was in the user's home directory 1 week ago).



I've seen several related posts on stack overflow, but so far, I haven't seen a guide that explains the complete workflow.



This is what I know so far:



  1. Use rsync to copy the contents of a given folder to the remote host, NAS, or (~/.snapshot/hourly0)

  2. Create a shell script to execute the rsync command

#!/bin/bash
sudo rsync -av --progress --delete --log-file=/home/username/$(date +%Y%m%d)_rsync.log --exclude "/home/username/.snapshot" /home/username/ /home/username/.snapshot/hourly1



  1. Change the permissions on the script to make it executable

sudo chmod +x /home/username/myscript.sh



  1. Use crontab to schedule the rsync command at the desired backup interval


  2. Somehow move hourly0 to hourly1 before running the scheduled hourly rsync


  3. Delete the oldest backup once rsync completes successfully


Are there any guides that cover how to do this? I don't understand how to automatically rename the folders as time goes on (ie weekly1 to weekly2), or how to delete week10 if I decide to only keep weeks up to 9. Is this another cron job?







shell-script cron rsync






share|improve this question









New contributor




Seth is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.











share|improve this question









New contributor




Seth is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.









share|improve this question




share|improve this question








edited 13 mins ago





















New contributor




Seth is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.









asked 1 hour ago









Seth

1064




1064




New contributor




Seth is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.





New contributor





Seth is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.






Seth is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.











  • Welcome to U&L! We are happy to assist, but we aren't a scriptwriting service. Please show what you've tried, and explain how it did not work as you expected or intended, and we'll be happy to help guide you.
    – DopeGhoti
    1 hour ago










  • Would you please clarify what you would like to delete. An arbitrary deletion of the backups or just backups older than specific time?
    – Goro
    45 mins ago










  • Er, backing up a user's /home into that directory doesn't seem like an actual backup...
    – jasonwryan
    29 mins ago










  • @jasonwryan: Why not? Taking a snapshot of the contents of a user's home directory at scheduled times seems like the definition of a backup to me.
    – Seth
    5 mins ago
















  • Welcome to U&L! We are happy to assist, but we aren't a scriptwriting service. Please show what you've tried, and explain how it did not work as you expected or intended, and we'll be happy to help guide you.
    – DopeGhoti
    1 hour ago










  • Would you please clarify what you would like to delete. An arbitrary deletion of the backups or just backups older than specific time?
    – Goro
    45 mins ago










  • Er, backing up a user's /home into that directory doesn't seem like an actual backup...
    – jasonwryan
    29 mins ago










  • @jasonwryan: Why not? Taking a snapshot of the contents of a user's home directory at scheduled times seems like the definition of a backup to me.
    – Seth
    5 mins ago















Welcome to U&L! We are happy to assist, but we aren't a scriptwriting service. Please show what you've tried, and explain how it did not work as you expected or intended, and we'll be happy to help guide you.
– DopeGhoti
1 hour ago




Welcome to U&L! We are happy to assist, but we aren't a scriptwriting service. Please show what you've tried, and explain how it did not work as you expected or intended, and we'll be happy to help guide you.
– DopeGhoti
1 hour ago












Would you please clarify what you would like to delete. An arbitrary deletion of the backups or just backups older than specific time?
– Goro
45 mins ago




Would you please clarify what you would like to delete. An arbitrary deletion of the backups or just backups older than specific time?
– Goro
45 mins ago












Er, backing up a user's /home into that directory doesn't seem like an actual backup...
– jasonwryan
29 mins ago




Er, backing up a user's /home into that directory doesn't seem like an actual backup...
– jasonwryan
29 mins ago












@jasonwryan: Why not? Taking a snapshot of the contents of a user's home directory at scheduled times seems like the definition of a backup to me.
– Seth
5 mins ago




@jasonwryan: Why not? Taking a snapshot of the contents of a user's home directory at scheduled times seems like the definition of a backup to me.
– Seth
5 mins ago










1 Answer
1






active

oldest

votes

















up vote
5
down vote













How about this guide:



1) create your script: Create new file and call it myrsync.sh, copy/paste the lines below:



 #!/bin/bash
sudo rsync -av --progress --delete --log-file=/home/your-username/Desktop/$(date +%Y%m%d)_rsync.log --exclude "/home/your-username/.folder" /home/data /media/dataBackup_$(date +%Y%m%d_%T)


Meaning of the flags:



 -av bit: 'a' means archive, or copy everything recursively, preserving things like permissions, ownership and time stamps. 
-'v' is verbose, so it tells you what its doing, either in the terminal, in this case, in the log file.
--progress gives you more specific info about progress.
--delete checks for changes between source and destination, and deletes any files at the destination that you've deleted at the source.
--log-file saves a copy of the rsync result to a date-stamped file on my desktop.
--exclude leaves out any files or directories you don't want copied. In the command above, the .folder directory

/home/data is the directory I want copied. /home/data copies the directory and its contents, /home/data would just copy the contents.

/media/dataBackup is the separate drive. Change this to whatever your backup location is.


2) Save myrsync.sh in your ~$HOME and make it executable by typing:



sudo chmod +x /home/your-username/Desktop/rsync-shell.sh


You can now double click that .sh file, choose Run in Terminal, it will ask you for your password and run, then leave a log file on your desktop. Or, you can make a cron job to do it for you!



3) The cron job



Copy your myrsync.sh file to /root by typing:



sudo cp /home/your-username/Desktop/myrsync.sh /root


Then type:



sudo crontab -e


You'll see a line which reads: minute hour day month year command



Under that, type:
0 22 * * * /root/myrsync.sh > $HOME/readme.log 2>&1



This means:



The hour in military time (24 hour) format (0 to 23)
The day of the month (1 to 31)
The month (1 to 12)
The day of the week(0 or 7 is Sun, or use name)
The command to run
So at 22:00 (10pm) every day root will run the shell script, without prompting you for sudo password (because its running as root already).


Now press Control-X, then type "Y", then press Enter



In order to delete older back ups, one way of doing this is to create a file with the time-stamp in it For example add the following command after the command rsync



date +%Y%m%d_%T >> time.txt


Use the command find to delete backups that matches the time stamp e.g:



find . -type f ! -newer /tmp/timestamp -delete


Or



find . ! -newermt $date ! -type d -delete


This will delete back ups before specific date/time






share|improve this answer






















  • How do you age-off old backups? Like weekly1 to weekly2 or hourly1 to hourly2?
    – Seth
    53 mins ago










  • You don't have. It will create new folder at each sync and it will capture day/month/year and time in the newly created backup folder. See $(date +%Y%m%d) in the code above. the command will sync from /home to /media/HomeBackup_$(date +%Y%m%d) and $(date +%Y%m%d) will give new folder name with every sync.
    – Goro
    51 mins ago











  • Then you need a separate cron job to delete backups older than a set amount of time? Or modify the shell script to delete the oldest backup once the rsync command completes successfully?
    – Seth
    50 mins ago











  • For the hourly backups, I'd like to keep the last three hours (hourly1, hourly2, hourly3) and delete any hourly older than 3. If the user logs out, then pause the hourly backups (so that I really have a backup of the last three hours that the user was active). For weekly, keep the last 2 weeks (weekly1, weekly2) and delete any weekly backups older than 2. Run weekly backups once a day meaning that the weekly1 backup is what was in the users's home folder 7 days ago. If no files have changed since the last backup, then I don't want to create a new backup or delete any of the old ones.
    – Seth
    37 mins ago










  • This needs a lot of coding. but one way of doing this is to create a file with the time-stamp in it. e.g date +%Y%m%d_%T >> /home/your-username/Desktop/time.txt then run the command find find . -type f ! -newer /home/your-username/Desktop/time.txt -delete
    – Goro
    4 mins ago










Your Answer







StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "106"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);

else
createEditor();

);

function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
convertImagesToLinks: false,
noModals: false,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);



);






Seth is a new contributor. Be nice, and check out our Code of Conduct.









 

draft saved


draft discarded


















StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2funix.stackexchange.com%2fquestions%2f469864%2fscheduled-folder-backup%23new-answer', 'question_page');

);

Post as a guest






























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes








up vote
5
down vote













How about this guide:



1) create your script: Create new file and call it myrsync.sh, copy/paste the lines below:



 #!/bin/bash
sudo rsync -av --progress --delete --log-file=/home/your-username/Desktop/$(date +%Y%m%d)_rsync.log --exclude "/home/your-username/.folder" /home/data /media/dataBackup_$(date +%Y%m%d_%T)


Meaning of the flags:



 -av bit: 'a' means archive, or copy everything recursively, preserving things like permissions, ownership and time stamps. 
-'v' is verbose, so it tells you what its doing, either in the terminal, in this case, in the log file.
--progress gives you more specific info about progress.
--delete checks for changes between source and destination, and deletes any files at the destination that you've deleted at the source.
--log-file saves a copy of the rsync result to a date-stamped file on my desktop.
--exclude leaves out any files or directories you don't want copied. In the command above, the .folder directory

/home/data is the directory I want copied. /home/data copies the directory and its contents, /home/data would just copy the contents.

/media/dataBackup is the separate drive. Change this to whatever your backup location is.


2) Save myrsync.sh in your ~$HOME and make it executable by typing:



sudo chmod +x /home/your-username/Desktop/rsync-shell.sh


You can now double click that .sh file, choose Run in Terminal, it will ask you for your password and run, then leave a log file on your desktop. Or, you can make a cron job to do it for you!



3) The cron job



Copy your myrsync.sh file to /root by typing:



sudo cp /home/your-username/Desktop/myrsync.sh /root


Then type:



sudo crontab -e


You'll see a line which reads: minute hour day month year command



Under that, type:
0 22 * * * /root/myrsync.sh > $HOME/readme.log 2>&1



This means:



The hour in military time (24 hour) format (0 to 23)
The day of the month (1 to 31)
The month (1 to 12)
The day of the week(0 or 7 is Sun, or use name)
The command to run
So at 22:00 (10pm) every day root will run the shell script, without prompting you for sudo password (because its running as root already).


Now press Control-X, then type "Y", then press Enter



In order to delete older back ups, one way of doing this is to create a file with the time-stamp in it For example add the following command after the command rsync



date +%Y%m%d_%T >> time.txt


Use the command find to delete backups that matches the time stamp e.g:



find . -type f ! -newer /tmp/timestamp -delete


Or



find . ! -newermt $date ! -type d -delete


This will delete back ups before specific date/time






share|improve this answer






















  • How do you age-off old backups? Like weekly1 to weekly2 or hourly1 to hourly2?
    – Seth
    53 mins ago










  • You don't have. It will create new folder at each sync and it will capture day/month/year and time in the newly created backup folder. See $(date +%Y%m%d) in the code above. the command will sync from /home to /media/HomeBackup_$(date +%Y%m%d) and $(date +%Y%m%d) will give new folder name with every sync.
    – Goro
    51 mins ago











  • Then you need a separate cron job to delete backups older than a set amount of time? Or modify the shell script to delete the oldest backup once the rsync command completes successfully?
    – Seth
    50 mins ago











  • For the hourly backups, I'd like to keep the last three hours (hourly1, hourly2, hourly3) and delete any hourly older than 3. If the user logs out, then pause the hourly backups (so that I really have a backup of the last three hours that the user was active). For weekly, keep the last 2 weeks (weekly1, weekly2) and delete any weekly backups older than 2. Run weekly backups once a day meaning that the weekly1 backup is what was in the users's home folder 7 days ago. If no files have changed since the last backup, then I don't want to create a new backup or delete any of the old ones.
    – Seth
    37 mins ago










  • This needs a lot of coding. but one way of doing this is to create a file with the time-stamp in it. e.g date +%Y%m%d_%T >> /home/your-username/Desktop/time.txt then run the command find find . -type f ! -newer /home/your-username/Desktop/time.txt -delete
    – Goro
    4 mins ago














up vote
5
down vote













How about this guide:



1) create your script: Create new file and call it myrsync.sh, copy/paste the lines below:



 #!/bin/bash
sudo rsync -av --progress --delete --log-file=/home/your-username/Desktop/$(date +%Y%m%d)_rsync.log --exclude "/home/your-username/.folder" /home/data /media/dataBackup_$(date +%Y%m%d_%T)


Meaning of the flags:



 -av bit: 'a' means archive, or copy everything recursively, preserving things like permissions, ownership and time stamps. 
-'v' is verbose, so it tells you what its doing, either in the terminal, in this case, in the log file.
--progress gives you more specific info about progress.
--delete checks for changes between source and destination, and deletes any files at the destination that you've deleted at the source.
--log-file saves a copy of the rsync result to a date-stamped file on my desktop.
--exclude leaves out any files or directories you don't want copied. In the command above, the .folder directory

/home/data is the directory I want copied. /home/data copies the directory and its contents, /home/data would just copy the contents.

/media/dataBackup is the separate drive. Change this to whatever your backup location is.


2) Save myrsync.sh in your ~$HOME and make it executable by typing:



sudo chmod +x /home/your-username/Desktop/rsync-shell.sh


You can now double click that .sh file, choose Run in Terminal, it will ask you for your password and run, then leave a log file on your desktop. Or, you can make a cron job to do it for you!



3) The cron job



Copy your myrsync.sh file to /root by typing:



sudo cp /home/your-username/Desktop/myrsync.sh /root


Then type:



sudo crontab -e


You'll see a line which reads: minute hour day month year command



Under that, type:
0 22 * * * /root/myrsync.sh > $HOME/readme.log 2>&1



This means:



The hour in military time (24 hour) format (0 to 23)
The day of the month (1 to 31)
The month (1 to 12)
The day of the week(0 or 7 is Sun, or use name)
The command to run
So at 22:00 (10pm) every day root will run the shell script, without prompting you for sudo password (because its running as root already).


Now press Control-X, then type "Y", then press Enter



In order to delete older back ups, one way of doing this is to create a file with the time-stamp in it For example add the following command after the command rsync



date +%Y%m%d_%T >> time.txt


Use the command find to delete backups that matches the time stamp e.g:



find . -type f ! -newer /tmp/timestamp -delete


Or



find . ! -newermt $date ! -type d -delete


This will delete back ups before specific date/time






share|improve this answer






















  • How do you age-off old backups? Like weekly1 to weekly2 or hourly1 to hourly2?
    – Seth
    53 mins ago










  • You don't have. It will create new folder at each sync and it will capture day/month/year and time in the newly created backup folder. See $(date +%Y%m%d) in the code above. the command will sync from /home to /media/HomeBackup_$(date +%Y%m%d) and $(date +%Y%m%d) will give new folder name with every sync.
    – Goro
    51 mins ago











  • Then you need a separate cron job to delete backups older than a set amount of time? Or modify the shell script to delete the oldest backup once the rsync command completes successfully?
    – Seth
    50 mins ago











  • For the hourly backups, I'd like to keep the last three hours (hourly1, hourly2, hourly3) and delete any hourly older than 3. If the user logs out, then pause the hourly backups (so that I really have a backup of the last three hours that the user was active). For weekly, keep the last 2 weeks (weekly1, weekly2) and delete any weekly backups older than 2. Run weekly backups once a day meaning that the weekly1 backup is what was in the users's home folder 7 days ago. If no files have changed since the last backup, then I don't want to create a new backup or delete any of the old ones.
    – Seth
    37 mins ago










  • This needs a lot of coding. but one way of doing this is to create a file with the time-stamp in it. e.g date +%Y%m%d_%T >> /home/your-username/Desktop/time.txt then run the command find find . -type f ! -newer /home/your-username/Desktop/time.txt -delete
    – Goro
    4 mins ago












up vote
5
down vote










up vote
5
down vote









How about this guide:



1) create your script: Create new file and call it myrsync.sh, copy/paste the lines below:



 #!/bin/bash
sudo rsync -av --progress --delete --log-file=/home/your-username/Desktop/$(date +%Y%m%d)_rsync.log --exclude "/home/your-username/.folder" /home/data /media/dataBackup_$(date +%Y%m%d_%T)


Meaning of the flags:



 -av bit: 'a' means archive, or copy everything recursively, preserving things like permissions, ownership and time stamps. 
-'v' is verbose, so it tells you what its doing, either in the terminal, in this case, in the log file.
--progress gives you more specific info about progress.
--delete checks for changes between source and destination, and deletes any files at the destination that you've deleted at the source.
--log-file saves a copy of the rsync result to a date-stamped file on my desktop.
--exclude leaves out any files or directories you don't want copied. In the command above, the .folder directory

/home/data is the directory I want copied. /home/data copies the directory and its contents, /home/data would just copy the contents.

/media/dataBackup is the separate drive. Change this to whatever your backup location is.


2) Save myrsync.sh in your ~$HOME and make it executable by typing:



sudo chmod +x /home/your-username/Desktop/rsync-shell.sh


You can now double click that .sh file, choose Run in Terminal, it will ask you for your password and run, then leave a log file on your desktop. Or, you can make a cron job to do it for you!



3) The cron job



Copy your myrsync.sh file to /root by typing:



sudo cp /home/your-username/Desktop/myrsync.sh /root


Then type:



sudo crontab -e


You'll see a line which reads: minute hour day month year command



Under that, type:
0 22 * * * /root/myrsync.sh > $HOME/readme.log 2>&1



This means:



The hour in military time (24 hour) format (0 to 23)
The day of the month (1 to 31)
The month (1 to 12)
The day of the week(0 or 7 is Sun, or use name)
The command to run
So at 22:00 (10pm) every day root will run the shell script, without prompting you for sudo password (because its running as root already).


Now press Control-X, then type "Y", then press Enter



In order to delete older back ups, one way of doing this is to create a file with the time-stamp in it For example add the following command after the command rsync



date +%Y%m%d_%T >> time.txt


Use the command find to delete backups that matches the time stamp e.g:



find . -type f ! -newer /tmp/timestamp -delete


Or



find . ! -newermt $date ! -type d -delete


This will delete back ups before specific date/time






share|improve this answer














How about this guide:



1) create your script: Create new file and call it myrsync.sh, copy/paste the lines below:



 #!/bin/bash
sudo rsync -av --progress --delete --log-file=/home/your-username/Desktop/$(date +%Y%m%d)_rsync.log --exclude "/home/your-username/.folder" /home/data /media/dataBackup_$(date +%Y%m%d_%T)


Meaning of the flags:



 -av bit: 'a' means archive, or copy everything recursively, preserving things like permissions, ownership and time stamps. 
-'v' is verbose, so it tells you what its doing, either in the terminal, in this case, in the log file.
--progress gives you more specific info about progress.
--delete checks for changes between source and destination, and deletes any files at the destination that you've deleted at the source.
--log-file saves a copy of the rsync result to a date-stamped file on my desktop.
--exclude leaves out any files or directories you don't want copied. In the command above, the .folder directory

/home/data is the directory I want copied. /home/data copies the directory and its contents, /home/data would just copy the contents.

/media/dataBackup is the separate drive. Change this to whatever your backup location is.


2) Save myrsync.sh in your ~$HOME and make it executable by typing:



sudo chmod +x /home/your-username/Desktop/rsync-shell.sh


You can now double click that .sh file, choose Run in Terminal, it will ask you for your password and run, then leave a log file on your desktop. Or, you can make a cron job to do it for you!



3) The cron job



Copy your myrsync.sh file to /root by typing:



sudo cp /home/your-username/Desktop/myrsync.sh /root


Then type:



sudo crontab -e


You'll see a line which reads: minute hour day month year command



Under that, type:
0 22 * * * /root/myrsync.sh > $HOME/readme.log 2>&1



This means:



The hour in military time (24 hour) format (0 to 23)
The day of the month (1 to 31)
The month (1 to 12)
The day of the week(0 or 7 is Sun, or use name)
The command to run
So at 22:00 (10pm) every day root will run the shell script, without prompting you for sudo password (because its running as root already).


Now press Control-X, then type "Y", then press Enter



In order to delete older back ups, one way of doing this is to create a file with the time-stamp in it For example add the following command after the command rsync



date +%Y%m%d_%T >> time.txt


Use the command find to delete backups that matches the time stamp e.g:



find . -type f ! -newer /tmp/timestamp -delete


Or



find . ! -newermt $date ! -type d -delete


This will delete back ups before specific date/time







share|improve this answer














share|improve this answer



share|improve this answer








edited 40 secs ago

























answered 59 mins ago









Goro

2,77341849




2,77341849











  • How do you age-off old backups? Like weekly1 to weekly2 or hourly1 to hourly2?
    – Seth
    53 mins ago










  • You don't have. It will create new folder at each sync and it will capture day/month/year and time in the newly created backup folder. See $(date +%Y%m%d) in the code above. the command will sync from /home to /media/HomeBackup_$(date +%Y%m%d) and $(date +%Y%m%d) will give new folder name with every sync.
    – Goro
    51 mins ago











  • Then you need a separate cron job to delete backups older than a set amount of time? Or modify the shell script to delete the oldest backup once the rsync command completes successfully?
    – Seth
    50 mins ago











  • For the hourly backups, I'd like to keep the last three hours (hourly1, hourly2, hourly3) and delete any hourly older than 3. If the user logs out, then pause the hourly backups (so that I really have a backup of the last three hours that the user was active). For weekly, keep the last 2 weeks (weekly1, weekly2) and delete any weekly backups older than 2. Run weekly backups once a day meaning that the weekly1 backup is what was in the users's home folder 7 days ago. If no files have changed since the last backup, then I don't want to create a new backup or delete any of the old ones.
    – Seth
    37 mins ago










  • This needs a lot of coding. but one way of doing this is to create a file with the time-stamp in it. e.g date +%Y%m%d_%T >> /home/your-username/Desktop/time.txt then run the command find find . -type f ! -newer /home/your-username/Desktop/time.txt -delete
    – Goro
    4 mins ago
















  • How do you age-off old backups? Like weekly1 to weekly2 or hourly1 to hourly2?
    – Seth
    53 mins ago










  • You don't have. It will create new folder at each sync and it will capture day/month/year and time in the newly created backup folder. See $(date +%Y%m%d) in the code above. the command will sync from /home to /media/HomeBackup_$(date +%Y%m%d) and $(date +%Y%m%d) will give new folder name with every sync.
    – Goro
    51 mins ago











  • Then you need a separate cron job to delete backups older than a set amount of time? Or modify the shell script to delete the oldest backup once the rsync command completes successfully?
    – Seth
    50 mins ago











  • For the hourly backups, I'd like to keep the last three hours (hourly1, hourly2, hourly3) and delete any hourly older than 3. If the user logs out, then pause the hourly backups (so that I really have a backup of the last three hours that the user was active). For weekly, keep the last 2 weeks (weekly1, weekly2) and delete any weekly backups older than 2. Run weekly backups once a day meaning that the weekly1 backup is what was in the users's home folder 7 days ago. If no files have changed since the last backup, then I don't want to create a new backup or delete any of the old ones.
    – Seth
    37 mins ago










  • This needs a lot of coding. but one way of doing this is to create a file with the time-stamp in it. e.g date +%Y%m%d_%T >> /home/your-username/Desktop/time.txt then run the command find find . -type f ! -newer /home/your-username/Desktop/time.txt -delete
    – Goro
    4 mins ago















How do you age-off old backups? Like weekly1 to weekly2 or hourly1 to hourly2?
– Seth
53 mins ago




How do you age-off old backups? Like weekly1 to weekly2 or hourly1 to hourly2?
– Seth
53 mins ago












You don't have. It will create new folder at each sync and it will capture day/month/year and time in the newly created backup folder. See $(date +%Y%m%d) in the code above. the command will sync from /home to /media/HomeBackup_$(date +%Y%m%d) and $(date +%Y%m%d) will give new folder name with every sync.
– Goro
51 mins ago





You don't have. It will create new folder at each sync and it will capture day/month/year and time in the newly created backup folder. See $(date +%Y%m%d) in the code above. the command will sync from /home to /media/HomeBackup_$(date +%Y%m%d) and $(date +%Y%m%d) will give new folder name with every sync.
– Goro
51 mins ago













Then you need a separate cron job to delete backups older than a set amount of time? Or modify the shell script to delete the oldest backup once the rsync command completes successfully?
– Seth
50 mins ago





Then you need a separate cron job to delete backups older than a set amount of time? Or modify the shell script to delete the oldest backup once the rsync command completes successfully?
– Seth
50 mins ago













For the hourly backups, I'd like to keep the last three hours (hourly1, hourly2, hourly3) and delete any hourly older than 3. If the user logs out, then pause the hourly backups (so that I really have a backup of the last three hours that the user was active). For weekly, keep the last 2 weeks (weekly1, weekly2) and delete any weekly backups older than 2. Run weekly backups once a day meaning that the weekly1 backup is what was in the users's home folder 7 days ago. If no files have changed since the last backup, then I don't want to create a new backup or delete any of the old ones.
– Seth
37 mins ago




For the hourly backups, I'd like to keep the last three hours (hourly1, hourly2, hourly3) and delete any hourly older than 3. If the user logs out, then pause the hourly backups (so that I really have a backup of the last three hours that the user was active). For weekly, keep the last 2 weeks (weekly1, weekly2) and delete any weekly backups older than 2. Run weekly backups once a day meaning that the weekly1 backup is what was in the users's home folder 7 days ago. If no files have changed since the last backup, then I don't want to create a new backup or delete any of the old ones.
– Seth
37 mins ago












This needs a lot of coding. but one way of doing this is to create a file with the time-stamp in it. e.g date +%Y%m%d_%T >> /home/your-username/Desktop/time.txt then run the command find find . -type f ! -newer /home/your-username/Desktop/time.txt -delete
– Goro
4 mins ago




This needs a lot of coding. but one way of doing this is to create a file with the time-stamp in it. e.g date +%Y%m%d_%T >> /home/your-username/Desktop/time.txt then run the command find find . -type f ! -newer /home/your-username/Desktop/time.txt -delete
– Goro
4 mins ago










Seth is a new contributor. Be nice, and check out our Code of Conduct.









 

draft saved


draft discarded


















Seth is a new contributor. Be nice, and check out our Code of Conduct.












Seth is a new contributor. Be nice, and check out our Code of Conduct.











Seth is a new contributor. Be nice, and check out our Code of Conduct.













 


draft saved


draft discarded














StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2funix.stackexchange.com%2fquestions%2f469864%2fscheduled-folder-backup%23new-answer', 'question_page');

);

Post as a guest













































































Comments

Popular posts from this blog

Long meetings (6-7 hours a day): Being “babysat” by supervisor

Is the Concept of Multiple Fantasy Races Scientifically Flawed? [closed]

Confectionery