Scheduled folder backup
Clash Royale CLAN TAG#URR8PPP
up vote
1
down vote
favorite
I'm looking for how to automatically backup a user's home directory in CentOs 7 to a remote host or NAS or just to ~/.snapshot.
In some Linux setups, I have seen a .snapshot folder in the user's home directory (~/.snapshot/) that holds hourly, nightly, and weekly backups of their home directory (ie ~/.snapshot/weekly1 for a copy of what was in the user's home directory 1 week ago).
I've seen several related posts on stack overflow, but so far, I haven't seen a guide that explains the complete workflow.
This is what I know so far:
- Use
rsync
to copy the contents of a given folder to the remote host, NAS, or (~/.snapshot/hourly0) - Create a shell script to execute the
rsync
command
#!/bin/bash
sudo rsync -av --progress --delete --log-file=/home/username/$(date +%Y%m%d)_rsync.log --exclude "/home/username/.snapshot" /home/username/ /home/username/.snapshot/hourly1
- Change the permissions on the script to make it executable
sudo chmod +x /home/username/myscript.sh
Use
crontab
to schedule the rsync command at the desired backup intervalSomehow move hourly0 to hourly1 before running the scheduled hourly rsync
Delete the oldest backup once rsync completes successfully
Are there any guides that cover how to do this? I don't understand how to automatically rename the folders as time goes on (ie weekly1 to weekly2), or how to delete week10 if I decide to only keep weeks up to 9. Is this another cron job?
shell-script cron rsync
New contributor
add a comment |Â
up vote
1
down vote
favorite
I'm looking for how to automatically backup a user's home directory in CentOs 7 to a remote host or NAS or just to ~/.snapshot.
In some Linux setups, I have seen a .snapshot folder in the user's home directory (~/.snapshot/) that holds hourly, nightly, and weekly backups of their home directory (ie ~/.snapshot/weekly1 for a copy of what was in the user's home directory 1 week ago).
I've seen several related posts on stack overflow, but so far, I haven't seen a guide that explains the complete workflow.
This is what I know so far:
- Use
rsync
to copy the contents of a given folder to the remote host, NAS, or (~/.snapshot/hourly0) - Create a shell script to execute the
rsync
command
#!/bin/bash
sudo rsync -av --progress --delete --log-file=/home/username/$(date +%Y%m%d)_rsync.log --exclude "/home/username/.snapshot" /home/username/ /home/username/.snapshot/hourly1
- Change the permissions on the script to make it executable
sudo chmod +x /home/username/myscript.sh
Use
crontab
to schedule the rsync command at the desired backup intervalSomehow move hourly0 to hourly1 before running the scheduled hourly rsync
Delete the oldest backup once rsync completes successfully
Are there any guides that cover how to do this? I don't understand how to automatically rename the folders as time goes on (ie weekly1 to weekly2), or how to delete week10 if I decide to only keep weeks up to 9. Is this another cron job?
shell-script cron rsync
New contributor
Welcome to U&L! We are happy to assist, but we aren't a scriptwriting service. Please show what you've tried, and explain how it did not work as you expected or intended, and we'll be happy to help guide you.
â DopeGhoti
1 hour ago
Would you please clarify what you would like to delete. An arbitrary deletion of the backups or just backups older than specific time?
â Goro
45 mins ago
Er, backing up a user's/home
into that directory doesn't seem like an actual backup...
â jasonwryan
29 mins ago
@jasonwryan: Why not? Taking a snapshot of the contents of a user's home directory at scheduled times seems like the definition of a backup to me.
â Seth
5 mins ago
add a comment |Â
up vote
1
down vote
favorite
up vote
1
down vote
favorite
I'm looking for how to automatically backup a user's home directory in CentOs 7 to a remote host or NAS or just to ~/.snapshot.
In some Linux setups, I have seen a .snapshot folder in the user's home directory (~/.snapshot/) that holds hourly, nightly, and weekly backups of their home directory (ie ~/.snapshot/weekly1 for a copy of what was in the user's home directory 1 week ago).
I've seen several related posts on stack overflow, but so far, I haven't seen a guide that explains the complete workflow.
This is what I know so far:
- Use
rsync
to copy the contents of a given folder to the remote host, NAS, or (~/.snapshot/hourly0) - Create a shell script to execute the
rsync
command
#!/bin/bash
sudo rsync -av --progress --delete --log-file=/home/username/$(date +%Y%m%d)_rsync.log --exclude "/home/username/.snapshot" /home/username/ /home/username/.snapshot/hourly1
- Change the permissions on the script to make it executable
sudo chmod +x /home/username/myscript.sh
Use
crontab
to schedule the rsync command at the desired backup intervalSomehow move hourly0 to hourly1 before running the scheduled hourly rsync
Delete the oldest backup once rsync completes successfully
Are there any guides that cover how to do this? I don't understand how to automatically rename the folders as time goes on (ie weekly1 to weekly2), or how to delete week10 if I decide to only keep weeks up to 9. Is this another cron job?
shell-script cron rsync
New contributor
I'm looking for how to automatically backup a user's home directory in CentOs 7 to a remote host or NAS or just to ~/.snapshot.
In some Linux setups, I have seen a .snapshot folder in the user's home directory (~/.snapshot/) that holds hourly, nightly, and weekly backups of their home directory (ie ~/.snapshot/weekly1 for a copy of what was in the user's home directory 1 week ago).
I've seen several related posts on stack overflow, but so far, I haven't seen a guide that explains the complete workflow.
This is what I know so far:
- Use
rsync
to copy the contents of a given folder to the remote host, NAS, or (~/.snapshot/hourly0) - Create a shell script to execute the
rsync
command
#!/bin/bash
sudo rsync -av --progress --delete --log-file=/home/username/$(date +%Y%m%d)_rsync.log --exclude "/home/username/.snapshot" /home/username/ /home/username/.snapshot/hourly1
- Change the permissions on the script to make it executable
sudo chmod +x /home/username/myscript.sh
Use
crontab
to schedule the rsync command at the desired backup intervalSomehow move hourly0 to hourly1 before running the scheduled hourly rsync
Delete the oldest backup once rsync completes successfully
Are there any guides that cover how to do this? I don't understand how to automatically rename the folders as time goes on (ie weekly1 to weekly2), or how to delete week10 if I decide to only keep weeks up to 9. Is this another cron job?
shell-script cron rsync
shell-script cron rsync
New contributor
New contributor
edited 13 mins ago
New contributor
asked 1 hour ago
Seth
1064
1064
New contributor
New contributor
Welcome to U&L! We are happy to assist, but we aren't a scriptwriting service. Please show what you've tried, and explain how it did not work as you expected or intended, and we'll be happy to help guide you.
â DopeGhoti
1 hour ago
Would you please clarify what you would like to delete. An arbitrary deletion of the backups or just backups older than specific time?
â Goro
45 mins ago
Er, backing up a user's/home
into that directory doesn't seem like an actual backup...
â jasonwryan
29 mins ago
@jasonwryan: Why not? Taking a snapshot of the contents of a user's home directory at scheduled times seems like the definition of a backup to me.
â Seth
5 mins ago
add a comment |Â
Welcome to U&L! We are happy to assist, but we aren't a scriptwriting service. Please show what you've tried, and explain how it did not work as you expected or intended, and we'll be happy to help guide you.
â DopeGhoti
1 hour ago
Would you please clarify what you would like to delete. An arbitrary deletion of the backups or just backups older than specific time?
â Goro
45 mins ago
Er, backing up a user's/home
into that directory doesn't seem like an actual backup...
â jasonwryan
29 mins ago
@jasonwryan: Why not? Taking a snapshot of the contents of a user's home directory at scheduled times seems like the definition of a backup to me.
â Seth
5 mins ago
Welcome to U&L! We are happy to assist, but we aren't a scriptwriting service. Please show what you've tried, and explain how it did not work as you expected or intended, and we'll be happy to help guide you.
â DopeGhoti
1 hour ago
Welcome to U&L! We are happy to assist, but we aren't a scriptwriting service. Please show what you've tried, and explain how it did not work as you expected or intended, and we'll be happy to help guide you.
â DopeGhoti
1 hour ago
Would you please clarify what you would like to delete. An arbitrary deletion of the backups or just backups older than specific time?
â Goro
45 mins ago
Would you please clarify what you would like to delete. An arbitrary deletion of the backups or just backups older than specific time?
â Goro
45 mins ago
Er, backing up a user's
/home
into that directory doesn't seem like an actual backup...â jasonwryan
29 mins ago
Er, backing up a user's
/home
into that directory doesn't seem like an actual backup...â jasonwryan
29 mins ago
@jasonwryan: Why not? Taking a snapshot of the contents of a user's home directory at scheduled times seems like the definition of a backup to me.
â Seth
5 mins ago
@jasonwryan: Why not? Taking a snapshot of the contents of a user's home directory at scheduled times seems like the definition of a backup to me.
â Seth
5 mins ago
add a comment |Â
1 Answer
1
active
oldest
votes
up vote
5
down vote
How about this guide:
1) create your script: Create new file and call it myrsync.sh
, copy/paste the lines below:
#!/bin/bash
sudo rsync -av --progress --delete --log-file=/home/your-username/Desktop/$(date +%Y%m%d)_rsync.log --exclude "/home/your-username/.folder" /home/data /media/dataBackup_$(date +%Y%m%d_%T)
Meaning of the flags:
-av bit: 'a' means archive, or copy everything recursively, preserving things like permissions, ownership and time stamps.
-'v' is verbose, so it tells you what its doing, either in the terminal, in this case, in the log file.
--progress gives you more specific info about progress.
--delete checks for changes between source and destination, and deletes any files at the destination that you've deleted at the source.
--log-file saves a copy of the rsync result to a date-stamped file on my desktop.
--exclude leaves out any files or directories you don't want copied. In the command above, the .folder directory
/home/data is the directory I want copied. /home/data copies the directory and its contents, /home/data would just copy the contents.
/media/dataBackup is the separate drive. Change this to whatever your backup location is.
2) Save myrsync.sh
in your ~$HOME and make it executable by typing:
sudo chmod +x /home/your-username/Desktop/rsync-shell.sh
You can now double click that .sh file, choose Run in Terminal, it will ask you for your password and run, then leave a log file on your desktop. Or, you can make a cron job to do it for you!
3) The cron job
Copy your myrsync.sh
file to /root by typing:
sudo cp /home/your-username/Desktop/myrsync.sh /root
Then type:
sudo crontab -e
You'll see a line which reads: minute hour day month year command
Under that, type:
0 22 * * * /root/myrsync.sh > $HOME/readme.log 2>&1
This means:
The hour in military time (24 hour) format (0 to 23)
The day of the month (1 to 31)
The month (1 to 12)
The day of the week(0 or 7 is Sun, or use name)
The command to run
So at 22:00 (10pm) every day root will run the shell script, without prompting you for sudo password (because its running as root already).
Now press Control-X, then type "Y", then press Enter
In order to delete older back ups, one way of doing this is to create a file with the time-stamp in it For example add the following command after the command rsync
date +%Y%m%d_%T >> time.txt
Use the command find
to delete backups that matches the time stamp e.g:
find . -type f ! -newer /tmp/timestamp -delete
Or
find . ! -newermt $date ! -type d -delete
This will delete back ups before specific date/time
How do you age-off old backups? Like weekly1 to weekly2 or hourly1 to hourly2?
â Seth
53 mins ago
You don't have. It will create new folder at eachsync
and it will capture day/month/year and time in the newly created backup folder. See$(date +%Y%m%d)
in the code above. the command willsync
from/home
to/media/HomeBackup_$(date +%Y%m%d)
and$(date +%Y%m%d)
will give new folder name with every sync.
â Goro
51 mins ago
Then you need a separate cron job to delete backups older than a set amount of time? Or modify the shell script to delete the oldest backup once the rsync command completes successfully?
â Seth
50 mins ago
For the hourly backups, I'd like to keep the last three hours (hourly1, hourly2, hourly3) and delete any hourly older than 3. If the user logs out, then pause the hourly backups (so that I really have a backup of the last three hours that the user was active). For weekly, keep the last 2 weeks (weekly1, weekly2) and delete any weekly backups older than 2. Run weekly backups once a day meaning that the weekly1 backup is what was in the users's home folder 7 days ago. If no files have changed since the last backup, then I don't want to create a new backup or delete any of the old ones.
â Seth
37 mins ago
This needs a lot of coding. but one way of doing this is to create a file with the time-stamp in it. e.gdate +%Y%m%d_%T >> /home/your-username/Desktop/time.txt
then run the command findfind . -type f ! -newer /home/your-username/Desktop/time.txt -delete
â Goro
4 mins ago
add a comment |Â
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
5
down vote
How about this guide:
1) create your script: Create new file and call it myrsync.sh
, copy/paste the lines below:
#!/bin/bash
sudo rsync -av --progress --delete --log-file=/home/your-username/Desktop/$(date +%Y%m%d)_rsync.log --exclude "/home/your-username/.folder" /home/data /media/dataBackup_$(date +%Y%m%d_%T)
Meaning of the flags:
-av bit: 'a' means archive, or copy everything recursively, preserving things like permissions, ownership and time stamps.
-'v' is verbose, so it tells you what its doing, either in the terminal, in this case, in the log file.
--progress gives you more specific info about progress.
--delete checks for changes between source and destination, and deletes any files at the destination that you've deleted at the source.
--log-file saves a copy of the rsync result to a date-stamped file on my desktop.
--exclude leaves out any files or directories you don't want copied. In the command above, the .folder directory
/home/data is the directory I want copied. /home/data copies the directory and its contents, /home/data would just copy the contents.
/media/dataBackup is the separate drive. Change this to whatever your backup location is.
2) Save myrsync.sh
in your ~$HOME and make it executable by typing:
sudo chmod +x /home/your-username/Desktop/rsync-shell.sh
You can now double click that .sh file, choose Run in Terminal, it will ask you for your password and run, then leave a log file on your desktop. Or, you can make a cron job to do it for you!
3) The cron job
Copy your myrsync.sh
file to /root by typing:
sudo cp /home/your-username/Desktop/myrsync.sh /root
Then type:
sudo crontab -e
You'll see a line which reads: minute hour day month year command
Under that, type:
0 22 * * * /root/myrsync.sh > $HOME/readme.log 2>&1
This means:
The hour in military time (24 hour) format (0 to 23)
The day of the month (1 to 31)
The month (1 to 12)
The day of the week(0 or 7 is Sun, or use name)
The command to run
So at 22:00 (10pm) every day root will run the shell script, without prompting you for sudo password (because its running as root already).
Now press Control-X, then type "Y", then press Enter
In order to delete older back ups, one way of doing this is to create a file with the time-stamp in it For example add the following command after the command rsync
date +%Y%m%d_%T >> time.txt
Use the command find
to delete backups that matches the time stamp e.g:
find . -type f ! -newer /tmp/timestamp -delete
Or
find . ! -newermt $date ! -type d -delete
This will delete back ups before specific date/time
How do you age-off old backups? Like weekly1 to weekly2 or hourly1 to hourly2?
â Seth
53 mins ago
You don't have. It will create new folder at eachsync
and it will capture day/month/year and time in the newly created backup folder. See$(date +%Y%m%d)
in the code above. the command willsync
from/home
to/media/HomeBackup_$(date +%Y%m%d)
and$(date +%Y%m%d)
will give new folder name with every sync.
â Goro
51 mins ago
Then you need a separate cron job to delete backups older than a set amount of time? Or modify the shell script to delete the oldest backup once the rsync command completes successfully?
â Seth
50 mins ago
For the hourly backups, I'd like to keep the last three hours (hourly1, hourly2, hourly3) and delete any hourly older than 3. If the user logs out, then pause the hourly backups (so that I really have a backup of the last three hours that the user was active). For weekly, keep the last 2 weeks (weekly1, weekly2) and delete any weekly backups older than 2. Run weekly backups once a day meaning that the weekly1 backup is what was in the users's home folder 7 days ago. If no files have changed since the last backup, then I don't want to create a new backup or delete any of the old ones.
â Seth
37 mins ago
This needs a lot of coding. but one way of doing this is to create a file with the time-stamp in it. e.gdate +%Y%m%d_%T >> /home/your-username/Desktop/time.txt
then run the command findfind . -type f ! -newer /home/your-username/Desktop/time.txt -delete
â Goro
4 mins ago
add a comment |Â
up vote
5
down vote
How about this guide:
1) create your script: Create new file and call it myrsync.sh
, copy/paste the lines below:
#!/bin/bash
sudo rsync -av --progress --delete --log-file=/home/your-username/Desktop/$(date +%Y%m%d)_rsync.log --exclude "/home/your-username/.folder" /home/data /media/dataBackup_$(date +%Y%m%d_%T)
Meaning of the flags:
-av bit: 'a' means archive, or copy everything recursively, preserving things like permissions, ownership and time stamps.
-'v' is verbose, so it tells you what its doing, either in the terminal, in this case, in the log file.
--progress gives you more specific info about progress.
--delete checks for changes between source and destination, and deletes any files at the destination that you've deleted at the source.
--log-file saves a copy of the rsync result to a date-stamped file on my desktop.
--exclude leaves out any files or directories you don't want copied. In the command above, the .folder directory
/home/data is the directory I want copied. /home/data copies the directory and its contents, /home/data would just copy the contents.
/media/dataBackup is the separate drive. Change this to whatever your backup location is.
2) Save myrsync.sh
in your ~$HOME and make it executable by typing:
sudo chmod +x /home/your-username/Desktop/rsync-shell.sh
You can now double click that .sh file, choose Run in Terminal, it will ask you for your password and run, then leave a log file on your desktop. Or, you can make a cron job to do it for you!
3) The cron job
Copy your myrsync.sh
file to /root by typing:
sudo cp /home/your-username/Desktop/myrsync.sh /root
Then type:
sudo crontab -e
You'll see a line which reads: minute hour day month year command
Under that, type:
0 22 * * * /root/myrsync.sh > $HOME/readme.log 2>&1
This means:
The hour in military time (24 hour) format (0 to 23)
The day of the month (1 to 31)
The month (1 to 12)
The day of the week(0 or 7 is Sun, or use name)
The command to run
So at 22:00 (10pm) every day root will run the shell script, without prompting you for sudo password (because its running as root already).
Now press Control-X, then type "Y", then press Enter
In order to delete older back ups, one way of doing this is to create a file with the time-stamp in it For example add the following command after the command rsync
date +%Y%m%d_%T >> time.txt
Use the command find
to delete backups that matches the time stamp e.g:
find . -type f ! -newer /tmp/timestamp -delete
Or
find . ! -newermt $date ! -type d -delete
This will delete back ups before specific date/time
How do you age-off old backups? Like weekly1 to weekly2 or hourly1 to hourly2?
â Seth
53 mins ago
You don't have. It will create new folder at eachsync
and it will capture day/month/year and time in the newly created backup folder. See$(date +%Y%m%d)
in the code above. the command willsync
from/home
to/media/HomeBackup_$(date +%Y%m%d)
and$(date +%Y%m%d)
will give new folder name with every sync.
â Goro
51 mins ago
Then you need a separate cron job to delete backups older than a set amount of time? Or modify the shell script to delete the oldest backup once the rsync command completes successfully?
â Seth
50 mins ago
For the hourly backups, I'd like to keep the last three hours (hourly1, hourly2, hourly3) and delete any hourly older than 3. If the user logs out, then pause the hourly backups (so that I really have a backup of the last three hours that the user was active). For weekly, keep the last 2 weeks (weekly1, weekly2) and delete any weekly backups older than 2. Run weekly backups once a day meaning that the weekly1 backup is what was in the users's home folder 7 days ago. If no files have changed since the last backup, then I don't want to create a new backup or delete any of the old ones.
â Seth
37 mins ago
This needs a lot of coding. but one way of doing this is to create a file with the time-stamp in it. e.gdate +%Y%m%d_%T >> /home/your-username/Desktop/time.txt
then run the command findfind . -type f ! -newer /home/your-username/Desktop/time.txt -delete
â Goro
4 mins ago
add a comment |Â
up vote
5
down vote
up vote
5
down vote
How about this guide:
1) create your script: Create new file and call it myrsync.sh
, copy/paste the lines below:
#!/bin/bash
sudo rsync -av --progress --delete --log-file=/home/your-username/Desktop/$(date +%Y%m%d)_rsync.log --exclude "/home/your-username/.folder" /home/data /media/dataBackup_$(date +%Y%m%d_%T)
Meaning of the flags:
-av bit: 'a' means archive, or copy everything recursively, preserving things like permissions, ownership and time stamps.
-'v' is verbose, so it tells you what its doing, either in the terminal, in this case, in the log file.
--progress gives you more specific info about progress.
--delete checks for changes between source and destination, and deletes any files at the destination that you've deleted at the source.
--log-file saves a copy of the rsync result to a date-stamped file on my desktop.
--exclude leaves out any files or directories you don't want copied. In the command above, the .folder directory
/home/data is the directory I want copied. /home/data copies the directory and its contents, /home/data would just copy the contents.
/media/dataBackup is the separate drive. Change this to whatever your backup location is.
2) Save myrsync.sh
in your ~$HOME and make it executable by typing:
sudo chmod +x /home/your-username/Desktop/rsync-shell.sh
You can now double click that .sh file, choose Run in Terminal, it will ask you for your password and run, then leave a log file on your desktop. Or, you can make a cron job to do it for you!
3) The cron job
Copy your myrsync.sh
file to /root by typing:
sudo cp /home/your-username/Desktop/myrsync.sh /root
Then type:
sudo crontab -e
You'll see a line which reads: minute hour day month year command
Under that, type:
0 22 * * * /root/myrsync.sh > $HOME/readme.log 2>&1
This means:
The hour in military time (24 hour) format (0 to 23)
The day of the month (1 to 31)
The month (1 to 12)
The day of the week(0 or 7 is Sun, or use name)
The command to run
So at 22:00 (10pm) every day root will run the shell script, without prompting you for sudo password (because its running as root already).
Now press Control-X, then type "Y", then press Enter
In order to delete older back ups, one way of doing this is to create a file with the time-stamp in it For example add the following command after the command rsync
date +%Y%m%d_%T >> time.txt
Use the command find
to delete backups that matches the time stamp e.g:
find . -type f ! -newer /tmp/timestamp -delete
Or
find . ! -newermt $date ! -type d -delete
This will delete back ups before specific date/time
How about this guide:
1) create your script: Create new file and call it myrsync.sh
, copy/paste the lines below:
#!/bin/bash
sudo rsync -av --progress --delete --log-file=/home/your-username/Desktop/$(date +%Y%m%d)_rsync.log --exclude "/home/your-username/.folder" /home/data /media/dataBackup_$(date +%Y%m%d_%T)
Meaning of the flags:
-av bit: 'a' means archive, or copy everything recursively, preserving things like permissions, ownership and time stamps.
-'v' is verbose, so it tells you what its doing, either in the terminal, in this case, in the log file.
--progress gives you more specific info about progress.
--delete checks for changes between source and destination, and deletes any files at the destination that you've deleted at the source.
--log-file saves a copy of the rsync result to a date-stamped file on my desktop.
--exclude leaves out any files or directories you don't want copied. In the command above, the .folder directory
/home/data is the directory I want copied. /home/data copies the directory and its contents, /home/data would just copy the contents.
/media/dataBackup is the separate drive. Change this to whatever your backup location is.
2) Save myrsync.sh
in your ~$HOME and make it executable by typing:
sudo chmod +x /home/your-username/Desktop/rsync-shell.sh
You can now double click that .sh file, choose Run in Terminal, it will ask you for your password and run, then leave a log file on your desktop. Or, you can make a cron job to do it for you!
3) The cron job
Copy your myrsync.sh
file to /root by typing:
sudo cp /home/your-username/Desktop/myrsync.sh /root
Then type:
sudo crontab -e
You'll see a line which reads: minute hour day month year command
Under that, type:
0 22 * * * /root/myrsync.sh > $HOME/readme.log 2>&1
This means:
The hour in military time (24 hour) format (0 to 23)
The day of the month (1 to 31)
The month (1 to 12)
The day of the week(0 or 7 is Sun, or use name)
The command to run
So at 22:00 (10pm) every day root will run the shell script, without prompting you for sudo password (because its running as root already).
Now press Control-X, then type "Y", then press Enter
In order to delete older back ups, one way of doing this is to create a file with the time-stamp in it For example add the following command after the command rsync
date +%Y%m%d_%T >> time.txt
Use the command find
to delete backups that matches the time stamp e.g:
find . -type f ! -newer /tmp/timestamp -delete
Or
find . ! -newermt $date ! -type d -delete
This will delete back ups before specific date/time
edited 40 secs ago
answered 59 mins ago
Goro
2,77341849
2,77341849
How do you age-off old backups? Like weekly1 to weekly2 or hourly1 to hourly2?
â Seth
53 mins ago
You don't have. It will create new folder at eachsync
and it will capture day/month/year and time in the newly created backup folder. See$(date +%Y%m%d)
in the code above. the command willsync
from/home
to/media/HomeBackup_$(date +%Y%m%d)
and$(date +%Y%m%d)
will give new folder name with every sync.
â Goro
51 mins ago
Then you need a separate cron job to delete backups older than a set amount of time? Or modify the shell script to delete the oldest backup once the rsync command completes successfully?
â Seth
50 mins ago
For the hourly backups, I'd like to keep the last three hours (hourly1, hourly2, hourly3) and delete any hourly older than 3. If the user logs out, then pause the hourly backups (so that I really have a backup of the last three hours that the user was active). For weekly, keep the last 2 weeks (weekly1, weekly2) and delete any weekly backups older than 2. Run weekly backups once a day meaning that the weekly1 backup is what was in the users's home folder 7 days ago. If no files have changed since the last backup, then I don't want to create a new backup or delete any of the old ones.
â Seth
37 mins ago
This needs a lot of coding. but one way of doing this is to create a file with the time-stamp in it. e.gdate +%Y%m%d_%T >> /home/your-username/Desktop/time.txt
then run the command findfind . -type f ! -newer /home/your-username/Desktop/time.txt -delete
â Goro
4 mins ago
add a comment |Â
How do you age-off old backups? Like weekly1 to weekly2 or hourly1 to hourly2?
â Seth
53 mins ago
You don't have. It will create new folder at eachsync
and it will capture day/month/year and time in the newly created backup folder. See$(date +%Y%m%d)
in the code above. the command willsync
from/home
to/media/HomeBackup_$(date +%Y%m%d)
and$(date +%Y%m%d)
will give new folder name with every sync.
â Goro
51 mins ago
Then you need a separate cron job to delete backups older than a set amount of time? Or modify the shell script to delete the oldest backup once the rsync command completes successfully?
â Seth
50 mins ago
For the hourly backups, I'd like to keep the last three hours (hourly1, hourly2, hourly3) and delete any hourly older than 3. If the user logs out, then pause the hourly backups (so that I really have a backup of the last three hours that the user was active). For weekly, keep the last 2 weeks (weekly1, weekly2) and delete any weekly backups older than 2. Run weekly backups once a day meaning that the weekly1 backup is what was in the users's home folder 7 days ago. If no files have changed since the last backup, then I don't want to create a new backup or delete any of the old ones.
â Seth
37 mins ago
This needs a lot of coding. but one way of doing this is to create a file with the time-stamp in it. e.gdate +%Y%m%d_%T >> /home/your-username/Desktop/time.txt
then run the command findfind . -type f ! -newer /home/your-username/Desktop/time.txt -delete
â Goro
4 mins ago
How do you age-off old backups? Like weekly1 to weekly2 or hourly1 to hourly2?
â Seth
53 mins ago
How do you age-off old backups? Like weekly1 to weekly2 or hourly1 to hourly2?
â Seth
53 mins ago
You don't have. It will create new folder at each
sync
and it will capture day/month/year and time in the newly created backup folder. See $(date +%Y%m%d)
in the code above. the command will sync
from /home
to /media/HomeBackup_$(date +%Y%m%d)
and $(date +%Y%m%d)
will give new folder name with every sync.â Goro
51 mins ago
You don't have. It will create new folder at each
sync
and it will capture day/month/year and time in the newly created backup folder. See $(date +%Y%m%d)
in the code above. the command will sync
from /home
to /media/HomeBackup_$(date +%Y%m%d)
and $(date +%Y%m%d)
will give new folder name with every sync.â Goro
51 mins ago
Then you need a separate cron job to delete backups older than a set amount of time? Or modify the shell script to delete the oldest backup once the rsync command completes successfully?
â Seth
50 mins ago
Then you need a separate cron job to delete backups older than a set amount of time? Or modify the shell script to delete the oldest backup once the rsync command completes successfully?
â Seth
50 mins ago
For the hourly backups, I'd like to keep the last three hours (hourly1, hourly2, hourly3) and delete any hourly older than 3. If the user logs out, then pause the hourly backups (so that I really have a backup of the last three hours that the user was active). For weekly, keep the last 2 weeks (weekly1, weekly2) and delete any weekly backups older than 2. Run weekly backups once a day meaning that the weekly1 backup is what was in the users's home folder 7 days ago. If no files have changed since the last backup, then I don't want to create a new backup or delete any of the old ones.
â Seth
37 mins ago
For the hourly backups, I'd like to keep the last three hours (hourly1, hourly2, hourly3) and delete any hourly older than 3. If the user logs out, then pause the hourly backups (so that I really have a backup of the last three hours that the user was active). For weekly, keep the last 2 weeks (weekly1, weekly2) and delete any weekly backups older than 2. Run weekly backups once a day meaning that the weekly1 backup is what was in the users's home folder 7 days ago. If no files have changed since the last backup, then I don't want to create a new backup or delete any of the old ones.
â Seth
37 mins ago
This needs a lot of coding. but one way of doing this is to create a file with the time-stamp in it. e.g
date +%Y%m%d_%T >> /home/your-username/Desktop/time.txt
then run the command find find . -type f ! -newer /home/your-username/Desktop/time.txt -delete
â Goro
4 mins ago
This needs a lot of coding. but one way of doing this is to create a file with the time-stamp in it. e.g
date +%Y%m%d_%T >> /home/your-username/Desktop/time.txt
then run the command find find . -type f ! -newer /home/your-username/Desktop/time.txt -delete
â Goro
4 mins ago
add a comment |Â
Seth is a new contributor. Be nice, and check out our Code of Conduct.
Seth is a new contributor. Be nice, and check out our Code of Conduct.
Seth is a new contributor. Be nice, and check out our Code of Conduct.
Seth is a new contributor. Be nice, and check out our Code of Conduct.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2funix.stackexchange.com%2fquestions%2f469864%2fscheduled-folder-backup%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Welcome to U&L! We are happy to assist, but we aren't a scriptwriting service. Please show what you've tried, and explain how it did not work as you expected or intended, and we'll be happy to help guide you.
â DopeGhoti
1 hour ago
Would you please clarify what you would like to delete. An arbitrary deletion of the backups or just backups older than specific time?
â Goro
45 mins ago
Er, backing up a user's
/home
into that directory doesn't seem like an actual backup...â jasonwryan
29 mins ago
@jasonwryan: Why not? Taking a snapshot of the contents of a user's home directory at scheduled times seems like the definition of a backup to me.
â Seth
5 mins ago