From a conversation I had with a SharePoint Administrator:
“Eeee. The client wants site by site backup – that’s going to take me a while to sort out. I need to backup 500 sites via command line but want to know how big the sites are. I also need to know how long each backup took. I know SharePoint has Powershell but I simply don’t have the time to write a script – also, I only want to backup sites which have changed either today, this week, or just want to backup them all up”.
Sounds like a Challenge! Time for a Laff! Time for a new SharePoint automation application, in steps GEPBACKUP…
Ok, tall order, but GEPBACKUP2010 and GEPBACKUP2013 (two routines for different SharePoint versions) is a console command that runs on your SharePoint Server. It generates a Powershell script based on a number triggers you give it to control what to backup, where to backup. The key thing is though that it also notes key information about the site and also the time taken to carry out the backup. Additionally, for those monitoring kings, it logs backups to the Application log.
Note. Those of you who have downloaded GEMBACKUP for SharePoint 2007 will be at home with this command, since that one was for SharePoint 2007 and has nowhere near the same level of features this one has. Additionally, this is a 100% rewrite command specifically adding in a lot more and targets Powershell, not STSADM.
How Does It Work
The command line looks like this:
For SharePoint 2010 -> GEPBACKUP2010 [-u URL of site to check] [-d] [-w] [-a] [-l //BackupDirectoryShare] [-s PowerShellScriptFile] [-[f/x/w] BackupReportFile]
For SharePoint 2013 -> GEPBACKUP2013 [-u URL of site to check] [-d] [-w] [-a] [-l //BackupDirectoryShare] [-s PowerShellScriptFile] [-[f/x/w] BackupReportFile]
Ok, lets go through the switches in turn:
This is the URL of the site. Use this if you wish to target a specific site to backup instead of everything (i.e. all sites). Very useful if you wish to backup a site collection of a group but only want to target changed sites in that site collection. If you do not use this switch all web applications and site collections will be included (i.e. will be checked)
When this switch is used only sites whose content has been changed will be included in the backup. Note that as detailed in the previous, switch, if -u is not used all sites will be checked against this switch.
When this switch is used only sites whose content has been checked in the week will be includd in the backup. Note that as detailed in the previous, switch, if -w is not used all sites will be checked against this switch.
When this switch is used all sites will be included in the backup and will not be subject to a date comparison check – i.e. any last content date change will be ignored.
When this switch is used backups will be stored in the relevant share or the drive / folder. This application makes no check to identify whether the target is valid; however, if nothing is entered as the directory then an error will be thrown and the utility will not proceed.
When this switch is used the script will be generated in the relevant PowerShell backup filename submitted. If no Powershell script filename is entered and this switch is used an error will be thrown and the utility will not proceed.
-f / -x / -w
When this switch is used the output report file will be generated to either -f for Text File, -x for XML and -w for HTML format. Note that if there is no filename supplied with this switch an error will be thrown and the utility will not proceed.
GEPBackup2010 AND GepBackup2013 have the ability to ignore specific sites. All you need to do is to list the sites URL you don’t want backed up in a file called EXCLUSIONS.TXT stored in the same directory. When running the routine will check for the existence of EXCLUSIONS.TXT and if found will then check each URL being scanned against the URLS in EXCLUSIONS.TXT. If there is a match the URL being scanned will not be backed up.
This is very useful if you want to backup sites in a site collection but not all of them.
Examples of this command being used is as follows (based on using the 2010 version, the 2013 examples are identical, simply swap GEPBACKUP2010 for GEPBACKUP2013 whenever you see it below):
This will create a Powershell backup script called GEPBACKUP2010.ps1 including commands to backup each site whose content has been changed today. The output report will be called GEPBACKUP2010.txt and the backup files created by the script will be stored in the same directory it was run in.
GEPBACKUP2010 -a -f mybackups.txt -l e:\backups -s mybackupscript.ps1
This will create a Powershell backup script called mybackupscript.ps1 including commands to backup each site whose content has been changed today. The report output will be sent to mybackups.txt and the backups created by the script will be stored in the directory called e:\backups.
GEPBACKUP2010 -u http://myspsportal -d -f mybackups.txt -l e:\backups -s mybackupscript.ps1
This will create a Powershell backup script called mybackupscript.ps1 including commands to backup each sites from the web application http://myspsportal whose content has been changed today. The report output will be sent to mybackups.txt and the backups created by the script will be stored in the directory called called e:\backups.
How to Use
The routine generates a Powershell script. That script needs to be run by a batch file to make things easier. When run, the routine creates the Batch File in which it targets the Powershell script. The batch file has got a key entry in it, as follows:
Powershell -file thescriptfiletorun.ps1
This line kicks off the Powershell script created by the routine.
Now, this batch file needs to be run through the windows scheduler, or not, the option is yours.
For example, if you want backups to be generated everyday then all you need to do is ensure that the batch file GEPBACKUPRUN.BAT is in the Windows Scheduler and set to go off at a quiet period of the day (e.g. 1am in the morning).
So the process is:
1: Create a batch file with the routine and enter the relevant switches.
2: If you wish to exclude certain sites ensure they are listed in EXCLUSIONS.TXT (note only top level sites will be checked against those listed in the EXCLUSIONS.TXT file).
2: Run the batch file (created in step 1) to execute the routine – this will create another batch file called GEPBACKUPRUN.BAT
3: Put both of these batch files into the Windows Scheduler. Put the first one in to run say a half hour before the second. This gives the first batch file time to execute the application which then scans all the relevant sites as per the batch file instructions you entered in the first step. Make sure that these batch files run at a quiet time as the operation is memory intensive and each site is locked by as a precaution to prevent modification of the data.
Note that the backup is designed to overwrite any same named site in the folder, since all commands in the script have -force. If you wish to alter this behaviour you would have to modify the Powershell script.
There are three kinds of monitoring. The first two are output files from the routine. The third comes from the Powershell script – logs are generated during the backup and these are echoed into the Windows Application Events Log.
Lets discuss the Report Files:
1: -f / -w / -x (The Report File)
When any of the abover switches are used, GEPBACKUP2010 creates report output to Text File, Html or XML.
An example contents of this file is given below:
Capture Started at:12/10/2010 at 13:32 (UTC)
BACKUP WEB APPLICATION SCAN
Web Application Name: MySharePointWebApp – 9991
SITE COLLECTION URL: http://mysharepointwebapp:9991/sites/myhomesite
Storage Usage: 2287kb / 2mb / 0gb
Storage Quota: 104857600kb / 102400mb / 100gb
Content Database: PORTAL1_SITE
Content Last Updated: 01/08/2010 at 15/00 (UTC)
Security Last Modified: 01/08/2010 at 14/55 (UTC)
This Site Will Be Backed Up To: \\mynetworkshare\SharePointBackups\sps2010sitebackups\mysharepointwebapp:9991#sites#myhomesite.spb
Next Site details etc….
As you can see from the above, each site included in the backup gets information displayed against its URL, the storage being used, the quota assigned to it, its content database, who created the site, bandwidths, hits and visits. The key fields for the backup are content last updated and security last updated. These determine whether the site has been last used today or in the week.
Finally, if the site is going to be backed up, the target and filename of the file is shown.
2: The LOG file.
The log file shows what sites were scanned and if there are any errors those also. This log file is called GEPBACKUP2010.LOG or GEPBACKUP2013.LOG
3: Windows Application Events
Load the Event Viewer on the server after running the Powershell script that backs up the relevant sites. In it there will be events listed against this source: GEPBackup2010 OR GEPBackup2013.
The EventID for all the Windows Application Events are 65000. Each event shows the start and the completion time of each of the backups. Additionally, the completion time shows the estimated time taken to carry out the backup, shown in Hours, Minutes, Seconds and Milliseconds. This is extremely useful in determining backup windows for each of the site backups.
The only problems you can get running this command are to do with permissions. If you do not have at the very least owner rights to the site and read access to its database the command will fail. If you do not have sufficient rights to the backup share it will also fail.
Other issues related to backup are concerned with how Powershell is defined. A good article explaining the format of Powershell for Backup SPSites is here and should be read thoroughly before using Powershell:
This tool is free, but like all my tools on this site it’s great to get support through donation, and it’s fantastic to get support through that!