Who Is Ola Hallengren
Ola Hallengren is a DBA and database developer
who is currently working with a large financial company in Sweden. He has been
working with SQL Server since 2001.
Back in 2007, Ola began a project to improve
the database maintenance within a large SQL Server environment. At that time, a
combination of database maintenance jobs created using the SQL Server
Maintenance Wizard, along with a collection of custom Transact-SQL scripts,
were used throughout the organization. His goal was to remove the hodgepodge
collection of jobs and to create an easy-to-deploy solution that could be
easily rolled out across a large environment of mission critical servers.
After the project was completed, Ola realized
that his solution would be useful to other DBAs, so starting January 1, 2008,
Ola released his first version of his database maintenance script, offering it
free to any organization who wants to use it. Since then, he has improved his
script on a regular basis, and it is now being used by some of the largest
organizations in the world.
His current version was released July 29th,
2012, and you can download and read the documentation, at his website ola.hallengren.com.
Overview
Ola’s script is designed to cover the most
common database maintenance tasks, which include:
·
The ability to
performing full, differential, or log backups (as appropriate) of any system
and user databases, using either the native SQL Server backup command, or some
third-party backup products, such as SQL Backup,
or SQL HyperBac.
The script also has the ability to automatically remove older backup files from
disk that are no longer needed.
·
The ability to perform
dynamic index optimization by rebuilding indexes online or offline (depending
on the edition of SQL Server you have), reorganizing indexes, updating
statistics, or doing nothing at all, based on an index’s size and fragmentation
level.
·
The ability to perform
database integrity checks using DBCC CHECKDB, DBCC CHECKFILEGROUP, DBCC
CHECKTABLE, DBCC CHECKALLOC and/or DBCC CHECKCATALOG. .
·
The ability to delete
old job and backup history, including the removal of log files created by
maintenance jobs as they run.
·
The ability to
automatically create jobs that implement each of the above tasks. All you have
to do is to decide when to run them and schedule them appropriately.
Alternatively, you can specify that jobs are not automatically created, so you
can create your own.
Each of the above features can be run with
default settings, or various parameters can be changed to customize how the
maintenance tasks run on your SQL Server instances. As each job is run, the
results are logged immediately to log files and/or log tables so you can see
exactly what maintenance is being performed (even as it is happening). In
addition, Ola’s script includes extensive error-checking to make it easier to
troubleshoot any potential problems.
How to Use Ola’s
Maintenance Script: The Simple Explanation
If you are a novice DBA, or maybe you are just
in a hurry to get a maintenance plan running on your SQL Server instances, you
can quickly get Ola’s maintenance plan implemented. Here’s how:
2.
Near the top of the
script, around line 32, you will see a line of Transact-SQL code that looks
like this:
SET @BackupDirectory = N'C:\Backup' -- <== Change this to your backup directory.
Replace C:\Backup' with the pathname of the location where your
backups are to be stored.
3.
Run the script. At
this point, one table, one function and four stored procedures are created in
the master database, along with 11 new jobs pre-created to perform all the
maintenance tasks described earlier.
4.
Manually schedule the
pre-configured jobs to run at appropriate times.
5.
You are now done.
As you can see, Ola’s maintenance tasks can be
very simple and quick to implement. But like most DBAs, I don’t like to run
scripts, even professionally polished and written scripts such as Ola’s,
without taking some time to fully understand how the script works. In the next
section, we dissect Ola’s script so that we better understand how it works. At
the same time, we will learn how it can be customized for your specific
environment.
Creating
the Maintenance Jobs (and Cleanup Jobs)
Assuming
that you specified at the beginning of the Maintenance Solution.sql script that
you want it to create maintenance jobs, running the script will create a series
of eleven jobs, which include:
·
CommandLog Cleanup
·
DatabaseBackup - USER_DATABASES – FULL
·
DatabaseBackup - USER_DATABASES – DIFF
·
DatabaseBackup - USER_DATABASES – LOG
·
DatabaseBackup - SYSTEM_DATABASES – FULL
·
DatabaseIntegrityCheck - USER_DATABASES
·
DatabaseIntegrityCheck - SYSTEM_DATABASES
·
IndexOptimize - USER_DATABASES
·
Output File Cleanup
·
sp_delete_backuphistory
·
sp_purge_jobhistory
All of
these jobs have been created with default parameters from the script, but you
can change the parameters, if you want to, directly from within the jobs
themselves. In addition, none of the jobs have been scheduled, because you, as
the DBA, must decide what jobs you want to run, and when, in order to minimize
the performance impact these jobs can have when they execute. While these
eleven jobs have been created, they all may or may not need to be used,
depending on your SQL Server environment. This means that you may end up
modifying or deleting some of these jobs.
Before
we take a quick look at each of the jobs, we first need to briefly discuss how
Ola’s script creates jobs. Once we have this basic understanding, then we will
be better prepared to understand how they execute.
All of
the jobs created by Ola’s scripts run use the sqlcmd utility, which allows
Transact-SQL scripts to be executed from the command line. This makes it easy
to create and schedule your jobs. On the other hand, if you have not used the
sqlcmd utility before, it may take you a little time before you fully
understand how it works. Since this is not an article on how to use the sqlcmd
utility, I suggest you look it up on Books Online if you are not familiar with
it.
Another
feature that Ola uses in his jobs are tokens, another topic that many DBAs may
not be familiar with. In short, a token is similar to an @@function in SQL
Server. When you place a pre-defined token inside a job step script, when the
script is executed, SQL Server Agent replaces the token at run time. For
example, the token $(ESCAPE_SQUOTE(JOBID)), at runtime, will insert the value
of the jobid of the job into the script. To learn more about tokens, look up
“Using Token in Job Steps” in Books Online.
For the
moment, if you are not familiar with the sqlcmd utility or tokens, don’t worry,
as I will be showing an example of how they are used as I explain how the first
maintenance job works. While you won’t become an expert on using these from
this one example, you should learn enough from it to understand what is going
on.
Note: In
my first job example, I will spend extra time explaining how the sqlcmd utility
and tokens are used, along with an explanation of how the job works. Since all
the jobs are similar (except for the stored procedure name and the parameters
used), this will be the only job I explain in depth.
Let’s
take a quick look at each of the jobs and see how they work. Once you have done
so, you can decide if you want to use them as is, modify them, or ignore then
altogether and create your own custom jobs.
DatabaseBackup
- USER_DATABASES – FULL
As you
can guess from the name of this job, this job’s purpose is to create full
backups of user databases. Generally speaking, I like to make full backups of
all my databases nightly. This job includes a single job step, as you can see
below.
Each job created by the script creates a single step.
When I
click on the Edit button, we can view the job script for this particular job.
Every job created by the script is executed as an
operating system (CmdExec) job.
Each of
the jobs created by the MaintenanceSolution.sql script creates a single step,
and that step executes as an operating system (CmdExec) job. The command is too
small to read in figure 14 above, so here is the text of the command for this
job.
sqlcmd -E -S
$(ESCAPE_SQUOTE(SRVR)) -d master -Q "EXECUTE [dbo].[DatabaseBackup]
@Databases = 'USER_DATABASES', @Directory = N'C:\Backup', @BackupType = 'FULL',
@Verify = 'Y', @CleanupTime = 24, @CheckSum = 'Y'" –b
The
above command combines both sqlcmd utility and job token syntax. Let’s break
down this command into its constituent parts.
sqlcmd: This is the
operating system command to start the sqlcmd utility.
-E: This is the
sqlcmd syntax that specifies that a trusted connection will be made to the
server.
-S: This is the
sqlcmd syntax that specifies the name of the server to connect to. Normally,
you would see the actual name of the server follow immediately after this
option, but in this case, there is a job token.
$(ESCAPE_SQUOTE(SRVR)): This job token, when the job script is
executed, will substitute the name of the server. The “SRVR” portion of the
token should be a clue of this.
-d: This is the
sqlcmd syntax that is used to refer to the database that is being connected to.
master: This is the
name of the database that is being connected to. Master is being connected to
because this is where the database maintenance stored procedures are located.
-Q: This is the
sqlcmd syntax that says that the following quoted content (Transact-SQL code)
is to be executed.
"EXECUTE
[dbo].[DatabaseBackup] @Databases = 'USER_DATABASES', @Directory =
N'C:\Backup', @BackupType = 'FULL', @Verify = 'Y', @CleanupTime = 24, @CheckSum
= 'Y'":This is the Transact-SQL command that is to be executed. Notice
that several of the 24 available parameters have been specified, but not all of
them. The parameters that are not specified will use their default values. If you want to change or add a
parameter to the stored procedure, this is where you would make this
modification to the Transact-SQL statement.
-b: This is the
sqlcmd syntax to “on error batch abort”. If an error should occur, sqlcmd exits
and returns to a command prompt and returns an ERRORLEVEL value, which is
recorded in the log file. If an error does happens, the stored procedure will
continue to execute until it is done. For example, if the backup of one
database fails, the fact that it failed is recorded in the log file, and the
stored procedure will continue to back up the remaining databases.
At first
glance, this appears complicated. But once you spend some time reviewing this
code, and the code found in the other jobs, it should start to make sense.
Now that
the job code has been reviewed, we now need to take a look at one more part of
the job, and that is the Output File option of the Advanced Tab, which is shown
below.
The Output File option is where the maintenance job’s
log file goes.
The
output file for a job is important because it shows you exactly what the job
did (which I suggest you regularly review), which can be useful for
troubleshooting. I want to talk about it here because the script uses a lot of
tokens in the output file path. Since we can’t read the path in figure 14, here
is what it look like.
C:\Program
Files\Microsoft SQL
Server\MSSQL10.MSSQLSERVER\MSSQL\Log\DatabaseBackup_$(ESCAPE_SQUOTE(JOBID))_$(ESCAPE_SQUOTE(STEPID))_$(ESCAPE_SQUOTE(STRTDT))_$(ESCAPE_SQUOTE(STRTTM)).txt
As with
the job code, this output path may seem overly complicated, but is actually
quite clever as it uses tokens to create a unique log filename for every job
that runs, which makes it very easy to find the log files of any jobs you are
interested in viewing.
Here’s a
breakdown of the output path’s constituent parts:
C:\Program
Files\Microsoft SQL Server\MSSQL10.MSSQLSERVER\MSSQL\Log\: This is the default location of where
all SQL Server log files are located. When Ola’s script is run, it finds this
location and automatically inserts it here as the job is created.
DatabaseBackup_$(ESCAPE_SQUOTE(JOBID))_:Each
output file begins with the name of the job, which in this case is
“DatabaseBackup”. Next is a token for the job id, which becomes a part of the
output file’s name. An underline “_” is also added to make the filename easier
to read.
$(ESCAPE_SQUOTE(STEPID))_:
Next, another token is used to include the step id as part of the output file’s
name. Another underline is also included.
$(ESCAPE_SQUOTE(STRTDT))_:
Next, another token is used to include the date (YYYMMDD format) as part of the
output file’s name. Another underline is also included.
$(ESCAPE_SQUOTE(STRTTM))_:
Next, another token is used to include the time (HHMMSS format) as part of the
output file’s name. Another underline is also included.
.txt: And finally, an
extension of “.txt” is added to the output’s file name.
When the
above job is executed, the output path may create a result like this:
DatabaseBackup_0x524D34D2F9A6F94BA347806E52810D4_1_20100707_130527.txt
While
the job id and the step id are obscure, they help to make the filename unique.
When I look at output files, what I focus on is the “DatabaseBackup” part,
which tells me what job was run, and the date and time, so I know when it ran.
If you
plan to use this job, you still have one mandatory step you must take, and
several optional steps. The mandatory step is that you must schedule this job,
as this is not done for you. The optional steps are to set up Alerts and
Notifications, which I highly recommend, so that you are notified in case a job
fails. Of course, you may also want to modify the job in other ways by changing
the stored procedure’s parameters, but I’ll leave that up to you.
Of
course, you don’t have to use this job. As I mentioned earlier, I prefer to
create my own jobs, but I have used the jobs created by the script as a
template for my own jobs.
Now that
we have a reasonably good knowledge of how jobs created with the script work,
now let’s take a brief look at the remaining jobs, but without all the detail.
CommandLog Cleanup
This job
deletes rows from the CommandLog table that are over 30 days. If you want to
change the 30 day value, you can do so by directly editing the command line for
the job, changing it from 30 to whatever number of days works best for you. You
may choose to run this job daily, weekly, or monthly.
DatabaseBackup - USER_DATABASES – DIFF
This job
is designed to create differential backups of your user databases. If you do
create differential backups, be sure that you schedule this job to run after a
full backup of your databases, and schedule the backups appropriately. On the other
hand, if you don’t create differential backups, you can delete this job.
DatabaseBackup - USER_DATABASES – LOG
This job
is designed to create log backups of all user databases that use the full or
bulk logged recovery models. Be sure to schedule this job so that it runs after
a full backup. Generally speaking, I schedule my log backups to occur every
hour (or more often), depending on my needs.
DatabaseBackup - SYSTEM_DATABASES – FULL
This job
creates full backups of the system databases, and generally speaking, this job
should be scheduled to run nightly. There is no transaction log backup job
created by the script for system databases because they use the simple recovery
model, and transaction log backups cannot be taken of them. Some DBAs prefer to
create a single job to perform full backups of user and system databases at the
same time. You can do this if you want, but you will have to modify the jobs
accordingly, as the script does not provide this option.
DatabaseIntegrityCheck - USER_DATABASES
This job
performs a DBCC CHECKDB on all the user databases, looking for potential
database corruption. Be aware that this job could be time and resource
consuming, so schedule it during less busy time of the day.
DatabaseIntegrityCheck - SYSTEM_DATABASES
This job
performs a DBCC CHECKDB on all the system databases, looking for potential
database corruption. Because the system databases are generally small, running
this job should not take long. Some DBAs prefer to create a single job that
runs DBCC CHECKDB on both user and system databases at the same time. You can
do this if you want, but you will have to modify the jobs accordingly, as the
script does not provide this option.
IndexOptimize - USER_DATABASES
This job
analyzes all of the indexes in all of your user databases, and based on the
size of the indexes, and the fragmentation level of the databases, determines
if the index should be ignored, reorganized, or rebuilt. Of all the jobs created
by the script, this is one that you may want to alter, changing some of the
IndexOptimize stored procedure parameters so you have better control over how
this job runs. Of course, the default options will work well for most SQL
Server instances if you don’t have the time or knowledge to tweak the
parameters. Notice that there is no job created for index optimization for the
system databases. This is because index optimization for system databases
seldom is beneficial, although you can create a job to perform this task if you
like.
Output File Cleanup
This job
simply deletes log files created by the various database maintenance jobs that
are over 30 days old. If you want to change the 30 day value, you can do so by
directly editing the command line for the job, changing it from 30 to whatever
number of days works best for you. You may choose to run this job daily,
weekly, or monthly.
sp_delete_backuphistory
This job
deletes backup history from the msdb database that is over 30 days old using
the system stored procedure sp_delete_backuphistory. If you want to change the
30 day value, you can do so by directly editing the command line for the job,
changing it from 30 to whatever number of days works best for you. You may
choose to run this job daily, weekly, or monthly.
sp_purge_jobhistory
This job
deletes job (SQL Server Agent jobs) history from the msdb database that is over
30 days old using the system stored procedure sp_purge_jobhistory. If you want
to change the 30 day value, you can do so by directly editing the command line
for the job, changing it from 30 to whatever number of days works best for you.
You may choose to run this job daily, weekly, or monthly. While the script
creates separate jobs for the CommandLog
Cleanup, sp_purge_jobhistory,
sp_delete_backuphistory, and the Output File Cleanup jobs (as we have just
seen), I prefer to combine all four of these maintenance tasks into a single
job, as all four jobs are closely related and can easily be scheduled to run at
the same time. If you want to do this, you will have to alter one of the four
jobs so that it performs all four tasks, and then delete the remaining three
you don’t need, as the script does not provide this option.
Job Summary
Now that
you have had a chance to review the jobs that have been created by the script,
and have learned how they work, you should be in a better position to determine
if you want to use these jobs as is, to modify them, or to create your own.
Each of these options work, and it is up to you as the DBA to make the choice
that will work best for your environment.
Keeping Up-To-Date
Periodically,
Ola updates his scripts, fixing small bugs or adding new features. If the
current version of Ola’s script you are using works great for you, then you
probably don’t need to upgrade when he updates his script. On the other hand,
if he introduces a new feature you want to use, then upgrading is easy.
Assuming you have not customized his original script, you can run the new
script as if it was a new install.
When you
run the new script, Ola guarantees backward compatibility. If you modified any
of the jobs created by his script, they are left alone by the upgrade process.
If you have created your own custom jobs, they will also continue to work
without modification.
If you
have modified his script directly (other than the five essential settings), you
might want to consider contacting Ola to find out if the customizations you
have made are still needed in the new version.
Summary
Although
this has been a long article, it still does not cover every option in great
depth. I have made the assumption that you understand what the various
parameters do, or at least that you can look them up. I have also assumed that
you understand how to properly schedule your database maintenance jobs. So at
this point, if this article has sparked an interest in Ola’s script, then
download it and install it on a test server and try it out. Once you are
satisfied that it works as you expect, then roll it out to your SQL Server
instances and begin to immediately reap the benefits Ola’s script provides. If
you have specific questions about how Ola’s script runs, or want to suggest new
features, contact him through his website and ask him directly, as he is the
expert on the script.