Tuesday, June 17, 2014

SQL Server 2014 Fulltext search ON PDF Document


Fulltext search is the good one to search words on document

Find the below steps to configure Full Text Search.

·         First enable file stream
·         Enable full text search
·         Download and Install adobe I filter
·         Setting environmental variable
·         Restart Machine
·         Creating database and table
·         Create full text search
·         Start Full population
·         Insert docs into table
·         Retrieve data from document

First enable file stream:-

Start àms sql server  2014 à configuration tools à select sql server instances à propertiesàfilestreamà  enable below all

Enable filestream for T-sql access
Enable filestream for file I/O access
Allow remote clients to acess to file stream data.

 

 

Enable fulltext search



By default full text search will disabled in sql server configuration manager as per your requirement set start mode automatic and start.



Download and Install adobe I filter:-


Find the below link and download adobe ifilter from adobe..


 

Setting environment variable:-

·         The full text indexing service in SQL Server allows PDF files to be indexed and allows you to perform full text searches against the contents of PDF files stored in binary fields. The prerequisite for making this work is the installation of Adobe PDF iFilter.
·         Unfortunately, I’ve found this to be really buggy and have spent the best part of half a day trying to get this to work against the a 64bit version of SQL 2008 R2. Everything appears to install properly but when you actually attempt to search for some PDFs, none are returned. For anyone also experiencing this problem, here’s some advice which might help you:
·         01. After installing Adobe iFilter, run the following commands:
·         exec sp_fulltext_service 'load_os_resources', 1;
·         exec sp_fulltext_service 'verify_signature', 0;
·         02. Add the iFilter.dll folder location into the Windows path. By default, this is set to the following:
·         C:\Program Files\Adobe\Adobe PDF iFilter 9 for 64-bit platforms\bin
·         If using the ‘Environment Variables’ dialog in ‘System Properties’ , make sure to edit the path variable within ‘System variables’ and not the path variable for the user (if one exists).
·         

·         03. PDF searching may work but most likely, it won’t. Try restarting the SQL Server Service and if this fails, reboot the server.
·         04. If this still doesn’t work, try deleting and recreating the full text indexes.
·         05. If this still doesn’t work, uninstall iFilter and repeat from step 1 until it does work. In my case, the PDF searching began working for me after uninstalling and reinstalling, despite every setting being identical the second time round.
·         If you encounter this problem, other tips that might be of use to you are:
·         01. Try performing searches against the contents of other PDF files as it may be the PDF content that’s at fault and not the iFilter
·         02. When the iFilter is installed properly, PDF files on the file system will be searchable using the search in Windows. It’s worth investigating that this works properly before delving in too deep into SQL Server.
·         03. In SQL Server, it’s worth running the following SELECT and making sure that a row relating to ‘PDF’ is returned. This row should  contain the correct file location that corresponds to where PDFFilter.dll is installed on the server.
·         SELECT * FROM sys.fulltext_document_types

For more information find the below URL

Restart complete machine once

Create database and table :- If you want to continue with old database you can proceed, but here I’m creating new database. 

Create database :-

Crete database FT_TEST

Create Table:-


USE [FT_Test]
GO

/****** Object:  Table [dbo].[FTS]    Script Date: 6/17/2014 4:29:27 PM ******/
SET ANSI_NULLS ON
GO

SET QUOTED_IDENTIFIER ON
GO

SET ANSI_PADDING ON
GO

CREATE TABLE [dbo].[FTS](
       [Id] [int] IDENTITY(1,1) NOT NULL,
       [DocNAme] [varchar](250) NULL,
       [Extention] [varchar](20) NULL,
       [Doc] [varbinary](max) NULL,
 CONSTRAINT [PK_FTS] PRIMARY KEY CLUSTERED
(
       [Id] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]

GO

SET ANSI_PADDING OFF
GO

Create Fulltext search:-


Defining fulltext search





After creation FT Click on Start Full papualtion


Now Insert Data into table


insert into fts( docname, extention, doc)
select '\\192.168.2.169\Share\eBook_Performance_Tuning_Davidson_Ford.pdf','.pdf', * from openrowset (bulk '\\192.168.2.169\Share\eBook_Performance_Tuning_Davidson_Ford.pdf',single_Blob) A

Retrieve data from document


select * from fts where contains (*,'simple')

in case if you'r not able to retrive data from pdf file follow below steps as well...


To install Windows TIFF IFilter with Windows Server 2008 R2

  1. Click Start, click All Programs, click Administrative Tools, and then click Server Manager.
  2. In the console tree of Server Manager, click Features, and then in Feature Summary, click Add Features.
  3. Click Features, and then click Add Features.
  4. Select the Windows TIFF IFilter check box, and then click Next.
  5. Click Install.










Friday, June 13, 2014

SQL Server Tempdb Considerations.

SQL Server tempdb is a database in SQL Server which is used to store temporary user objects and internal objects. Because the tempdb is used a lot by SQL Server it is important for the performance to configure it correctly. Tempdb is only using one data file by default. Multiple data files can improve the I/O performance. It is recommended to use between 1/4 and 1/2 * number of processor cores. So for example if you have an 8 core server use 2 to 4 files for the tempdb. The tempdb data files and its log file should be placed on a fast disk. You should create big enough data files with the same original size and only use auto growth on tempdb to prevent errors.

To add data files to your tempdb:
USE master
GO
ALTER DATABASE tempdb ADD FILE ( NAME = tempdbDatafile2, FILENAME = 'C:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\DATA\tempdbDatafile2.mdf',
SIZE = 100MB,
MAXSIZE = UNLIMITED,
FILEGROWTH = 10%),
( NAME = tempdbDatafile3,
FILENAME = 'C:\Program Files\Microsoft SQL Server\
MSSQL11.MSSQLSERVER\MSSQL\DATA\tempdbDatafile3.mdf',
SIZE = 100MB,
MAXSIZE = UNLIMITED,
FILEGROWTH = 10%)
GO


To move the tempdb files to a new location:
USE master;
 GO
 ALTER DATABASE tempdb MODIFY FILE (NAME = tempdev, FILENAME = 'D:\Temp\tempdb.mdf');
 GO
 ALTER DATABASE tempdb MODIFY FILE (NAME = templog, FILENAME = 'D:\Temp\templog.ldf');
 go
 ALTER DATABASE tempdb  MODIFY FILE (NAME = tempdbDatafile2,  FILENAME = 'D:\Temp\tempdbDatafile2.mdf');
 go
 ALTER DATABASE tempdb MODIFY FILE (NAME = tempdbDatafile3,  FILENAME ='D:\Temp\tempdbDatafile3.mdf');
 GO

Now restart the instance and check is the files are in their new location. Then delete the old files.

Wednesday, June 11, 2014

SQL Server 2012 SP2 is now available!

Announcing the release of SQL Server 2012 Service Pack 2. It's now available to all customers with existing SQL Server 2012 deployments for download on the MSFT download center.

http://blogs.technet.com/b/dataplatforminsider/archive/2014/06/11/sql-server-2012-sp2-is-now-available.aspx?WT.mc_id=Social_SQL_Announce_DI&WT.mc_id=Social_FB_OutgoingNews_20140611_64123098_Microsoft%20SQL%20Server%20-%20sqlserver&linkId=8485336

What is the Difference Between Rebuild and Reorganize indexes ?


Rebuilding Index:-

  • Rebuilding task dropping existing one and create new one.
  • Locking’s and blockings will occurred, during the rebuilding process.
  • Generally rebuild process runs faster than Reorganizing index.
  • More Disk space is required for rebuilding operation.
  • Index and columns stats are automatically updated, as a part of rebuilding process.
  • When the fragmentation is more than 30% better rebuild index.
  • We can rebuild online or offline
  • Example:-

ALTER INDEX (indexname) ON (tablename)
REBUILD;


Reorganizing Index:-
  • Reorganizing task is just modifying the existing index.
  • User can access database during the reorganizing process.
  • Generally reorganizing index takes long time rather than rebuilding task.
  • Less space is required.
  • Index and statics need to be updated after reorganizing index
  • If the fragmentation is B/W 5% to 30% better to Reorganizing Index.
  • Reorganizing index process is offline only.
  • Example
ALTER INDEX (indexname) ON (tablename) REORGANIZE;




SQL Server Maintenance Solution From Ola Hallengren


Who Is Ola Hallengren
Ola Hallengren is a DBA and database developer who is currently working with a large financial company in Sweden. He has been working with SQL Server since 2001.
Back in 2007, Ola began a project to improve the database maintenance within a large SQL Server environment. At that time, a combination of database maintenance jobs created using the SQL Server Maintenance Wizard, along with a collection of custom Transact-SQL scripts, were used throughout the organization. His goal was to remove the hodgepodge collection of jobs and to create an easy-to-deploy solution that could be easily rolled out across a large environment of mission critical servers.
After the project was completed, Ola realized that his solution would be useful to other DBAs, so starting January 1, 2008, Ola released his first version of his database maintenance script, offering it free to any organization who wants to use it. Since then, he has improved his script on a regular basis, and it is now being used by some of the largest organizations in the world.
His current version was released July 29th, 2012, and you can download and read the documentation, at his website ola.hallengren.com.
Overview
Ola’s script is designed to cover the most common database maintenance tasks, which include:
·         The ability to performing full, differential, or log backups (as appropriate) of any system and user databases, using either the native SQL Server backup command, or some third-party backup products, such as SQL Backup, or SQL HyperBac. The script also has the ability to automatically remove older backup files from disk that are no longer needed.
·         The ability to perform dynamic index optimization by rebuilding indexes online or offline (depending on the edition of SQL Server you have), reorganizing indexes, updating statistics, or doing nothing at all, based on an index’s size and fragmentation level.
·         The ability to perform database integrity checks using DBCC CHECKDB, DBCC CHECKFILEGROUP, DBCC CHECKTABLE, DBCC CHECKALLOC and/or DBCC CHECKCATALOG. .
·         The ability to delete old job and backup history, including the removal of log files created by maintenance jobs as they run.
·         The ability to automatically create jobs that implement each of the above tasks. All you have to do is to decide when to run them and schedule them appropriately. Alternatively, you can specify that jobs are not automatically created, so you can create your own.
Each of the above features can be run with default settings, or various parameters can be changed to customize how the maintenance tasks run on your SQL Server instances. As each job is run, the results are logged immediately to log files and/or log tables so you can see exactly what maintenance is being performed (even as it is happening). In addition, Ola’s script includes extensive error-checking to make it easier to troubleshoot any potential problems.


How to Use Ola’s Maintenance Script: The Simple Explanation
If you are a novice DBA, or maybe you are just in a hurry to get a maintenance plan running on your SQL Server instances, you can quickly get Ola’s maintenance plan implemented. Here’s how:
1.     Download the MaintenanceSolution.sql script from Ola’s website and open it up in a query window inside of SSMS.
2.     Near the top of the script, around line 32, you will see a line of Transact-SQL code that looks like this: 
SET @BackupDirectory = N'C:\Backup' -- <== Change this to your backup directory.


Replace 
C:\Backup' with the pathname of the location where your backups are to be stored.
3.     Run the script. At this point, one table, one function and four stored procedures are created in the master database, along with 11 new jobs pre-created to perform all the maintenance tasks described earlier.
4.     Manually schedule the pre-configured jobs to run at appropriate times.
5.     You are now done.
As you can see, Ola’s maintenance tasks can be very simple and quick to implement. But like most DBAs, I don’t like to run scripts, even professionally polished and written scripts such as Ola’s, without taking some time to fully understand how the script works. In the next section, we dissect Ola’s script so that we better understand how it works. At the same time, we will learn how it can be customized for your specific environment.

Creating the Maintenance Jobs (and Cleanup Jobs)

Assuming that you specified at the beginning of the Maintenance Solution.sql script that you want it to create maintenance jobs, running the script will create a series of eleven jobs, which include:
·         CommandLog Cleanup
·         DatabaseBackup - USER_DATABASES – FULL
·         DatabaseBackup - USER_DATABASES – DIFF
·         DatabaseBackup - USER_DATABASES – LOG
·         DatabaseBackup - SYSTEM_DATABASES – FULL
·         DatabaseIntegrityCheck - USER_DATABASES
·         DatabaseIntegrityCheck - SYSTEM_DATABASES
·         IndexOptimize - USER_DATABASES
·         Output File Cleanup
·         sp_delete_backuphistory
·         sp_purge_jobhistory
All of these jobs have been created with default parameters from the script, but you can change the parameters, if you want to, directly from within the jobs themselves. In addition, none of the jobs have been scheduled, because you, as the DBA, must decide what jobs you want to run, and when, in order to minimize the performance impact these jobs can have when they execute. While these eleven jobs have been created, they all may or may not need to be used, depending on your SQL Server environment. This means that you may end up modifying or deleting some of these jobs.
Before we take a quick look at each of the jobs, we first need to briefly discuss how Ola’s script creates jobs. Once we have this basic understanding, then we will be better prepared to understand how they execute.
All of the jobs created by Ola’s scripts run use the sqlcmd utility, which allows Transact-SQL scripts to be executed from the command line. This makes it easy to create and schedule your jobs. On the other hand, if you have not used the sqlcmd utility before, it may take you a little time before you fully understand how it works. Since this is not an article on how to use the sqlcmd utility, I suggest you look it up on Books Online if you are not familiar with it.
Another feature that Ola uses in his jobs are tokens, another topic that many DBAs may not be familiar with. In short, a token is similar to an @@function in SQL Server. When you place a pre-defined token inside a job step script, when the script is executed, SQL Server Agent replaces the token at run time. For example, the token $(ESCAPE_SQUOTE(JOBID)), at runtime, will insert the value of the jobid of the job into the script. To learn more about tokens, look up “Using Token in Job Steps” in Books Online.
For the moment, if you are not familiar with the sqlcmd utility or tokens, don’t worry, as I will be showing an example of how they are used as I explain how the first maintenance job works. While you won’t become an expert on using these from this one example, you should learn enough from it to understand what is going on.
Note: In my first job example, I will spend extra time explaining how the sqlcmd utility and tokens are used, along with an explanation of how the job works. Since all the jobs are similar (except for the stored procedure name and the parameters used), this will be the only job I explain in depth.
Let’s take a quick look at each of the jobs and see how they work. Once you have done so, you can decide if you want to use them as is, modify them, or ignore then altogether and create your own custom jobs.
DatabaseBackup - USER_DATABASES – FULL
As you can guess from the name of this job, this job’s purpose is to create full backups of user databases. Generally speaking, I like to make full backups of all my databases nightly. This job includes a single job step, as you can see below.
 Each job created by the script creates a single step.
When I click on the Edit button, we can view the job script for this particular job.
 Every job created by the script is executed as an operating system (CmdExec) job.
Each of the jobs created by the MaintenanceSolution.sql script creates a single step, and that step executes as an operating system (CmdExec) job. The command is too small to read in figure 14 above, so here is the text of the command for this job.

sqlcmd -E -S $(ESCAPE_SQUOTE(SRVR)) -d master -Q "EXECUTE [dbo].[DatabaseBackup] @Databases = 'USER_DATABASES', @Directory = N'C:\Backup', @BackupType = 'FULL', @Verify = 'Y', @CleanupTime = 24, @CheckSum = 'Y'" –b
The above command combines both sqlcmd utility and job token syntax. Let’s break down this command into its constituent parts.
sqlcmd: This is the operating system command to start the sqlcmd utility.

-E: This is the sqlcmd syntax that specifies that a trusted connection will be made to the server.

-S: This is the sqlcmd syntax that specifies the name of the server to connect to. Normally, you would see the actual name of the server follow immediately after this option, but in this case, there is a job token.

$(ESCAPE_SQUOTE(SRVR)): This job token, when the job script is executed, will substitute the name of the server. The “SRVR” portion of the token should be a clue of this.

-d: This is the sqlcmd syntax that is used to refer to the database that is being connected to.

master: This is the name of the database that is being connected to. Master is being connected to because this is where the database maintenance stored procedures are located.

-Q: This is the sqlcmd syntax that says that the following quoted content (Transact-SQL code) is to be executed.

"EXECUTE [dbo].[DatabaseBackup] @Databases = 'USER_DATABASES', @Directory = N'C:\Backup', @BackupType = 'FULL', @Verify = 'Y', @CleanupTime = 24, @CheckSum = 'Y'":This is the Transact-SQL command that is to be executed. Notice that several of the 24 available parameters have been specified, but not all of them. The parameters that are not specified will use their default values. If you want to change or add a parameter to the stored procedure, this is where you would make this modification to the Transact-SQL statement.

-b: This is the sqlcmd syntax to “on error batch abort”. If an error should occur, sqlcmd exits and returns to a command prompt and returns an ERRORLEVEL value, which is recorded in the log file. If an error does happens, the stored procedure will continue to execute until it is done. For example, if the backup of one database fails, the fact that it failed is recorded in the log file, and the stored procedure will continue to back up the remaining databases.

At first glance, this appears complicated. But once you spend some time reviewing this code, and the code found in the other jobs, it should start to make sense.
Now that the job code has been reviewed, we now need to take a look at one more part of the job, and that is the Output File option of the Advanced Tab, which is shown below.

The Output File option is where the maintenance job’s log file goes.
The output file for a job is important because it shows you exactly what the job did (which I suggest you regularly review), which can be useful for troubleshooting. I want to talk about it here because the script uses a lot of tokens in the output file path. Since we can’t read the path in figure 14, here is what it look like.

C:\Program Files\Microsoft SQL Server\MSSQL10.MSSQLSERVER\MSSQL\Log\DatabaseBackup_$(ESCAPE_SQUOTE(JOBID))_$(ESCAPE_SQUOTE(STEPID))_$(ESCAPE_SQUOTE(STRTDT))_$(ESCAPE_SQUOTE(STRTTM)).txt

As with the job code, this output path may seem overly complicated, but is actually quite clever as it uses tokens to create a unique log filename for every job that runs, which makes it very easy to find the log files of any jobs you are interested in viewing.
Here’s a breakdown of the output path’s constituent parts:

C:\Program Files\Microsoft SQL Server\MSSQL10.MSSQLSERVER\MSSQL\Log\: This is the default location of where all SQL Server log files are located. When Ola’s script is run, it finds this location and automatically inserts it here as the job is created.

DatabaseBackup_$(ESCAPE_SQUOTE(JOBID))_:Each output file begins with the name of the job, which in this case is “DatabaseBackup”. Next is a token for the job id, which becomes a part of the output file’s name. An underline “_” is also added to make the filename easier to read.

$(ESCAPE_SQUOTE(STEPID))_: Next, another token is used to include the step id as part of the output file’s name. Another underline is also included.

$(ESCAPE_SQUOTE(STRTDT))_: Next, another token is used to include the date (YYYMMDD format) as part of the output file’s name. Another underline is also included.

$(ESCAPE_SQUOTE(STRTTM))_: Next, another token is used to include the time (HHMMSS format) as part of the output file’s name. Another underline is also included.

.txt: And finally, an extension of “.txt” is added to the output’s file name.
When the above job is executed, the output path may create a result like this:
DatabaseBackup_0x524D34D2F9A6F94BA347806E52810D4_1_20100707_130527.txt

While the job id and the step id are obscure, they help to make the filename unique. When I look at output files, what I focus on is the “DatabaseBackup” part, which tells me what job was run, and the date and time, so I know when it ran.

If you plan to use this job, you still have one mandatory step you must take, and several optional steps. The mandatory step is that you must schedule this job, as this is not done for you. The optional steps are to set up Alerts and Notifications, which I highly recommend, so that you are notified in case a job fails. Of course, you may also want to modify the job in other ways by changing the stored procedure’s parameters, but I’ll leave that up to you.
Of course, you don’t have to use this job. As I mentioned earlier, I prefer to create my own jobs, but I have used the jobs created by the script as a template for my own jobs.
Now that we have a reasonably good knowledge of how jobs created with the script work, now let’s take a brief look at the remaining jobs, but without all the detail.

CommandLog Cleanup

This job deletes rows from the CommandLog table that are over 30 days. If you want to change the 30 day value, you can do so by directly editing the command line for the job, changing it from 30 to whatever number of days works best for you. You may choose to run this job daily, weekly, or monthly.

DatabaseBackup - USER_DATABASES – DIFF

This job is designed to create differential backups of your user databases. If you do create differential backups, be sure that you schedule this job to run after a full backup of your databases, and schedule the backups appropriately. On the other hand, if you don’t create differential backups, you can delete this job.

DatabaseBackup - USER_DATABASES – LOG

This job is designed to create log backups of all user databases that use the full or bulk logged recovery models. Be sure to schedule this job so that it runs after a full backup. Generally speaking, I schedule my log backups to occur every hour (or more often), depending on my needs.

DatabaseBackup - SYSTEM_DATABASES – FULL

This job creates full backups of the system databases, and generally speaking, this job should be scheduled to run nightly. There is no transaction log backup job created by the script for system databases because they use the simple recovery model, and transaction log backups cannot be taken of them. Some DBAs prefer to create a single job to perform full backups of user and system databases at the same time. You can do this if you want, but you will have to modify the jobs accordingly, as the script does not provide this option.

DatabaseIntegrityCheck - USER_DATABASES

This job performs a DBCC CHECKDB on all the user databases, looking for potential database corruption. Be aware that this job could be time and resource consuming, so schedule it during less busy time of the day.

DatabaseIntegrityCheck - SYSTEM_DATABASES

This job performs a DBCC CHECKDB on all the system databases, looking for potential database corruption. Because the system databases are generally small, running this job should not take long. Some DBAs prefer to create a single job that runs DBCC CHECKDB on both user and system databases at the same time. You can do this if you want, but you will have to modify the jobs accordingly, as the script does not provide this option.

IndexOptimize - USER_DATABASES

This job analyzes all of the indexes in all of your user databases, and based on the size of the indexes, and the fragmentation level of the databases, determines if the index should be ignored, reorganized, or rebuilt. Of all the jobs created by the script, this is one that you may want to alter, changing some of the IndexOptimize stored procedure parameters so you have better control over how this job runs. Of course, the default options will work well for most SQL Server instances if you don’t have the time or knowledge to tweak the parameters. Notice that there is no job created for index optimization for the system databases. This is because index optimization for system databases seldom is beneficial, although you can create a job to perform this task if you like.

Output File Cleanup

This job simply deletes log files created by the various database maintenance jobs that are over 30 days old. If you want to change the 30 day value, you can do so by directly editing the command line for the job, changing it from 30 to whatever number of days works best for you. You may choose to run this job daily, weekly, or monthly.

sp_delete_backuphistory

This job deletes backup history from the msdb database that is over 30 days old using the system stored procedure sp_delete_backuphistory. If you want to change the 30 day value, you can do so by directly editing the command line for the job, changing it from 30 to whatever number of days works best for you. You may choose to run this job daily, weekly, or monthly.

sp_purge_jobhistory

This job deletes job (SQL Server Agent jobs) history from the msdb database that is over 30 days old using the system stored procedure sp_purge_jobhistory. If you want to change the 30 day value, you can do so by directly editing the command line for the job, changing it from 30 to whatever number of days works best for you. You may choose to run this job daily, weekly, or monthly. While the script creates separate jobs for the CommandLog Cleanup,  sp_purge_jobhistory, sp_delete_backuphistory, and the Output File Cleanup jobs (as we have just seen), I prefer to combine all four of these maintenance tasks into a single job, as all four jobs are closely related and can easily be scheduled to run at the same time. If you want to do this, you will have to alter one of the four jobs so that it performs all four tasks, and then delete the remaining three you don’t need, as the script does not provide this option.

Job Summary

Now that you have had a chance to review the jobs that have been created by the script, and have learned how they work, you should be in a better position to determine if you want to use these jobs as is, to modify them, or to create your own. Each of these options work, and it is up to you as the DBA to make the choice that will work best for your environment.

Keeping Up-To-Date

Periodically, Ola updates his scripts, fixing small bugs or adding new features. If the current version of Ola’s script you are using works great for you, then you probably don’t need to upgrade when he updates his script. On the other hand, if he introduces a new feature you want to use, then upgrading is easy. Assuming you have not customized his original script, you can run the new script as if it was a new install.
When you run the new script, Ola guarantees backward compatibility. If you modified any of the jobs created by his script, they are left alone by the upgrade process. If you have created your own custom jobs, they will also continue to work without modification.
If you have modified his script directly (other than the five essential settings), you might want to consider contacting Ola to find out if the customizations you have made are still needed in the new version.

Summary

Although this has been a long article, it still does not cover every option in great depth. I have made the assumption that you understand what the various parameters do, or at least that you can look them up. I have also assumed that you understand how to properly schedule your database maintenance jobs. So at this point, if this article has sparked an interest in Ola’s script, then download it and install it on a test server and try it out. Once you are satisfied that it works as you expect, then roll it out to your SQL Server instances and begin to immediately reap the benefits Ola’s script provides. If you have specific questions about how Ola’s script runs, or want to suggest new features, contact him through his website and ask him directly, as he is the expert on the script.