Quest Software – Benchmark Factory – v7

Prelude

Back in May 2014, Kevin Stern of Quest reached out to me and let me in on the fact that they have released a new version of Quest Benchmark Factory.

This was through a comment he posted @ https://danieladeniji.wordpress.com/2014/05/10/technical-quest-software-benchmark-factory-for-databases/.

Since then I have wanted to download the new version and kick it around a bit.

In this post, I will talk a bit about my experience.

 

Database – Preparation

 

Database – Sample

We will use the AdventureWorks Database.  Unfortunately, our target database is SQL Server Express and that product version does not ship with sample databases.  And, so we downloaded a version from Microsoft; specifically https://msftdbprodsamples.codeplex.com/releases/view/125550.

 

Restore Database

USE [master]
GO
RESTORE DATABASE [AdventureWorks2014] 
FROM DISK = N'C:\Downloads\AdventureWorks2014.bak' WITH FILE = 1
  , MOVE N'AdventureWorks2014_Data' TO N'E:\DATA\AdventureWorks2014_Data.mdf'
  , MOVE N'AdventureWorks2014_Log' TO N'F:\LOG\AdventureWorks2014_Log.ldf'
  , NOUNLOAD
  , STATS = 5


 

Database – Objects

As we have an entire pre-written database for our use, we will need very minimal modifications.
Actually, we will simply add two stored procedure that serve same purpose; which is to query the person table.

The Stored Procedures will accept two arguments firstname and lastname and will filter against the person.person table based on the argument contents.

 

 

dbo,usp_PersonFetchFilteredOnName_1

 

Here is SP 1, it exclusively uses the or clause.

 

use [AdventureWorks2014]
go

if object_id('dbo.usp_PersonFetchFilteredOnName_1') is null
begin
    exec('create procedure dbo.usp_PersonFetchFilteredOnName_1 as select 1/0 as [shell]')
end
go

alter procedure dbo.usp_PersonFetchFilteredOnName_1
(
      @Firstname nvarchar(50) = null
    , @Lastname  nvarchar(50) = null
)
as
begin

    set nocount on;

    select *
    from   [Person].[Person]
    where
                (
                       (@Firstname is null)
                    or (@Firstname = '')
                    or (Firstname = @Firstname)
                )
            and
                (
                       (@Lastname is null)
                    or (@Lastname = '')
                    or (Lastname = @Lastname)
                )


end
go

 

 

dbo.usp_PersonFetchFilteredOnName_2

 

Here is SP 2, it uses the case clause.

use [AdventureWorks2014]
go

if object_id('dbo.usp_PersonFetchFilteredOnName_2') is null
begin

    exec('create procedure dbo.usp_PersonFetchFilteredOnName_2 as select 1/0 as [shell]')
end
go

alter procedure dbo.usp_PersonFetchFilteredOnName_2
(
      @Firstname nvarchar(50) = null
    , @Lastname  nvarchar(50) = null
)
as
begin


    set nocount on;

    select *
    from   [Person].[Person]
    where
            (

                    (
                        Firstname = 
                                    case
                                        when (@Firstname is null) then Firstname
                                        when (@Firstname = '') then Firstname
                                        else @Firstname
                                    end
                    )

                and

                    (
                        Lastname = 
                                    case
                                        when (@Lastname is null) then Lastname
                                        when (@Lastname = '') then Lastname
                                        else @Lastname
                                    end
                    )
            )
end
go

 

Database – Sample Data

To get representative test data, we will use bcp to get data out of our database.

 

bcp "select '\"' + Firstname + '\"' as Firstname, '\"' + Lastname + '\"' as Lastname, Count(*) as Cnt from [AdventureWorks2014].[Person].[Person] group by Firstname, Lastname order by count(*) desc " queryout person.txt -S.\SQLEXPRESS_V2014  -E -T -c -t","

if not exist c:\sqlserver\data mkdir c:\sqlserver\data
xcopy C:\Personal\DanielAdeniji\Blog\QuestSoftware\BenchmarkFactory\v7\DatabaseScripts\GenerateTestData\person.txt  c:\sqlserver\data /Q /y /D

 

 

ODBC Data Source

System Data Source

We created a System ODBC Data Source.

 

SelectADriver

 

Data Source Name and SQL Server

 

DatasourceNameAndSQLInstance

Database Connection Detail

 

DatabaseConnectionSettings

Quest Benchmark Factory

 

Script

 

Introduction

Here is a table that shows our Benchmark Factory (BF) Script:

 

Scenario Parameter  Value
Transaction 1
 SQL exec AdventureWorks2014.dbo.usp_PersonFetchFilteredOnName_1  ?,?
 Param1=@Firstname  $BFFileArray(“C:\sqlserver\data\person.txt”,SEQUENTIAL,1)
 Param2=@lastname  $BFFileArray(“C:\sqlserver\data\person.txt”,SEQUENTIAL,2)
Transaction 2
 SQL exec AdventureWorks2014.dbo.usp_PersonFetchFilteredOnName_2  ?,?
 Param1=@Firstname  $BFFileArray(“C:\sqlserver\data\person.txt”,SEQUENTIAL,1)
 Param2=@lastname  $BFFileArray(“C:\sqlserver\data\person.txt”,SEQUENTIAL,2)

 

 

 

Transaction Mix

Here is our Transaction Mix

 

EditJob

 

Transaction SQL

StoredProcedure-SQL

 

Transaction Bind Parameters

 

StoredProcedure-BindParameters

 

 

 

 

User-load

 

Here is what our User-load looks like.

We will start with a single user, and move on to 5, 10, 15, and 20 users.

 

userload

 

 

Agent

 

Before

Here is what the Agent Screen looks like initially.

 

Agent-Before

 

 

After

 

Agent-After

 

 

The change between the Before and After is that we checked our computer name to indicate it is part of our test.

 

Warning

If you do not make that test, you will see a warning that reads

 

Missing Agents To Use For Testing

 

MissingAgentsToUseForTesting

 

 

 

Results

 

Picture

Our test result is pasted below:

 

Userload 20 Results

 

Tabulated

 

Metric SP-1 SP-2
 Avg Response Time  0.014  0.018
 Avg. Transaction Time  0.014  0.018
 Executions  1771  1766
 Rows  8712  8707
 Errors  0  0

 

 

Quick Explanation

 

  • The second stored procedure is a bit slower
  • SP 1 executed 1771 times, while SP2 executed a slightly smaller 1766; so when allocated identical time scope we were able to accomplish slightly more with SP1
  • As we iterated a bit more with SP1, we returned a bit more records

 

 

SQL Sentry Plan Explorer

In all truth, though second Stored Procedure is a tad bit slower, I still could not tell why.

And, so I installed and launched Huntersville’s own SQL Sentry Plan Explorer.  See how I little know it took a young lady on Bart to tell me that Huntersville is a suburb of Charlotte, North Carolina.

 

Estimated Plan

 

EstimatedPlan

 

 

Actual Plan

 

To get the Actual Plan I entered the query.

 

 Test Query:

declare
      @Firstname nvarchar(50) 
    , @Lastname  nvarchar(50) 

select 
      @Firstname = 'Laura'
    , @Lastname  = 'Norman'

exec dbo.usp_PersonFetchFilteredOnName_1
      @Firstname = @Firstname
    , @Lastname  = @Lastname

exec dbo.usp_PersonFetchFilteredOnName_2
      @Firstname = @Firstname
    , @Lastname  = @Lastname

 

Actual Plan

Actual Plan

ActualPlan

 

 

 

Tabulated

 

Metric SP-1 SP-2
Estimated Cost 6.2% 93.8%
Duration 14 20
CPU 15 32
Reads 124 128
Writes
Estimated I/O  5.5%  94.5%
Est Rows  1  632
Actual 124 128
Writes
Estimated I/O  5.5%  94.5%
Est Rows  1  632
Actual Rows 5 5
Key Look up 1 1
Index Scan  1 1

 

 

In summary, SQL Server thinks the second query is much worse than it ended up been.

 

 

Conclusion

There are so many tools one can use to dig a bit deeper into queries.  Quest Software certainly earns its honorable role and belong in the tool-bag of SQL aficionados.

 

Technical: Quest Software – Benchmark Factory for Databases

 

Technical: Quest Software – Benchmark Factory for Databases – Use Case Scenario – Quick Evaluation of MS SQL Server v2012 – InMemory Database – Simple Table and Programmable Objects

 

Background

Wanted a quick familiarization of MS SQL Server v2012 (In Memory Database) tables and programmable objects.

 

Tools

There are a couple of tools in the market.  Our choice includes:

  • Microsoft – RML Utilities – ostress.exe
  • Quest Software – Benchmark factory for Databases
  • Quest Software – Benchmark factory for Databases Freeware

As this is for home evaluation, was happy to find out that Quest has made a freeware version available.

In this post, We will focus our attention on this tool.

 

Download

Downloaded “Benchmark Factory for Databases Freeware” from http://www.toadworld.com/m/freeware/555.aspx .  Please keep in mind that you need a Quest Account.  As it is easy and free to get one, please register and get one, if you do not have an existing account.

Depending on when you attempt the download the exact download link might have changed. And, if so I will suggest that you visit http://www.toadworld.com/products/benchmark-factory/default.aspx for the more general product web site.

 

Environment Setup – Microsoft SQL Server

On a MS SQL Server 2014 Instance, Let us setup our SQL database objects.

 

Add File Group – In Memory

Check sys.filegroups and see if there are any entries with type = FX. If not, then create one.


set nocount on
set noexec off
go

use [DBLabInMemory]
go

/*
  Check sys.filegroups for type=FX -- HasMemoryOptimizedObjects
*/
if exists
	(
		select top 100 percent
				   tblFileGroup.name
				,  tblFileGroup.[type] 
				, tblFileGroup.type_desc
		from   sys.filegroups tblFileGroup
		where  tblFileGroup.[type] = 'FX'
		order  by tblFileGroup.[type]
	)
begin
	print 'Database - FileGroup - fileGroupInMemoryFileGroup exists!'
	set noexec on
end
go


ALTER DATABASE [DBLabInMemory]
	ADD FILEGROUP fileGroupInMemoryFileGroup
		CONTAINS MEMORY_OPTIMIZED_DATA 
		;

go

set noexec off
go



 

 

Add In-Memory File to File Group (Memory_OPTIMIZED_DATA)

Add in-memory file to file group…


set nocount on
set noexec off

use [DBLabInMemory]
go


if not exists
 (
 select *
 from master.sys.sysdatabases
 where name = 'DBLabInMemory'
 )
begin
    print 'Database - DBLabInMemory does not exists!'
    raiserror( 'Database - DBLabInMemory does not exists!', 16, 1)
    set noexec on
end
go


if exists
 (
     select *
     from sys.master_files
     where database_id = db_id() --db_id('DBLabInMemory')
     and (
                (name = 'datafile_InMemory_0001')
             or (physical_name = 
 'E:\Microsoft\SQLServer\DatafilesInMemory\DBLabInMemory\datafile___InMemory__0001.ndf'
                )
          )
   )
begin

 print 'Database - DBLabInMemory -- file already exists!'
 raiserror( 'Database - DBLabInMemory -- file already exists!', 16, 1)
 set noexec on
end
go

ALTER DATABASE [DBLabInMemory]
 ADD FILE 
 (
    name='datafile_InMemory_0001'
 , filename='E:\Microsoft\SQLServer\DatafilesInMemory\DBLabInMemory\datafile___InMemory__0001.ndf'
 ) 
 TO FILEGROUP [fileGroupInMemoryFileGroup]


go

set noexec off
go


 

Create Traditional Table – dbo.customerTraditional



USE [DBLabInMemory]
GO


CREATE TABLE [dbo].[CustomerTraditional]
(
	[ID] [int] NULL,
	[Firstname] [varchar](50) NULL,
	[Lastname] [varchar](50) NULL,
	[EmailAddress] [varchar](50) NULL,
	[Country] [varchar](50) NULL,
	[IPAddress] [varchar](20) NULL,
	[Company] [varchar](50) NULL,
	[City] [varchar](50) NULL,
	[PhoneWork] [varchar](50) NULL,
	[StreetAddress] [varchar](50) NULL,
	[Username] [varchar](50) NULL,
	[UniqueID] [varchar](40) NOT NULL,
	 CONSTRAINT [PK_UniqueID] PRIMARY KEY CLUSTERED 
	(
		[UniqueID] ASC
	)WITH (
                  PAD_INDEX = OFF
                , STATISTICS_NORECOMPUTE = OFF
                , IGNORE_DUP_KEY = OFF
                , ALLOW_ROW_LOCKS = ON
                , ALLOW_PAGE_LOCKS = ON
              )
	      ON [PRIMARY]

) ON [PRIMARY]

GO


 

 

Create Traditional Table – dbo.customerInMemory

create In-Memory table (dbo.customerInMemory)


USE [DBLabInMemory]
GO


CREATE TABLE [dbo].[CustomerInMemory]
(
 [ID] [int] NULL,
 [Firstname] [varchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
 [Lastname] [varchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
 [EmailAddress] [varchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
 [Country] [varchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
 [IPAddress] [varchar](20) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
 [Company] [varchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
 [City] [varchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
 [PhoneWork] [varchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
 [StreetAddress] [varchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
 [Username] [varchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
 [UniqueID] [varchar](40) COLLATE Latin1_General_100_BIN2 NOT NULL,

    PRIMARY KEY NONCLUSTERED HASH 
      (
        [UniqueID]
      ) WITH ( BUCKET_COUNT = 256)

)
WITH  
   ( 
       MEMORY_OPTIMIZED = ON
    , DURABILITY = SCHEMA_AND_DATA 
   )

GO

SET ANSI_PADDING OFF
GO
 

 

 

Create Stored Procedure dbo.usp_InsertCustomerTraditional

SP – dbo.usp_InsertCustomerTraditional


USE [DBLabInMemory]
GO

if object_id('[dbo].[usp_InsertCustomerTraditional]') is null
begin
 exec ('create procedure [dbo].[usp_InsertCustomerTraditional] as begin print ''undefined'' end ')
end
GO


ALTER PROCEDURE [dbo].[usp_InsertCustomerTraditional]
(
 @ID int 
 ,@Firstname varchar(50)
 ,@Lastname varchar(50)
 ,@EmailAddress varchar(50)
 ,@Country varchar(50)
 ,@IPAddress varchar(20) = null
 ,@Company varchar(50) = null
 ,@City varchar(50) = null
 ,@PhoneWork varchar(50) = null
 ,@StreetAddress varchar(50) = null
 ,@Username varchar(50) = null
 ,@UniqueID varchar(50) = null
)
with execute as owner
as 
begin

 insert into [dbo].[CustomerTraditional]
 (
 [ID]
 ,[Firstname]
 ,[Lastname]
 ,[EmailAddress]
 ,[Country]
 ,[IPAddress]
 ,[Company]
 ,[City]
 ,[PhoneWork]
 ,[StreetAddress]
 ,[Username]
 ,[UniqueID]
 )
 values
 (
 @ID
 ,@Firstname
 ,@Lastname
 ,@EmailAddress
 ,@Country
 ,@IPAddress
 ,@Company
 ,@City
 ,@PhoneWork
 ,@StreetAddress
 ,@Username
 ,NEWID() --@UniqueID
 )

end

GO 

 

 

Create Stored Procedure dbo.usp_InsertCustomerInMemory

SP – dbo.usp_InsertCustomerInMemory


USE [DBLabInMemory]
GO

if object_id('[dbo].[usp_InsertCustomerInMemory]') is not null
begin
  DROP PROCEDURE [dbo].[usp_InsertCustomerInMemory]
end
GO

SET ANSI_NULLS ON
GO

SET QUOTED_IDENTIFIER ON
GO


create procedure [dbo].[usp_InsertCustomerInMemory]
(
  @ID int 
 ,@Firstname varchar(50)
 ,@Lastname varchar(50)
 ,@EmailAddress varchar(50)
 ,@Country varchar(50)
 ,@IPAddress varchar(20) = null
 ,@Company varchar(50) = null
 ,@City varchar(50) = null
 ,@PhoneWork varchar(50) = null
 ,@StreetAddress varchar(50) = null
 ,@Username varchar(50) = null
 ,@UniqueID varchar(50) = null
)
with native_compilation, schemabinding, execute as owner
as 
begin atomic with
(transaction isolation level = snapshot,
language = N'English')

 
 insert into [dbo].[CustomerInMemory]
 (
  [ID]
 ,[Firstname]
 ,[Lastname]
 ,[EmailAddress]
 ,[Country]
 ,[IPAddress]
 ,[Company]
 ,[City]
 ,[PhoneWork]
 ,[StreetAddress]
 ,[Username]
 ,[UniqueID]
 )
 values
 (
 @ID
 ,@Firstname
 ,@Lastname
 ,@EmailAddress
 ,@Country
 ,@IPAddress
 ,@Company
 ,@City
 ,@PhoneWork
 ,@StreetAddress
 ,@Username
 ,NEWID() --@UniqueID
 )

end

GO



 

 

 

Installation

Installation is straight-forward and it is supportable on my old MS Windows 2003 x86 box.

 

Usage

Launch product

 

Create Scenario

 

Welcome

Here is welcome screen

Welcome

 

Select Load Scenario

As we are targeting a “Microsoft SQL Server” database, chose that Database (Microsoft SQL Server) in the Database drop-down.

 

SelectLoadScenario__MicrosoftSQLServer

 

We have a basic choice of Test Types that we can run.  Our basic choices are:

  • Industry Standard benchmarks
  • Test Database scalability
  • Test SQL for scalability
  • Capture and Replay SQL Server workload

Here is Quest’s explanation for each Scenario:

Scenario Explanation
Industry Standard benchmarks Creates load scenarios from industry standard benchmarks (AS3AP, Scalable Hardware, TPC-C, TPC-B, TPC-D, TPC-E, TPC-H). This includes the steps to create/load all the required benchmark objects to execute the standard benchmark workloads. These synthetic workloads can be used when no real-world data is available to be imported for database load testing.
Test Database scalability Creates load scenarios to test database scalability using OLTP, database warehousing, and mixed workloads. This utilizes benchmark objects and transactions from the TPC-C and the TPC-H benchmarks.
Test SQL for scalability A SQL Scalability test compares the performance of SQL Statement variations under a load. Each transaction will execute individually for each user load for a specified number of executions or length of time.
Capture and Replay SQL Server workload Create a workload which replays a captured MSSQL workload from obtained trace file from a shared directory.

 

 

After a quick think, here is our understanding of the various items:

  • “Industry Standard benchmarks” / “Test Database Scalability”  — We will choose one of this options, if we wanted to run a standard tests against our server
    • In our case, we want a very simple test
    • Also, in its initial iteration, Microsoft In-Memory database is a hybrid product and does not have the option of implicitly defining all objects as in-memory
  • Test SQL for Scalability — supports our need of having a small well defined tables and programmable objects, and SQL queries
  • Capture and Replay SQL Server workload — Choose this option if you have captured SQL Server Trace files using SQL Server Profiler or Server side trace.

 

Please consider clicking on the “Show Advanced” check-box, if you will like to see and review additional options.

Again, we will choose “Test SQL for Scalability”

Define Iteration Length

The next screen depends on which Scenario you chose earlier.  As we chose “Test SQL for Scalability”, we are now in the “Define Iteration Length” window.

DefineIterationLength_v2

 

We chose the “Number of Executions per iteration” option and changed from the default of 5 to 1000 executions.

This basically means that we will cycle through our workload 1000 times.

We will skip the step of adding new User Scenario / adding SQL Statement at this time.

Measurement Intervals

The “Measurement Intervals” screen basically defines our virtual workload user; in essence the number of virtual users that we will have at every stage.  Keeping in mind that we ramp up gradually.

Before:

MeasurementIntervals_Before

 

After:

MeasurementIntervals

 

Select new job to

As always choose a good, associative name for your job.  We choose “MicrosoftSQLServer-InMemory-SimpleTableAndProgrammableObjects“.

 

 

SelectNewJobTo_v2

 

Congratulations

Congratulations

 

 

Define Profile

Empty Job View

We are taken to an an empty job view screen.  And, we will now define our Job.

JobsView

 

 

Define Profile

Profiles are basically targeted Database Connections.

Let us go define a new one and target our MS SQL Server Instance.

We initiate that process by clicking on the “Add Profile” main menu option.

 

Welcome

welcome

 

Driver Selection

We choose “Microsoft SQL Server (ODBC)”

DriverSelection

 

Connection Information

Select Data Source Name, enter SQL Server Username, password.

 

ConnectionInformation

 

Profile Name

Please enter a name for your Profile.  In our case, we chose MSSQLInMemory.

ProfileName

 

Congratulations

You can go back and review your selections and once you ‘re good, please click on the Finish button.

Congratulations

 

Define Scenario

Let us return to defining our Scenario.

 

Specify Profile

Access the In the “Job Setup” \ “General” Tab, choose your target profile.

selectProfile

 

Load Setup \ Transactions

Access the In the “Load Setup” \ “Transactions” Tab and at this stage of our work, you will likely see an empty panel, free of any defined Transactions.

Empty Transactions Panel

LoadSetup-emptyPanel

 

Add Transactions

Right click on the empty panel, add from the drop-down menu, please choose to “New Transactions” option.

AddTransaction

 

SQL Statement
INSERT INTO [dbo].[CustomerTraditional]
(
[ID],[Firstname],[Lastname],[EmailAddress]
,[Country],[IPAddress],[Company]
,[City],[PhoneWork],[StreetAddress]
,[Username],[UniqueID]
)
select
$BFRand(1000000)
, ‘$BFLeft($BFFirstName(),20)’
, ‘$BFLeft($BFLastName(),20)’
, ‘$BFLeft($BFEMail(), 50)’
, ‘$BFCountry()’
, null
, ‘$BFLeft($BFCompany(),50)’
, ‘$BFLeft($BFCity(), 50)’
, ‘$BFLeft($BFPhone(), 50)’
, ‘$BFLeft($BFAddress(), 50)’
, ‘$BFLeft($BFFirstName(),20)’
, NEWID()

 

I will suggest that you use the ExecuteStatementto execute and validate the SQL.

 

Real World Latencies

Here we specify the latency between each transaction

RealWorldLatencies

 

Congratulations

Click OK to continue

Congratulations

Add Transactions – Stored Procedure – Invocation of traditional Stored Procedure

Right click on the newly added Transaction, add from the drop-down menu, please choose to “New Transactions” option.

Here is us invoking our Stored Procedure (dbo.usp_InsertCustomerTraditional)

SQL Statement
exec dbo.usp_InsertCustomerTraditional
$BFRand(1000000)
, ‘$BFLeft($BFFirstName(),20)’
, ‘$BFLeft($BFLastName(),20)’
, ‘$BFLeft($BFEMail(), 50)’
, ‘$BFCountry()’
, null
, ‘$BFLeft($BFCompany(),50)’
, ‘$BFLeft($BFCity(), 50)’
, ‘$BFLeft($BFPhone(), 50)’
, ‘$BFLeft($BFAddress(), 50)’
, ‘$BFLeft($BFFirstName(),20)’
, null — uniqueID
Add Transactions – Sql Statement – Insert into in-memory table

Right click on the newly added Transaction, add from the drop-down menu, please choose to “New Transactions” option.

 

SQL Statement
INSERT INTO [dbo].[CustomerInMemory]
(
[ID],[Firstname],[Lastname],[EmailAddress]
,[Country],[IPAddress],[Company]
,[City],[PhoneWork],[StreetAddress]
,[Username],[UniqueID]
)
select
$BFRand(1000000)
, ‘$BFLeft($BFFirstName(),20)’
, ‘$BFLeft($BFLastName(),20)’
, ‘$BFLeft($BFEMail(), 50)’
, ‘$BFCountry()’
, null
, ‘$BFLeft($BFCompany(),50)’
, ‘$BFLeft($BFCity(), 50)’
, ‘$BFLeft($BFPhone(), 50)’
, ‘$BFLeft($BFAddress(), 50)’
, ‘$BFLeft($BFFirstName(),20)’
, NEWID()
Add Transactions – Stored Procedure – Invocation of In-Memory Stored Procedure

Right click on the newly added Transaction, add from the drop-down menu, please choose to “New Transactions” option.

Here is us invoking our Stored Procedure (dbo.usp_InsertCustomerInMemory)

SQL Statement
exec dbo.usp_InsertCustomerInMemory
$BFRand(1000000)
, ‘$BFLeft($BFFirstName(),20)’
, ‘$BFLeft($BFLastName(),20)’
, ‘$BFLeft($BFEMail(), 50)’
, ‘$BFCountry()’
, null
, ‘$BFLeft($BFCompany(),50)’
, ‘$BFLeft($BFCity(), 50)’
, ‘$BFLeft($BFPhone(), 50)’
, ‘$BFLeft($BFAddress(), 50)’
, ‘$BFLeft($BFFirstName(),20)’
, null — uniqueID

 

 

Run Job

Once we have defined our transactions and attached our job to a profile (target database), we can run the job.

To do so we can use the Menu (Jobs \ Run)  or  use the top menu icon:

runjob

 

Review Job

We can review our running job via a few lenses:

  • Internally
  • SQL Server Profiler

Benchmark

Within the Jobs view, access the “Run Job” \ “Summary” tab.

As we starting out, we will see that we are at our initial userload of 1.

Userload of 1

reviewRunningJob-Userload-1

Userload of 10

reviewRunningJob-Userload-10

As we have increased our userload to 10, we have new metrics:

  • Increased Transaction per second (TPS) from 0.97 to 9.83
  • Our average time is @ 0.007

Userload of 20

reviewRunningJob-Userload-20

 

As we have increased our userload to 20, we have new metrics:

  • Increased Transaction per second (TPS) to 17.24
  • Our average time is down a bit to 0.123

 

Database Profiler

SQL Server Profiler

We will use SQL Server’s native profiling tool, SQL Server Profiler, and try to get an internal view of how are statements are being executed.

Here is what a traditional SQL Insert statement looks like.

SQLStatment-InsertIntoCustomerTraditional

 

Hopefully you will notice a few things:

  • Our SQL Statements are prepared
  • As the SQL Statements are being prepared, they will be a bit faster than unprepared SQL Statements

 

See Results

Depending on the complexity of your transactions, the number of transactions you will like to run, and the workload sizes, it might take time to run your job.

To see the current result and pass results, please access the “Job View” \ “See Results” panel.

To view reports, please choose the report and click on the viewReport icon.

You can also select the Report, right click on your selection and choose the “Show Test Results”.

 

Results Summary

Our summary shows the best performing transaction is the “SQLStatementSP-InMemory”.

Transaction Time (baseline)

 

Results Summary – Transaction Time

I think we should dig deeper and so double-clicked on the graph above.

Result Summary Graph - Trasaction Time

 

The graph above demonstrates that In-Memory transactions have appreciably less transaction time than traditional transactions.

 

Results – Transactions per second

Transactions per second

 

As we expected as we increased the user-load, we process more transactions.

 

Results – Deadlocks

Thankfully, we are experiencing zero deadlocks.

Relative Summary Graph - Deadlocks

 

 

Report – Export To Word

Once you have a report up, you might want to export the report to Word.

To do so, please click on the “W” icon at the top of a report.

exportReportToWord

Unfortunately for us, we received an error message:

Textual:

Please make sure the Microsoft Word is installed correctly.  And, the active X control is enabled.

Image:

PleaseMakeSureTheMicrosoftWordIsInstalledCorrectly - Add the ActiveX control is enabled

 

I download Microsoft Word Viewer.  Upon trying to install it, ran into more problems.

Textual:

Please wait while Setup finishes determining your disk space requirements.

Image:

pleaseWait

 

To fix try:

wordview_en-us.exe /passive /norestart

This means that you will not be prompted for a destination folder, etc.

The install worked.

Returned to the report viewer but still no go!  I must need a full version of the product.

 

Report – Export To PDF

Downloaded FoxIT Reader.  Installed the application and was able to return to Quest Benchmark Factory and print the report to the FoxIT printer.

SelectPrinter

 

Reviewed generated PDF File.  But, as it is stale and unable to dig deep into charts.

 

Source Control

GitHub

A few of the files have been pushed onto Github.

The URL:
https://github.com/DanielAdeniji/QuestBenchmarkFactoryMSSQLInMemorySimple

 

Summary

I am really very impressed with Quest Benchmark factory.

It has a very rich test toolset for generating test data.

And, it has a firm foundation for packaging the payload and applying them to a database.

And, it prepares very well polished and applicable report.

Quest – LiteSpeed – Large Log Files

Quest – LiteSpeed – Large Log Files

Quest LiteSpeed allows us to drastically reduce the size of our Microsoft SQL Server backup files.  And, so we use it quite a bit.

It runs well behind the scenes.  But, occasionally there is one setting you might want to keep an eye on.

That setting is the default “ErrorLogPath”.

On one of our machines, the system folder, kept filling up. We really could not find the culprit. Is it this or that:

  1. Is it Microsoft SQL Server Temp folder per the Service  Account  – Especially per hugh ETL jobs (Excel file loading, bcp, etc)
  2. Is it one of the other services – Backup

Nothing that easy to find…

Tried to list for “hugh” files on the system drive — Nothing too big.

Finally, limited our search to looking for *.log files.  And, then we found it:

So now we know that we have a couple of big files in the “C:\Documents and Settings\All Users\Application Data\Quest Software\LiteSpeed\SQL Server” folder.

Next thought is to determine how can we change the logging location from our system path (C:) to somewhere else less “strategic” to system uptime.

Of coure Googled it.

Can Litespeed log activity to a UNC path?
https://support.quest.com/Search/SolutionDetail.aspx?ID=SOL71230

So launched regedit and navigated to HKEY_LOCAL_MACHINE\SOFTWARE\Imceda\SQLLiteSpeed\Engine

Note that it is still under Imceda and not the new owner “Quest Software”.

Changed the folder listed in “ErrorlogPath” entry and stopped all running Quest Applications such as SqlLiteSpeedx32.  The original log files can not be deleted if the Application using\writing to them is not terminated.  Unless you ….

Anyways stopped the current SQL Backup Job and now orphaned “app” as noted in previous paragraph.

Will re-start the SQL backup job  and hope to have Quest use the new folder….

Only now if I know

  1. Why the Quest Log files are over 14 GB
  2. How to set a wrap around limit on the size of the Quest Log file
  3. How to disable logging

References:

  1. Can Litespeed log activity to a UNC path?
    https://support.quest.com/Search/SolutionDetail.aspx?ID=SOL71230
  2. ENHANCEMENT REQUEST – How to set default path for LiteSpeed log file during the Installation process?
    https://support.quest.com/search/SolutionDetail.aspx?id=SOL59970

Quest – Performance Analysis – Client Installs Causing Havoc

A while ago, we installed Quest / Performance Analysis and left it dormant.  The server has since being decomission.  But, unfortunately a few client installs are still orphaned out there.

It seems that client caches old authentication (username \ password) data which at times raises their old head.

I suppose we could back track a bit.  Really was not sure that the problem was Quest.  But, a “good” search through Event Viewer \ Security Log revealed the “Process ID” of the Application that was causing the failed login…

____________________________________________________

Logon Failure:
Reason: Unknown user name or bad password
User Name: joe
Domain: LAB
Logon Type: 2
Logon Process: Advapi
Authentication Package: Negotiate
Workstation Name: LABCOMP1
Caller User Name: LABCOMP1$
Caller Domain: LAB
Caller Logon ID: (0x0,0x3E7)
Caller Process ID: 4008
Transited Services: –
Source Network Address: –
Source Port: –

____________________________________________________

Thank Goodness the process was still running; as we were able to use “Task Manager” to match Process ID 4008 to quest_sc_mw.exe

And so, I started searching for a way to uninstall Quest or the very least disable it.

So looked through at the “Start up” folder and the “Services” Applet, but no luck.

Lauched SysInternals “Autoruns”  and ordered by ImagePath.  Viola found it…\

Once found, still do not know how to un-install it as it is not showing up “Add\Remove” programs – Probably the install was done via deamon.  Whatever the case, will change start-up mode form Automatic to Disabled using Services Applet.

Quest – LiteSpeed – Installation \ Uninstall – Error – 2203/-2147024629

Quest – LiteSpeed – Installation \ Uninstall – Error – 2203/-2147024629

Quest / LiteSpeed - Windows Installer - Error Number 2203 / -2147024629

To Repair:

  1. Start and Stop the “Windows Installer” Windows Service
  2. Check the Environment Variable “Temp”
    1. If missing, create temp
    2. If directory refer to is missing, create it or Change the value of the “Environment” Variable