WPFPerf – Performance Profiling Tools for WPF

October 27, 2008
  • Want to understand WPF a little better? 
  • Want to write better WPF applications? 
  • Want to get a better grasp on WPF performance?

Then get the new WPF Performance Profiling Tools.  This suite of tools has been updated in a big way.  The UI and new features are very slick.

These tools have an x86 and x64 versions ready for you to use.

WPFPerf_01

Links

WPF Performance Profiling Tools Home Page

Latest Documentation

Previous Documentation (some still applies to current release)

Have a great day.

Just a grain of sand on the worlds beaches.


Reconciling Initial Use of LINQ to SQL DataContext

April 30, 2008

A few days ago in my blog post, Sample Series – Bench Marking Object Loading I showed bench mark times for several different methods of loading Business Entity Objects from a SQL Server database.  LINQ to SQL turned in the slowest times. 

I received comments and emails so I went back and wrote a real world application that did more than one test that is posted here, Sample Series – Bench Marking Object Loading Application II.  This application showed that LINQ to SQL is as faster or faster than other methods used to load objects.

However, I still needed to reconcile the results from the first bench mark test.  These numbers were not going to go away.

Josh Smith and I will be teaching the WPF Multi-Tier Business Application Track at the Enterprise Developers Guild Code Camp on 17 May 2008 and one of the sessions is on WPF Tools & Performance Testing.

Marlon Grech and I were emailing back and forth about how to find the hold up and he told me he uses Red Gate Software’s Ants Profiler.  So, yesterday I purchase Ants Profiler and got to work.

The Profiling

Using Ants Profiler, I ran two profile tests.  The first test I only ran the LoadLINQ method once.  This would give me a base line to compare the running of LoadLINQ multiple times.

The times are in seconds.  The longer run time in seconds is because the Ants Profiler is doing a lot of work to gather the information from the application and CLR classes that the application calls.

While the number of seconds to run the program under the profiler is longer than the one second run times we got in the initial blog post, the results are relative with respect to method run times.  The red bar graph indicates a relative run time of the method compared to other methods.

In the next two images, notice that LoadLINQ runs much faster on the second run.  Ants Profiler will enable us to find out why.

Ants1

 Ants3

In the below two images the longer running code is easily found.  Look at the top line of code in each image.  This is the constructor for the DataContext. 

We can see that the very first time this is called it took 2.66.  When I profiled the application the second time, notice that the time to call the constructor twice was only 2.67.

Ants Profiler has shown us exactly which line of code was the initial bottleneck.

Ants2  

Ants4

I used another feature of Ants Profiler to view the CLR method calls and execution times to determine what was happening within the constructor of the DataContext. 

The below image shows the method calls inside the DataContext constructor and that the DataContext.Init method was taking up almost all of the time.  Ants Profiler allows you to “drill into” that method and see what is going on under the covers.  Karl really likes this drilling business;  both Ants Profiler and Mole share this common feature of allowing developers to drill around very easily and inspect under the hood.

Ants5

The below image is a profile of the DataContext.Init method.  (the reason for the time differences is that I captured these images on different runs of the profiler.)

Ants6

Close

In case you’re wondering, I’m not a sales person or employee of Red Gate, I just love their products and support.  I’ve been a long time SQL Compare customer. 

Now I have Ants Profiler in my toolbox and love the fact that I can look under the hood and see what is going on in my applications.

Just a grain of sand on the worlds beaches.


Sample Series – Bench Marking Object Loading Application II

April 28, 2008

This is the next sample in the Sample Applications Series. The purpose of the Sample Series is to provide concise code solutions for specific programming tasks. This sample provides a brief description of the problem, the solution and full source code.

After last nights blog post Sample Series – Bench Marking Object Loading I received a good number of questions on a WPF Disciples Google Group Discussion Thread.  If you didn’t get a chance to read that blog post, please look at it now as I won’t be repeating the background information.

I thought about all the comments, suggestions and possible “one time” object initiation that various software technologies pay the first time they are used.

I really want to get to the bottom for the initial delay that me and others got when using LINQ to build object collections.  Additionally I wanted to see what the bench mark would look like in a typical application.

When I was researching the loading of business entity objects from a database I ran across an awesome article on Code Project entitled, Dynamic Method IL Generator written by Code Project author Herb Randson.  I used the code from his article and translated into VB.NET and made just a few very minor adjustments and added a LINQ test to the mix for comparison.

New Bench Mark Test

Let me introduce you to the players in this bench mark.

NEW SQL Server Express Database

Included in the download is a NEW BenchMarkData SQL Server Express 2005 Database.  If you downloaded and set up the database from the previous sample, please remove that database and attached the new database supplied with the source download below. 

To run the tests in the sample download, you will need to have either SQL Server 2005 or SQL Server 2005 Express installed on your computer.  Attach the included database to either server and edit the app.config file and changed the connection string to match your system.  This database is ultra simple with one table at has 19,972 rows that I imported from the AdventureWorks database Person.Contacts table. 

This new database has an additional field and stored procedures for all CRUD operations.

NEW Bench Mark Loading Objects Sample Application

This new application is a Windows forms application that runs all the tests and displays the results in a DataGridView.  There are four sets of tests.  Each set of tests has 5 tests, except for the LINQ set which has 6 tests.  All CRUD operations are performed using stored procedures and concurrency checks are made using the SQL Server timestamp column in each row.  Each test has been optimized and correctly handles DBNull coming from the database and correctly set Nullable(Of Type) properties.

The below list describes the four sets of tests. 

  • Manual – method loads the business objects by iterating a SQLDataReader and manually constructing the classes with optimized code.  In a real world application, this would require that a method be written for each business entity to load it.
  • Reflection – method loads the business objects by iterating a SQLDataReader and using a cached listing of the target objects properties.  In a real world application, this requires only this code be called and a List of business objects will be returned.  What is more realistic is that a method will be created that wraps this code as I have done in this application.
  • Dynamic – method loads the business objects by iterating a SQLDataReader and using some of the most wicked code I’ve seen a long time.  In his Code Project article Dynamic Method IL Generator the author lays out how to generate IL code at runtime and then call that code.  This method results in insanely fast loading of business objects.  In a real world application, this requires only this code be called and a List of business objects will be returned.  What is more realistic is that a method will be created that wraps this code as I have done in this application.
  • LINQ – method utilizes code generated by Visual Studio 2008 and takes advantage of the stored procedures for CRUD operations.

The below list describes each of the five tests performed by the four above methods.

  • Read – Update One Record Single Connection – this test simulates the retrieval of a single record, the changing of the record by a process and the updating of the record over a single connection to the database.  NOTE:  This is also the very first test in the set and you will notice that it records a longer execution duration than the next test.
  • Read – Update One Record Two Connections – this test simulates a user loading a form with a single record, editing of the record and then updating the record over two separate connections to the database.
  • Read 911 records – this test reads all records in the database where the last name starts with “a.”  911 is the total number of records returned.
  • Read Insert and Delete – this test inserters a record, rereads that record and deletes that same record.
  • Read and Update 19,972 records using same connection – this test simulates business processing with complex business layer calculations being applied to a set of data.  All the records are loaded into business objects, that collection of objects is iterated and each object has a field updated using a calculation and each individual record is updated back to the database.  The same database connection is used to retrieve the records and to write them back to the database.
  • Read and Update 19,972 records using SUBMIT CHANGES same connection – this test is only performed by LINQ and is TestNumber 6.  I did this to show the difference between using a stored procedure and LINQ to update a large number of records in a batch.

Test Results

All of these tests were run against a production SQL Server in the middle of the day.  The client was a 2 Core Dual 2.4GHz with 2GB memory running Vista x32.

ResultTwoII

This test clearly shows that LINQ to SQL is just as fast or faster than other techniques for building business entity objects from a database query.

It is only in the Update department that LINQ to SQL turns in slower times.  The average time to retrieve 19,972 records, iterate through them and write individual updates back to the database was 16 seconds.  Pretty impressive.  LINQ using stored procedures took 24 seconds.  Not back either.  However, when I used the LINQ SubmitChanges command it took 1:28 seconds.

ResultOneII

Comments

After showing this to others at my work, they looked at me and said, what is the conclusion?  Which technique should we go with?  We can generate all of the code for any of the techniques.  The leanest method is the ManualBuilder method.  It consistently turns in the second fastest time and does not require any caching of precompiled queries, IL execution code or reflection property lists.  So for my next few projects, I’m sticking with “old school” DataReaders and optimized .NET object loading.

Choosing the ManualBuilder does not prevent me from using LINQ within the application once the object are loaded.  For object loading, I prefer simpler business objects with less overhead and less dependencies.

Close

Your application requirements and development resources will dictate your choices.  Making an informed decision that meets your needs is what this post is about.

I hope this sample gets you to investigate the various options you have for loading business entity objects from a database.

Source Code: After downloading the source code you MUST change the file extension from .zip.DOC to .zip. This is a requirement of WordPress.com.

The source includes a Visual Studio 2008 solution and SQL Server 2005 database for testing.  At your option, you could very easily download the code and point it to any of your current databases and run tests against them.

Download Source and SQL Database 779KB

Hope you can learn just a little bit more about .NET from this article and the Sample Series.

Just a grain of sand on the worlds beaches.


Sample Series – Bench Marking Object Loading

April 27, 2008

BenchMarkResultsConsole

              Time to create 19,972 business objects

This is the next sample in the Sample Applications Series. The purpose of the Sample Series is to provide concise code solutions for specific programming tasks. This sample provides a brief description of the problem, the solution and full source code.

Josh Smith and I are very busy preparing for the Charlotte, NC May 17th Code Camp as we will be teaching the WPF Multi-Tier Business Application Track.  This will be a very exciting day for WPF and LOB and hope to see you there! 

I am also writing the 4th installment of the WPF Business Application Series and hope to deliver the entire application soon and then continue to write articles against it.

During this preparation the topic of loading Business Entity Objects from the database keesp coming up.  Each time I reflect on that topic these questions immediately come to mind:

  • Should I stick with the time tested and super fast SQLDataReader?
  • Should I move to LINQ to SQL?
  • Should I load objects using reflection?
  • Should I write object loader code for each object?
  • When I add Silverlight into the mix will that effect my above choice?

I did a good bit of research and will present the results. 

When researching I ran across an awesome article on Code Project entitled, Dynamic Method IL Generator written by Code Project author Herb Randson.  I used the code from his article and translated into VB.NET and made just a few very minor adjustments and added a LINQ test to the mix for comparison.

I hope this blog post encourages you to perform your own testing before choosing Door #1 or Door #2.

Introduction

I wanted the test to simulate a real world database request that involved selecting almost 20,000 rows and building a generic collection of objects to process.  Not that any of us would be selecting all our customers and stuffing them into a ComboBox, but that the test would simulate building 20,000 instances of a business object class that would be processed and the results written back to the database.  For this bench mark, I’m not writing results back to the database.  Business applications I write can easily generate this type and volume of traffic when performing business processing so I elected to use this as my test.  This type of request is also commonly used to preprocess data for a report.

Users could care less about bench marks.  They judge our applications based on how it appears to run on their desktop in their unique environment.  How long does a form take to appear?  When the customer form opens and the users clicks on the History tab, is there a delay before the 500 records appear in the grid?  When the user runs a report that requires preprocessing, how long does it take?  How long does batch processing take?  Is the UI frozen while processing?

When I design applications my first priority is delivering documented, maintainable code, my second priority is performance.

As architects we are not only concerned with how responsive and quick our UI’s are, but must take into consideration processing time on our servers.  If we have clients that are connected over the web, almost all processing will take place on the web or application server.  It is imperative that server deployed code runs as fast and efficient as possible. 

Bench Mark Testing

Let me introduce you to the players in this bench mark.

SQL Server Express Database

Included in the download is the BenchMarkData SQL Server Express 2005 Database.  To run the tests in the sample download, you will need to have either SQL Server 2005 or SQL Server 2005 Express installed on your computer.  Attach the included database to either server and edit the app.config file and changed the connection string to match your system.  This database is ultra simple with one table at has 19,972 rows that I imported from the AdventureWorks database Person.Contacts table.

SQL Server Profiler

I used the SQL Server Profiler to record bench marks from the SQL Server during processing and have included the results below.

Bench Mark Loading Objects Sample Application

This is a very simple console application that demonstrates four different methods of loading business objects.  I tried to make each test as far as possible.  Each test will open and close the connection to the database.  Each test has been optimized and correctly handles DBNull coming from the database and correctly set Nullable(Of Type) properties.

Below is the list of the four methods.  Each method will build a generic List(Of Contact) or List(Of ConactEntity).  LINQ builds the Contact class and the other methods build the ConactEntity class.  I did this so that each technique would build a class like it would in a real world application.

  • Manual – the LoadManual method loads the business objects by iterating a SQLDataReader and manually constructing the classes with optimized code.  In a real world application, this would require that a method be written for each business entity to load it.
  • Reflection – the LoadReflection method loads the business objects by iterating a SQLDataReader and using a cached listing of the target objects properties.  In a real world application, this requires only this code be called and a List of business objects will be returned.  What is more realistic is that a method will be created that wraps this code.
  • Dynamic – the LoadDynamic method loads the business objects by iterating a SQLDataReader and using some of the most wicked code I’ve seen a long time.  In his Code Project article Dynamic Method IL Generator the author lays out how to generate IL code at runtime and then call that code.  This method results in insanely fast loading of business objects.  This code reminds me of the old ADO GetRows method we used on early ASP web sites.  This too was insanely fast compared to any other method of reading data from SQL Server.
  • LINQ – the LoadLINQ method utilizes code generated by Visual Studio 2008, creates a DataContext and with a single line of code, transforms the query results into a List(Of Contact).

Test Results

This test was run on my home system, an Intel Core2 Quad 2.4 GHz with 4 GB main memory, 4 MB L2 cache and a single hardware RAID 1 SATA disk array.  This configuration would be common among server class machines except large servers would have more memory and dual SCSI RAID arrays.

BenchMarkResultsConsole

From the above image, the stats that stick out are the low and high results (lower time is better).  The LINQ loader took just over one second.  The Dynamic loader to .063 seconds which is insanely fast.  The other two were longer but still much faster than the LINQ solution.  In a UI, time is perception.  Add in any other latencies, and the user is now waiting.

In the below image, we can see how each of the above techniques interacted with the SQL Server while iterating the SQLDataReader.  The duration is measured in microseconds.

All of the techniques are very efficient in terms of working with the SQL Server.  Each time I ran this, I got different results with respect to CPU, but he Reads and Duration columns remained the same.  To be honest, I do not know why the Reflection method Duration was so long compared to the others or why the LINQ to SQL method was 2.5 times slower than the Manual of Dynamic methods.  The SQL queries were the same except for the way the LINQ query used fully qualified column names.

BenchMarkResultsSQLProfiler

Real World Development

I strongly recommend that you read Herb’s article and the comments other developers posted.  Herb’s article is about “how” something can be accomplished and he presents several solutions for the accomplishing the same programming task.  I simply added a LINQ to SQL solution to the mix and presented these here for your inspection and review.

I don’t know about you, but I like to adopt new technologies slowly.  Give myself and others around me time to prove them from a security, development, performance, reliability, scalability, toolset availability, toolset maturity and maintenance perspectives.

From a code generation standpoint, all four solutions can be code generated.  I don’t recommend using the Visual Studio ORM Designer or SQLMetal tools at this time because they do not provide any ability for developers to place either XML comments or attributes on their class properties.  You can use the XML Map Files to make this happen but have to pay additional performance hits for this ability.  The lack of support for the developers meta data in the ORM tools has caused me to write all my own code generation tools.  I hope this will be corrected in future releases of the ORM or other Visual Studio RAD tool offerings.  While at the April 2008 MVP Summit I did speak with several teams at Microsoft about this current limitation of meta data awareness.

I have a blog post that raises some questions on ORM tools consuming meta data that you can read here.

The time tested Manual technique is super fast but requires the most code.  If this code is generated then this becomes a non-issue.  The Reflection and Dynamic methods offer very simple and small footprint for great functionality and performance.  The Dynamic method has a draw back in that you can’t debug the generate IL code using the standard debugging techniques most developers are comfortable with.

I don’t think I’m ready to jump on the LINQ to SQL bandwagon for business object loading just yet.  However, that does not mean that I can’t load my objects using the other techniques and then use LINQ to work with them.

Close

Your application requirements and development resources will dictate your choices.  Making an informed decision that meets your needs is what this post is about.

I hope this sample gets you to investigate the various options you have for loading business entity objects from a database.

Source Code: After downloading the source code you MUST change the file extension from .zip.DOC to .zip. This is a requirement of WordPress.com.

The source includes a Visual Studio 2008 solution and SQL Server 2005 database for testing.  At your option, you could very easily download the code and point it to any of your current databases and run tests against them.

Download Source 951 KB

Hope you can learn just a little bit more about .NET from this article and the Sample Series.

Just a grain of sand on the worlds beaches.


Follow

Get every new post delivered to your Inbox.

Join 244 other followers