Tuesday, August 10, 2010

Speed Optimization in ASP.NET 2.0 Web Applications


Overview
Developing a web application that responds to users requests quickly is a target and a challenge for all web developers. ASP.NET was designed with speed optimization and performance improvements in mind. The set of enhancements like: Pages compilation, and Automatic storage on the server are new to ASP.NET.
However, these enhancements are not enough to protect you from slow performance when a large number of http requests are simultaneously processed by your web application.

To write a well performing web application you must make a balance between four common performance measures. These measures are:

Execution Time: It is the time taken to process a request. This is measured as the time elapsed between the first byte received from the server and the last one.
Response Time: It is the time between issuing a request and the first byte received as a response from the server.
Scalability: It measures the performance of a web application against resources (memory, processors, or computers) allocated for it.
Throughput: It is the number of requests your application can serve in a time unit, usually a second.
What follows is a list of areas that you can invoke to harvest for techniques that you can use to develop a high performance ASP.NET application.
State management
Data access
Page and server controls
Web applications
Coding practices
In what follows we will cover the first two areas. Part 2 covers the last three areas, and caching techiniques are deeply explained in Caching optimization.
State Management
The following issues give you some ideas about how to make state management efficient.
SessionState State
Disable session state when you are not using it. Not all web application require a session state for each user. When you disable unneeded session state you are definitely improving your application performance.
To disable session state for a complete page in your application, use the following code:


If your page requires access to session variables only to read them then set the session state for this page to read only.
To disable session state for the entire application, open the 'web.config' file, and set the mode attribute to off under the session state section as in the following code:
Session State Provider
There are more than one approach for storing your session state data. You can choose the approach you prefer by setting the mode attribute under the session state section of your application 'web.config' file. The supported modes are: in-process mode, state server mode, SQL server mode, and custom mode. Each mode is suitable in some circumstance, and has its own set of advantages. The in-process session state is by far the fastest one. So, it is recommended to use the in-process provider if you want to store small amounts of volatile data in the session state. Other state server and SQL server providers are needed if your application using a web farm or garden, or if you want to retain your session state date if the web server is restarted.
Data Access
The following points give you some ideas about how to render data access efficient:
Use SQL server provider
Among the data providers available to be used, choose SQL data provider to build a high performance, scalable web application.
Use SQL Data Reader
The SQLDataReader class gives you the ability to read only retrieved records from the database for once. So, if this is all what you actually need, then it is extremely recommended to use a data reader class instead of a DataSet or, a Table Class. This will gives your application much higher performance.
Use Stored Procedures
Whenever possible use compiled stored procedure instead of SQL plain text commands, this gives your application an extra speed performance. SQL server builds and stores execution plans for previously stored procedures.
Make use of SQLDataSource useful properties
Instead of writing a query that includes the "Where", or "Sort By" clauses use the caching, filtering, and sorting properties of the SQLDataSourse provider. You can set the SQLDataSource object's "EnableCaching" property to true to enable caching of the retrieved data by this data source. When you enable data source caching, SQLDataSource will do the operations like filtering and sorting on the cached data. You can use the "FilterExpression" property, and the "SortParameterName" property to perform filtering and sorting using the SQLDataSource object. This will certainly makes your application faster.
View State Encryption
You must balance between the importance of encrypting your application view state data to make your transferred data more secure, and the cost / the impact of using encryption on your application speed. Encryption has a fixed cost for initialization plus a unfixed cost depends on the size of data to be encrypted. In brief, it is recommended to avoid using unnecessary view state data encryption.
Paging
It is known that to display data from a given data source you have to use data controls like: GridView, and/or DetailsView controls and to bind them to the data source you want to display data from. You may want to adjust the display layout of some data records and columns within your screen by using the GridView, or the DetailsView paging property and set it to true. This paging technique is called UI paging (User Interface paging). To perform this service, the GridView control requests data from the specified data source, it then filters the elements that will be displayed in the current page and simply ignores the rest. It does this process each time a new page is required ... which is certainly a great waste of resources.
There is another technique for doing the same task, it is typically referred to as data source paging. In this technique you make use of a data source control that implements the DataSourceView class, and set its CanPage property to true, then you feed the GridView control or the DetailsView control from this data source. In this case the data source object queries only for the elements needed in each page. Thus, data source paging is more efficient than UI paging. The only data source class that allows data source paging is ObjectDataSource class.
Caching
Using the caching technique appropriately can improve the performance of your application more than using any other technique. .Net framework provides some mechanisms for caching pages and data within your ASP.NET application. You can use caching for pages that exhibit heavy traffic and do not change between requests. You can also cache small sized data that is used extensively by your application instead of processing them each time.
To use caching effectively, give attention to the following notes:
- Don't assign a small duration for your cached items, items that expire quickly take extra cost for cleaning up and garbage collection.
- Don't cache too many data items, there is a memory cost for each data item. It is not necessary to cache items that can be easily calculated or those which are rarely calculated or used.
Caching is so important and the ASP.NET framework gives a varity of caching mechanisms you can use to improve your application performance more and more. Some of those mechanisms are: Caching application data, Page output casching, Caching web forms, and Caching parts of web forms.
For further information
Refer to the online copy of Microsoft Developers Network at http://msdn.microsoft.com or use your own local copy of MSDN.

Monday, August 9, 2010

Deleting Rows Duplicate when the table doesn’t have a primary key :: SQL SERVER

Were you ever in need of something like, deleting duplicate rows when the table doesn’t have a primary key. If yes, read on!

This article gives a quick way of deleting duplicate records from a given table, that doesn’t have a primary key. Also, we will NOT use cursors, loops, temporary tables (global or local) but still accomplish the mission. Interesting! isn't it. Please read on!


First lets create a simple table with two columns and name the table as ‘DeleteDupsTable’.

Create Table DeleteDupsTable(BrandId int, SKU int)
Lets add a few records to the above table



Insert into DeleteDupsTable Values(1000,Cast(Rand()*100 as int)),
(1000,Cast(Rand()*100 as int)),
(1000,Cast(Rand()*100 as int)),
(1000,Cast(Rand()*100 as int)),
(1000,Cast(Rand()*100 as int)),
(1000,Cast(Rand()*100 as int)),
(1000,Cast(Rand()*100 as int)),
(1000,Cast(Rand()*100 as int)),
(2000,Cast(Rand()*100 as int)),
(2000,Cast(Rand()*100 as int)),
(2000,Cast(Rand()*100 as int)),

(2000,Cast(Rand()*100 as int)),
(2000,Cast(Rand()*100 as int)),
(3000,Cast(Rand()*100 as int)),
(3000,Cast(Rand()*100 as int)),
(3000,Cast(Rand()*100 as int)),
(3000,Cast(Rand()*100 as int))



Notice, that we don’t need to write the insert statement for all the rows. It’s an enhancement in SQL Server 2008 aka table value constructor.
Go ahead run the above SQL.
At this point in time we have a heap created in the database. A table without any keys is called a ‘Heap’.
Lets run the query below to see what have we got to ensure we have necessary data before we go ahead and purge the duplicate records.



Select * from DeleteDupsTable

Ok necessary setup is done.
Lets get into the details of how we delete duplicate duplicate records.


Lets verify how many records have we got per Brand. Go ahead and run the below query. This would list number of records per Brand.



Select Brandid,COUNT(BrandId) from DeleteDupsTable group by Brandid

Ok, we have all the data ready out for purging.
Now the most important step. I wonder if it’s ever possible to delete duplicate records without a primary key! :).
The following query generates a dynamic ‘DELETE’ statement for every brand in the table. The criteria is the BrandId should have more than one record (possible qualifier for duplicate deletes :) ). The delete statement would retain the latest record (the last record in the list will be the latest one when we issue a select * from DeleteDupsTable).



Select 'Delete Top(' + Cast((COUNT(BrandId)-1) as VarChar) + ') from DeleteDupsTable
Where BrandId='+ CAST(BrandId As Varchar)
From
DeleteDupsTable Group by
Brandid having COUNT(BrandId)>1

After executing the above statement, the output will be a set of delete statements. Each delete statement would delete all records excluding the latest one for every brand that has got more than one record.


Copy the output and paste it on the a new query window and press F5.


Boooom! All Records deleted. Mission accomplished.

Well! Good question.


Lets enhance the above query. The enhancement is, the primary requirement stats as it is which is nothing but the dynamic delete queries will delete the records. But, at the same time it outputs the records that were purged. with no extra select statements (interesting! is it?). Meaning, you see the data on your screen that was deleted. This way, even if you have lost the data, you could still insert it back using the output just generated.


Go ahead and run the below query and notice the output this query generates:

Select 'Delete Top(' + Cast((COUNT(BrandId)-1) as VarChar) + ') from

DeleteDupsTable output Deleted.* Where BrandId='+ CAST(BrandId As Varchar)

From

DeleteDupsTable Group

by Brandid having

COUNT(BrandId)>1



In the above query, notice we have output Deleted.* clause. This clause does the trick of displaying the records that were deleted.


Notice, in the following output, each delete statement tries to delete all records excluding one for every brand. The record being retained is the latest record for each of the brands.



Output:





Welcome

Hello Freinds !!!!!

Welcome to my blog. We will be discussing all programing concerns here.
Hope you enjoy !!