code-adventures
Monday, January 21, 2019
Monday, February 6, 2017
Aspect-Oriented Programming: Implementing caching with AOP
Introduction to AOP
Aspect-oriented programming (AOP) is,
according to Wikipedia, “a programming paradigm that aims to increase
modularity by allowing the separation of crosscutting concerns.” It deals with
functionality that occurs in multiple parts of the system and separates it from
the core of the application, thus improving separation of concerns while
avoiding duplication of code and coupling.
The biggest advantage of AOP is that you
only have to worry about the aspect in one place, programming it once and
applying it in all the places where needed.
There are many uses for AOP, such as:
There are many uses for AOP, such as:
- Implementing logging in your application.
- Using authentication before an operation (such as allowing some operations only for authenticated users).
- Adding caching to certain method calls.
- Global error handling.
- Changing the behavior of some methods.
In the .NET Framework, the most commonly
techniques to implement AOP are post-processing and code interception. The
former is the technique used by PostSharp (postsharp.net) and the latter is
used by dependency injection (DI) containers such as Castle DynamicProxy
and Unity’s
Interception feature. These tools usually use a design pattern named
Decorator or Proxy to perform the code interception.
In this two-part blog post I’ll take a look
on the two approaches that don’t need a DI container:
I’ll try to finish with my personal conclusions.
Part 1: Implementing a caching decorator with RealProxy class
The RealProxy class gives you basic functionality for proxies. It’s an abstract class that must be inherited by overriding its Invoke method and adding new functionality. This class is in the namespace System.Runtime.Remoting.Proxies.
Below we define a caching decorator using RealProxy.
In the constructor we pass the type of the decorated class to the base class. Next we override the Invoke method that receives an IMessage parameter. It contains a dictionary with all the parameters passed to the original method call. After extracting the MethodInfo we can add the aspect we need before calling the method.
public class CachingProxy<T>: RealProxy
{
private readonly T _decorated;
public CachingProxy(T decorated): base(typeof (T))
{
_decorated = decorated;
}
public override IMessage Invoke(IMessage msg)
{
var methodCall = msg as IMethodCallMessage;
var methodInfo = methodCall.MethodBase as MethodInfo;
MethodResultCache cache = MethodResultCache.GetCache(methodInfo);
object result = cache.GetCachedResult(methodCall.InArgs);
if (result == null)
{
try
{
result = methodInfo.Invoke(_decorated, methodCall.InArgs);
cache.CacheCallResult(result, methodCall.InArgs);
}
catch (Exception e)
{
return new ReturnMessage(e, methodCall);
}
}
return new ReturnMessage(result, null, 0, methodCall.LogicalCallContext, methodCall);
}
}
In the constructor we pass the type of the decorated class to the base class. Next we override the Invoke method that receives an IMessage parameter. It contains a dictionary with all the parameters passed to the original method call. After extracting the MethodInfo we can add the aspect we need before calling the method.
The caching helper that is used has methods for getting cached results or adding results to cache. The implementation is not important for now, will be available for download.
public interface IMethodResultCache
{
object GetCachedResult(IEnumerable<object> arguments);
void CacheCallResult(object result, IEnumerable<object> arguments);
}
We will use this caching decorator on a simple repository class with the following contract:
public interface IUsersRepository
{
List<User> GetAll();
User GetById(int id);
}
The actual implementation that gets the data from the DB is also out of the scope of this article. Imagine a EntityFramework implementation or other basic data access alternative.
To use the decorated repository, we must use the GetTransparentProxy method, which will return an instance of IUsersRepository. Every method of this instance that’s called will go through the proxy’s Invoke method. To ease this process, we create a Factory class to create the proxy and return the instance for the repository:
public class UsersRepositoryFactory
{
public static IUsersRepository CreateRealProxy()
{
var repository = new UsersRepository();
var dynamicProxy = new CachingProxy<IUsersRepository>(repository);
return dynamicProxy.GetTransparentProxy() as IUsersRepository;
}
}
This is how the calling code would look like:
[TestMethod]
public void TestRealProxy()
{
IUsersRepository repository = UsersRepositoryFactory.CreateRealProxy();
repository.GetById(1);
repository.GetById(1);
repository.GetById(2)
}
And these are the test results for a run, the messages are writen to output by the caching helper's GetCachedResult method:
Instead of using the factory we can imagine a different scenario of using annotations to mark cacheable classes/members and linking them to the caching proxy. This approach is used by Unity's interception feature.
Part 2: to be continued
Source code link.
Happy coding.
Monday, March 2, 2015
Analyze IIS Web Site performance with LogParser and SQL Server
In this post I'll go through the steps you have to take in order to import IIS logs into SQL Server and provide some queries to get you started on identifying performance problems.
This query will give you an indicator on where to focus your performance improvement effort.
This query will display the number of requests per hour intervals during a day. This way you know at what time your website is mostly used.
This query will return the average response time per hour intervals during a day.
1. Localize your IIS logs
By default, IIS log files are located in the following directories:
- IIS 7 and later: %SystemDrive%\inetpub\logs\LogFiles
- IIS 6 and earlier: %WinDir%\System32\LogFiles
Open the IIS Manager and select Sites, as shown bellow. This will show you the ID of each website hosted on your server. You will need this ID to determine which W3SVC* directory to analyze.
Open Windows Explorer and navigate to the directory that contains the IIS log files of your website.
Open Windows Explorer and navigate to the directory that contains the IIS log files of your website.
Locate your log file based on the naming and copy it somewhere else. (ie: c:\temp\logs).
You can find LogParser at the following location: Log Parser 2.2
I have used the following command:
I have used the following command:
C:\Program Files\Log Parser 2.2>logparser "SELECT * INTO iisLogs FROM c:\temp\logs\*.log" -i:iisw3c -o:SQL -server:localhost -database:webLogs -username:sa -password:yourpass -createTable: ON
You will have to fill in your own SQL Server location and credentials.
Running this command will take some time depending on the size of your log file. After it finishes you can see the created table on your DB, with a structure identical to the IIS log file.
Read more about W3C Extended Log File Format (IIS 6.0)
Running this command will take some time depending on the size of your log file. After it finishes you can see the created table on your DB, with a structure identical to the IIS log file.
Read more about W3C Extended Log File Format (IIS 6.0)
3. Run scripts to identify performance problems
Status codes
SELECT scstatus + scsubstatus As Status, COUNT(*) AS Hits FROM [dbo].[iisLogs] GROUP BY scstatus + scsubstatus ORDER BY Hits DESC
This query will allow you to compare 200-Success status code responses to:
- cached content: 304 - Not Modified
- error codes: 404-Not Found, 500- Internal Server Error
In my case I see a lot of redirects, almost as much as 200 status codes. Something might be wrong with the friendly URLs redirect feature and needs more investigating. The errors though are nothing to worry about.
- cached content: 304 - Not Modified
- error codes: 404-Not Found, 500- Internal Server Error
In my case I see a lot of redirects, almost as much as 200 status codes. Something might be wrong with the friendly URLs redirect feature and needs more investigating. The errors though are nothing to worry about.
Slowest 20 pages
SELECT TOP 20 csuristem AS URL, MAX(timetaken) AS Max, MIN(timetaken) AS Min, AVG(timetaken) AS Average, COUNT(*) AS Hits FROM [dbo].[iisLogs] GROUP BY csuristem ORDER BY Average DESC
This query will give you an indicator on where to focus your performance improvement effort.
Requests per hour
SELECT DATEADD(HOUR, DATEDIFF(HOUR, 0, time), 0) AS Hour, COUNT(*) AS TotalRequests FROM [dbo].[iisLogs] GROUP BY DATEADD(HOUR, DATEDIFF(HOUR, 0, time), 0) ORDER BY DATEADD(Hour, DATEDIFF(Hour, 0, time), 0)
This query will display the number of requests per hour intervals during a day. This way you know at what time your website is mostly used.
Average response time per hour
SELECT DATEADD(Hour, DATEDIFF(Hour, 0, time), 0) AS Hour, AVG([timeTaken]) AS AverageResponseTime FROM [dbo].[iisLogs] GROUP BY DATEADD(Hour, DATEDIFF(Hour, 0, time), 0) ORDER BY DATEADD(Hour, DATEDIFF(Hour, 0, time), 0)
This query will return the average response time per hour intervals during a day.
4. Compare data sets
If you need to go in front of managers to explain what happened you can create Excel charts with the data you get from your queries. This will give you a visual representation that is easier to interpret.
For example I have compared the average response time with the user load, the results in my case were a bit surprising.
Subscribe to:
Posts (Atom)