Saturday, 20 September 2008

Refactoring Techniques - High Performance Code

This blog will hold the collection of refactoring and performance techniques.

What is Refactoring?

Refactoring is a process to improve upon the design of existing code, while preserving its intended functionality.

Bad Code Smells:
Refactoring Techniques:
Here, you can find an excellent list of refactoring techniques with examples. In below, I am going to duplicate some:


List items = new List();

Bad: items.Count
Good: items.Length

Length property is faster than Count performance-wise.



int[] items = new int[2];

List items = new List();

1. Using Generic list gives you more power and control over the list
2. You do not have to specify the length of the array at the time of its creation

3) Extract Constant

public static double CalcCircumference(double diameter)
{ return 3.14 * diameter; }

public const double PI = 3.14;
public static double CalcCircumference (double diameter)
{ return PI * diameter; }

1. If we need to change PI value, we only need to change it in one place
2. Increased readibility

4) Extract Method

5) Decompose IF Conditional

6) Extract Variable

7) Encapsulate Field:

8) Extract Interface:

9) Reorder Method Parameters

10) Rename (type, method, class, namespace, variable)

11) Promote Local Variable to Parameter

1) To inject dependency (object, type, etc)

12) Use StringBuilder instead of String

For another list of refactoring techniques, please see

Refactoring Tools:

Learn More:

Thursday, 11 September 2008

Unit Testing - deleting test data

Imagine a unit test in which you need to insert test data into database so that the test can run.

After running the test, you need to delete your inserted test data otherwise you will have so much junk in database.

One way, that we used for some time, is that in the the teardown method of the current test fixture, you run a stored procedure to delete all your test data from all the tables.

The problems with this way are:
  • Cuncurrency issues when 2 test cases are run at the same time
  • Performance issues:
  • Unnecessary connection to database tables in order to delete the test data
  • Unnecessary use of transaction logs which lead to have larger .ldf file (all the DML scripts affect transaction logs)
  • Unnecessary change of indexes (all the DML scripts affect indexes)
The best way is to use transaction in each test case so that only those tables and data affected by the test are involved. In addition, because the changes will be rolled back at the end of the test case, so we won't have concurrency issue and database will remain in a consistent state.

I will write a sample code here later.