I have been working in a web project(asp.net) for around six months. The final product is about to go live. The project uses SQL Server as the database. We have done performance
In the scheme of things, a few million rows is not a particulary large Database.
Assuming we are talking about an OLTP database, denormalising without first identifying the root cause of your bottlenecks is a very, very bad idea.
The first thing you need to do is profile your query workload over a representative time period to identify where most of the work is being done (for instance, using SQL Profiler, if you are using SQL Server). Look at the number of logical reads a query performs multiplied by the number of times executed. Once you have identified the top ten worst performing queries, you need to examine the query execution plans in detail.
I'm going to go out on a limb here (because it is usually the case), but I would be surprised if your problem is not either
This SO answer describes how to profile to find the worst performing queries in a workload.