I have been working in a web project(asp.net) for around six months. The final product is about to go live. The project uses SQL Server as the database. We have done performance
At first we were using fully normailized database, but now we made it partially normailzed due to performance issues (to reduce joins).
As the old saying goes "normalize till it hurts, denormalise till it works".
It's fairly common in large, heavy-use dbs to see a degree of denormalisation to aid performance, so I wouldn't worry too much about it now, so long as your performance is still where you want it to be and your code to manage the "denormalised" fields doesn't become too onerous.
what are the possible solutions when data size becomes very large, as the no. of clients increase in future?
Not knowing too much about your application's domain, it's hard to say how you can future-proof it, but splitting out recently used and old data to separate tables is a fairly common approach in heavily-trafficked databases - if 95% of your users are querying their data from the last 30/45 days, having a "live_data" table containing, say, the last 60 day's worth of data and an "old_data" for the older stuff can help your performance.
A good idea would be to make sure you have extensive performance monitoring set up so that you can measure your db's performance as the data and load increases. If you find a noticeable drop in performance, it might be time to revisit your indexes!