I read that the max length of URL can be 2,000 characters. I have therefore a table with varchar(2000) column type to store URLs. But this column can not be indexing only th
Your question leaves a lot to the imagination.
For one thing we must assume your index's purpose is to serve as a primary key to avoid duplicates. You won't be developing an application that ever says to a user, "sorry, there's a mistake in your 1800-character data entry; it doesn't match, please try again."
For another thing, we must assume these URLs of yours potentially have lots of CGI parameters (?param=val¶m=val¶m=val) in them.
If these assumptions are true, then here's what you can do.
Make your URL column longer, as a varchar, if you need to.
Add a SHA-1 hash column to your table. SHA-1 hashes consist of strings of 40 characters (hexdigits).
Make that column your primary key.
When you put stuff into your table, use the mySQL SHA1 function to compute the hash values.
Use the INSERT ... ON DUPLICATE KEY UPDATE mySQL command to add rows to your database.
This will let you keep duplicate URLs out of your data base without confusion in a way that scales up nicely.
http://dev.mysql.com/doc/refman/5.1/en/insert-on-duplicate.html