I have an sqlite database full of huge number of URLs and it\'s taking huge amount of diskspace, and accessing it causes many disk seeks and is slow. Average URL path length is
You could use your existing body of URLs to compute Huffmann codes over all bytes occuring in URLs according to their frequency. You can then store that set of codes once, and encode all URLs using it. I think it should give decent compression.