I am using Delphi 2009. I have a very simple data structure, with 2 fields:
If you more often need large datasets, and have some money to spare, simply stuff 16GB ( 500-750 Eur) in a machine, and make a separate process with some 64-bit compiler (*) of it that you query over e.g. shared mem or other IPC method.
In that case you can use the in-memory approach till Delphi 64-bit finally comes out. Since your data seems to be simple ( a map from array of char to array of char) it is easily to export over IPC.
This is of course if this approach has any merit for your case (like it is a cache or so), which I can't determine from your question.
(*) I recommend FPC of course :-)
I did this once, till about 5 million objects, 5 GB of data.
I got permission to open source the container types I made for it, they are at:
http://www.stack.nl/~marcov/lightcontainers.zip (warning: very dirty code)
mghie: to answer in another cliche: There is no silver bullet
Databases have a lot of other assumptions too
This makes databases relatively bad for use in e.g. caches, loadbalancers etc. Of course that is all provided that you need the speed. But the initial question felt a bit speed-sensitive to me.
In a past job my function in an database oriented firm was to do everything but that, IOW fix the problems when the standard approach couldn't hack it (or required 4 socket Oracle servers for jobs where the budget didn't warrant such expenses). The solution/hack written above was a bit of OLAPpy, and connected to hardware (a rfid chipprogramming device), requiring some guaranteed response time. Two months of programming time, still runs, and couldn't even buy a windows server + oracle license for the cost.