I\'m in a situation to replace some functionalities from a client that operates directly on a CSV file, which used as a config file for a system.
Most of the cases avai
A CSV file is just a sequential text file.
If you want to modify the contents you have to read the entire file into memory, modify it there and write it out again.
Assume a more general case of a file that contains the following:
ABCDEFGHIJKLMNOPQRSTUV
If you want to remove "IJLKM" from the middle you have to read the remainder of the file (NOP...) so you can shuffle it up to meet (...FGH).
If you want to insert "0123456789" in between "M" and "N" you need to read N-Z otherwise the new characters will just overwrite "NOPQRSTUVW".
The only way to accomplish this in .NET is to move it from a csv file into a true database.
You can open a text file as if it were a table in a database. See ConnectionString examples for connecting to it here: http://www.connectionstrings.com/textfile
However, even using .NET to treat it as a database won't stop the underlying code from overwriting the whole file when it's updated.
Text files just weren't meant for concurrent users. That's why true databases exist.
The .NET Framework v4.0 comes with a new class for memorymapped file support, see
You will have a good chance of making this work for you. You will still be stuck with the problem to solve when linelenghts change. However, if you want random access and spurious updates (replacing "MSFT" by "UNIX", e.g.) you'll get awesome unrivalled performance.
You could employ padding strategies to have 'spare room' in each cell/line to overcome the problems with changing field/line lengths.
In general, I don't think it is worth the cost for CSV files, but the technique comes up every now and then and is worth mentioning.
Cheers, Seth