Is there any performance difference on retrieving a bit or a char(1) ?
Just for curiosity =]
UPDATE: Suposing i\'m using SQL Server 2008!
As Adam says, it depends on the database implementing the data types properly, but in theory the following holds:
Bit:
Will store 1 or 0 or null. Only takes a Bit to store the value (by definition!). Usually used for true or false, and many programming languages will interpret a bit as a true or false field automatically.
Char[1]:
A char takes 8 bits, or one byte, so its 8 times larger when stored. You can store (pretty much) any character in there. Will probably be interpreted as a string by programming languages. I think Char[1] will always take the full byte, even when empty, unless you use varchar or nvarchar.