Final Edit: I found a solution to the problem (at the bottom of the question).
I\'ve got an Nunit problem that\'s causing me grief. Edit:
I know this answer is over a year late, but for anyone reading in the future...
I had a similar problem to yours - attempting to delete test databases in between tests failed because of the SQLite database file remaining open. I traced the problem in my code to a SQLiteDataReader object not being explicitly closed.
SQLiteDataReader dr = cmd_.ExecuteReader();
while (dr.Read())
{
// use the DataReader results
}
dr.Close(); // <-- necessary for database resources to be released
What do you use to open your database ? Do you use the ADO 2.0 connector from here. If have an application that use it and I can do multiple connection with it (close/open). If you do not use this connector, you might give a try. What does your method Connect() return?
Call static method
SqliteConnection.ClearAllPools()
After this call the database file is unlocked and you can delete the file in the [TearDown].
try calling Close on the dbconnection
make sure the sqllite process is terminated
you can see what process has your file locked with the Unlocker (free) utility
this may be a 'known' issue with SqlLite; the forum banter suggests closing the connection and disposing the command, and suggests that this will be fixed in a future version (since this behavior is not consistent with other ADO providers)
I'd need to see your filename creation logic, but it is possible that you are opening the file to create it but not closing it once it is created. I think if you use System.IO.Path.GetTempFileName() it will just create the file and return you the name of the file with the file closed. If you are doing your own random name generation and using File.Open, you'll need to make sure it's closed afterwards.
On another note, I heartily recommend pursuing a mocking strategy to abstract out the database rather than reading/writing from an actual database in your unit tests. The point of unit tests is that they should be very fast and file I/O is going to really slow these down.