Directory c:\\test has 50 or so files in it, no subdirectories.
If IO.Directory.Exists(\"C:\\test\") Then
IO.Directory.Delete(\"C:\\test\", True)
I had the same problem. I ended up deleting only the contents of the directory @"C:\test"
and copying the new files to the directory. That was my problem:
Using Directory.Delete() and Directory.CreateDirectory() to overwrite a folder
If that's really the application code, note that you are really trying to delete the directory named "est"!
Escape your path to be "c:\test" or use the @ operator @"c:\test".
After exploring System.IO.Directory with reflector, it looks like .Delete is just a wrapper around the FindFirstFile, FindNextFile, and RemoveDirectory Win API calls. There's nothing threaded or asynchronous about the .Net runtime's invokation of those API calls, or the API implementation themselves.
Now, supposing its somehow a TRIM issue, you can disable TRIM by opening an elevated command prompt and using fsutil:
fsutil behavior set disabledeletenotify 1
To enable, run the same command with 0 as parameter.
To query, use query as the command argument:
fsutil behavior query disabledeletenotify
I've had problems with this before, but this is not specific to SSD drives. You would be far better off doing a move then delete:
if(Directory.Exists(dirpath))
{
string temppath = dirpath + ".deleted";
Directory.Move(dirpath, temppath);
Directory.Delete(temppath, true);
}
Directory.Create(dirpath);
The other way to deal with it is to loop until complete:
if(Directory.Exists(dirpath))
{
Directory.Delete(dirpath, true);
int limit = 100;
while(Directory.Exists(dirpath) && limit-- > 0)
Thread.Sleep(0);
}
Directory.Create(dirpath);
Yes, it has nothing to do with SSD drive. I had same problem but only on a client laptop. I am using .NET 3.5. In my case, the directory had a single file. CreateDirectory seems to be executing first internally before Delete is finished.
This was a code that worked well for three years on many computers. Client changed to a new laptop and the code consistently fails for him. I can not reproduce the scenario on development/test machines with same configuration.
I suspect that you've found a race condition in NTFS that was previously not exposed since drives didn't exist that were fast enough to hit it. I don't think that TRIM has anything to do with it (though I reserve the right to be wrong!)
Either way, the proper way to handle this is to put the code in a Retry loop:
int retries = 3;
while(true) {
try {
doTheOperation();
break;
} catch (Exception ex) {
retries--;
if (retries == 0) {
throw;
}
Thread.Sleep(100);
continue;
}
}