If you are trying to recursively delete directory a
and directory a\b
is open in Explorer, b
will be deleted but you will get the error 'directory is not empty' for a
even though it is empty when you go and look. The current directory of any application (including Explorer) retains a handle to the directory. When you call Directory.Delete(true)
, it deletes from bottom up: b
, then a
. If b
is open in Explorer, Explorer will detect the deletion of b
, change directory upwards cd ..
and clean up open handles. Since the file system operates asynchronously, the Directory.Delete
operation fails due to conflicts with Explorer.
Incomplete solution
I originally posted the following solution, with the idea of interrupting the current thread to allow Explorer time to release the directory handle.
// incomplete!
try
{
Directory.Delete(path, true);
}
catch (IOException)
{
Thread.Sleep(0);
Directory.Delete(path, true);
}
But this only works if the open directory is the immediate child of the directory you are deleting. If a\b\c\d
is open in Explorer and you use this on a
, this technique will fail after deleting d
and c
.
A somewhat better solution
This method will handle deletion of a deep directory structure even if one of the lower-level directories is open in Explorer.
/// <summary>
/// Depth-first recursive delete, with handling for descendant
/// directories open in Windows Explorer.
/// </summary>
public static void DeleteDirectory(string path)
{
foreach (string directory in Directory.GetDirectories(path))
{
DeleteDirectory(directory);
}
try
{
Directory.Delete(path, true);
}
catch (IOException)
{
Directory.Delete(path, true);
}
catch (UnauthorizedAccessException)
{
Directory.Delete(path, true);
}
}
Despite the extra work of recursing on our own, we still have to handle the UnauthorizedAccessException
that can occur along the way. It's not clear whether the first deletion attempt is paving the way for the second, successful one, or if it's merely the timing delay introduced by the throwing/catching an exception that allows the file system to catch up.
You might be able to reduce the number of exceptions thrown and caught under typical conditions by adding a Thread.Sleep(0)
at the beginning of the try
block. Additionally, there is a risk that under heavy system load, you could fly through both of the Directory.Delete
attempts and fail. Consider this solution a starting point for more robust recursive deletion.
General answer
This solution only addresses the peculiarities of interacting with Windows Explorer. If you want a rock-solid delete operation, one thing to keep in mind is that anything (virus scanner, whatever) could have an open handle to what you are trying to delete, at any time. So you have to try again later. How much later, and how many times you try, depends on how important it is that the object be deleted. As MSDN indicates,
Robust file iteration code must take into account many complexities
of the file system.
This innocent statement, supplied with only a link to the NTFS reference documentation, ought to make your hairs stand up.
(Edit: A lot. This answer originally only had the first, incomplete solution.)