We use DFS to keep webfarm nodes’ content in sync and ran across a problem where a directory with thousands of files and folders had one missing file on a replica. Here’s a quick script we used to find out what files were missing:
$dir1 = “\\server1\c$\folder1”
$dir2 = “\\server2\c$\folder1”
$d1 = get-childitem -path $dir1 -recurse
$d2 = get-childitem -path $dir2 -recurse
$results = @(compare-object $d1 $d2)
foreach($result in $results)
{
$result.InputObject
}
Directory: \\server1\c$\folder1\subfolder
Mode LastWriteTime Length Name
—- ————- —— —-
-a— 2/16/2012 4:08 PM 162 ~$README.txt
This will output all of the files and directories that exist in only one location.
Thanks Jeff. An elegant tip. I didn’t know the power of Compare-Object.
If only interested in file names and not file attributes or dates, you might speed up the comparison on very large folders by collecting and comparing only file names (instead of the entire object). And case-sensitive comparisons are usually faster in code. Tweaks to three lines in your script should do that, as per below:
$d1 = get-childitem -path $dir1 -name -recurse
$d2 = get-childitem -path $dir2 -name -recurse
$results = @(compare-object -casesensitive $d1 $d2)
Pingback: PowerShell: compare 2 directory’s | Marc Valk dot Net
I know it’s an old post but I ran into this post while I was searching for a relative answer. I’m trying to compare two folders but the folders are huge and when I run just this command it not only takes forever but also tries to download files (onedrrive). Is there an easier way of this?