Back in June, after a whole seven months in our system, one of our clients hit Microsoft’s limit in the max number of folders within a folder. They are on track to do it again a year later.
I lost the first script I wrote to figure out which directories were affected, so I wrote a new one yesterday. The problem directories are all in the same path with the schoolcode\courses as the name. A configuration for the name of the folder to use for the school has the application create new folders for each new course at this location. When this folder fills up, the solution is to add a number to that name so a new folder is created and subsequent courses put there. So I looked for all directories fitting this pattern. Then I loop through the list and count the number of objects in each directory and post to the output this count.
$logdate = Get-Date -uformat "%Y%m%d%H%M"
$sites = 'nas1 nas2'
# Loop through each NAS
foreach ($site in $sites)
# Compile the list of directories for NAS
$dirlist = Get-ChildItem \\$site\share\path\to\client_stuff\*\courses*
# Loop through found directories
foreach ($diritem in $dirlist)
$getcount = (Get-ChildItem $diritem).Count
# Post to output
Add-Content \\nas1\share\to\scripts\LOGS\count_directories_$logdate.log "$diritem == $getcount"
Obviously, I do not really want to run this script every term or even six months. So I really need to design a check for our monitor service. It might report any who cross over 50,000. That would give us a 15,533 cushion to have configuration changed to start using a new folder. The correct threshold is hard to choose. As is, it is slightly bigger than the largest term for our largest client. It would ensure no surprises. With something like 5,000 it is possible for us to get no warning one day, then the start of a new term hit the limit when the client sent 14,795 courses for the Fall 2013 term. But really only that one school has this problem. The next two largest clients had 8,246 and 6,364 courses for Fall 2013. (Combined almost the same as the biggest.)
Really, though, the problem with a check is Powershell takes 2 hours to count this amount of stuff. So probably I need to record this data to a database and have the monitor check this database.