How you do things matters (PowerShell Performance)
I've run into a couple of things over the last month or so that were interesting from a PowerShell perspective. With relatively small amounts of data, performance really isn't a concern. If you're only working with a few hundred or few thousand items PowerShell is pretty quick no matter what. However, when you work with larger datasets, how you do things can make a big difference. Reducing the number of operations The first thing I want to highlight is that that the number of operations affects performance. I was recently building some scripts to build a test environment by copying the AD structure from production. I based all of my work on AD Mirror ( https://gallery.technet.microsoft.com/scriptcenter/AD-Mirror-PowerShell-Module-331b1b12 ). I ran into a performance issue populating group membership. The code from AD Mirror had the list of group members in a variable and added them by using a foreach loop like this: foreach ($m in $newMembers) { Add-ADGroupMembe...