Click an Ad

If you find this blog helpful, please support me by clicking an ad!

Wednesday, May 29, 2013

Understand the Script Before you Run it

Seasoned IT people have heard this a million times over: understand the script before you run it. This is a tale of woe and unforeseen overtime, that could have been avoided but for the mistakes of two intrepid IT pros. It's one of the best reasons to learn PowerShell, in my opinion. There are TONS of useful scripts out there to automate just about everything, and knowing just a bit can help you step through a script and to understand the concept of what a script is doing before you unleash it on, say, Active Directory.

We are in the process of breaking up a gigantic file server (2 TB) into 3 chunks. Having a file server this big is a big albatross. According to my math, restoring this puppy from backup would take around 36 hours. Longer-term, my plan is to pair the splitting with some sensible file storage policies and some kind of archiving for static files. Together, these should get the musketeers down to a more manageable size.

My cohort has volunteered to do the after hours work to move the file shares and reconfigure DFS. Being the helpful lad that I am, I gave him a command to make his life easier:

robocopy.exe <source> <target> /COPYALL /MIR

I am infatuated with robocopy. It's such a great little program. Copyall ensures that NTFS permissions and timestamps are preserved. MIR is the key part of this though; it ensures that the destination folder becomes an exact copy of the source. BUT MIR is a double-edged sword, and will delete files to achieve this end. I gave my cohort the command without explaining it. I really regret that, and it's illuminated that I need to do my part to ensure that people understand the tools that I'm giving them; this includes better documenting my code. I'm not horrible about it, but I could do better. There's always that line in the IT world where you have to assume that someone knows something, though, and it's tough to see where that is, sometimes. Telling him how to open the command prompt might seem condescending, right? Where do you start with someone? Misjudging that line is very easy to do, and can be very harmful.

But, I digress. So my partner runs the command, and moves some stuff one night. Last week, he discovered new stuff in the old "source" folder, so he ran the command again. See the problem? The MIR switch creates a mirror of the source, and about 50GB of files were no longer present in the source, so they were deleted. Ruh-Roh. I was just heading up to bed when my phone went off. He needed a file restore. A 54GB file restore. Of many small files. Not good. I fired up my trusty Veeam Backup & Replication and started restoring files. Wow was this thing moving slowly! I was getting throughput of 40KB/sec! A support call fixed that, but I want to tell you about some other really great things that I learned:

  1. Veeam Enterprise paid for itself during this process. I was able to boot the VM as it was before the mishap, output a recursive directory listing (get-childitem) to a text file, and copy that file to my hard drive. Then, I did the same thing on the production side and used a program called Beyond Compare to compare the 2 text files to see where my file restore had gone wrong. This is the second time I've had to do something like this, and the hours of labor saved has more than paid for the higher-end version.
  2. Veeam (actually I think it's an NTFS issue) doesn't like files with a filename and path over 260 characters. How these files are allowed to exist on an NTFS filesystem in the first place, I have no idea, but it will stop a Veeam restore IN ITS TRACKS. Comparing the filesystems of yesterday vs today helped me see what had been restored and what I had yet to do.
  3. During my support call, it was imparted to me that using the Windows File Level Restore is not a good way to restore a lot of files at once (like 54GB worth of Word and Excel docs, for instance). Veeam takes a few seconds to verify each and every file, which is part of what was slowing me down. The tech showed me that after you mount the backup for the Windows FLR (so you're looking at the browser window) you should open regular old Windows Explorer and navigate to C:\VeeamFLR. Your drives will be mounted here, and you can use Explorer to copy and paste much more quickly.
So, lessons learned:
  • Communicate more better
  • Assume less
  • Veeam Enterprise is gold, baby! (Beyond Compare is well worth the price as well)
  • I need to find a way to comb my servers for really long paths+filenames
  • Use the C:\VeeamFLR folder to copy from backups back to production; it's just easier.

4 comments:

  1. This won't look nice as a comment, so here is a pastebin: http://pastebin.com/P1SERPXa

    Make sure you have access to all directories, or it will fail.

    #Let's find some long filenames
    $files=Get-Childitem C:\ -Recurse
    #Change Outpath to where you want log.
    $Outpath = "C:\NoSync\LongFiles.txt"
    foreach ($file in $files)
    {
    if ($file.pspath.substring(38).length -gt 10)
    {
    if($File.Attributes -like "Directory*")
    {
    Write-Output ($File.PSPath.substring(38) + " is " + $File.Name.Length + " Characters Long, and it is a directory! The total path length is " + ($File.PSPath.Substring(38).length) + ".") >> $Outpath
    }
    else
    {
    Write-Output ($File.PSPath.substring(38) + " is " + $File.Name.Length + " Characters Long, and it is a file! The total path length is " + ($File.PSPath.Substring(38).length) + ".") >> $Outpath
    }
    }
    }

    ReplyDelete
    Replies
    1. Also, I used 10 as a test, I expect you'll want to ramp that up to ~150-200 or so.

      Delete
  2. Thanks, Ben. What is the "38" for in Substring(38)?

    ReplyDelete
  3. $File.PsPath returns a bunch of extra info in front of the path. On my system it was 38 characters long.
    Microsoft.PowerShell.Core\FileSystem::

    ReplyDelete