Click an Ad

If you find this blog helpful, please support me by clicking an ad!

Friday, May 31, 2013

Export Device Drivers from a Working System

A while ago I had to rebuild an old server on new hardware, and couldn't figure out what the fiber card that connected to our tape library was. This app helped me out big time. I still don't know, but I was able to export the driver using Double Driver and use it on the new system. It worked perfectly!

Wednesday, May 29, 2013

Understand the Script Before you Run it

Seasoned IT people have heard this a million times over: understand the script before you run it. This is a tale of woe and unforeseen overtime, that could have been avoided but for the mistakes of two intrepid IT pros. It's one of the best reasons to learn PowerShell, in my opinion. There are TONS of useful scripts out there to automate just about everything, and knowing just a bit can help you step through a script and to understand the concept of what a script is doing before you unleash it on, say, Active Directory.

We are in the process of breaking up a gigantic file server (2 TB) into 3 chunks. Having a file server this big is a big albatross. According to my math, restoring this puppy from backup would take around 36 hours. Longer-term, my plan is to pair the splitting with some sensible file storage policies and some kind of archiving for static files. Together, these should get the musketeers down to a more manageable size.

My cohort has volunteered to do the after hours work to move the file shares and reconfigure DFS. Being the helpful lad that I am, I gave him a command to make his life easier:

robocopy.exe <source> <target> /COPYALL /MIR

I am infatuated with robocopy. It's such a great little program. Copyall ensures that NTFS permissions and timestamps are preserved. MIR is the key part of this though; it ensures that the destination folder becomes an exact copy of the source. BUT MIR is a double-edged sword, and will delete files to achieve this end. I gave my cohort the command without explaining it. I really regret that, and it's illuminated that I need to do my part to ensure that people understand the tools that I'm giving them; this includes better documenting my code. I'm not horrible about it, but I could do better. There's always that line in the IT world where you have to assume that someone knows something, though, and it's tough to see where that is, sometimes. Telling him how to open the command prompt might seem condescending, right? Where do you start with someone? Misjudging that line is very easy to do, and can be very harmful.

But, I digress. So my partner runs the command, and moves some stuff one night. Last week, he discovered new stuff in the old "source" folder, so he ran the command again. See the problem? The MIR switch creates a mirror of the source, and about 50GB of files were no longer present in the source, so they were deleted. Ruh-Roh. I was just heading up to bed when my phone went off. He needed a file restore. A 54GB file restore. Of many small files. Not good. I fired up my trusty Veeam Backup & Replication and started restoring files. Wow was this thing moving slowly! I was getting throughput of 40KB/sec! A support call fixed that, but I want to tell you about some other really great things that I learned:

  1. Veeam Enterprise paid for itself during this process. I was able to boot the VM as it was before the mishap, output a recursive directory listing (get-childitem) to a text file, and copy that file to my hard drive. Then, I did the same thing on the production side and used a program called Beyond Compare to compare the 2 text files to see where my file restore had gone wrong. This is the second time I've had to do something like this, and the hours of labor saved has more than paid for the higher-end version.
  2. Veeam (actually I think it's an NTFS issue) doesn't like files with a filename and path over 260 characters. How these files are allowed to exist on an NTFS filesystem in the first place, I have no idea, but it will stop a Veeam restore IN ITS TRACKS. Comparing the filesystems of yesterday vs today helped me see what had been restored and what I had yet to do.
  3. During my support call, it was imparted to me that using the Windows File Level Restore is not a good way to restore a lot of files at once (like 54GB worth of Word and Excel docs, for instance). Veeam takes a few seconds to verify each and every file, which is part of what was slowing me down. The tech showed me that after you mount the backup for the Windows FLR (so you're looking at the browser window) you should open regular old Windows Explorer and navigate to C:\VeeamFLR. Your drives will be mounted here, and you can use Explorer to copy and paste much more quickly.
So, lessons learned:
  • Communicate more better
  • Assume less
  • Veeam Enterprise is gold, baby! (Beyond Compare is well worth the price as well)
  • I need to find a way to comb my servers for really long paths+filenames
  • Use the C:\VeeamFLR folder to copy from backups back to production; it's just easier.

Friday, May 17, 2013

Chaining Together Veeam SureBackup Verification Jobs

So my most recent conundrum in implementing Veeam Backup and Replication 6.5 Enterprise was getting SureBackup up and running. I wanted to set aside each Sunday to check all of my backup jobs.

Veeam has wisely allowed backup jobs to be chained; I set my first backup job to start at a certain time, and set the next job to start when the first one finishes, and so forth. It's such an elegant way to do things, and I commend Veeam for implementing it. What I don't understand is why they didn't make this feature accessible for any job that could be scheduled. I don't do replications, so I'm not sure if you can do it there, but I know for a fact that you cannot do it with SureBackup jobs.

Therefore, I needed to dust off my chops and head back to starting and running jobs via Powershell. I got my script syntax and methodology from a great post in this thread on the Veeam forums by v.Eremin.


Add-PSSnapin VeeamPSSnapin
$Job1 = Get-VSBJob -name "SB_Daily_1"
$Job2 = Get-VSBJob -name "SB_Daily_2"
$Job3 = Get-VSBJob -name "SB_Daily_3"

#This starts the first Job, then puts the script to sleep for 5 minutes
Start-VSBJob $Job1
Start-Sleep -s 300

#This section checks the status of the last job every 5 minutes, and starts the next one if it's done.
If($Job1.GetLastState() -ne "Working") {Start-VSBJob $Job2}
Else
{
do
{
Start-sleep -s 300
$status = $Job1.GetLastState()
}while ($status -eq "Working")
Start-VSBJob $Job2
}

#This section checks the status of the last job every 5 minutes, and starts the next one if it's done.
If ($Job2.GetLastState() -ne "Working") {Start-VSBJob $Job3}
Else
{
do
{
Start-sleep -s 300
$status = $Job2.GetLastState()
}while ($status -eq "Working")
Start-VSBJob $Job3
}

So, first you need to load the Veeam PowerShell Snap-In, and then you declare your job names as variables for later use. Then, you start the first SureBackup job.

Within each subsequent section, you check if the last job is still working. If not, start the next job, and if it is still processing that job the script enters a do-while loop wherein it goes to sleep for 5 minutes, then checks the status of the first job again. If it's still processing, the script re-executes the do-while loop. Finally, when it sees that the job is finished, it will start the next job.

This loop is repeated for each subsequent job.

Monday, May 13, 2013

Create an Outlook Rule to Act on Emails Received during a Certain TIME Period

I ran into this conundrum while trying to suppress active CPU alarms during our weekly antivirus scans. I don't want to discard CPU activity alarms altogether, and Outlook only has a canned rule for acting on emails received on certain days.

What you need to do is create the rule with the criteria looking at certain text IN THE EMAIL HEADER!

As far as the specific text to look for, I opted to use "2013 23:" (the clock for headers is the 24 hour variety). I left the year in there so that the rule wouldn't falsely trigger on other instances of "23:", which is a pretty generic search term.

Sure, I'll have to edit the rule next year, but this will get me by.

Friday, May 10, 2013

Auto-Decline WSUS Windows Embedded Updates

I decided to take the script to decline Itanium updates that I posted recently to the next level, and tweaked it to also remove Windows Embedded updates. Here is the script that I'm running now:


$WsusServer = "wsusserver.contoso.com"
$UseSSL = $false
$PortNumber = 80
$TrialRun = $false #change this to $true to see what it will effect!

#E-mail Configuration
$SMTPServer = "mailserver.contoso.com"
$FromAddress = "administrator@contoso.com"
$Recipients = "me@contoso.com"
$MessageSubject = "PS Report - Declining Itanium/Embedded Updates"

Function SendEmailStatus($MessageSubject, $MessageBody)
{
$SMTPMessage = New-Object System.Net.Mail.MailMessage $FromAddress, $Recipients, $MessageSubject, $MessageBody
$SMTPMessage.IsBodyHTML = $true
#Send the message via the local SMTP Server
$SMTPClient = New-Object System.Net.Mail.SMTPClient $SMTPServer
$SMTPClient.Send($SMTPMessage)
$SMTPMessage.Dispose()
rv SMTPClient
rv SMTPMessage
}

#Connect to the WSUS 3.0 interface.
[reflection.assembly]::LoadWithPartialName("Microsoft.UpdateServices.Administration") | out-null
$WsusServerAdminProxy = [Microsoft.UpdateServices.Administration.AdminProxy]::GetUpdateServer($WsusServer,$UseSSL,$PortNumber);

#$itanium = $WsusServerAdminProxy.SearchUpdates('Itanium') | ?{-not $_.IsDeclined}
#$itanium += $WsusServerAdminProxy.SearchUpdates('ia64') | ?{-not $_.IsDeclined}
#Although the above seems faster it also seaches in the description of the update so use the below just to search the title!
$itanium = $WsusServerAdminProxy.GetUpdates() | ?{-not $_.IsDeclined -and $_.Title -match "ia64|itanium"}
$itanium += $WsusServerAdminProxy.GetUpdates() | ?{-not $_.IsDeclined -and $_.Title -match "Embedded Standard 7"}
If ($TrialRun)
{$MessageSubject += " Trial Run"}
Else
{$itanium | %{$_.Decline()}}

$Style = "<Style>BODY{font-size:11px;font-family:verdana,sans-serif;color:navy;font-weight:normal;}" + `
"TABLE{border-width:1px;cellpadding=10;border-style:solid;border-color:navy;border-collapse:collapse;}" + `
"TH{font-size:12px;border-width:1px;padding:10px;border-style:solid;border-color:navy;}" + `
"TD{font-size:10px;border-width:1px;padding:10px;border-style:solid;border-color:navy;}</Style>"

If ($itanium.Count -gt 0)
{
$MessageBody = $itanium | Select `
@{Name="Title";Expression={[string]$_.Title}},`
@{Name="KB Article";Expression={[string]::join(' | ',$_.KnowledgebaseArticles)}},`
@{Name="Classification";Expression={[string]$_.UpdateClassificationTitle}},`
@{Name="Product Title";Expression={[string]::join(' | ',$_.ProductTitles)}},`
@{Name="Product Family";Expression={[string]::join(' | ',$_.ProductFamilyTitles)}},`
@{Name="Uninstallation Supported";Expression={[string]$_.UninstallationBehavior.IsSupported}} | ConvertTo-HTML -head $Style
SendEmailStatus $MessageSubject $MessageBody
}

Again, I can't take credit for the script itself; that honor belongs to ...... whoever submitted it to this technet page.

Monday, May 6, 2013

Lost All Admin Access to a SQL Server; PSTools to the rescue!

So today I was troubleshooting why a local scheduled task to back up a SQL Express database hasn't been running. Oddly, in SQL Server Management Studio (SSMS) I could see that the sa account had been disabled, and that only the 'builtin/Users' account had login rights. BuiltinUsers didn't have admin rights, either.

Hooray for this post over at mssqltips.com, which allows you to leverage psexec to get a SSMS login under the NT AUTHORITY/SYSTEM account, and then change the permissions. The command is this:

PsExec -s -i "C:\Program Files (x86)\Microsoft SQL Server\110\Tools\Binn\ManagementStudio\Ssms.exe"

You may need to modify the path, and you need to run it from the SQL Server itself after copying PSTools over to the SQL Server. Also, this command is all one line, and there is a space between SQL and Server. I don't have the time to fight a formatting war right now....

I highly recommend that admins keep a copy of the various PSTools programs around, as they can be very handy. I even found a GUI front-end for them.

This is also a stark reminder that you can lock down an application as much as you want, but the minute someone gets admin access (or physical access) to the system itself, all bets are off.

Thursday, May 2, 2013

Auto-decline WSUS Itanium Updates

I've been really busy lately. While I haven't had the time to post between work and family (remember to keep balance in your life!), I have a TON of things saved up to post about. Let's hope all of the tech isn't obsolete by the time I get around to it, eh?

In that vein, I'm just going to blast some things out with short posts, in the hope that I'll be able to make the time to actually post.

Also, I'll be putting interesting web pages I come across - mostly news and editorial stuff in the blog roll on the right. It links to my Tumblr page, called The LAG.

So today's post is about WSUS updates. I really like WSUS. I really hate WSUS. I like it because it patches my stuff in an automatic and (not very) complicated way. It hasn't changed all that much in a long time. I hate it because it's TOO simple. I know, if I want granularity I need to plop down the cash for something like System Center. I don't want granularity that badly, though! Case in point, every month I have to go through all of the Patch Tuesday updates and decline the Itanium releases. They waste space on my system, and clutter up my view of approved (and actually useful) updates.

That's why I was SO thrilled when a colleague of mine sent me the link to a Microsoft Technet page that had a powershell script that would decline Itanium updates! I tweaked the script a bit and ran a couple of tests just to make sure it was going to do what I thought (it has a test run component, nice!).

After I verified that it was indeed the magical faerie unicorn that I thought it to be, I scheduled it to run every Tuesday night after WSUS synchronizes.

I'LL NEVER HAVE TO DECLINE ITANIUM UPDATES AGAIN!

You'd think that Microsoft could break these out fairly easily within Products and Classifications, but whatever.