Click an Ad

If you find this blog helpful, please support me by clicking an ad!

Friday, June 28, 2013

Running a Powershell Script on a Remote Computer From a Local Powershell Script

Let me tell you about why I needed this:

I will preface this by saying that I cite a use case which includes Veeam, but it's just an app that creates files, and your use case could include other files.

I have a server named FileServerPROD that serves as my Veeam Repository. At the end of my backup job, I have a robocopy process that synchronizes all of the files on FileServerPROD with FileServerDR at another site. These copy jobs take several hours at night, and an analysis posed the question, "What if something bad happens in the middle of the robocopy job?"

The answer is "Your offiste backup copies are toast, that's what." Uh.... unacceptable.

I came up with a process that creates a "Yesterday" folder and creates a copy of the backup files BEFORE my robocopy process starts its thing. This way, I keep a copy of my old files that is untainted should the robocopy go south. The problem I encountered was that if I tried to do this from my side of the WAN, the copied files would have to come all the way to the computer that the copy was initiated from (across the WAN) and then BACK to the destination - across the WAN again! I needed to initialize the copy operations from FileServerDR so the data would all stay local (and speedy).

Powershell isn't designed to easily launch Powershell scripts remotely from inside of other Powershell scripts. Obviously, the ability would be a security concern. So I cobbled together this workaround.

First off, here's the script that runs at the end of the Veeam job and lives on FileServerPROD, which I will dissect below. I've made the most pertinent code red:



$starttime = ((Get-Date)|out-string)

#Execute Remote Powershell script to move the current backup files into a "Yesterday" folder and make new copies to run the mirror operation on
$Server="FileServerDR"
$process=[WMICLASS]"\\$Server\ROOT\CIMV2:win32_process"
$result=$process.Create("powershell.exe -File C:\ps\Daily-Apps.ps1")
$ProcessID = $result.processID

#Make sure the process has finished running its course before initiating the robocopy job

#Only when the process initiated above completes while $Running be $null, and the loop end
$Running = 1
While ($Running -ne $null){
Start-Sleep -seconds 60 #wait 1 minutes
$Running = (get-process -id $ProcessID -ComputerName $Server -ea 0) #Check the process again
}

$YesterdayCreated = ((Get-Date)|out-string)
#The section below copies the Veeam Jobs, and then sends a report
start-process "c:\robocopy.exe" -ArgumentList 'R:\Backups\Daily-Apps \\FileServerDR\R$\Daily-Apps /MIR /R:1 /W:5 /NP /LOG:C:\Logs\VeeamRobocopy-Daily-Apps.log' -wait

$endtime = ((Get-Date)|out-string)
$To = "me@contoso.com"
$From = "administrator@contoso.com"
$Body = (("Start Time: $starttime")+("Yesterday Folder Created: $YesterdayCreated")+("End Time: $endtime"))
$Subject = "Robocopy Offsite - Veeam Daily-Apps - Job Results"
$SMTPServer = "SMTPServer.contoso.com"
$Attachment = "C:\Logs\VeeamRobocopy-Daily-Apps.log"
Send-Mailmessage -to $To -Subject $subject -From $From -body $body -smtpserver $SMTPServer -attachments $Attachment



And here's the script that runs on FileServerDR:



#Remove the current "Yesterday" folder
remove-Item R:\Daily-Apps-Yesterday -Recurse -Force -ea 0

#Create New Folder
mkdir R:\Daily-Apps-Yesterday

#Copy contents of the old folder to the new folder
copy-Item R:\Daily-Apps\*.* R:\Daily-Apps-Yesterday



Now, let's go through the FileServerPROD script:

$starttime = ((Get-Date)|out-string)
This gets a time marker for information purposes.

#Execute Remote Powershell script to move the current backup files into a "Yesterday" folder and make new copies to run the mirror operation on
$Server="FileServerDR"
$process=[WMICLASS]"\\$Server\ROOT\CIMV2:win32_process"
$result=$process.Create("powershell.exe -File C:\ps\Daily-Apps.ps1")
$ProcessID = $result.processID
The code above launches a Powershell process on CITYFPDR which runs the script remotely. The last line gets the process ID, which we will use next.

#Make sure the process has finished running its course before initiating the robocopy job
#Only when the process initiated above completes while $Running be $null, and the loop end
$Running = 1
While ($Running -ne $null){
Start-Sleep -seconds 60 #wait 1 minutes
$Running = (get-process -id $ProcessID -ComputerName $Server -ea 0) #Check the process again
}
THIS IS THE KEY! This While loop puts the script on pause until the remote script finishes by checking if the process ID exists. If the ProcessID doesn't exist (is finished) the the get-process query will return $null, thus ending the loop.

$YesterdayCreated = ((Get-Date)|out-string)
#The section below copies the Veeam Jobs, and then sends a report
start-process "c:\robocopy.exe" -ArgumentList 'R:\Backups\Daily-Apps \\FileServerDR\R$\Daily-Apps /MIR /R:1 /W:5 /NP /LOG:C:\Logs\VeeamRobocopy-Daily-Apps.log' -wait
Here we get another time marker and the script starts the robocopy job.

$endtime = ((Get-Date)|out-string)
$To = "me@contoso.com"
$From = "administrator@contoso.com"
$Body = (("Start Time: $starttime")+("Yesterday Folder Created: $YesterdayCreated")+("End Time: $endtime"))
$Subject = "Robocopy Offsite - Veeam Daily-Apps - Job Results"
$SMTPServer = "SMTPServer.contoso.com"
$Attachment = "C:\Logs\VeeamRobocopy-Daily-Apps.log"
Send-Mailmessage -to $To -Subject $subject -From $From -body $body -smtpserver $SMTPServer -attachments $Attachment
And here we get an "end time" marker, then an email is sent with all of the time markers we've gathered along with the robocopy log as an attachment.




Here's the dissection of the script that runs on the other side, creating that "Yesterday" folder and a safe copy of my data:

#Remove the current "Yesterday" folder
remove-Item R:\Daily-Apps-Yesterday -Recurse -Force -ea 0
This resets the yesterday folder

#Create New Folder
mkdir R:\Daily-Apps-Yesterday
This makes a new yesterday folder

#Copy contents of the old folder to the new folder
copy-Item R:\Daily-Apps\*.* R:\Daily-Apps-Yesterday
Like it says, this copies the contents over

Monday, June 24, 2013

Automated WSUS Report - Computers in the Unassigned Computers Container

It seems like one of the other admins is always adding a computer to the domain and not telling me, and I realized that I need to make it a priority to check the "Unassigned Computers" container every so often to ensure there isn't anything in it. This is the sort of thing Powershell shines at!

Before you run this, make sure you have a C:\Temp folder, or change the output file path. I did my explanation in the comments within the script below:



$WsusServer = "WSUSServer.contoso.com"
$UseSSL = $false
$PortNumber = 80
$TempFile = "C:\Temp\Output.txt"

#E-mail Configuration
$SMTPServer = "SMTPServer.contoso.com"
$From = "administrator@contoso.com"
$To = "me@contoso.com"
$Subject = "PS Report - Unassigned Computers in WSUS"

#Connect to the WSUS 3.0 interface.
[reflection.assembly]::LoadWithPartialName("Microsoft.UpdateServices.Administration") | out-null
$Wsus = [Microsoft.UpdateServices.Administration.AdminProxy]::GetUpdateServer($WsusServer,$UseSSL,$PortNumber);

#Get the FQDN of computers that reside in the Unassigned Computers container
$Unassigned = (($Wsus.GetComputerTargetGroups() | ?{$_.Name -eq 'Unassigned Computers'}).GetComputerTargets() | select FullDomainName)

#Output each to the temp file
Foreach ($Computer in $Unassigned){
($Computer.FullDomainName) | out-string | add-content $TempFile
}

#Create the body of the email, inserting <BR> tags between each line.
$Body = (Get-Content $TempFile) -join '<BR>'

#If there are computers in the unassigned computers group, send me an email listing them
If ($Unassigned -ne $null){
Send-Mailmessage -From $From -To $To -Subject $Subject -SMTPServer $SMTPServer -Body $Body -BodyAsHTML
}

#Delete the temp file
Remove-Item $TempFile -ea silent



One of the challenges I faced here was getting the computer names into the text file. Out-String didn't work because of the type of object the $Unassigned was, for some reason. I ended up outputting to the temp file, then using -join to put in <BR> tags, then sending the email with the BodyAsHTML option. In HTML, the <BR> tag just represents a carriage return (go to the next line).

Friday, June 21, 2013

Grep for Windows - What a Welcome Tool!

Last week we renamed a file server, and I needed to dig through all of my Powershell scripts and change the name wherever it was referenced. What a tedious chore!

Oh how I wished there was a tool that would search inside of multiple text files and tell me where a string of text appears!

THERE IS! 

It's called Grep for Windows, and it is awesome. It's wizard driven and will even search based on regular expressions.


Wednesday, June 19, 2013

Windows 8 - Mini-Review, and My Ability to Switch Users went Missing!

I'll get to the actual fix towards the bottom, but let me diverge into my mini-review of Windows 8 for a couple of paragraphs first.

This weekend I took the plunge and installed Windows 8 on my home desktop. I took full advantage of that promo that Microsoft had back in January and picked up Windows 8 for $30. I still think Windows 8 is a "failure" for the business segment due to so many business users being, well, fairly clueless when it comes to technology. I also have come to believe that humanity in general despises change and lacks a yearning to learn new things, especially when the old way "works just fine". Okay, enough sociology.

I've been using Windows 8 for a few days now, and let me tell you: It has some great features! While the switch to the Metro interface (Start Screen) is jarring, I like that I have big buttons for my most used apps. One of the fixes in 8.1 is that you can make the wallpaper on your Start Screen the same as your desktop wallpaper, so that should help. The live tiles are less useful on a desktop, but I can see their allure. Some of the built in apps are pretty barebones, but there's a store and I can get more apps. The first thing I found was TuneIn, which lets me listen to terrestrial radio stations from all over the planet - awesome! Half the time when I hit the start button on my keyboard I am typing the thing that I want to open anyway, so my use of the start screen is fairly limited. One thing I overlooked about the big buttons is the ability to "remind" myself that I need to do something, like check my torrents, for instance. When I press start now there's my big green uTorrent button staring at me.

The second thing that I really like is the parental control. I have a 12 year old, and now I can look at where he's going on the web independent on whether he figures out how to clear his history. I hear you now: "But why don't you just set up a Squid proxy or something - you're a geek!" This is an option I have thought of many times, but I don't want to pay the electric bill and deal with the management (at home, no less) of yet another system. I also have him locked down with an application whitelist, so that he can only run things I say that he can ahead of time. The cherry on top is that I can specify when he can log on, so no more worrying about him sneaking down in the middle of the night. I'm pretty sure I could have done this via group policy before, but I like that it's now an easy to find option.

So, I got this thing all configured and then I noticed that I didn't have the ability to "switch user" anymore. What was odd is that it was working just fine. I have no idea what caused it to go away since I was doing so many thing (it's a new install after all). After hunting around I finally found a group policy option that fixed the problem. 

1. Hold down your Windows key and press 'R' to open the Run Command box.
2. Type 'gpedit.msc' and press enter. This open Group Policy Management.
3. Navigate to Local Computer Policy/Computer Configuration/Administrative Templates/System/Logon
4. Within the Logon folder, you will see "Hide Entry Points for Fast User Switching". Right-click on it and select Edit.
5. Change the setting to "Enabled" and then press OK.
6. Close the Local Group Policy Editor
7. Hold down your Windows key and press 'R' to open the Run Command box.
8. Type 'gpupdate /force' and press enter, then wait for the black box to go away. Or you can reboot.

That's it.

Tuesday, June 18, 2013

Windows Updates for Labs Without an Internet Connection

I'm often building lab environments that have no ability to get online. This is usually because some of the VMs I build have the same name as actual production systems. Some of them are even restores of production VMs in a lab environment. Naming conflicts suck, especially in the middle of the day.

I found a great tool called WSUS Offline Update that lets me put all of the WSUS updates I want onto a virtual DVD and update my OS from it. I wish I had it back in the day when I was updating remote offices on a slow and shared internet connection! It's well-built and fairly granular - you can pick and choose OS, Architecture, etc. It will even do Office and .NET!


Friday, June 14, 2013

Running Scheduled Reports in Spiceworks

We love Spiceworks! BUT it's still missing some things. One of which is the ability to set up reports to run automatically and email them to you. Well, it's not missing per se, it's just kind of complicated. The ability is there, and this is how to use it.

The post here within the Spiceworks Community got me started, but I wanted to show the whole picture.

  1. On your Spiceworks server, go to c:\Program Files (x86)\Spiceworks and create a new folder called 'rpt'.
  2. Now go into c:\Program Files (x86)\Spiceworks\pkg\gems\spiceworks-X.X.XXXXX (where the X's represent your version number), and copy the file run_report.rb into the rpt folder you created in step 1.
  3. Open a command prompt and navigate to c:\Program Files (x86)\Spiceworks\rpt and execute the following command to see how to use this ruby script:
    ..\bin\ruby run_report.rb -?
  4. Now, you need to find the report number that corresponds to the report you want to run. In my case, I used the following command:
    ..\bin\ruby run_report.rb -e user@contoso.com -p password -l 
  5. Your Spiceworks login credentials are used in the above command, as well as -l, which lists all reports, along with their report numbers. Write down the report numbers you want.
  6. In a batch file, write a line for each report. Here is an altered copy of my batch file:

    cd "C:\Program Files (x86)\Spiceworks\rpt"

    REM To get a list of tickets run this command:
    REM ..\bin\ruby run_report.rb -e user@contoso.com -p password -f pdf -l

    REM Run Report #72
    ..\bin\ruby run_report.rb -e user@contoso.com -p password -f pdf 72

    REM Run Report #73
    ..\bin\ruby run_report.rb -e user@contoso.com -p password -f pdf 73

    REM Run Report #74
    ..\bin\ruby run_report.rb -e user@contoso.com -p password -f pdf 74

    powershell.exe -NoProfile -File C:\ps\SpiceworksReporting.ps1


  7. Pay attention to the last line, which calls a powershell script that will email you the reports! Here's that file:

    #This file is run at the tail-end of SpiceworksReporting.bat

    $To = "admin@contoso.com"
    $From = "spiceworks@contoso.com"
    $Subject = "Spiceworks Reports"
    $SMTPServer = "smtpserver.contoso.com"
    $Body = "See Attached File(s)"

    #Report 1: 
    $file1 = "C:\Program Files (x86)\Spiceworks\rpt\report-72.pdf"

    #Report 2:
    $file2 = "C:\Program Files (x86)\Spiceworks\rpt\report-73.pdf"

    #Report 3:
    $file3 = "C:\Program Files (x86)\Spiceworks\rpt\report-78.pdf"

    #Send Email
    Send-MailMessage -To $To -From $From -SMTPServer $SMTPServer -Subject $Subject -Body $Body -Attachments $file1,$file2, $file3

    Remove-Item $file1
    Remove-Item $file2
    Remove-Item $file3


  8. Now, go into Task Scheduler and create a task to run the batch file. The batch file creates the reports, which output into pdf and are stored in the rpt folder we create. Then, the batch file calls the Powershell script which emails these files as attachments and then deletes the PDF files that were created.
Voila! You have automated Spiceworks reports!

Monday, June 10, 2013

Working with Windows Defender Updates in WSUS

We don't auto-approve anything besides the Windows Defender updates. We do this using a custom Auto-Approve rule within WSUS like so:

So, updates are being approved, and now I would like to auto-decline superseded updates to keep things tidy. Why keep things tidy? I have a third-party patch management system that also let's me pull some pretty nifty reports on client patching progress, and if I don't remove these they sort of pollute my output by not being installed.

So I made a powershell script by adapting my old "Decline Itanium Patches" script.
Here's the script for declining superseded Definitions for Windows Defender updates:

$WsusServer = "WsusServer.contoso.com"
$UseSSL = $false
$PortNumber = 80
$TrialRun = $true

#E-mail Configuration
$SMTPServer = "SMTPServer.contoso.com"
$FromAddress = "administrator@contoso.com"
$Recipients = "me@contoso.com"
$MessageSubject = "PS Report - Declining Superceded Defender Updates"

Function SendEmailStatus($MessageSubject, $MessageBody)
{
$SMTPMessage = New-Object System.Net.Mail.MailMessage $FromAddress, $Recipients, $MessageSubject, $MessageBody
$SMTPMessage.IsBodyHTML = $true
#Send the message via the local SMTP Server
$SMTPClient = New-Object System.Net.Mail.SMTPClient $SMTPServer
$SMTPClient.Send($SMTPMessage)
$SMTPMessage.Dispose()
rv SMTPClient
rv SMTPMessage
}

#Connect to the WSUS 3.0 interface.
[reflection.assembly]::LoadWithPartialName("Microsoft.UpdateServices.Administration") | out-null
$WsusServerAdminProxy = [Microsoft.UpdateServices.Administration.AdminProxy]::GetUpdateServer($WsusServer,$UseSSL,$PortNumber);

$defender = $WsusServerAdminProxy.GetUpdates() | ?{-not $_.IsDeclined -and $_.Title -match "defender" -and $_.IsSuperseded -eq $true}

If ($TrialRun)
{$MessageSubject += " Trial Run"}
Else
{$defender | %{$_.Decline()}}

$Style = "<Style>BODY{font-size:11px;font-family:verdana,sans-serif;color:navy;font-weight:normal;}" + `
"TABLE{border-width:1px;cellpadding=10;border-style:solid;border-color:navy;border-collapse:collapse;}" + `
"TH{font-size:12px;border-width:1px;padding:10px;border-style:solid;border-color:navy;}" + `
"TD{font-size:10px;border-width:1px;padding:10px;border-style:solid;border-color:navy;}</Style>"

If ($defender.Count -gt 0)
{
$MessageBody = $defender | Select `
@{Name="Title";Expression={[string]$_.Title}},`
@{Name="KB Article";Expression={[string]::join(' | ',$_.KnowledgebaseArticles)}},`
@{Name="Classification";Expression={[string]$_.UpdateClassificationTitle}},`
@{Name="Product Title";Expression={[string]::join(' | ',$_.ProductTitles)}},`
@{Name="Product Family";Expression={[string]::join(' | ',$_.ProductFamilyTitles)}},`
@{Name="Uninstallation Supported";Expression={[string]$_.UninstallationBehavior.IsSupported}} | ConvertTo-HTML -head $Style
SendEmailStatus $MessageSubject $MessageBody
}

Running this script with $TrialRun set to $true (as it is initially) will simply email you what the script plans to do; it won't decline anything. Changing the $TrialRun variable to $false will actually decline things.

I set this script to run daily about an hour after my scheduled WSUS Synchronization.

In case you're wondering, the patch management system I'm running is Dameware Third Party Patching by Solarwinds. I'm still learning it and the curve is a bit steeper than I would like. At this time, I can't recommend it, but only because I need to learn more to use it effectively and not because it's a poor product (that I've found). I DO really like the reports it generates and I am successfully patching all Adobe Flash installs through my WSUS Server.

Friday, June 7, 2013

Finding Files Over a Certain Length

So one thing that I've put into place following my Veeam restoration issue with long filenames is a script that runs daily on my fileservers that emails me when it finds any filename+filepath that's over 240 characters. The limit for an NTFS file is 256 characters, and a Veeam restore will choke if it tries to restore a file that exceeds this length.

Before I get into the script, I'd like to thank Ben for writing the actual action part of the script.
I have made modifications so that it runs where I want it to run and emails me if there are any results. Here it is:

#Let's find some long filenames
$files=Get-Childitem F:\ -Recurse 

#Change Outpath to where you want log.
$Outpath = "C:\Temp\LongFiles.txt"

#MAIN SCRIPT
foreach ($file in $files)
{
if ($file.pspath.substring(38).length -gt 240)
{
if($File.Attributes -like "Directory*")
{
Write-Output ($File.PSPath.substring(38) + " is " + $File.Name.Length + " Characters Long, and it is a directory! The total path length is " + ($File.PSPath.Substring(38).length) + ".") >> $Outpath
} #End If
else
{
Write-Output ($File.PSPath.substring(38) + " is " + $File.Name.Length + " Characters Long, and it is a file! The total path length is " + ($File.PSPath.Substring(38).length) + ".") >> $Outpath
} #End Else
} #End If
} #End Foreach

#Email Parameters
$smtpserver = "mailserver.contoso.com"
$From = "administrator@contoso.com"
$To = "me@contoso.com"
$Subject = "PS Report - Long Filenames from <servername>"
$body = (Get-content $outpath | out-string)

#Sending the email
If ((Test-Path $outpath) -eq $True){
Send-Mailmessage -from $From -to $To -subject $Subject -smtpserver $smtpserver -body $body
} #End If

#Delete the log file
Remove-Item $outpath

So, a couple of things to mention about this:

  • Change the get-childitem path to the one you wish to scour.
  • Make sure you have a c:\temp path for the log file or change the output logfile's path.
  • Change the email parameters.
  • In the first 'If' statement, I changed his script to output anything over 240 characters in length, because that's all I really care about.
  • Make sure you have access to the files or the script won't search those files/folders.

I then set up a scheduled task to run the script every morning. The "Sending the email" portion tests to see if a logfile exists. If there is no logfile, then there are no files that meet the criteria (over 240 chars).

As is good practice, if there is nothing to say, the script does not email me. I have enough emails to dig through every day....