Click an Ad

If you find this blog helpful, please support me by clicking an ad!

Friday, May 25, 2012

Uninstalling Google Chrome can really mess with your computer

Uninstalling Google Chrome can really mess with your computer

We had someone at work unwittingly install the Google Chrome browser. They noticed afterwards, and then uninstalled it. All of sudden, links in emails bombed with an error message: “This operation has been cancelled due to restrictions in effect on this computer. Please contact your system administrator.”

Well, there's a kick in the pants!

After searching around for a few minutes, it appears that Microsoft offers a fix for this exact problem.

I have never used Microsoft's "Fix it for me" tool before. I'm not a trusting soul when it comes to my computers (or those entrusted to me) so I usually will make the required changes myself. I thought I would give it a shot this time, and it worked like a charm.

What happens is that the Google Chrome install changes entries in your registry having to do with how your computer opens certain files or protocols. Upon uninstall, however, Chrome doesn't put things back the way they were. Bad developers!!! So you either have to fix the entries that it messed up, OR you have to reinstall Chrome, so that Chrome can handle all of those things it changed instead of Internet Explorer/whatever other programs.

Wednesday, May 23, 2012

Random gripe for today

Random gripe for today: I hate when I have to use a support website that does case management, and the technician posts a reply to my case, whereupon I receive an email telling me to log back in to the website to see what the technician wrote. GRRR!!

Monday, May 21, 2012

Document Document Document!!

Document Document Document!!

So I spent all morning and part of last week documenting all of the connections in the server room. I've done this many times before, and it's my least favorite part of my job. Oh, don't get me wrong, I don't mind generating documentation! Realistically, it's a big part of being in IT. What I do loathe, however, is when the people around you change things (repeatedly) and don't update the documentation. So, the lesson for today is to get everyone on-board with some kind of change management scheme, starting with your manager for enforcement purposes. It doesn't have to be at the level of ITIL, but everyone needs to know that it's their responsibility to update the documentation when changes are made.

For our servers, we use SYDI to automatically document them on a weekly basis. When we fix an issue, we record the resolution in Spiceworks, our helpdesk software, so that we can find out what we did to fix an issue if it recurs. We run a ping sweep over an entire network on a monthly basis and check that against our Internal Network Address database (an excel spreadsheet) to see if our free IP addresses are actually unused. We have some snazzy Visio diagrams of network paths, server rack population, and power cable routing. The only part of this that's hard to do is mapping out what devices are mapped to which switch port. You can look into the switch's management interface and find out which MAC addresses are on that port, but that doesn't mean it's the only device (like if the switch is connected to another switch). Also, it's not easy to identify iSCSI HBAs or other odd devices like IP cameras and Virtual Tape libraries in this manner.

So, it's all down to tracing cables, unless you can get your team to write down what they did whenever they plug/unplug a network cable.

Tuesday, May 15, 2012

Cloud Storage

Cloud Storage

I'm using SkyDrive, Google Drive, and Dropbox on my home computer and on my Android. I basically use them as backups, but I do remotely access my files from time to time. I ran across this yesterday at It's an app that allows you to transfer files between dropbox accounts, and/or using an FTP. This is really handy for web development people, because a lot of web admins use FTP to access their web sites. But, it got me thinking. I've got a few gigs here and a few gigs there. There are more services that offer cloud storage, but I don't use them because I think 3 is probably sufficient....

What if there was an app that could combine them all? One unified platform where you could browse all your files and didn't need to remember that all of your pictures are on Dropbox, but your resume is on Google Drive? You could search for "resume", it would bring up the appropriate files for you to choose from, and then you open it/edit it/whatever. ONE APP TO RULE THEM ALL. If you had sufficient space, you could even do some kind of Cloud RAID on them. Raid 0 if you just wanted one big filesystem across all of the platforms, Raid 1 if you wanted mirrors, and Raid 5 if you wanted redundancy; you'd use one service for parity data!

Of course, the minute something like this gained any traction, one of the storage vendors would change their code and render it inoperable, but WHAT IF.....

Of course, you could just run your own FTP site from your house, provided that you have a desktop sitting around that you could leave powered on 24/7. I do this as well, using FileZilla Server and Dyndns, so that when my IP address at home changes it will still resolve for me. I also use an obscenely long password, cause hackers love them some FTP sites.

Monday, May 14, 2012

Should I Get Certified?

Should I Get Certified?

I see threads in forums all the time wondering whether tech certification is worth the time, money, and trouble. I'm going to talk about my opinion, and what I chose to do in my career.

Basically, when it comes to landing a job in IT, there are three main aspects:
  1. Experience
  2. College Degree
  3. Certification 
Everyone has their own opinions on which is the most important. Sometimes, company policies dictate which ones are paid the most attention. I'll go through them one at a time and explain the who's, what's, and why's.

If you have a proven track record, it pretty much trumps everything else, nearly all of the time. The exceptions are places where people who work are required to be certified to a certain level. If you're applying to a "Microsoft Certified" consultancy or repair shop, then surprise! You need to be certified. That said, if you have oodles of experience they'll probably pay for you to take the test. Another area where experience may not trump the other two areas is if the company you're applying to requires some sort of degree by company policy.

College Degree
In my opinion, the current state of the "Computer Science" bachelor's degree, for sysadmins, is a joke, which is why I never got one. If you're going into database design, programming, or some other sort of "development" area of the IT industry, then it's worth a bit more. First off, I don't think 4-year universities can move fast enough to incorporate new technology into their curricula. Typically, the professors are focused mainly on theory, instead of the application of those theories in the real world. Also, would someone kindly explain to me why I need to go through Calculus II to be a server admin? In my view, being a sysadmin is a lot like being a plummer (except plummers usually make more). You need to know how to do tasks. You need to know how to troubleshoot to find a problem. The other very handy skills to have to be a good sysadmin are communication and business skills. I chose to attend the local community college, which offered a hands-on curriculum based on certification. For example, our "Intro to Computers" used the A+ certification training book. Networking 101 used the Network+ training book. They also offered blocks of classes that would help you get your CCNA or your MCSE (they've since changed the classes to reflect the new MCITP exams). Of course, you still had to go take the test, but at least you were being taught current industry standards and taught to do useful things. I learned how to make networking cable, which was an amazing skill to have.

Opinions on whether you should certify are usually very strong. Many people have used braindumps (actual questions from the test) to pass exams. They got jobs based solely on their certifications, and of course could not perform very simple tasks when they got the job. This left a bad taste in a lot of people's mouths. Personally, I like certifications. Despite the title of the certification, the holder should never be considered an 'expert' unless that candidate can point out actual experience in the real world. If they have passed the exam though, and they can speak reasonably in an interview setting about the given topic, then it should be viewed that they are knowledgeable about the topic. They can talk the talk, and they can probably do quite a few things with that technology. There are some certifications out there that ARE worthy of high-esteem, without a doubt. These certifications not only require a written exam, but also a practical exam. A couple of examples of this are Microsoft Certified Architect, Cisco Certified Internetwork Expert (CCIE) and Red Hat Certified Engineer. A step below these are VMware certifications, which don't have a practical but require you to take a VMware approved (and expensive! $3000!!) class in order to receive the certification. Another facet of certification is that sometimes HR departments are the ones doing the hiring, and in general, HR departments don't know much about IT certification. So if your resume is being passed through an HR department that came up with a rule that successful applicants need the A+ certification, then you aren't getting through HR without it, even though you have a decade of experience.

So, the final answer? Do them all. Except the 4-year degree. Having certifications will never hurt you. The stuff you learn while studying for an exam is helpful. You need to know it anyway, or at least be exposed to it. Therefore, you shouldn't look at it as a waste of time, either. To some companies, cracking out a certification every year or so proves that you're interested in learning.

Friday, May 11, 2012

Tech Sites I Check Every Day

Tech Sites I Check Every Day

So, I wanted to write about the tech-related websites that I look at every day. Why should you be reading tech news on a daily basis? Because you don't want to buy that server based on an Intel chip, when the new version of the chip is just around the corner. Also, it helps me learn about technology that I might not even know exists, teaches me new tricks I might be able to leverage to solve problems, and gives me the opportunity to look at problems/issues in a different light.

I use an app called Doggcatcher on my Android to download several podcasts as soon as they're released, which I listen to on my 30-minute commute every day, and sometimes at lunchtime. It's MUCH more enjoyable than radio!

This Week in Tech (TWiT)
Tech News Today
Reporters' Roundtable
Windows Weekly
The 404

My daily bookmarks include:
Ars Technica (and the forums too!) - Ars Technica is a great tech site. Their editorials are top notch as well. The community is a close-knit group of willingly helpful people, most of them being sysadmins, but many who are just normal geeks too.
4sysops - This is a blog by sysadmins, for sysadmins. Lots of free software, directions, and news.
ComputerWorld - Run of the mill tech news site.
Instant Fundas - This is a blog that posts neat trick daily. I've come across some really cool stuff here.
Krebs on Security - Krebs is a security expert.
Lifehacker - lots of great tips on this site.
Mashable - News on the social networking front - news from the world of Windows.
Slashdot - News for nerds, stuff that matters.
TechRepublic - They have quite a few "How-To" videos, and they talk about non-tech topics that affect people who work in tech, such as how interview tips and whatnot.
TechSpot - A good source for hardware news
TGDaily - More tech news, but a smattering of other topics that interest techies, like movies, games, etc
Hey, Scripting Guy! Blog - This guy's taught me a lot of Powershell over the past few months.

Wednesday, May 9, 2012

Things that help me be a better Sysadmin

Things that help me be a better Sysadmin

So I'm a network admin, and I have about 300 people and 500 devices to manage. Over time, our IT team has cultivated a very stable environment, and the following software/hardware has really helped me to do my job better:

  • Large monitors - I have 2 24" LCD monitors on my desk. I now have enough screen real estate to watch my system monitoring client, my email, and my IM client while still being able to actually work on things.
  • Heavy duty computer - I'm running a quad-core Intel chip with 8GB of Ram and multiple hard drives. We built them ourselves. My employer also sprang for VMware Workstation, so if I need to get my hands dirty on something I can always create my own domain internal to my system and play around. 
  • VMware Lab - The team shares a 3 host VMware environment on its own network as a heavier-duty lab environment. When we want to test something, we replicate our core infrastructure (Domain controllers, mail server, etc) to the lab and test as if we were doing it in production. It's really nice to be able to test a rollout or an update and document the process from start to finish so that there are no surprises later on.
  • ScriptLogic Desktop Authority - This is a management system that is basically a much better version of Windows Group Policy. It's so much more granular - if one person needs a certain setting, or registry entry I can change it for that one user without having to worry about OU inheritance and whatnot. You can use conditional validation (multiple entries allowed, with And/Or logic), so a setting only gets updated if some condition exists (for example, if the computer is running Windows 7 and 'file.exe' exists in a specific directory). We also have an add-on that allows people to reset their own Windows passwords by answering a few security questions that they create the answers to.
  • SpiceWorks - This crawls our network every day. We use it to find out what's installed where. We also use it as our helpdesk system.
  • Dameware Mini Remote Control - Implementing this cut down on our team's response time drastically. When someone has an issue, we remote in and fix the problem without having to trudge all over the building.
  • PRTG Network Monitor - This monitors our infrastructure. It alerts us when things go down, and also tracks historical metrics like disk space usage so we can plan better.
  • Beyond Compare - This program allows you to compare things, like registry files, text files, or folder structures. Mainly we use this to identify what changes when we configure something a certain way. Once we identify registry values that change, we can make the change on every computer with ScriptLogic (or Powershell even).
  • Veeam Backup and Recovery - We use this to back up our virtual machines. When a server dies we have the ability to restore the entire server from backup just as it was. This also gives us the capability to restore our production servers into our lab environment so we can test and tinker with a clone of our production infrastructure.
  • Royal-TS - this is a Remote Desktop Protocol app that keeps track of all of my servers. It will also auto-fill credentials for me. It's so nice to be able to log in to a server in a matter of seconds to fix a problem.
  •  Folder2ISO - This handy software converts any folder I choose into an ISO file, which I can then mount in VMware. 
  •  SYDI - When I first started, I loved doing system documentation. Over time this grew tedious and time-consuming, so I created an automated process using this software to document the configurations on all of our servers. I still have to document things, of course, but there is less to do manually.

Tuesday, May 8, 2012

All about my phone!!!

All about my phone!!!

I just got a smartphone about 2 months ago. Before that I was using a Samsung Rant, which is a very nice feature phone with a slide out physical keyboard. I really loved that thing!!! My wife and I decided to step into the world of smartphones with out income tax return this year. My wife chose an iPod 4S, because it's cute. After much research I chose a Samsung Galaxy S2 Touch, because I wanted Android and I loved the big screen. I told my wife that if she wanted to go with an iPhone, then she was on her own, because I wasn't learning two systems. I chose to go with Android as my OS because I don't like the idea of someone at Apple deciding which apps can run on my phone. I'm not a big fan of the "walled-garden" management system. I didn't choose a Windows phone, even though I'm big on Microsoft, because there's not as many developers working for them and the platform just isn't as polished as the Android's. I could root my phone, but I haven't had a good reason to yet. I just want my phone to work, and don't want to tinker with the thing all the time. So here's a list of the things I use my phone for the most, along with some app recommendations.

  • Email - I have Gmail, and sync with my work account as well. I have the Yahoo Mail app, but that's just a junk mail account I use for newsletters and cruft.
  • Calendar - The Exchange calendar, plus my wife and I use an app called Cozi to track who's doing what.
  • Trendi - This widget lists the top ten things trending on Twitter right now.
  • Tweetdeck - Use use this to post to Twitter and Facebook.
  • Friendcaster - I use this to browse Facebook. It's much faster than their app.
  • Foursquare - I use it when I go out.
  • Goodreads - Let's me keep track of what my friends are reading and what I want to read next. I use this list when I go to the library to grab a new book.
  • Skype - I haven't used it yet, but it's there.
  • LinkedIn - LinkedIn client

  • Alarm Clock Extreme Free - I love using my phone as my alarm clock - I don't ever have to worry about setting it before bed depending on the day, because the app keeps track of different alarm times on different days.
  • GroceryIQ - My wife and I use this to make sure that whoever goes grocery shopping has the current list.
  • Waze - A great navigation app.
  • Inkpad - I keep my to-do list in here. I tried Astrid, but I didn't need something so complex.
  • Firefox - I'm a Firefox user through and through. I love having my bookmarks synced everywhere I go!
  • FindMyCar - This app will keep track of where you parked, and guide you there using GPS.
  • KeyRing - Keeps a copy of all of those pesky rewards cards.

  • Google Reader - This keeps track of RSS feeds. I currently am tracking most of the major news sites, as well as Ars Technica, Slashdot, ComputerWorld, and Lifehacker.
  • Other News Apps - USA Today, BBC News, TMZ, Drudge Report, CNN, etc.

Phone System Stuff:
  • Camera ICS - This is a clone of the Ice Cream Sandwich Android phone. Sprint hasn't given me the ICS update - I'm still on Froyo. Sad Panda.
  • Just Pictures - I like this photo gallery app a lot better than the one that came with my phone.
  • Pocket - This used to be a service called "Read it Later". Basically, I can mark a web page in my browser anywhere, and it will add it to this list, so I can.... uh... read it later.
  • Vibe - This lets me customize how my phone vibrates when different people call, or for different notifications. I keep my phone silent at all times, so this allows me to know if my wife's calling, because my phone vibrates like a heartbeat, and I don't have to look at the screen.
  • Avast Mobile - Avast had the best mobile antivirus app in a roundup I read recently.

File Access and Media Stuff:
  • DropBox - free storage
  • Google Drive - free storage
  • Cloud Explorer - an app for accessing Microsoft's Skydrive
  • ES File Explorer - This is a file browser, but it has the added ability to allow me to browse LAN shares
  • AndFTP - a full-featured FTP client app
  • Skifta - This is a DLNA client, and handles streaming from my home media server
  • MX Player - the best video player I've found so far
  • Doggcatcher - This is an app that automatically downloads and manages your podcasts. I listen to This Week in Tech, Tech News Today, Lifehacker, CNet Reporters Roundtable, Windows Weekly, The 404, and The VergeCast. It's much better than listening to morning radio on the way to work.
  • DoubleTwist - This thing syncs my iTunes playlists at home with my Droid.
  • Google Music - This will stream my MP3s from the cloud. Google Play allows you to save something like 10,000 songs on their servers.
  • KeePassDroid - I use KeePass to keep track of my logins and passwords. I use DropBox to sync them from home with:
  • Foldersync - can sync on a schedule with my phone and various cloud services.

System Admin Stuff:
  • Teamviewer - I would use this in a pinch to access my computer at work.
  • Pocketcloud - an RDP client
  • AirDroid - Allows me to copy files and stuff through a wireless connection.
  • Hotspotting - uses GPS to map and track hotspots. It's wardriving with a memory, basically.
  • Fing - can scan a network for clients, IP Addresses, etc.
  • WifiAnalyzer - measures wireless networks around you for channel and signal strength info

  • Spark360 - I can keep track of my XBox Live account through this
  • John NES Lite - a Nintendo emulator.
  • Cracked - Love it!
  • FML - for a good laugh.

My droid does more than I ever dreamed it would and has simplified my life in many ways. Once I decide to pull the trigger, I plan on buying:

  • Locale - Manage your phone's setting based on where you are
  • Tasker - Automate anything!!
  • Tapatalk Forum App - an app for browsing web forums. I frequent the Ars Technica forums quite a bit....
  • Ultimate Call Screen - Manage incoming calls better.

Monday, May 7, 2012

My Home Setup and How I'm Backing it Up

My Home Setup and How I'm Backing it Up

So, my main system is a beast I built about 5 years ago. Not only was it to be my main system, but I was also using it to run virtual machines of various Windows servers while working on my MCSE. The important part is that it has four 500GB hard drives and one 1TB hard drive. Why? because you get better performance running virtual machines off multiple spindles.

C Drive: takes up about 150GB and is my system drive.
D Drive:  This is where I keep my stuff.
E drive: It is the home base for backups, as well as the staging area for any downloads.
F drive: Stores a copy of my nightly backup.
G Drive: Contains a mirror of everything on D Drive, as well as a copy of each night's backup.
H Drive: It's 1 Terabyte. It contains a mirror of my screenplay device (see next paragraph).

I have two other computer systems around the house (a desktop and a laptop), and a set-top device called an Iomega Screenplay (which is basically a network-accessible 1TB hard drive that can play stuff on my TV). The Screenplay is constantly mapped as the Z drive on my main system. The house is wired for Gigabit Ethernet for faster transfers, with wireless-G for my wife's laptop. The hostnames:

Megatron - this is my main system
Rasputin - my kid's computer
Wife-PC - my wife's laptop

At 11pm, a nearly identical batch file kicks off on both Rasputin and Trogdor:

net use T: "\\megatron\f$\backup holding area"
rd /s /q "T:\From Wife-PC"
rd /s /q T:\Quicken
mkdir "T:\From
mkdir T:\Quicken
robocopy C:\Q-Backup "T:\Quicken" /MIR /NP
xcopy C:\Users\
Wife\Desktop "T:\From Wife-PC" /C /Q /E /Y
net use T: /delete

My wife and kid don't do much in the way of content creation or downloading, so this is usually over pretty fast.

At 2AM, the backup process starts on Megatron. You can read what it's doing in the "REM" blocks within the batch file here:

@echo on

date /t

time /t
REM -------------Delete old backup files----------
rd /s /q f:\backup\documents
rd /s /q "f:\backup\From Rasputin"
rd /s /q "f:\backup\From Wife-PC"
rd /s /q f:\backup\Quicken
rd /s /q f:\backup\mirc
rd /s /q "f:\backup\misc text"
rd /s /q "f:\backup\My Music"
rd /s /q f:\backup\pictures
del f:\backup\*.txt
del e:\backup*.rar
REM -----------------------------------------------

REM Removing the old backup stuff
time /t

REM -------------MAKE THE FOLDERS-----------------
mkdir f:\backup
mkdir "f:\backup\From Rasputin"
mkdir f:\backup\Quicken
mkdir "f:\backup\From Wife-PC"
mkdir f:\backup\documents
mkdir "f:\backup\Misc Text"
mkdir "f:\backup\My Music"
mkdir f:\backup\pictures
mkdir f:\backup\mirc
REM -----------------------------------------------

REM Making the new backup folders
time /t

REM -------------Backup Phone Pictures-----------------
robocopy "H:\DropBox\Dropbox\Camera Uploads" "D:\Pictures\__Incoming\Charly's Phone" /MIR /NP /LOG:E:\robocopy.log
REM ---------------------------------------------------

REM This section syncs pictures from my phone which have been automatically uploaded to Dropbox

REM -------------COPY THE FILES--------------------
xcopy "F:\Backup Holding Area\From Wife-PC" "F:\backup\From Wife-PC" /C /Q /E /Y
xcopy "F:\Backup Holding Area\Quicken" F:\backup\Quicken /C /Q /E /Y
copy "d:\mirc 6.2" f:\backup\mirc
copy E:\*.bat "D:\MyDocuments\Scripts\For Home Use"
xcopy "d:\mydocuments" f:\backup\documents /C /Q /E /Y
xcopy "d:\My Music" "f:\backup\My Music" /C /Q /E /Y
xcopy d:\pictures f:\backup\pictures /C /Q /E /Y
REM ------------------------------------------------

REM This section moves my files to their backup folders
time /t

REM -------------Tree Text Creation-----------------
tree /F /A c: > f:\backup\Megatron-C-tree.txt
tree /F /A d: > f:\backup\Megatron-D-tree.txt
tree /F /A e: > f:\backup\Megatron-E-tree.txt
tree /F /A f: > f:\backup\Megatron-F-tree.txt
tree /F /A g: > f:\backup\Megatron-G-tree.txt
tree /F /A h: > f:\backup\Megatron-H-tree.txt
REM ------------------------------------------------

REM This section makes "tree" files so I can recreate my file structures if need be
time /t

REM -------------Run Robocopy Jobs-------------------
del e:\robocopy.log
robocopy D:\CBTs G:\D-Mirror\CBTs /MIR /NP /LOG+:E:\robocopy.log
robocopy D:\Education G:\D-Mirror\Education /MIR /NP /LOG+:E:\robocopy.log
robocopy D:\Icons G:\D-Mirror\Icons /MIR /NP /LOG+:E:\robocopy.log
robocopy D:\ISO G:\D-Mirror\ISO /MIR /NP /LOG+:E:\robocopy.log
robocopy D:\Mp3 G:\D-Mirror\MP3 /MIR /NP /LOG+:E:\robocopy.log
robocopy D:\MyDocuments "G:\D-Mirror\My Documents" /MIR /NP /LOG+:E:\robocopy.log
robocopy D:\Pictures G:\D-Mirror\Pictures /MIR /NP /LOG+:E:\robocopy.log
robocopy D:\Software G:\D-Mirror\Software /MIR /NP /LOG+:E:\robocopy.log
robocopy D:\Technet G:\D-Mirror\Technet /MIR /NP /LOG+:E:\robocopy.log
robocopy "D:\The Library" "G:\D-Mirror\The Library" /MIR /NP /LOG+:E:\robocopy.log
robocopy D:\Wallpaper G:\D-Mirror\Wallpaper /MIR /NP /LOG+:E:\robocopy.log
robocopy D:\Mp3-Wife G:\D-Mirror\MP3-Wife /MIR /NP /LOG+:E:\robocopy.log
REM ------------------------------------------------

REM This section is making a mirror image of folders on D drive that I'm not including in my backup.
time /t

REM -------------Delete Shortcut files--------------
del *.lnk /S
REM ------------------------------------------------

REM I'm removing all the .lnk (shortcuts) from the backup
time /t

REM -------------Compress/split Backup--------------
cd\"program files (x86)"\winrar
rar a -m5 -o+ -r -t -v4300000k -r -preallylongpassword e:\backup.rar f:\backup
REM ------------------------------------------------

REM Using Winrar to compress my backup into files that will fit onto single-sided DVDs
time /t

REM -------------Delete Blat log file---------------
del e:\blat.log
REM ------------------------------------------------

REM Removing yesterdays log file
time /t

REM ------------Screenplay Picture sync-------------
del screenplay-picturesnightly.log
robocopy "D:\Pictures" "\\IOMEGA-D714CA5C\ScreenPlay\Pictures" /MIR /NP /R:5 /W:5 /LOG:E:\screenplay-picturesnightly.log
REM ------------------------------------------------

REM Mirroring all of my pictures to the screenplay device
time /t

REM -------------Copy Cloud Files-----
copy D:\MyDocuments\Passwords\*.* H:\DropBox\DropBox\PWD /Y
robocopy D:\MyDocuments H:\DropBox\Dropbox\MyDocs /MIR /NP /LOG+:E:\robocopy.log
robocopy D:\MyDocuments "H:\GDrive\Google Drive\MyDocs" /MIR /NP /LOG+:E:\robocopy.log
robocopy D:\MyDocuments H:\SkyDrive\Documents /MIR /NP /LOG+:E:\robocopy.log

REM ------------------------------------------------
REM Copying "My Documents" to my Google Drive, Skydrive, and Dropbox folders

REM -------------Mirror backup RAR files to F and G Drives-----
copy e:\backup*.rar f:\
copy e:\backup*.rar g:\

REM ------------------------------------------------
REM Copying my backup files to 2 other drives, in case of drive failure

type e:\screenplay-picturesnightly.log
type e:\robocopy.log

REM These commands writes my log files to standard output
date /t
time /t

The date and time commands let me know how much time each step in the backup process is taking. It takes about 3 hours for the whole cycle to run through.

The backup file above is called by the command "E:\backup.bat > backuplog.txt" so that the entire output above is saved to backuplog.txt. After this task completes, another batch file runs which uses a program called blat to email the backup log file to me. If I ever get motivated I'll use Powershell for this, but the process works, so.....

I try to take a backup off-site once a month, so I'm not wasting a bunch of DVDs (I'm up to four now). Realistically, the only thing that will cause a loss of important data is total destruction of my computer. My Documents are synced to 3 different cloud services, and my pictures are mirrored upstairs to my screenplay.

I have found that this solution works best for me. I have a ridiculously low bandwidth cap; I have the lowest tier plan on Charter cable, which affords me only 100GB per month (upload and download). Also, I don't like to pay a monthly fee, so I don't want to use services like Mozy or Carbonite. I could get 2 external USB drives and just swap them out, keeping one of them offsite, but I have mouths to feed and so on, and I can find better things to do with $300. So, in the end, I chose redundancy as my hedge against hardware failure.

Saturday, May 5, 2012

Powershell Regular Expressions Save the Day

Powershell Regular Expressions Save the Day

Copy a string from between two other characters/strings (as in a UNC Path).

I had a UNC path like this:

I needed to rip out just the server part of that whole mess, which I did so with this regular expression (after fooling with it for way too long!!):

$TargetUNC = "\\server\share\folder\folder\folder\file.txt"
$ServerName = [regex]::match($TargetUNC,'[^\\]+').value
#The line above will strip out "server" from the UNC string and save it to variable $ServerName

Friday, May 4, 2012

Automated Reporting with Powershell

Automated Reporting with Powershell

One of my favorite uses of Powershell is automating things so that I don't have to remember to do them. This includes checking up on my infrastructure to make sure that things are running smoothly. I have a few things I've automated Powershell to do:

Check my VMware environment - I get an email every morning with this report. I'm not going to go through the code on this one, because it's way too long. Basically, I downloaded a superb Powershell script called vCheck and modified it to suit my environment and report on what I need to know. These topics include:
  • Hosts and VMs that are down
  • VMs that have outdated tools
  • VMs that have CD-ROMs or Floppies still attached
  • Any other Host issues or alarms
  • VM alarms
  • vCenter service issues
  • Warnings and errors in the event log
You can find vCheck here.

Run a query on a SQL database - This pulls the version information out of a database for a commonly updated application monthly. We have a test environment and a production environment for this particular application. This report helps us keep them both in line.

Filename: Report-VersionSQLQuery.ps1
Author: Charles Stemaly

add-pssnapin SqlServerCmdletSnapin100
add-pssnapin SqlServerProviderSnapin100

#Adds the snapins to Powershell for working with SQL. You must have the SQL Management Studio installed on the machine from #which you run this script.

$DividerLine = "`r`n --------------------------------------------------------------- `r`n"
$ReportDate = Get-Date
$ReportDate = $ReportDate.toshortdatestring()

#This sections creates some variables, namely the date and a divider line I'll use later to construct my email.

$ProdHeading = "Production Version:"
$ProdVersion = (Invoke-Sqlcmd -Query "SELECT [version], [description], [create_timestamp] FROM [Prod].[dbo].[version] NGDB JOIN (SELECT [product_id], `
MAX([create_timestamp]) as create_date FROM [Prod].[dbo].[version] GROUP BY [product_id]) TBL ON TBL.[product_id] = NGDB.[product_id] `
AND TBL.[create_date] = NGDB.[create_timestamp]" -ServerInstance "ProdServerName" | Out-String)

#Prodheading is the heading for my email.
#Now, I'm not a SQL guy, so I needed some help from our DBA to construct the Query above. Once I had the query, I simply had to #use the 'Invoke-SqlCmd' commandlet to send the query off to the SQL server and return the results. One neat thing is that it #returns an object which I can then further manipulate; I can only show the top 2 results or sort by any of its properties using #Powershell, for instance. In this case, I convert it to a string so I can email it later.

$TestHeading = "Test Version:"
$TestVersion = (Invoke-Sqlcmd -Query "SELECT [version], [description], [create_timestamp] FROM [Test].[dbo].[version] NGDB JOIN (SELECT [product_id], `
MAX([create_timestamp]) as create_date FROM [Test].[dbo].[version] GROUP BY [product_id]) TBL ON TBL.[product_id] = NGDB.[product_id] `
AND TBL.[create_date] = NGDB.[create_timestamp]" -ServerInstance "TestServerName" | Out-String)

#This is the same command and query as above, modified slightly to pull the information from our test server/database.

$VersionListing = ($ProdHeading + $ProdVersion + $DividerLine + $TestHeading + $TestVersion)
#Here, I am simply "adding" the different strings together to form one big string ($VersionListing)
Send-Mailmessage -from -to -subject "Versions as of $ReportDate" -smtpserver mailserver -body ($VersionListing)

#The subject of my email incorporates the date string I created at the beginning of the script, and uses the $VersionListing #super-string as the body of the email.

Shows me computers in Active Directory not seen on our network in over 90 days - This helps me clean up anything that might not have gotten deleted when we decommission computers.

Filename: Report-ComputerNotSeenIn90Days.ps1
Author: Charles Stemaly

add-pssnapin Quest.ActiveRoles.ADManagement
#Adds the Quest ActiveRoles Snap-in

get-qadcomputer -inactivefor 90 | select name | sort name | ConvertTo-html | out-file c:\olderthan90days.html
$body = get-content c:\olderthan90days.html | Out-String
$body1 = "These computers have not been seen in 90 Days by Active Directory"

#Runs a query of computers inactive for 90 days, specifies that I want to see only the name, and sorts them alphabetically. It #then writes the output to an html file, which I convert to a string to use in my email body. I then create another peice of the #body ($body1).

Send-MailMessage -To "" -Subject "Computers not seen by AD in 90 Days" -Body ($body1 + $body) -BodyAsHtml -From "" -SmtpServer ""
Remove-Item c:\olderthan90days.html

#Here I construct my email, and then I delete the html file I created earlier.

Shows me the last date that users logged in - I have this list sorted chronologically so I can see if users aren't using the generic accounts we have floating around, and it also guards against situations where HR has forgotten to tell me that so-and-so isn't working here anymore.

Filename: Report-LastLogon.ps1
Author: Charles Stemaly

Add-PSSnapin Quest.ActiveRoles.ADManagement
Get-QADUser -sizeLimit 0 | where {
  $_.lastlogontimestamp -and
    (($now-$_.lastlogontimestamp).days -gt $daysSinceLastLogon)

} | select-object Name, LastLogonTimeStamp | sort-object LastLogonTimeStamp | convertto-html | out-file c:\report.html

#First, I'm loading the Quest ActiveRoles Snap-in, then running a query of my Active Directory users. I'm piping that to 
#'Where-Object' to comb through the result and limit it to only users who have a "lastlogontimestamp" AND where the difference #in days between now and their last logon is greater than 1. I'm then selecting only the name and the last logon timestamp, #sorting it by date so I get the oldest first, converting the output to html and saving the output to an html file.

$body = get-content c:\report.html | out-string
$body2 = "Last time each user's account logged in to Active Directory `r`n`r`n"
Send-Mailmessage -from -to -subject "Account Info - Last Login" -smtpserver mailserver -body ($body2 + $body) -bodyasHTML
del c:\report.html

#I've constructed the body of my email, sent it out, and then deleted the temp file I created earlier. The `r`n characters you #see are escaped characters within the string. Escaped characters perform some specific function; these create a carriage return #and a new line, breaking up my string so it looks better in the email.

You may have noticed that I'm not consistent in working with strings and you might know that if I looked into things deeper I probably wouldn't have to juggle between creating html files and then converting them to strings later. Formatting output is pretty tricky for me in Powershell, I must say. Some things I tried that should have worked made a mess of the final product (the email) so I just played with different things until they worked. If it was important, I could look up the "best way" to do this, but I have users to help and a network to maintain. As in most things in IT, there are multiple ways to get to the information you want.

Lest ye think that this is a Powershell only blog, on Monday I'm going to talk about my computer setup at home and how I back it up, which is kind of non-conventional.

Thursday, May 3, 2012

Powershell Task Automation

Powershell Task Automation

So, after my Powershell abilities progressed a bit, I had an epiphany. Holy Cow! Think of all of the things I'm doing manually that I could automate!!!

First, I went and installed Quest's PowerGUI application from This is a GUI and script editor for Powershell that I find very helpful. They have a ton of other add-ons available for it, but I haven't delved to deeply into all of that yet.

Then, I looked through my IT team's helpdesk logs for common problems we having. We have a couple of problematic applications in use here, and I noticed we had many instances where we had to kill process for our users. Normally, this process required us to remote control the user's computer, open task manager, and kill the application. There's some stumbling blocks here, too. Sometimes Windows will report back in a few seconds that the application is not responding. So then you have to click on the "End Task" button. And again, sometimes this will pause for a couple of seconds. Once in a while Windows isn't very responsive and sort of sputters along, so that can increase the time that it takes to resolve the issue. Also, it's kind of tedious. I'm going to post the whole script, and use the Powershell-style comments to describe what it is I'm doing.

Filename: killprocess.ps1
Author: Charles Stemaly

$cpname = read-host "Enter the computer name"
#Prompt the user for the host name of target computer
do {
#Here, I have started a 'Do' loop. This block of code will execute until some condition is met.
$procname = read-host "Enter the process name (i.e. notepad.exe)"
#Prompt the user for the name of the process to kill
get-wmiobject -computername $cpname win32_process|where-object {$_.processname -eq "$procname"}| foreach-object {$_.terminate()}
#Here I'm interacting with WMI on the target computer: Get the running process, then pare down #the list for only the exe that I specified in the $procname variable, then use the terminate method #to kill that process. Note that this would kill multiple instances of the program.
$answer = read-host "Press 1 to kill another process, or q to quit"
#Prompt the user to do another or to quit the script. This variable sets the condition of the 'Do' #loop.
#This marks the end of the 'Do' block of code
while ($answer -eq 1)
# This is what looks at the condition of the 'Do' block and decides whether to run the loop again or #exit.

Using this script, we cut the response time on these calls down dramatically. We no longer need to fumble through the GUI, waiting for Windows to behave itself. WMI terminates the offending process almost instantly. As an added bonus to the faster resolution time, you look like a fricking magician to the end user.

The next type of support call I decided to craft an automated solution for was the Windows password reset.

Filename: ResetPassword.ps1
Author: Charles Stemaly

$UsernameToReset = Read-Host "Username"
# Prompt the user for the username that needs to be reset
$UserQuery = (Get-QADUser $UsernameToReset)
#construct the first part of the command; this makes for easier reading later.
If ($UserQuery.AccountIsLockedOut -eq $true){
#This portion detects whether the account is locked out or not.
    Unlock-ADAccount $UsernameToReset
#If the account is locked out, this command unlocks it.
    Write-Host -ForegroundColor Yellow "Account unlocked for $UsernameToReset"
#This writes to the screen telling me that the account was unlocked.
#This marks the end of the 'If' block of code.
Set-QADUser $UsernameToReset -UserPassword "password"
#This line resets the user's password to "password". Obviously, I use something different in my #duties :)
Write-Host -ForegroundColor Yellow "Password was reset for $UsernameToReset to password"
#This line writes to the screen telling me that the password has been reset.

While I love the Active Directory Users and Computers snap-in for the Microsoft Management Console (don't we all?), it's just a lot faster to open Powershell, and type a couple of lines. I should point out here that the code above requires that the Quest ActiveRoles commandlets snap-in is installed and loaded by Powershell. Also, you obviously need the correct permissions in Active Directory to be resetting passwords.

While I won't get into the individual scripts themselves, I also have automated the following situations:
  • Creating, disabling, and deleting users
  • Finding the answer to "Who printed this?" calls
  • Resetting the print spooler on our print server when a print job gets irrevocably stuck
  • Rebooting a remote computer from WMI
  • Finding out which Terminal Server in our farm a particular user is logged in to
  • Notify users via email when their password is expiring within 7 days. We find that Windows 7 tends to bury this notification in the system tray.
Tomorrow, I will lay out how Powershell is handling automated reporting about our environment.

Wednesday, May 2, 2012

My Powershell Profile

My Powershell Profile

Let me preface this by saying that I'm going to discuss my OLD Powershell profile. I used to use this as the default profile, but it took forever to open a Powershell instance due to all of the modules and snap-ins that had to load. In the end, what I did was have a blank profile, then created a shortcut on my taskbar to run 'powershell.exe -NoExit -File c:\powerprofile.ps1". I'm going to go through my setup one section at a time.

Variable and Aliases Section

$a = (Get-Host).UI.RawUI
$a.BackgroundColor = “Black”
$a.ForegroundColor = “Green”
new-alias gh get-help
$desktop = "C:\Users\user1\Desktop"
$documents = "O:\My Documents"
$credential = get-credential
$credDomainAdmin = get-credential

The first bit dictates that my Powershell window is green text on a black background. It makes everything so easy to see!! I specify that 'gh' is an alias for the get-help command, since I use it a lot. I also have a couple of variables I can use when I want to refer to my Desktop or My Documents folder paths. Then, I have Powershell prompt me for my credentials, as well as the Domain Administrator credentials. We'll use those later to set up connections to various servers.

Exchange 2010 Management Shell

$sessionEXCH = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri -Authentication Kerberos -Credential $credDomainAdmin
import-pssession $sessionEXCH | out-null
echo "Exchange 2010 connection loaded"

Note that the Exchange 2010 Management Console is installed locally. This makes the Exchange Powershell commands available and connects me to the mail server. Note that I piped the import-pssession command to out-null. Doing this suppresses any output to the screen. 

Lync 2010 Management Shell

$sessionLYNC = New-PSSession -ConnectionUri "" -Credential $credDomainAdmin
import-pssession $sessionLYNC | out-null

Again, the Lync 2010 Management Console is installed locally. This section makes the Lync 2010 commands available and connects me to the Lync server.

 VMware PowerCLI Connection

add-pssnapin Vmware.VimAutomation.Core
$vmware = "Vmware.VimAutomation.Core"
connect-viserver -server -credential $credential

This loads the VMware PowerCLI (which, again, is installed locally) and connects me to the VMware Virtual Center Server.

Misc Modules and Snap-Ins:

I will go through the rest of these using Powershell-style comments (anything after a # symbol is a comment).

add-pssnapin SqlServerCmdletSnapin100
add-pssnapin SqlServerProviderSnapin100
#SQL Server Management

import-module activedirectory
#The Microsoft Active Directory module

import-module pscx
add-pssnapin Quest.ActiveRoles.ADManagement
import-module WASP
import-module bsonposh
#These modules and snap-ins were discussed in my post yesterday.

import-module psterminalservices
#This is a handy module for managing Microsoft Windows Terminal Servers.

After all of this, I perform the command to change my working directory to our IT Department's script folder, and then clear the screen:

cd Z:\PS

Load time for this configuration is about 20 seconds or so. If I just need to run some commands, it's far faster to open Powershell with a blank profile. I could have gone about this the opposite way. The script above could serve as my default profile, and I could create a shortcut to 'powershell.exe -noprofile'. In weighing my options between the two methods, I decided to use a blank default always so that I could type Powershell at the run command and just get going on my work immediately.

Tomorrow: What kind of helpdesk tasks I've automated.

Tuesday, May 1, 2012

Handy Powershell Modules and Snap-Ins

Handy Powershell Modules and Snap-Ins

Today I want to talk about some of my favorite modules and snap-ins that I use to make Powershell really do some heavy lifting.

VMware PowerCLI -- If you have a VMware environment, then this is a handy thing to have. While I don't use it to create VMs or hosts, I do use it to run a report every morning to let me know what the status of my VMware infrastructure is. I also use it if I need to do something repetitive, like reboot a bunch of Virtual Machines.

Quest ActiveRoles Management Shell -- Quest has created this free Powershell snap-in to help you with Active Directory management. There is a Microsoft Active Directory module to use with Powershell, and I use it as well, but for some things I prefer using Quest's.

PSCX -- The Powershell Community Extensions include many commandlets which add tremendously to the utility of Powershell. Some of my favorites include:
  • Send-SmtpMail
  • Out-Speech -- you can have some fun with this one!!
  • Clipboard manipulation -- Set-, Get-, Write-, and Out-Clipboard
  • Join-String
  • Ping-Host

WASP --  While I haven't done anything useful with it (yet), the ability to control the Windows GUI, as well as initiate mouse and keyboard events from within Powershell is a powerful ability.

 BSonPosh -- This module includes quite a few advanced capabilities for system and network admins. Some of these include:
  • Get-FileMD5
  • Get-FSMO -- list your domain's FSMO role holders in one command!!!
  • Get-Uptime
  • Commandlets for working with File Shares
  • Commandlets for viewing and converting networking information (converting IP Addresses to binary, working with the routing table, etc.)

I Saved the Best For Last

This next script (note that it isn't a module or a snap-in) is an amazing piece of work. Not only has it saved me countless hours running Windows Updates during our monthly maintenance window, but it also marks the first time I saw that Powershell could construct its own GUI interface.

PoshPAIG -- This utility allows you to enter in a list of computers (or feed it the names in a text file). The application will then poll each one to see how many updates are waiting for installation. From there, you can install the updates, find out which servers need a reboot after the installation, and reboot them if need be. If I had a software of the year award, this gem would get it.

Tomorrow: What my Powershell profile looks like.