Click an Ad

If you find this blog helpful, please support me by clicking an ad!

Thursday, July 11, 2013

Gathering my Log Files. Every. Day.

This post relates to item #1 on my "What I'm Monitoring" post.

This script is set to execute as a scheduled task on each one of my Windows servers; I've even made it part of my provisioning checklist.

I'll allow my commenting within the script to do the 'splaining:

--------------------------------------------------------------------------------------------------------------------

#Get yesterday's date
$Date = ((get-date).adddays(-1))

#Create formatting elements for the email
$SystemHeader = "`r`n+++   System Log Errors and Warnings    +++"
$ApplicationHeader = "`r`n+++ Applications Log Errors and Warnings  +++"
$DelimiterLog = "`r`n++++++++++++++++++++++++++++++++++++++++++++"
$Newline = "`r`n"

#Get the hostname
$Hostname = ($env:computername)

#Retrieve and filter the system log - Whenever I get an error that's 'ok' and that I don't care to 
#see anymore, I will add it to this section. This procedure also applies to the next section.
$System = (get-eventlog system -after $Date -EntryType Error,Warning | where {$_.eventid -ne 1111 -and $_.eventid -ne 3 -and $_.eventid -ne 4 -and $_.eventid -ne 8 -and $_.eventid -ne 1109} | select Entrytype, Source, EventID, Message, TimeGenerated | format-list)

#Retrieve and filter the application log
$Application = (get-eventlog application -after $Date -EntryType Error,Warning | where {$_.eventid -ne 1530 -and $_.eventid -ne 1524 -and $_.eventid -ne 1517 -and $_.eventid -ne 12321 -and $_.eventid -ne 1008 -and $_.eventid -ne 2003 -and $_.eventid -ne 100 -and $_.eventid -ne 1023} | select Entrytype, Source, EventID, Message, TimeGenerated | format-list)

#Build the email
$Body = $DelimiterLog
$Body += $SystemHeader
$body += $DelimiterLog
$body += ($System | Out-string)
$body += $Newline
$body += $Newline
$body += $DelimiterLog
$body += $ApplicationHeader
$body += $DelimiterLog
$body += ($Application | Out-string)

#Send me an email of any errors, but ONLY send the email IF there are errors!
If ((($system | Measure-Object).count) -gt 0 -or (($application | Measure-Object).count) -gt 0){
Send-Mailmessage -from "administrator@contoso.com" -to "me@contoso.com" -subject "Log Errors and Warnings from $Hostname" -smtpserver SMTPServerName -body $body
} #End If

--------------------------------------------------------------------------------------------------------------------

What you see above is the one that script that runs on 90% of my servers. However, some of my servers end up needing their own file due to differences in the filtering section (where I'm excluding certain event IDs from getting through). Examples include my Exchange server and my IIS web servers.

Wednesday, July 10, 2013

What I'm Monitoring and Automating

Now that I'm pretty settled in my new job, I thought it might be helpful to step back and take a look at what I've automated and what I'm monitoring, and why. This will also serve as a roadmap for future content, because you all know how I love to post scripts.

  1. GatherLogs: Every server I have runs a task in the morning that sends me any warning or error entries in the system or application logs. Of course, Windows error logs have a lot of errors that just occur and really don't mean anything. Therefore, my script weeds these entries out. Previous Article.
  2. SQLExpBkup: Servers running SQL Express run a job to back up their databases locally prior to the network backup jobs. Previous Article.
  3. Off-site backups: I mirror quite a bit (using robocopy, mainly) to offsite repositories:
    • Some SMB shares
    • Backup Exec B2D files
    • Veeam Backup Files
    • The content of our IT Fileshare (ISOs, installation files, etc)
    • The contents of our departmental Sharepoint Knowledge Base (a Document Library). We don't want to be without our documentation. Previous Article
  4. SureBackup Open Lab Check: This script checks to see if I have any open Veeam SureBackup virtual labs running before I leave for the day. Previous Article
  5. VMware snapshot check: runs first thing in the morning to notify me of any snapshots that exist in my VMware environment. Article Here
  6. Automatic reporting on my Dell Equallogic SAN replication jobs. EDIT: I'm not repicating with Dell's stuff now. For our needs, doing Veeam BackupCopy was good enough.
  7. Weekend Veeam SureBackup job chain. Previous Article.
  8. Bad Logins on my domain controllers. Previous articles: Part one and part two.
  9. Logon Counts on my domain controllers. EDIT: Not doing this anymore. I had nothing actionable from the data and it takes a lot of CPU to comb through the event logs.
  10. Daily comprehensive DCDIAG on my main domain controller. Previous Article.
  11. Script that checks for file names that excees the NTFS limits on my file servers. Previous Article.
  12. An emailed list of any PST files on my file server that houses my users' home directories. We're still trying to get rid of these things....
  13. SQL Database mid-day backup, copy to our testing SQL server, mount and DBCC check. Previous Article.
  14. A script that sets up Spiceworks tickets for recurring daily, weekly, and monthly tasks. Article Here
  15. Automatically emailed Spiceworks reports. Previous Article.
  16. List of computers in AD that have not logged in for over 90 days. Previous Article.
  17. List of users in AD that are disabled. Article Here
  18. List of users' last login dates and times. Sorted by date and ascending, it shows me accounts that aren't being used. Previous Article.
  19. Scheduled Task Issues - stuck or failed tasks on any server. Article Here
  20. Sensitive Group Audit - who is a member of the Domain Admins group or other powerful groups  in your org? Article Here
  21. Website Monitoring - there are probably better ways to do this, but I have a powershell script that checks our websites that runs every 5 minutes or so. EDIT: I have PRTG do this now, but I modified my old Powershell script to ensure that PRTG is running. Monitoring the monitor!
  22. Documentation script that combs through my servers and makes a master list of all of these scheduled tasks - I'm going through it to make this list, as a matter of fact! Article Here
  23. Documentation script that gets drive space from all of my servers used/free, and dumps it into an Excel doc. I have PRTG, which tracks this over time on a per drive basis, but getting all of the data together so I can see per month/per year data growth was easier this way.
  24. Documentation script that goes through all of my SQL servers and creates the docs for them (Databases, recovery models, space usage, versions, etc) Article Here
  25. Script that looks for services that are using things like "Administrator". Still fixing those..... Article Here
  26. Mailbox size reports - who's hogging all of the space on our mail server!? Article Here
  27. I have a batch file that reboots this one server once a week. Our app person insists. In this day and age? The server runs 2008R2, so I don't see why, but whatever. Article Here
  28. WSUS: Auto-declining Itanium patches, Windows Embedded patches, and superceded Windows Defender updates.
  29. Notification of computers in the "Unassigned Computers" WSUS group. Previous Article.
  30. List of computers that are pending reboot or have updates pending. If the number is over ten I start cracking skulls (yeah right).
There. Not only is that a pretty good list of existing content, but a pretty good roadmap for me to follow to start posting some of the scripts I've created.

Monday, July 1, 2013

My Best Script So Far - SQL Backup, Remote Mount, and DBCC

NOTE: After some actual time with SQL, I understand that this is not the most efficient way to do this - you can accomplish most of what I've scripted with Maintenance plans. HOWEVER, I am posting this script anyway, because I'm proud of it, it works, and people might get helpful examples of certain routines out of it.

I've alluded to this script in a couple of posts. It's the script I'm most proud of. It was very difficult to do with my somewhat limited knowledge of SQL and Powershell at the time, it taught me a lot about both, and it runs multiple times per day on production systems with no issues.

The issue this script solves is as follows: I am an "Accidental DBA". Meaning, I'm the admin who knows the most about SQL (which isn't much), and we don't have the budget for an actual DBA, so I'm it. I backup my SQL servers every night using Veeam Enterprise AND Backup Exec 2010. Veeam is great and all, but Backup Exec can restore my database right back into my SQL server with little fuss; the process is much easier. The databases I inherited are using simple recovery mode, so I have no point-in-time recovery capabilities. I could switch them to use a Full Recovery model, but I don't know what all the ramifications of that change may be on the apps themselves (supported?), disk space, backup configuration changes, and then backup disk space. It occured to me that if we lost a database at 3pm, I would have to restore from the previous evening, negating all work done throughout the day. No good. I needed a mid-day backup to mitigate this loss.

Like I said, this is probably not the best way to do this, but at the time (with no SQL training, or time to partake in it) it was the best I could come up with.

An outline of the process:
  1. Backup a database
  2. Copy the database to our testing SQL Server
  3. Mount the database
  4. Run a Database Consistency Check (DBCC from here on)
  5. Detach the database
  6. Keep the backup copy until next time
  7. Email me the DBCC report
A couple of notes. 
  • Powershell SQL stuff (and this script) doesn't work on SQL 2005.
  • You need to have the SQL 2008 R2 Studio Management Console (2008 might work, didn't try) installed on the system you run this script from.
The players:
  • SQLTEST - This is the testing SQL Server
  • SQLPROD - This is the production SQL Server
The script, in its entirety, will be posted without edits (save anonymizing my server names) between the two horizontal lines below. I've documented the script very well, so just read the comments withing the script for explanations.

Ugh, I'm not going to bother with the formatting either. This script is a beast, and apparently Blogger doesn't like pasting from Notepad++. Here's a link to the script, so you can download it from my Google Drive (no login required).

Here it is:



# HOW TO ADD ANOTHER DATABASE
# 1. Select all code between the two lines that start with # ADDING A NEW DATABASE, and copy it to your clipboard
# 2. Paste the code you copied at the end of the file, between the last # END DATABASE X and # ADD ANY NEW DATABASES ABOVE THIS LINE - THIS SECTION IS THE END
# 3. Renumber the Database Heading (within the * box), also edit the following variables (search for them within your section) to reflect the database number:
# $StartTimeDatabase# (Two Instances)
# $EndTimeDatabase# (Two Instances)
# $ReportForDatabase#
# $srv# (Two Instances)
# 4. Log in to the production database server that houses the database we want to schedule backups and consistency checks for
# 5. Ensure that the location C:\Backups exists (If you use a different drive letter, please edit the appropriate variables)
# 6. Open SQL Server Management Studio, logging in as someone with administrtive rights to all databases
# 7. Right Click the chosen database, choose "Tasks", then "Backup"
# 8. Modify Backup Set section "Name" to match the Database Name
# 9. Add the Destination to c:\backups\<DatabaseName.bak>, then remove the other location
# 10. Click Drop-down arrow next to "Script" at the top and choose "Script Action to new Query Window"
# 11. Close the backup windows by pressing the 'X' in the upper right hand corner (we will not actually run the backup job)
# 13. Referring to the key below on REFERENCE LINE:,
#     analyze the SQL Query syntax of the backup job to assign the appropriate values to EDITING SECTION within your database's code section
#     that you pasted in earlier at the end of the script. You need four values: $DatabaseName, $SourceBackupFilePath, $BackupFileName, $BackupName
# REFERENCE LINE: BACKUP DATABASE [DB1] TO  DISK = N'C:\Backups\DB1.bak' WITH NOFORMAT, NOINIT,  NAME = N'DB1', SKIP, NOREWIND, NOUNLOAD,  STATS = 10
#                                 [$DatabaseName]       '$SourceBackupFilePath\$BackupFileName'                  N'$BackupName'

# Add Required Powershell Snapins to work with SQL
add-pssnapin SqlServerCmdletSnapin100
add-pssnapin SqlServerProviderSnapin100

# Switch to root directory
cd c:
cd \

# Declare Constant Variables
$date = (get-date -Format M/d/yyyy)
$newline = "`r`n"
$dividerline = "==============================================================================="

# Variables dealing with the testing database server
$TestingDBServer = "SQLTEST"
$TestingBackupFilePath = "D:\Databases\Backups"
$TestingBackupFileUNC = "d$\Databases\Backups"
$TestingServerSQLDataFolder = "D:\Databases\Backups"
$TestingServerSQLDataFolderUNC = "d$\Databases\Backups"
$TimestampFolder = (get-date -uformat "%H%M-%m%d%Y")
$ArchiveStoragePath = "\\$TestingDBServer\$TestingBackupFileUNC\$Timestampfolder"
new-item -Path $ArchiveStoragePath -ItemType directory | out-null
$Pattern = "..errors and repaired ........"
$InsertString = "Found"




# ADDING A NEW DATABASE - START COPY ON THIS LINE

#***************************************
#*   *
#* DATABASE 1   *
#*   *
#***************************************

#***********************
#*  EDITING SECTION    *
#***********************
# Declare DB Specific Variables
$StartTimeDatabase1 = (get-date) #Change the number to correspond to the Database # in the * Box above
$SourceDBServer = "SQLPROD\MSSQLSERVER" #The Hostname\Instance of the production database server
$DatabaseName = "DB1" #Refer to the key
$SourceBackupFilePath = "C:\Backups" #Refer to the key
$SourceBackupFileUNC = "c$\Backups" #This is the same as $SourceBackupFilePath, but replace the colon with a dollar sign
$BackupFileName = "DB1.bak" #Refer to the key
$BackupName = "DB1" #This is the same as your database name, but refer to the key

# Automatically Built Variables
$SourceBackupFullPathAndName = ($SourceBackupFilePath + "\" + $BackupFileName)
$TestingBackupFullPathAndName = ($TestingBackupFilePath + "\" + $BackupFileName)

# Backup Database
Invoke-SqlCmd -Query "BACKUP DATABASE [$DatabaseName] TO  DISK = N'$SourceBackupFullPathAndName' WITH NOFORMAT, INIT,  `
NAME = N'$BackupName', SKIP, NOREWIND, NOUNLOAD,  STATS = 10" -ServerInstance $SourceDBServer -QueryTimeout 65535

# MOVE (not copy!) the backup file to the Testing Server
move-item "\\SQLPROD\$SourceBackupFileUNC\$BackupFileName" -Destination "\\$TestingDBServer\$TestingBackupFileUNC"

# Restore Database onto the Testing Server
Invoke-SqlCmd -Query "RESTORE DATABASE [$DatabaseName] FROM  DISK = N'$TestingBackupFullPathAndName' WITH  FILE = 1,  NOUNLOAD,  STATS = 10" `
-ServerInstance "$TestingDBServer" -QueryTimeout 65535

# DBCC Check
Invoke-SqlCmd -Query "DBCC CHECKDB ($DatabaseName) WITH ALL_ERRORMSGS" -ServerInstance "$TestingDBServer" -QueryTimeout 65535

# Get DBCC Results from SQL Log and format them
[system.reflection.assembly]::LoadWithPartialName('Microsoft.SQLServer.Smo') | out-null
$srv1 = new-object ('Microsoft.SQLServer.Management.Smo.Server') $TestingDBServer
$Result = ($srv1.readerrorlog(0) | where {$_.text -like "*DBCC*"} | select text -last 1)
$Result -match $Pattern | out-null
$DBCCOutputText = $matches[0]
$DBCCOutputText = ($InsertString,$DBCCOutputText -join " ")



# Delete Database
$DeleteDatabaseQuery = "EXEC msdb.dbo.sp_delete_database_backuphistory @database_name = N'$DatabaseName'
GO
USE [master]
GO
DROP DATABASE [$DatabaseName]
GO"
Invoke-SqlCmd -Query $DeleteDatabaseQuery -ServerInstance "$TestingDBServer" -QueryTimeout 65535

# Move the BAK files that you used for the restores to a timestamped folder within the backup area
# These folders will be deleted after their creation times are over 24 hours
move-item "\\$TestingDBServer\$TestingBackupFileUNC\*.bak" -destination $ArchiveStoragePath


# This section creates the report for this database
$EndTimeDatabase1 = (get-date)
$ReportForDatabase1 = (
"Date: "+$Date+$newline+`
"Database Name: "+$DatabaseName+$newline+`
"DBCC Output: "+$newline+$DBCCOutputText+$newline+`
"Start Time: "+$StartTimeDatabase1+$newline+`
"End Time: "+$EndTimeDatabase1+$newline+`
$dividerline
)

#***************************************
#*   *
#* END DATABASE 1   *
#*   *
#***************************************

# ADDING A NEW DATABASE - START COPY ON THIS LINE

# ADDING A NEW DATABASE - END COPY HERE

#********************************************************************
#* *
#* ADD ANY NEW DATABASES ABOVE THIS LINE - THIS SECTION IS THE END *
#* *
#********************************************************************

# CHANGES NEED TO BE MADE TO THIS REPORTING SECTION FOR SUBSEQUENT DATABASES
# Build the master report
$MasterReport = ("Beginning"+$newline+$dividerline+$newline+$ReportForDatabase1+$newline) #When adding a new database, append +$ReportForDatabase#+$newline just inside the end parenthesis

#This section counts instances of the $SearchFor text, and if it equals 1 (the number of databases checked) then the body of the email will begin with "DBCC Check Successful". This allows for an Outlook filtering rule.
$Content = $MasterReport
$SearchFor = 'Found 0 errors and repaired 0 errors'
$MatchedInstances = ($Content | Select-String $SearchFor -AllMatches | Select -Expand Matches | Select -Expand Value)
If ($MatchedInstances.count -eq 1){
$MasterReport = ("DBCC Check Successful"+$MasterReport)
} #End If


# For each folder in the Archive area on the testing server, get the creationtime and delete anything older than 24 hours
$Folders = (Get-childitem "\\$TestingDBServer\$TestingBackupFileUNC")
Foreach ($folder in $folders){
$FolderAge = (((Get-Date) - $folder.creationtime).totalhours)
If ($FolderAge -gt 24)
{remove-item $Folder.FullName -recurse}
}

# Section for emailing of the report
Send-Mailmessage -from administrator@contoso.com -to me@contoso.com,OtherAdmin@contoso.com -subject "DB1 Database Consistency Check for $date" -smtpserver SMTPServer.contoso.com -body $MasterReport -bodyasHTML