Click an Ad

If you find this blog helpful, please support me by clicking an ad!

Wednesday, April 10, 2013

Copying Files from a Sharepoint 2013 Document Library and a Rant Against the Cloud

This week, my team finally had our fill of Spiceworks' Knowledge Base. Spiceworks is a GREAT free network scanning and inventory tool, to be sure. When we evaluated it, it did everything we needed (and more). The one thing we were unsure of was the implementation of its Knowledge Base.

In Spiceworks, the Knowledge Base allows you to write articles, how-to's, etc. You can either keep them linked to your own login, share them with the team, or share them with the entire Spiceworks community (which is a fantastic community - I get lots of help there). The problem is that your documents are not stored on your server. They're in Spiceworks' cloud.

The cloud should not be used to store important information. I just don't understand all of these businesses are moving important things into it. The thing is, the cloud is only as reliable as your internet connection. So, who do you trust more to keep things running: Your IT Admin(s), or your internet provider AND the service provider? AT&T doesn't give a damn if your business is without internet.

Anyway, we tried to do it the new way. The cloud way. We put a couple hundred documents into Spiceworks over the course of a few months. Of course, there have been several times where our team was not able to access the Knowledge Base because something was wrong on Spiceworks' end.

Finally, we had had enough, and I built a Sharepoint server in an afternoon. We have datacenter licenses for our VMware hosts, and Sharepoint Foundation 2013 is free, so it cost us nothing. I hooked the back end into our IT SQL server, but Sharepoint will install a SQL Express DB if you need it. Keep in mind that there's a 4GB limit to any databases on an express install, though.

This was my first Windows 2012 server, and, well... meh. We manage things through a remote desktop app called mremote, and RDP makes the new server OS a bit challenging to use. You can't pass the Windows shortcut key, so I actually have to mess with the hot corners to get to the start screen. AND since the Windows shortcut key doesn't work through RDP, I can't hit Win+R and run commands that way (or Win+E to open explorer). Anyone have a better remote desktop app that will pass Windows keys I am all ears. This might not be Microsoft's fault - I probably just need to find a workaround....

I digress. So I got Sharepoint all set up and started dumping in our docs. At a previous job, I had attempted to write a script (this was before I learned Powershell) that would copy all of our files out of a Sharepoint repository to a secondary location every day, but it took so long to run that it was ridiculous. I thought I would give it another try. I found a very helpful blog post by Jeffrey B. Murphy at jbmurphy.com that did almost all of the work for me, and then I just added a few bells and whistles to complete it. They must have improved something somewhere (WebDAV perhaps?) because copying the files is MUCH (MUCH) faster. Here's the script:


Add-pssnapin microsoft.sharepoint.powershell
$StartTime = get-date
$TempFile = "C:\Temp\SPFiles.txt"

#This next command deletes the files that were copied over yesterday - it empties the folder
Get-ChildItem \\DRServer\e$\SharepointKB | Remove-Item -recurse -force

$SiteURL = "http://SharepointServer"
$DocumentLibrary = "Knowledge Base"
$Destination = "\\DRServer\e$\SharepointKB"
$spWeb = Get-SPWeb -Identity $SiteURL
$list = $spWeb.Lists[$DocumentLibrary]

#This section creates a list of all of the files that will be moved and outputs to a text file
$FilesMoved = ($list.items | select File | sort File | ft -wrap)
$FilesMoved | out-file $TempFile

#This section actually copes the files.
foreach ($listItem in $list.Items)
{
    $DestinationPath = $listItem.Url.replace("$DocumentLibrary","$Destination").Replace("/","\")
    write-host "Downloading $($listItem.Name) -> $DestinationPath"

    if (!(Test-Path -path $(Split-Path $DestinationPath -parent)))
    {
        write-host "Creating $(Split-Path $DestinationPath -parent)"
        $dest = New-Item $(Split-Path $DestinationPath -parent) -type directory
    }
    $binary=$spWeb.GetFile($listItem.Url).OpenBinary()
    $stream = New-Object System.IO.FileStream($DestinationPath), Create
    $writer = New-Object System.IO.BinaryWriter($stream)
    $writer.write($binary)
    $writer.Close()
}
$spWeb.Dispose()

$EndTime = get-date

#This section sends me an email including start and end times as well as the file list I created earlier
$Subject = "Sharepoint KnowledgeBase Copied to DR Server"
$Body = "Start Time: $StartTime `r`n`r`nEnd Time: $EndTime `r`n`r`nList of Sharepoint files copied is attached"
Send-MailMessage -To me@myjob.com -From administrator@myjob.com -Subject $Subject -Body $Body -SmtpServer mailserver.myjob.com -attachments $TempFile
Remove-Item $TempFile

By the way, you will need to run this script from a server that has the Sharepoint Powershell extensions installed. Your Sharepoint server is the easy choice; I didn't even look into whether I could install the extension elsewhere, because I tried to go down that rabbit-hole with Sharepoint 2010 and spent way too much time fighting with it.

No comments:

Post a Comment