Friday, May 6, 2011

Warmup and monitoring PowerShell script for Sharepoint

Version francaise la


While i was working for a customer, i wanted to generate automatically a list of AAM to monitor. There's also the problem of warming up after a pool recycle (or issreset). So, why not use the warmup script and sharepoint itself  to generate automatic inventory and status of each website?



This was also a good PowerShell exercise to iterate a SharePoint collection, access a list and update it. It is surprisingly easy, since PowerShell is quite capable of importing the sharepoint assembly and use its object model. I guess it works exactly the same in .NET (i'm an admin, not a coder ;).

Simply create a list in central admin. We'll use title as the URL, nice unique key. You can integrate it to the UI by editing the homepage and simply add a webpart to the list (remember it's got its own view)




If you want to keep an history of the monitoring, simply use versioning in the library :





Now, let's write our script :



#

# Code : Emmanuel ISSALY - version 1.0 - 05/05/2011
#
 [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint") > $null;
 function get-webpage([string]$url,[System.Net.NetworkCredential]$cred=$null)
{
$wc = new-object net.webclient
if($cred -eq $null)
{
$cred = [System.Net.CredentialCache]::DefaultCredentials;
}
$wc.credentials = $cred;
return $wc.DownloadString($url);
}
  défault credentials (will be set in the job)
 $cred = [System.Net.CredentialCache]::DefaultCredentials;
#$cred = new-object System.Net.NetworkCredential("username","password","domain")
 # our sharepoint list
 $docliburl="http://centraladminURL/Lists/ourList";
$reportsite=new-object Microsoft.SharePoint.SPSite($docliburl);
$reportweb=$reportsite.openweb();
$reportlist=$reportweb.GetList($docliburl);
 # for each AAM of "default" zone
set-alias stsadm -value "stsadm.exe"
 [xml]$x=stsadm -o enumzoneurls
foreach ($zone in $x.ZoneUrls.Collection) {
[xml]$sites=stsadm -o enumsites -url $zone.Default;
foreach ($site in $sites.Sites.Site) {
 write-host $site.Url;
 # Have we already registered it?
$item = $reportlist.Items | Where { $_["Title"] -eq $site.url }
 # if not, it's a new one
if ($item -eq $null) {$item = $reportlist.Items.Add();}
   $item["Title"] = $site.url;
  $item["OK?"]= $True;
  $item["Erreur"]="";
 trap
{
  $item["OK?"]= $False;
  $item["Erreur"]= $error[0].exception.innerexception.message;
  continue;
}
    $html=get-webpage -url $site.Url -cred $cred;
   $item.Update();
 }
}
 $reportweb.dispose()
$reportsite.dispose()

I must confess it's my first code since ... 2003, so it shall be perfectible (i'm thinking at the farm class here for example), but it works and does the job. The error handling is dirty. Suggestions welcome :)

This script should run on a host with powershell 1.0 installed (obviously), at least WSS to access microsoft.sharepoint, a path to stsadm is nice, and the account running it must have read access to the sites.

You have only to setup a scheduled task, using the data access account as the identity (or any account with at least read access via a web policy, for example). Run the script has many times as you like, once per iisreset or pool recycle is adequate.

et voila!... your central admin now has a auto filling list with all the sites and their status, enabling you to check at a glance if the sites created by your users are properly loopbacked, if their pool are running, etc.



No comments: