Top Link Bar Navigation To XML

Applies To: SharePoint

Continuing in my series on SharePoint’s Top Link Bar (sometimes called the Tab Bar) I want to show you how to serialize a site collection’s Global Navigation Nodes to XML using PowerShell. This isn’t quite the same as just using the Export-Clixml command since we are interested in very specific properties and their format.

In my previous post, Multi-Level Top Link Bar Navigation (Sub Menus), I showed you how to enable additional sub menus using the native control in SharePoint by editing a simple attribute in the Master Page. However, it quickly became clear that the native navigation editor (Site Actions > Site Settings > Navigation) won’t allow you to edit anything greater than 2 levels despite the control’s support for it. In this post I’ll show you how to output a site’s navigation to XML. In my next post I’ll show how to edit it and then import it to create multiple levels.

The Script

Copy and paste the code below into notepad and save it as NavigationToXML.ps1

$xmlFile = "d:\scripts\navigation\ExportedNavigation.xml"
$sourceweb = "http://somesite.com"
$keepAudiences = $true

Function ProcessNode($nodeCollection) {
    $parentCollection = @()
    foreach($node in $nodeCollection) {
       if ($node.Properties.NodeType -ne [Microsoft.SharePoint.Publishing.NodeTypes]::Area -and $node.Properties.NodeType -ne [Microsoft.SharePoint.Publishing.NodeTypes]::Page) {
	   $nHash = New-Object System.Object
           $nHash | Add-Member -type NoteProperty -name IsNode -value "yes"
           $nHash | Add-Member -type NoteProperty -name Title -value $node.Title
           $nHash | Add-Member -type NoteProperty -name Url -value $node.Url
           $nHash | Add-Member -type NoteProperty -name NodeType -value $node.Properties.NodeType
           $nHash | Add-Member -type NoteProperty -name Description -value $node.Properties.Description
           if($keepAudiences){
                $nHash | Add-Member -type NoteProperty -name Audience -value $node.Properties.Audience
           } else {
                $nHash | Add-Member -type NoteProperty -name Audience -value ""
           }
           $nHash | Add-Member -type NoteProperty -name Target -value $node.Target
           if($node.Children.Count -gt 0){
                $nHashChildren = ProcessNode($node.Children)
                $nHash | Add-Member -type NoteProperty -name Children -value $nHashChildren
           } else {
                $nHash | Add-Member -type NoteProperty -name Children -value @()
           }
           $parentCollection += $nHash
       }
    }
    $parentCollection
}

$sw = Get-SPWeb $sourceweb
$psw = [Microsoft.SharePoint.Publishing.PublishingWeb]::GetPublishingWeb($sw)

ProcessNode($psw.Navigation.GlobalNavigationNodes) | Export-Clixml $xmlFile

#cleanup
$sw.dispose()

What it Does and How to Use It

There are 3 parameters at the top of the script you will need to change:

  • $xmlFile: The path to use for the XML output
  • $sourceweb: The URL of the site whose navigation you are serializing
  • $keepAudiences: When $true audience values are serialized, when $false they are left blank

Once you set those and save, open the SharePoint Management Shell (You’ll want to run-as a farm administrator) and navigate to the location of the script. You can run it by typing: .\NavigationToXml.ps1

RunNavigationToXMLScript

The script will create an array of custom objects that correspond to the $sourceweb‘s navigation nodes and then use standard PowerShell serialization to create a simplified XML document at the location specified by $xmlFile. For instance, given the navigation shown here:

IMG_0347

The XML output will look similar to this:

<Objs Version="1.1.0.1" xmlns="http://schemas.microsoft.com/powershell/2004/04">
  <Obj RefId="0">
    <TN RefId="0">
      <T>System.Object</T>
    </TN>
    <ToString>System.Object</ToString>
    <MS>
      <S N="IsNode">yes</S>
      <S N="Title">Some Sites</S>
      <S N="Url">/personal/ckent/navtest</S>
      <S N="NodeType">Heading</S>
      <S N="Description"></S>
      <S N="Audience"></S>
      <Nil N="Target" />
      <Obj N="Children" RefId="1">
        <TN RefId="1">
          <T>System.Object[]</T>
          <T>System.Array</T>
          <T>System.Object</T>
        </TN>
        <LST>
          <Obj RefId="2">
            <TNRef RefId="0" />
            <ToString>System.Object</ToString>
            <MS>
              <S N="IsNode">yes</S>
              <S N="Title">theChrisKent</S>
              <S N="Url">http://thechriskent.com</S>
              <S N="NodeType">AuthoredLinkPlain</S>
              <S N="Description"></S>
              <S N="Audience"></S>
              <Nil N="Target" />
              <Obj N="Children" RefId="3">
                <TNRef RefId="1" />
                <LST />
              </Obj>
            </MS>
          </Obj>
          <Obj RefId="4">
            <TNRef RefId="0" />
            <ToString>System.Object</ToString>
            <MS>
              <S N="IsNode">yes</S>
              <S N="Title">Microsoft</S>
              <S N="Url">http://microsoft.com</S>
              <S N="NodeType">AuthoredLinkPlain</S>
              <S N="Description"></S>
              <S N="Audience"></S>
              <Nil N="Target" />
              <Ref N="Children" RefId="3" />
            </MS>
          </Obj>
        </LST>
      </Obj>
    </MS>
  </Obj>
  <Obj RefId="5">
    <TNRef RefId="0" />
    <ToString>System.Object</ToString>
    <MS>
      <S N="IsNode">yes</S>
      <S N="Title">Google</S>
      <S N="Url">http://google.com</S>
      <S N="NodeType">AuthoredLinkPlain</S>
      <S N="Description"></S>
      <S N="Audience"></S>
      <Nil N="Target" />
      <Ref N="Children" RefId="3" />
    </MS>
  </Obj>
</Objs>

For those familiar with the Export-Clixml PowerShell command this shouldn’t look too crazy. For the rest of us, I will go into detail about this in my next post. Regardless of how complicated that may look to you, it is much simpler than if we had just called Export-Clixml on the Global Navigation Nodes object for the site.

How it Works

The main code begins at line 33 where the $sourceweb is retrieved and then the ProcessNode function (Lines 5-31) is called on the GlobalNavigationNodes collection. This generates an array of custom objects that are then serialized to the $xmlFile using Export-Clixml.

The ProcessNode function creates a custom object for every node and captures the Title, Url, NodeType, Description, Target, and when $keepAudiences is $true the audience information. Every custom object is also given a Children property (lines 21-26) and this is set to an array recursively for all child nodes that exist. This has been written to capture any number of levels of children. (This script does, however, skip all Area and Page node types since these are automatic and shouldn’t be edited manually)

 

So what do we do with this XML document? In my next post I’ll show you how to read, edit and ultimately import these nodes back into the site collection. This can be helpful to copy nodes from one site colleciton to another (although my post on SharePoint Top Link Bar Synchronization provided a much cleaner approach for this). More importantly this will provide you an easy way to have multiple sub menus in the SharePoint Top Link Bar.

SharePoint Top Link Bar Synchronization

Applies To: SharePoint

The Top Link Bar (sometimes called the Tab bar) provides global navigation for a site collection. It’s a great spot to link to subsites, other site collections and more. It’s also relatively easy to use and customize. On a standard team site you can go to Site Actions > Site Settings and then choose the Navigation link under Look and Feel to customize. This all works really well. However, it can be difficult to maintain consistent navigation between multiple site collections. Add in the complication of relative URLs, audiences and attempting to keep those sites in sync with any more than 2 or 3 site collections – suddenly any navigational changes become a full project!

IMG_0347

Fortunately, using the below PowerShell script you can easily setup navigation on a single site collection using the out of the box tools and then propagate those changes to as many additional site collections as needed:

The Script

You can copy the script below and paste it into notepad and save it as NavigationPropagation.ps1

$sourceweb = "http://somesite.com"
$destwebs = "http://somesite.com/sites/humanresources", "http://somesite.com/panda", "http://someothersite.com/pickles"

$keepAudiences = $true #set to false to have it totally ignore audiences (good for testing)

Function CreateNode($node,$dWeb,$destCollection){
    if ($node.Properties.NodeType -ne [Microsoft.SharePoint.Publishing.NodeTypes]::Area -and $node.Properties.NodeType -ne [Microsoft.SharePoint.Publishing.NodeTypes]::Page) {

        Write-Host "    Node: " + $node.Title

        #make URLs relative to destination
    	$hnUrl = $node.Url
    	#Write-Host "O URL: $hnUrl"
        $IsHeading = $false
        if($node.Properties.NodeType -eq [Microsoft.SharePoint.Publishing.NodeTypes]::Heading){
            $IsHeading = $true
        }
    	$hnUrl = SwapUrl $hnUrl $sourceWeb $dWeb $IsHeading
    	#Write-Host "N URL: $hnUrl"

        $hnType = $node.Properties.NodeType
        if($hnType -eq $null){
            $hnType = "None"
        }

        $hNode = [Microsoft.SharePoint.Publishing.Navigation.SPNavigationSiteMapNode]::CreateSPNavigationNode($node.Title,$hnUrl,[Microsoft.SharePoint.Publishing.NodeTypes]$hnType, $destCollection)
    	$hNode.Properties.Description = $node.Properties.Description
        if($keepAudiences){
    	   $hNode.Properties.Audience = $node.Properties.Audience
        }
    	$hNode.Properties.Target = $node.Properties.Target
        $hNode.Update()

        if($node.Children.Count -gt 0) {
    	   foreach($child in $node.Children) {
                CreateNode $child $dWeb $hNode.Children
    	   }
        }
    }
}

Function SwapUrl([string]$currentUrl,[string]$sourceRoot,[string]$destRoot,[boolean]$isHeading) {
	if ($currentUrl -ne "/") {
		if ($currentUrl.StartsWith("/")) {
			$currentUrl = $sourceRoot + $currentUrl
		} elseif ($currentUrl.StartsWith($destRoot)) {
			$currentUrl = $currentUrl.Replace($destRoot, "")
		}
	} else {
		$currentUrl = [System.String]::Empty
        if(!$isHeading){
            Write-Host "        This is a blank non-heading! Using $sourceRoot" -ForegroundColor Yellow
            $currentUrl = $sourceRoot
        }
	}
	$currentUrl
}

$sw = Get-SPWeb $sourceweb
$psw = [Microsoft.SharePoint.Publishing.PublishingWeb]::GetPublishingWeb($sw)

foreach ($destweb in $destwebs) {

	Write-Host "***************************************" -ForegroundColor Cyan
	Write-Host "Propogatin' That There $destweb" -ForegroundColor Cyan
	Write-Host "***************************************" -ForegroundColor Cyan

	$dw = Get-SPWeb $destweb
	$pdw = [Microsoft.SharePoint.Publishing.PublishingWeb]::GetPublishingWeb($dw)

	$nodeCount = $pdw.Navigation.GlobalNavigationNodes.Count
	Write-Host "  Removing Current Nodes ($nodeCount)..." -ForegroundColor DarkYellow
	for ($i=$nodeCount-1; $i -ge 0; $i--) {
		$pdw.Navigation.GlobalNavigationNodes[$i].Delete()
	}

	Write-Host "  Propagating to $destweb..." -ForegroundColor DarkYellow
	foreach($n in $psw.Navigation.GlobalNavigationNodes){
		[void](CreateNode $n $destweb $pdw.Navigation.GlobalNavigationNodes)
	}

	$dw.dispose()
}

$sw.dispose()

What it Does and How to Use it

Since this is a script that you will typically run with the same parameters over and over I just made them variables at the top of the script. There are just 3 of them:

  • $sourceweb: This is the url of the site collection to copy the navigation from
  • $destwebs: This is a comma separated (array) list of urls to copy the navigation to
  • $keepAudiences: When $true audiences are propagated along with the links, when $false then these are totally ignored. Setting this to $false can be really helpful for testing just to make sure everything is being copied correctly before you add in the complication of audience filtering

Once you set those and save, open the SharePoint Management Shell (You’ll want to run-as a farm administrator) and navigate to the location of the script you can simply run it by typing: .\NavigationPropagation.ps1

IMG_0349

The script will cycle through each of the $destwebs and delete all the Global Navigation Nodes from the site. It will then copy all the nodes (excluding Area and Page types since these are automatic depending on your settings) from the $sourceweb. Each URL is made relative or adjusted to be fully qualified as needed. This is done recursively allowing any number of levels of child nodes (more on this in an upcoming post).

Typical use case for us is to save the script with all the parameters set and whenever we make a change on the $sourceweb we simply run the script and everything gets synchronized. If you’re making frequent updates then you could always use Task Scheduler to run the script automatically.

How it Works

There are 2 functions in the script: CreateNode (lines 4-40) and SwapUrl (lines 42-57). CreateNode copies a passed node into the specified collection mapping the necessary properties and calling SwapUrl which ensures relative Urls are fully qualified and Urls that should be relative are made relative. The main thing to note is that CreateNode is recursive and will copy all child nodes as well.

The main execution of the script starts at line 59 and is pretty straightforward. It retrieves the source web and then loops through the destination webs. For each destination web it deletes all the existing navigation nodes and then calls CreateNode passing each of the source web’s nodes and the destinations node collection.

 

Keeping your navigation consistent across the farm is essential to a good user experience but can be difficult to manage using out of the box tools. Fortunately with the above script this can be relatively simple.

Changing Web Part Properties When the Page is Unavailable

Applies To: SharePoint

The other day we made some changes that caused some issues with how one of our web parts was configured. Unfortunately, I hadn’t wrapped the problem in a try/catch and my error blew up the whole page. I’m sure I’m the only one that’s ever done that. So obviously I’ve got some code changes to make, but what do I do in the meantime? Fortunately, there’s some straight forward Powershell that lets you change web part settings (even custom properties like mine).

I found the solution to this over on Aarebrot.net where he was using the technique to change a web part that automatically redirected the user. I’ve just reproduced his code here and added some explanation and background.

When I first went to solve this problem I tried the ?contents=1 querystring trick to pull up the Web Part Administration page. If you’re looking for a quick solution you can add that query string to the end of your page’s URL and then delete the web part from the page and start over. But a more elegant solution is to just change the offending property using some easy Powershell.

Using the SharePoint 2010 Management Shell, run the following commands:

$web = Get-SPWeb "http://somedomain.com/sites/someweb"
$page = $web.GetFile("default.aspx")
$page.CheckOut()
$wpm = $web.GetLimitedWebPartManager("default.aspx",[System.Web.UI.WebControls.WebParts.PersonalizationScope]::Shared)
$part = $wpm.WebParts[0]
$part.SomeProperty = "The correct setting!"
$wpm.SaveChanges($part)
$page.CheckIn("Fixed that property")
$page.Publish("Fixed that property")
$web.Close()

What Just Happened?

In line 1 we’re just getting a reference to the web site (SPWeb) where your web part lives using Get-SPWeb. Just replace the URL shown with yours.

Lines 2-3 and 8-9 are only required if the page you’re modifying is on a publishing site or check in/out is required. Feel free to skip these (go directly to line 4) if you’re just editing a simple page. If your page does require check out to be edited, line 2 is simply retrieving the file (SPFile) using the GetFile method using the relative location of the page. Then line 3 calls the CheckOut method which, of course, checks out the file.

In line 4, we’re grabbing a reference to the Web Part Manager for the page (SPLimitedWebPartManager) using the GetLimitedWebPartManager method. Just replace the first parameter with the relative location of your page. The second parameter is the PersonalizationScope enumeration and can be User or Shared. You’re going to want to use Shared to affect everybody. The Web Part Manager object is what lets us get access to all the web parts on the page and screw with em.

In line 5, we grab the web part (WebPart) we want by index using the WebParts collection. In the example above I already knew that the web part I wanted was the first one in the collection. You can also pass the uniqueID property of the web part (instead of the index). You can find out both by simply calling the WebParts collection by itself ($web.WebParts) and everything will get listed to the screen.

To see all the available properties of the web part you can just type ($part) and it will list everything out including any custom properties. Then you can just set them like we do in line 6.

Line 7 uses the Web Part Manager’s SaveChanges method to incorporate all your changes. Lines 8-9 are again only required if your pages library requires check in/out and publishing. If it’s a simple page just skip to line 10. Line 8 uses the CheckIn method which takes a string for a check-in comment. Line 9 uses the Publish method which also takes a string for a comment.

Line 11 just calls the Close method and ensures we clean up all our resources.

That’s it! Now you can wrap that up in a script to loop through multiple pages and change properties on all sorts of web parts or just one-off fix those web parts you might have broken.

Remove Drop Off Libraries from Every Site

Applies To: SharePoint 2010

The other day I came across a problem with some missing features on a few of our sites. Some quick searching told me I needed to enable all my sites to use the SharePoint Server Enterprise Features. This is a good thing to do, especially if you’ve just upgraded from 2007 to 2010 or you were using Foundation or Standard Server.

So, I went to Central Administration and clicked on Upgrade and Migration and chose Enable Features on Existing Sites. I was presented with this screen:

So, I checked the box and pressed OK. It took a few hours, but then everything was good to go! Almost…

Turns out I had people asking me what this new Drop Off Library link was on their sites. A quick check showed that EVERY SITE in EVERY SITE COLLECTION in EVERY WEB APPLICATION now had the Content Organizer feature enabled and a Drop Off Library added.

So, I go to Manage Site Features I deactivate the Content Organizer and go to delete the Drop Off Library. No delete option in Library settings. Turns out you need to use something like SharePoint Manager or Powershell to allow the library to be deleted before that link will show up. Regardless, we have hundreds of sites and my sites – All of which now have an unremovable library and unnecessary feature activated.

So, time for some Powershell! I wrote a script (below) that will cycle through all the sites in all the site collections for a given Web Application and disable the DocumentRouting feature (Content Organizer) and delete the Drop Off Library. I was inspired by this forum thread and this blog post.

Additionally, I was also faced with the complication that I had actually activated this feature previously for a few sites and wanted to make sure they weren’t stripped along with everybody else. So I added an ExclusionURLs parameter. I’ll explain more about that and how the rest of the script works in a minute.

The Script

Param(
	[parameter(position=0)]
	[String]
		$WebApplicationURL,
    [parameter(position=1)]
    [Boolean]
        $AnalysisOnly = $true,
	[parameter(position=2)]
	[String[]]
		$ExclusionURLs
)

#Display Exclusion URL information
if($ExclusionURLs -and $ExclusionURLs.Count -gt 0) {
    Write-Host "Excluded URLs:" -foregroundcolor green
    $ExclusionURLs | ForEach-Object {
        Write-Host "     $_" -foregroundcolor green
    }
} else {
    Write-Host "No URL Exclusions" -foreground cyan
}

#Display Feature Information
$feature = Get-SPFeature “DocumentRouting”
Write-Host “Feature ID for Content Organizer is called $($feature.DisplayName)" -foregroundcolor cyan

if($AnalysisOnly) {
    Write-Host "ANALYSIS ONLY" -foregroundcolor red
}

#Go Through Every Site
Get-SPWebApplication $WebApplicationURL | Get-SPSite -Limit ALL | Get-SPWeb -Limit ALL | ForEach-Object {

    #Check for Exclusion
    if(!($ExclusionURLs -contains $_.URL)) {
        Write-Host "$_ | $($_.URL)" -foregroundcolor DarkCyan

        #Disable Feature if found
        if ($_.Features[$feature.ID]) {
            Write-Host “  Feature $($feature.DisplayName) Found" -foreground green
            if(!$AnalysisOnly){
                Disable-SPFeature $feature -Url $_.Url -Force -Confirm:$false
                Write-Host “  Feature $($feature.DisplayName) Disabled” -foreground magenta
            }
        } else {
            Write-Host "  Feature $($feature.DisplayName) NOT Found" -foreground yellow
        }

        #Delete Drop Off Library if found
        $list = $_.Lists["DROP OFF LIBRARY"]
        if ($list) {
            Write-Host “  List $list Found” -foregroundcolor green
            if(!$AnalysisOnly){
                $list.AllowDeletion = $true;
                $list.Update()
                $list.Delete()
                Write-Host “  List $list Deleted” -foreground magenta
            }
        } else {
            Write-Host “  Drop Off Library NOT found” -foregroundcolor yellow
        }
    }

}
Write-Host " "
Write-Host "All Done!" -foregroundcolor yellow

You can copy the above script, save it in a text file with a ps1 extension and run it from the console. Assuming you’ve named the file RemoveDropOffLibraries.ps1 you can run it in a couple of different ways:

Analysis Only:

Since I’m the cautious type, I want to know which sites are going to be affected before I actually pull the trigger. So running it with just the Web Application URL will provide you with a list of all sites. Additionally, you’ll be told if a given site has the DocumentRouting feature enabled and if a Drop Off Library was found.

Perform Actions:

If you’re comfortable with the results above, just pass the Web Application URL and $false for the AnalysisOnly parameter. This will do the same list of sites and indicate if the DocumentRouting feature is enabled and if a Drop Off Library was found. Each time the feature is found activated, it gets disabled and you’ll see a message. Additionally, each time a Drop Off Library is found, it gets deleted and you’ll see a message.

Analysis Only with Exclusions:

Exclusions allow you to pass site URLs in a comma separated list. Doing the above command ensures your exclusions are working. Just specify the Web Application URL, $true for AnalysisOnly and then 1 or more exclusion URLs separated only by a comma.

Perform Actions with Exclusions:

This uses the same rules for exclusions as above, but instead of just analyzing, it actually does the work.

Quick Tip:

If you have a lot of sites and want to be able to see the output easily, use the start-transcript and stop-transcript cmdlets like so:

start-transcript -path SomeFile.rtf
...Commands and Output...
stop-transcript

Just replace line 2 with one of the command examples from above.

 

Surely no one else will ever end up in this situation, but just in case – there ya go!

Extract Timer Job History Using PowerShell

Applies To: SharePoint 2010, PowerShell

I was tasked with finding all timer jobs that ran in a given time period. Some quick searching turned up a pretty cool solution by Glyn Clough using PowerShell. I took his script and modified it some to account for UTC times and it works great. Although I’m presenting my modified script, the bulk of the work was done by Glyn and I’m really just tweaking it a little.

The Script

Param(
	[parameter(position=0)]
	[DateTime]
		$StartTime,
	[parameter(position=1)]
	[DateTime]
		$EndTime
)

if(!$StartTime) {$StartTime = (Get-Date).Date}
if(!$EndTime) {$EndTime = (Get-Date).AddDays(1).Date}

$StartTime = $StartTime.ToUniversalTime()
$EndTime = $EndTime.ToUniversalTime()

$TZ = [System.TimeZoneInfo]::FindSystemTimeZoneById(((Get-WmiObject win32_timezone).StandardName))

Get-SPWebApplication | foreach {
	$_.JobHistoryEntries |
		where{	($StartTime -le $_.StartTime -and $_.StartTime -le $EndTime) -or
			($StartTime -le $_.EndTime -and $_.EndTime -le $EndTime) } |
		sort StartTime |
		select	JobDefinitionTitle,
			WebApplicationName,
			ServerName,
			Status,
			@{Expression={[System.TimeZoneInfo]::ConvertTimeFromUtc($_.StartTime, $TZ)};Label="Start Time"},
			@{Expression={[System.TimeZoneInfo]::ConvertTimeFromUtc($_.EndTime, $TZ)};Label="End Time"},
			@{Expression={($_.EndTime - $_.StartTime).TotalSeconds};Label="Duration (secs)"}
} | Out-GridView -Title "Timer Job History"

You can copy the above script save it in a text file with a ps1 extension and run it from the console. Assuming you’ve named the file JobHistory.ps1 you can run it in a couple of different ways:

No Parameters:

Running it this way will return the entire history for the current day starting at midnight.

Specify Start Time:

This is especially helpful if you’re just trying to find the most recent history since this will give you the full history starting at the specified date/time to now.

Specify Range:

Doing this will return all history entries for the given range. The date/time parameters can be entered in a variety of ways since powershell is converting the string to a date you can enter the date/time in a format that matches your culture/locale.

The Output

The results are funneled to a GridView which requires the Windows PowerShell Integrated Scripting Environment (ISE) to be installed. On a server this is as simple as opening Server Manager, selecting features, Add Features, then choosing the Windows PowerShell Integrated Scripting Environment and installing (This did not require a restart).

There are many benefits to using the GridView. The best is the filtering, but I also like the sorting and copy/paste functionality. I often filter on job status or sort by duration to catch problem jobs. Then I can copy those rows and paste them directly into Excel if needed.

The above script also outputs histories for every web application (And will immediately show the GridView when the first is done and then slowly add the remaining ones). This could be changed by modifying the above script and adding a parameter, but this was unnecessary for me. You can also use the GridView filter to show only the web application you need.

Obviously yours will show the WebApplicationName and ServerName without my sloppy black bars

Intermittent “Unable to display this Web Part” messages

Applies To: SharePoint 2010

I few months ago I customized a view in SharePoint designer to turn the due date red for any past due items in the list. The end users really liked this but an obnoxious problem started turning up. Seemingly randomly we would get:

Unable to display this Web Part. To troubleshoot the problem, open this Web page in a Microsoft SharePoint Foundation-compatible HTML editor such as Microsoft SharePoint Designer. If the problem persists, contact your Web server administrator.

Correlation ID: Some GUID

Taking a look through our logs didn’t reveal anything and often a refresh or two would solve the problem. So it wasn’t really stopping business but it was pretty annoying. Adjusting the logging settings we finally saw some messages corresponding to the provided Correlation ID and found the issue was Value did not fall into expected range often followed by Stack Overflow exceptions.

Unfortunately the above error message is so generic it was pretty difficult to find anyone else even having the same problem, let alone the solution. Finally I came across this thread on MSDN discussing the exact issue. Instructions for fixing the problem and the background of this issue can be found on this article on Englando’s Blog. The solution presented was to get a hotfix from Microsoft. Fortunately, that is no longer necessary and the fix is provided in the February 2012 Cumulative Update from Microsoft.

The problem was introduced in the June 2011 Cumulative Update when Microsoft reduced the timeout for XSLT transformation (used whenever you customize a view in SharePoint Designer) from 5 seconds to 1 second. This is a good idea for public facing farms to help mitigate Denial of Service attacks but pretty unnecessary for internal farms like the one I was working on.

The timeout causes modified XSLTListView Web Parts and XSLTDataView Web Parts to sometimes show the “Unable to display this Web Part” errors. This is especially true if you have several columns (more transformation) or are doing anything of even mild complexity. The issue was “fixed” in the August 2011 Cumulative Update but broken again in the December 2011 Cumulative Update.

To fix this issue we installed the February 2012 Cumulative Update on our farm (More about our experiences with this update to follow). Keep in mind, however, that the update does not change the XsltTransformTimeOut but merely provides you the ability to do so using PowerShell.

To check your current timeout settings, simply use the following PowerShell:

$myfarm = Get-SPFarm
$myfarm.XsltTransformTimeOut

If you’re experiencing the above problem, you probably got a 1 back from the above command indicating that the timeout is currently set to 1 second. To set it to a more reasonable value (we choose the original 5 seconds) just do this (assuming you set the $myfarm object using the above powershell):

$myfarm.XsltTransformTimeOut = 5
$myfarm.Update()

That’s it, things are happy again.

Using Powershell to Document SharePoint 2010 Farm Configuration

Applies To: SharePoint 2010

Business continuity (Disaster Recovery) is an important topic for SharePoint 2010 and Microsoft has provided a helpful book available online here: Business Continuity Management for Microsoft SharePoint Server 2010. The book is full of steps and options for backing up and restoring your farm.

There is an interesting powershell script starting on page 132 that takes advantage of the Export-Clixml powershell cmdlet to cycle through your farm’s configuration and write everything to XML files. The resulting XML isn’t super user friendly, but it is human readable. Even cooler is that you can use the Import-Clixml cmdlet to instantiate those objects back in powershell later.

Obviously an actual Farm backup (configuration-only) is more helpful for restoration of your settings, but these XML files can be very useful. For instance, if you don’t want to restore all your configuration but just want to document it somewhere so you can either reference it in whole or in part, this is a great solution. Even better is using it as a guide to pick and choose the various commands when you’re trying to find some information on the fly.

Unfortunately, the script is all jumbled together and split across multiple pages making the included suggestion of copying and pasting into a text document more complicated than needed. The newlines are all in the wrong places, the comments run together and overwrite commands, page numbers show up, etc. So, I went through the resulting file and formatted it correctly. You can use the copy toolbar to get it directly:

Continue reading