SharePoint Automation Gary Lapointe – Founding Partner, Aptillon, Inc.


ITUnity: SharePoint Online Article Series

You may have noticed that my blog has been kind of quiet lately – the reason is because I’ve been devoting some time to help hydrate the ITUnity site with some core PowerShell articles. As I write this I’ve currently got 10 articles published – these articles are all part of a continuing series devoted to using PowerShell to manipulate SharePoint Online. I’ve got a couple more articles, which may turn into several more articles, to complete in order to round out the series at which point I’ll then start doing more random topics that focus on both SharePoint On-premises and SharePoint Online (as well as a some general PowerShell or rather non-SharePoint specific PowerShell tips and tricks).

You can find all the articles I’ve written by viewing my author profile page on the ITUnity site: Here’s the list of articles that are a part of the SharePoint Online series that I’m currently working on:

  1. Introduction to the SharePoint Online Management Shell
  2. Exploring the SharePoint Online Cmdlets
  3. Working with the SharePoint Online Site Collection Cmdlets
  4. Connecting to SharePoint Online Using the SharePoint CSOM API with Windows PowerShell
  5. Using the SharePoint CSOM API with SharePoint Online and Windows PowerShell
  6. Loading Specific Values Using Lambda Expressions and the SharePoint CSOM API with Windows PowerShell
  7. Completing Basic Operations Using the SharePoint CSOM API and Windows PowerShell
  8. Using the SharePoint REST Service with Windows PowerShell
  9. Using a Custom Windows PowerShell Function to Make SharePoint REST Service Calls
  10. Working with Lists and List Items Using the SharePoint REST Service and Windows PowerShell

Astute readers of my blog may notice that some of the topics listed above are similar to posts I’ve previously had here – We wanted to make sure that any background information about topics such as the SharePoint Online Management Shell or working with the REST API existed on the ITUnity site rather than being dependent on articles on my blog so I took the opportunity to rework my articles from scratch so that there would be better continuity between the articles within the series.

I’ll continue to post some stuff on my blog but my focus for the near future will be to continue building the ITUnity site up with as much PowerShell content as I can – if you’re interested in contributing to the effort please let me know as I’m definitely eager to get more authors involved so that we can have some diversity of content, topics, and opinions.



Moved to Github

I’ve been pretty slow to learning how to use github for the management of my open source projects but I’ve finally managed to take the time to at least learn enough so that I could move my main projects over. I don’t know that I’ve got everything done exactly the way it should be as I kind of stumbled through it a bit but I think what I’ve got so far should be a good start for anyone who wants to download the releases or see or contribute to the source.

You can find all of my repositories by going to my profile page:

As of right now I just have three repositories:

  • PowerShell-MamlGenerator:
    • This is what I use to dynamically generate the help files for my SharePoint cmdlet projects. I took some time to pull this out into its own solution so that the other projects simply have a dependency on the compiled assembly and don’t require you to pull down the source to the project.
  • PowerShell-SPCmdlets:
    • This repository contains the source code for the SharePoint 2010 and SharePoint 2013 cmdlets. On the releases page you can find the downloads for the various Foundation and Server WSPs.
  • PowerShell-SPOCmdlets:
    • This repository contains the source code for my SharePoint Online cmdlets. The releases page contains the download for the installer.

If you have direct links to any of the old downloads then those links will no longer work as I’ve removed the files to make sure folks are getting the latest version.

I welcome any feedback regarding how I can better use github with my projects – as I noted, I’m kind of new to this and I’m eager to learn/improve.

Tagged as: , 2 Comments

Invoke-SPORestMethod: Using the SharePoint Online REST API with PowerShell

Earlier this summer I mentioned that I’d start blogging about some of my scripts that I have in my toolbox and I guess I’m a little behind in that but I guess better late than never. I’ve had snippets of code that I could use to make REST based calls to SharePoint Online for a while, some of them were little snippets of PowerShell and some were embedded deep in my custom cmdlets or some other application that I’ve created over time. I recently stumbled across an article by Vadim Gremyachev where he provided his script for making a REST call using PowerShell and decided that I’d go ahead and provide my version of that same script (which I tweaked to add a couple minor parameters that Vadim included in his).

Before I walk through the key points of the script let me first share it with you:


The first thing you have to know about working with SharePoint Online REST endpoints is that you have to get connected to the service. This is done by passing your credentials (username and password) to a new instance of the Microsoft.SharePoint.Client.SharePointOnlineCredentials object. This object handles the conversion of our simple username and password to the appropriate authorization token and cookies necessary to establish the connection (and it is these additional complexities which makes it not possible for us to utilize the Invoke-RestMethod PowerShell V3 native cmdlet). Because I don’t want to have to deal with providing the username and password every time I run a command or otherwise remembering what variable I stored the credentials in I chose to utilize a global variable ($global:spoCred) which I check in the Begin block of the function and if it’s not set then I prompt for the credentials and then set the variable. I also chose to create another set of functions, Set-SPORestCredential and Clear-SPORestCredential, to make it easier to work with this variable (so you can explicitly set the variable using the Set-SPORestCredential function or clear the variable using the Clear-SPORestCredential function). Using this approach I can set my credentials just the once and then run my REST calls over and over without having to worry about reauthorizing.

In order to utilize the Microsoft.SharePoint.Client.SharePointOnlineCredentials object you have to either load the Microsoft.SharePoint.Client.dll and Microsoft.SharePoint.Client.Runtime.dll assemblies or you can take the approach I did which was to assume that the Microsoft SharePoint Online Cmdlets are installed and simply load the Microsoft.Online.SharePoint.PowerShell module which results in the types I need being loaded into memory. So if you don’t have the module installed then my function will throw an error stating as such.

From there the rest is pretty straightforward as I just use a System.Net.WebRequest object and set the appropriate variables and header values to make the request. Once I get a result back I check the expected verbosity result (a new JSON Lite feature) and use the ConvertFrom-Json cmdlet to output the results as a PSObject. Note that this is a pretty simple implementation in that I’m assuming the results that come back are a JSON object and not something like a binary stream as the result of retrieving a file or something. For a solution that is more complex and handles binary objects in the return use my cmdlet by the same name (Invoke-SPORestMethod) included with my SharePoint Online custom cmdlets download. Also, the –Metadata parameter is another simplistic approach which and doesn’t take into account some of the more complex scenarios – my custom cmdlet replaces the –Metadata parameter with a –Body parameter (though it’s aliased as Metadata) and handles a wider range of possibilities for posting data to the REST endpoint.

Because the SharePoint Online REST API is able to grow quickly without the need for Microsoft to post updated installs, such is the case with the CSOM library, the need to utilize this API for automating tasks using PowerShell will grow just as quickly if not more so. Because of this it is extremely valuable to have a quick and easy way to make RESTful calls without having to recreate all this plumbing every time. It is because of this that I created this script and, more specifically, why I added the equivalent yet more robust version of the script as a custom cmdlet. So in summary, I strongly recommend you get used to using the REST API and this script and equivalent cmdlet will make it that much easier for you to do so.



“_ULS_EXPAND_FORCED_LOGGING_MESSAGES_” Environment Variable Explained

Have you ever been looking through the SharePoint ULS logs trying to troubleshoot one issue or another and come across entries such as this:

[Forced due to logging gap, Original Level: <TraceLevel>] <Some message> {0}

I was recently working with a client who noticed an message like this in the ULS logs and was baffled by the fact that the {0} was not being replaced with the relevant data referred to in the message text. I ‘d noticed this in the past but never bothered looking into it to figure out what was up with it so I decided to crack open Reflector and poke around a bit.

What I found was that there was a method, SendTraceData(), which is part of the Microsoft.SharePoint.Diagnostics.ULS class and it is within this method that the [Forced due to logging gap…] message is appended to the string that is to be written to the ULS logs (this happens when the time between log writes exceeds the default 50ms). Within the routine that is adding this string there is a test to see if the ExpandForcedLoggingMessages property is false, and if it is then set whatever data was provided to null, thereby clearing the relevant information that would be used in any subsequent string.Format() call to add the data to the message. Here’s a screenshot showing that code so you can see what I mean:

private static void SendTraceData()

So the trick to figuring out why the {0} is not being replaced with the appropriate value is to understand how that ExpandForcedLoggingMessages property is set. I did some more digging with the Reflector analyze tool and found that in the CompleteInit() method of the same Microsoft.SharePoint.Diagnostics.ULS class there is code which sets the property based on the value of the _ULS_EXPAND_FORCED_LOGGING_MESSAGES_ environment variable:

private static unsafe void CompleteInit()

This variable, by default, is not set (and I can’t find one lick of documentation about it) but if we create the variable and set it to a value of true and reboot the server we will find that suddenly the values are being populated (or expanded) on that server (for this to take affect farm wide you need to do this on each server in the farm).

To set the variable you can either do it manually via the computer properties or you can use this simple PowerShell command:

[System.Environment]::SetEnvironmentVariable("_ULS_EXPAND_FORCED_LOGGING_MESSAGES_", "true", "Machine")


Don’t forget to reboot the machine after you set the variable.

Because I didn’t want to wait for a particular event to occur to validate that this actually worked I decided to create a simple test script which called the SendTraceData() method. Because the containing class is internal I had to use some reflection to make this work but you shouldn’t need to run this code – I only include it here for completeness to show how I validated that the change worked as expected:

$bindings = @("InvokeMethod", "NonPublic", "Instance", "Static", "Public", "GetField", "GetProperty")
$asm = [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint")
$type = $asm.GetType("Microsoft.SharePoint.Diagnostics.ULS")
$ulscatType = $asm.GetType("Microsoft.SharePoint.Diagnostics.ULSCat")
$ulscat = $ulscatType.GetProperty("msoulscatUnknown").GetValue($null, $null)
$method = $type.GetMethod("SendTraceData", $bindings)
$method.Invoke($null, @($true, [System.UInt32]0, $ulscat, 20, "This is test data: '{0}'.", @("hello world")))


If we look at the ULS logs using ULSViewer we can see our test event got created and the data was preserved:


And now you know how to get that {0} populated with data which may or may not be useful but I personally would rather have more data available when troubleshooting an issue and hate the idea of having possibly key information nullified.

Oh, and while you’re looking at the CompleteInit() method you may have noticed that there’s also a _ULS_MAX_LOGGING_GAP_IN_MILLISECONDS_ environment variable and a _ULS_EXPAND_FORCED_LOGGING_CACHED_MESSAGES_ environment variable. The later is similar to the one I’ve been talking about and is related to messages you see that look like this:

[Forced due to logging gap, cached @ <Date>, Original Level: <TraceLevel>] <Some message> {0}

So if you’re going to enable the one you should probably enable the other as well.

The _ULS_MAX_LOGGING_GAP_IN_MILLISECONDS_ environment variable controls how big, in milliseconds, the maximum logging gap should be. By default the value is 50 milliseconds. I would recommend not changing this value but if you want events written more frequently to the ULS logs then you can lower the value or likewise if you want the events written less frequently you can bump up the value (note that you won’t lose events, it just affects how often SharePoint is writing to the file).

I hope this information helps you with your troubleshooting adventures!




I haven’t really done much blogging lately so I figured I’d try and get back into it by sharing some scripts from my collection. Over the years I’ve created tons and tons of little scripts here and there to help me solve one problem or another – whether it be some upgrade related thing or a content or build migration or just some random snippet to help me figure out what the heck is going on with my code or with SharePoint in general – I’ve got a ton of them just scattered all over the place. So at various points this summer I’m going to *try* and just grab an arbitrary one here or there and clean it up some and post it so others can benefit (and also to force me to clean up some of my crap).

The script that I want to share today I created a couple of years ago to help me troubleshoot an issue I was having with a custom delegate control registration. The problem ended up being resolved easily enough as it was simply an issue of scope – the control was registered at one scope but expected at another. Of course, at the time this didn’t really make sense as the scope that should have been expected was what I was providing, turns out that there’s what I would characterize as some bad design decisions within the SharePoint product so what I expected wasn’t what SharePoint wanted. Anyways, I digress – to help me isolate the issue I threw together a quick little script which would dump out the control registrations so that I could validate that my control was in fact being registered and being registered at the scope I expected. I could also use the script to see what was the “top” registration, or specifically which control would actually win and get its code called and rendered on the page, and finally, I could use this to see what other controls I was boxing out for the win (or loss). The script below is a much cleaned up version of my quick and dirty hack but it does the same thing, just with comments and a little more flexibility with the use of the SPWebPipeBind type. Note that to do the job you have to use an internal method of the Microsoft.SharePoint.SPElementProvider class but as it’s just a simple query I’m personally not concerned about the use of the reflection to make it work.

So with that, here’s my little Get-SPControlElement function:



   Retrieves the control elements registered for a given delegate control ID.


   Retrieves the control elements registered for a given delegate control ID.


   Get-SPControlElement -Web "http://demo" -ControlId "QuickLaunchDataSource" -TopOnly


   Get-SPControlElement -Web "http://demo" -ControlId "QuickLaunchDataSource" -Scope "Web"


function Get-SPControlElement {


    Param (

        # Specify the URL of an existing Site or an existing SPWeb object.

        [Parameter(Mandatory = $true,

                   ValueFromPipelineByPropertyName = $true,

                   ValueFromPipeline = $true,

                   Position = 0)]




        # The ID of the delegate control to return instances of.

        [Parameter(Mandatory = $true,

                   ValueFromPipelineByPropertyName = $true,

                   ValueFromPipeline = $false,

                   Position = 1)]




        # If specified, query for the top element only. If not specified, show all controls registered for the control ID.

        [Parameter(Mandatory = $false,

                   Position = 2)]




        # The scope to search for. Valid values are "Farm", "WebApplication", "Site", and "Web". To show controls registered at all scopes omit the parameter or pass in a $null value.

        [Parameter(Mandatory = $false,

                   Position = 3)]





    Begin {

        $bindings = @("InvokeMethod", "NonPublic", "Instance", "CreateInstance", "Public")

        $asm = [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint")

        $type = $asm.GetType("Microsoft.SharePoint.SPElementProvider")

        $constructor = $type.GetConstructor($bindings, $null, @(), $null)

        $provider = $constructor.Invoke(@())


    Process {

        $spWeb = $Web.Read()

        if ($TopOnly) {

            $method = $type.GetMethod("QueryForTopControlElement", $bindings)

        } else {

            $method = $type.GetMethod("QueryForControlElements", $bindings)


        $method.Invoke($provider, @([Microsoft.SharePoint.SPWeb]$spWeb, $Scope, $ControlId))



    End {





And here’s a simple example demonstrating how to call the function (note that I put the code in a file named Get-SPControlElement.ps1 and then I dot source the file to load the function into memory – you could scope the function globally to avoid the need to dot source if you want):


This should work on both SharePoint 2010 and 2013 and I believe with some minor modifications you could make it work with 2007.



Announcing My Custom SharePoint Online Cmdlets

For quite a while now I’ve been pointing out the lack of cmdlets that are available for working with Office 365 and SharePoint 2013 (SharePoint Online) and I’ve mentioned several times that someone should really do something about that and that I’d love to be that person if only I had the time. Well, as it turns out, over Christmas break I managed to find some free time so I went ahead and got started on my own custom cmdlets for SharePoint Online which I’m now officially releasing with this post.

If you refer back to my earlier post where I detailed the SharePoint Online cmdlets provided by Microsoft, you’ll note that there are currently only 30 cmdlets – with my new offering I just about double that number by adding an additional 27 cmdlets. I’m only going to briefly outline by design goals and a couple simple examples in this article so I encourage you to visit the downloads page of my site to download the installer and/or source code for the project and for details about each cmdlet you should go to the command index page (click the SharePoint Online tab to see the cmdlets).

In creating the cmdlets I had two core goals that I wanted to achieve: first I wanted to add some basic retrieval functionality for common objects such as SPWeb and SPList objects so that I could do some simple discovery type operations just as I would with SharePoint on-premises cmdlets; second, I wanted to make working with the cmdlets and the objects returned by them much more like working with the on-premises API so that you don’t have to think about calling the Load and ExecuteQuery methods every time you try to do something. To accomplish these goals I had to not only create the actual cmdlets themselves (which was the easy part) but I also had create custom wrapper objects for several of the objects in the Microsoft.SharePoint.Client namespace. For example, when you call my Get-SPOWeb cmdlet what you get back is an SPOWeb object, not a Microsoft.SharePoint.Client.Web object. By taking this approach I an avoid the common uninitialized property errors that you’d see when you output the object to the PowerShell console. These errors makes working with the native objects nearly impossible as it requires you to know which properties are initialized and which ones aren’t – by creating a wrapper object I can hide this issue and make working with the objects very simple. I can also do things like create an Update method which handles the calling of the ExecuteQuery method as well as the refresh of the object, again, taking a lot of the nuances of working with the .NET CSOM API out of the picture. For those of you not familiar with working with the .NET CSOM API (and for those that are) the point to take away from all my technical babbling is that using my cmdlets will make working with SharePoint Online very similar to working with SharePoint on-premises.

Let’s take a look at how we can use my cmdlets. First off you need to download and install the cmdlets – I’ve created the cmdlets so that they work just like the Microsoft provided SharePoint Online cmdlets in that I just created a PowerShell V3 module manifest which references a binary assembly (dll) that contains all the cmdlets. All the installer does is put all the required files (dependent Microsoft.SharePoint.Client dlls, my assembly containing the cmdlets themselves, and the PowerShell help and format file) into an appropriate directory and then adds the path to that directory to the PSModulePath environment variable. This makes cmdlets available without having to manually load the module and is just like what the Microsoft SharePoint Online cmdlets do. Note that Microsoft provides a shortcut for the SharePoint Online Management Shell which is simply a PowerShell console with the module explicitly loaded but use of the management shell is completely unnecessary so I don’t bother creating an equivalent shortcut. To see that the module is installed correctly simply open any PowerShell console and run the Get-Module cmdlet passing in the -ListAvailable parameter. You should see the module listed as shown below:

Get-Module -ListAvailable

With the cmdlets installed you can now connect to a Site Collection using the Connect-SPOSite cmdlet. Note that the majority of my cmdlets are scoped to a single Site Collection and you must establish a connection to that Site Collection before you can do anything else; if you need to work on a different Site Collection then you’ll need to call Connect-SPOSite again.  In this example I’m connecting to a site by providing my username and the Site Collection to connect to:

Connect-SPOSite -Credential "" -Url ""


I recommend you actually create a variable to store you credential information ($cred = Get-Credential) so that you can reuse the information each time you connect to a different Site Collection. Oh, and as a bonus, though the default behavior is to connect to SharePoint Online, if you pass in the –NetworkCredentials switch parameter then you can use these cmdlets against SharePoint on-premises – you’re welcome. To kill a connection use the Disconnect-SPOSite cmdlet (I particularly recommend you do this when creating scripts, not much point if you’re just doing something one-off in the console but if you writing a script you want to make sure you kill your context so someone can’t come behind you and hijack your credentials – this is absolutely the same pattern that the Microsoft SharePoint Online cmdlets follow so if you’re familiar with them this should not be new to you).

Once you’ve established a connection you’re ready to start using the other cmdlets. If you’d like to retrieve the root Site (SPWeb) or a child Site you can use the Get-SPOWeb cmdlet as shown:

$rootWeb = Get-SPOWebIdentity "/" -Detail

$childWeb = Get-SPOWebIdentity "/test" -Detail


Note the –Detail parameter which is useful when you want to see as much information as possible. If you’re just looking for a list Sites then I would omit the parameter so that you’re bringing back less information which should help it to run a little faster (to see all the Sites omit the –Identity parameter).

When you get back an object I encourage you to use the Get-Member cmdlet (aliased as gm) to see the list of properties and methods available. All of my cmdlets will return my own SPO* wrapper object you can always get back to the original Microsoft.SharePoint.Client object via a property on my object. The fact is I couldn’t wrap every single sub-object or every single property and method so there will be times when you’ll have to use the native object but hopefully what I did provide will at least cover a 80% of the common use cases (I hope). The following screenshot shows the members for the SPOWeb object:


Notice the highlighted Web property which you can use to get to the original Microsoft.SharePoint.Client.Web object. If you need to update any properties on the object you can either use the Set-SPOWeb cmdlet or update the property directly on the object and then call the Update() method – no need to call ExecuteQuery and then refresh the object, it’s all handled for you.

I don’t want to spend time going through every cmdlet and object that I’ve created as I detail each of the cmdlets on my command index page and the object members are easily discovered using the Get-Member cmdlet so to wrap up I want to simply point out that this is definitely a work in progress and will hopefully grow over time. Also, I don’t claim to be a .NET CSOM expert so there may be some areas that can be improved upon from a performance point of view so if you download the code and see something that can be improved please share you feedback, or if you see anything else that can be improved again, please share (note that I’m absolutely horrible when it comes to responding to comments on my blog but I do eventually get to them all, just not necessarily very quickly).

Have fun and happy PowerShelling!



SharePoint 2013 Version

So today I was doing some SharePoint 2013 app development against my Office 365 SharePoint 2013 tenant and I needed to view the HTTP traffic from the site in order to troubleshoot some issues I was having and I stumbled across something I found very interesting when I looked at the header details in Fiddler:


Yup, that’s right – my tenant is on SharePoint 16 – not 15 which is the current public release of SharePoint. To confirm that this wasn’t just something with the headers I navigated to and there it is again:


(In case you’re wondering, the service.cnf file hard codes the version information for the current web application and if you have an on-prem version you can see it in the _vti_pvt folder of you local web application’s site – you can also look at _vti_pvt/buildversion.cnf to get the actual build number).

Another confirmation test I hadn’t thought of but James Love (@jimmywim) pointed out to me was that the page layouts within the site are also referencing assemblies. I downloaded the DefaultLayout.aspx file to confirm and this is what I see:

<%@ Page language="C#"   Inherits="Microsoft.SharePoint.Publishing.PublishingLayoutPage, Microsoft.SharePoint.Publishing,Version=,Culture=neutral, PublicKeyToken=71e9bce111e9429c" %>

So you may be asking yourself – what does all this mean? Has Microsoft upgraded my SharePoint Online Farm to the next version of SharePoint? Well, I honestly don’t know but I think all the evidence is pretty clear that they must have. In terms of what is different, I simply couldn’t tell you as I’ve not noticed anything different – perhaps you have? If so post a comment as I’d love to know!

I personally don’t like speculating about what this means and what Microsoft is doing so for me what is important about this information is the fact that I/you might have clients or customers running on bits that are not just different due to patch levels but are in fact on an entirely different version. As I’m working on SharePoint Apps that will be sold in the App Store this type of information is critical when it comes to troubleshooting problems. So until Microsoft releases some public information about what’s up with all of this, the best we can do is keep the information handy as we work on various projects and, hopefully, keep the discussions going so that as folks discover version related differences we communicate that information to each other for our global benefit.



My First Pluralsight Course

Throughout the years I've done numerous presentations related to using PowerShell with SharePoint and through them all I've often found myself wanting more time so that I could share more details. Recently some friends of mine pointed me in the direction of Pluralsight and the opportunities they had for new authors – I saw this as a fantastic way to take many of the presentations I've done in the past and polish them up, add some details, and create a real course out of them. So with that I'm pleased to announce that my first Pluralsight course is now available: Using Windows PowerShell with SharePoint 2010 and SharePoint 2013. The folks at Pluralsight were amazing to work with and going through the authoring process has given me a whole new level of respect for the existing authors and for Pluralsight as a company; they seem to be doing everything right and every Pluralsight employee I've interacted with exemplifies professionalism and dedication to the goal of producing a great product, and I am humbled by the acceptance of my course into their library.

So if you're a developer or an administrator and you're working with SharePoint then I strongly recommend you go through some or all of my course to help you better understand how to use PowerShell, and more specifically, how to use it with SharePoint. You can see the full course outline by following the previous link but I've gone ahead and included the course and module descriptions here for your direct reference:

Using Windows PowerShell with SharePoint 2010 and SharePoint 2013

When it comes to administering and automating SharePoint 2010, 2013, and Office 365, there is no better tool than Windows PowerShell. After going through this course you'll have the skills and knowledge necessary to be productive with PowerShell. In the first two modules you'll get a jump start into PowerShell where you'll learn everything from basic syntax to creating functions and scripts, all within the context of SharePoint. Next you'll discover what's new when it comes to using PowerShell V3 with SharePoint 2013. Administering SharePoint with PowerShell does not mean that you're limited to what you can do directly on the server and in this course you'll learn everything you need to know to manage your Farm remotely, whether you are using Office 365 or an on-premises installation. And finally, sometimes the out of the box cmdlets just aren't enough so we'll teach you how to create your own custom cmdlets that you can deploy to your SharePoint Farm. After completing this course you'll be on your way to becoming a SharePoint superstar as you'll have all the core knowledge you need to start administering and automating SharePoint using Windows PowerShell.

  1. Introduction to PowerShell for SharePoint

    This module focuses on the basics of Windows PowerShell, all with an emphasis on SharePoint. At the conclusion of this module, you should have enough basic knowledge to start working with SharePoint via the SharePoint Management Shell.

  2. Scripting with PowerShell & SharePoint

    This module builds on the foundations presented in the first module and gets beyond what is typically done in the console. During this module you'll start by learning about conditional logic and looping and then move onto creating functions and scripts.

  3. PowerShell V3 + SharePoint 2013

    In this module you'll learn about many of the new features offered by Windows PowerShell V3 with SharePoint 2013.

  4. PowerShell & Office 365

    In this module we'll shift from SharePoint on-premises to SharePoint Online. You'll learn how to connect to your SharePoint Online tenant and what you can and can't do with the available SharePoint Online cmdlets.

  5. PowerShell Remoting

    In this module, we'll switch gears back to on-premises SharePoint installations as we take a look at how you can use PowerShell from your client machine to remotely connect to and manage your on-premises SharePoint Farm.

  6. Creating Custom Cmdlets for SharePoint

    In this module you'll learn how to extend the out of the box SharePoint cmdlets by creating your own custom cmdlets and PipeBind objects.

I already have some ideas for another course to essentially round out the PowerShell + SharePoint side of things – specifically I'm planning on creating a course that assumes you know PowerShell and now you need to learn how to better apply that knowledge to solve specific problems – so my plan for the next course will be to provide more solution focused education (at least, that's the plan, I'll have to see how this first course does before I commit to anything).

I hope that you find my course useful and please provide feedback (positive or negative) as I'm anxious to know what works and what doesn't so that I can continue to improve and bring better and better stuff to the community.


Parallel SharePoint Tasks with PowerShell

Today I was working on a deployment for a client which entailed activating a custom SharePoint Feature on about 1000 Site Collections. This Feature did a fair number of things and on average it takes about 10-15 minutes to complete in their test environment (which is pretty slow compared to their production environment which I've not yet deployed to but I expect close to a 5 minute run time per Site Collection once I go to production with it). You can obviously do the math and quickly see that it will take me somewhere around 10 days for this to complete if I did one Site Collection at a time. This is just unacceptable as I personally don't want to be monitoring a Feature activation script for that long. What's worse is that when I look at CPU and memory utilization on the servers I can see that they have plenty of resources so it's not like the operation is actually taxing the system, they're just slow operations. So the solution, for me, is pretty obvious: I need to activate these Features in parallel.

There are two ways that I can achieve this using PowerShell and they depend on which version of PowerShell you're using. In my case I'm running SharePoint 2010 which means that I'm using PowerShell V2; because of this my only option is to use the Start-Job cmdlet with some control logic to dictate how many jobs I'm willing to run at once. If I were using SharePoint 2013 I could use the new workflow capabilities of PowerShell V3 thereby making the whole process a lot easier to understand. I'll show both approaches but I want to first start with what you would do for SharePoint 2010 with PowerShell V2.

Using Start-Job for Parallel Operations

The trick with using the Start-Job cmdlet is knowing when to stop creating new jobs until existing jobs have completed. The key is to use the Get-Job cmdlet and filter on the JobStateInfo property's State property and then, if you have reached your job count threshold, call the Wait-Job cmdlet to block the script until a job completes. The following script is a simple example of what I created for my client and can be used as a template for your own scripts:

$jobThreshold = 10

foreach ($site in (Get-SPSite -Limit All)) {
    # Get all running jobs
    $running = @(Get-Job | where { $_.JobStateInfo.State -eq "Running" })

    # Loop as long as our running job count is >= threshold
    while ($running.Count -ge $jobThreshold) {
        # Block until we get at least one job complete
        $running | Wait-Job -Any | Out-Null
        # Refresh the running job list
        $running = @(Get-Job | where { $_.JobStateInfo.State -eq "Running" })

    Start-Job -InputObject $site.Url {
        $url = $input | %{$_}
        Write-Host "BEGIN: $(Get-Date) Processing $url..."

        # We're in a new process so load the snap-in
        Add-PSSnapin Microsoft.SharePoint.PowerShell

        # Enable the custom feature
        Enable-SPFeature -Url $url -Identity MyCustomFeature

        Write-Host "END: $(Get-Date) Processing $url."
    # Dump the results of any completed jobs
    Get-Job | where { $_.JobStateInfo.State -eq "Completed" } | Receive-Job

    # Remove completed jobs so we don't see their results again
    Get-Job | where { $_.JobStateInfo.State -eq "Completed" } | Remove-Job

If you run this script and open up task manager you'll see that it's created a powershell.exe process for each job. You might be able to get away with more processes running at once but I'd recommend starting smaller before you bump it up too high and risk crippling your system.

Using PowerShell V3 Workflow

With PowerShell V3 we now have the ability to create a workflow within which I can specify tasks that should be run in parallel. I actually detailed how to do this in an earlier post so I won't spend much time on it here. I do want to show the code again for the sake of comparison as well as to point out one core difference (I recommend you read the Workflow section of my aforementioned post for more details). First though, here's a slightly modified version of the code so you can compare it to the V2 equivalent:

workflow Enable-SPFeatureInParallel {
    foreach -parallel($url in $urls) {
        InlineScript {
            # Write-Host doesn't work within a workflow
            Write-Output "BEGIN: $(Get-Date) Processing $($using:url)..."
            # We're in a new process so load the snap-in
            Add-PSSnapin Microsoft.SharePoint.PowerShell
            # Enable the custom feature
            Enable-SPFeature -Identity $using:feature -Url $using:url
            Write-Output "END: $(Get-Date) Processing $($using:url)."
Enable-SPFeatureInParallel (Get-SPSite -Limit All).Url "MyCustomFeature"

The first thing you should be asking yourself when you look at this is how many will be processed simultaneously? With the V2 version we could set the limit to whatever arbitrary value made sense for our situation. With this approach, however, we're limited to only 5 processes. You can see this if you run the code and open up task manager where, like the Start-Job approach, you'll see the powershell.exe for each process (note that it's not the workflow that is creating the powershell.exe process, it's the call to the InlineScript activity which is doing it – this call to InlineScript just helps to point out that you'll never see more than five created).


So, though we're limited by the number of processes and there are some downsides in terms of how we output information (like the fact that we can't use Write-Host and any output generated by one run could be intermixed with output from another run) I think the V3 approach is much cleaner and easier to use. That said, you could make the Start-Job approach generic so that you pass in a script to run along with an array of input values so that this could be easily used without having to look at the details of what's happening.


Provisioning Search on SharePoint 2013 Foundation Using PowerShell

There was recently a twitter conversation between @cacallahan, @toddklindt, and @brianlala discussing provisioning Search on SharePoint Foundation and whether it was possible or not and somewhere during the conversation it was suggested that I might know how to do this (sorry guys for not responding immediately) – unfortunately I hadn’t actually done any work with SharePoint 2013 Foundation yet and so had not yet tried and thus didn’t know the answer (I knew there were issues and suspected a workaround was possible but I didn’t have a server built to test anything). Well, last night and today I managed to have some free time so I figured I’d take a look at the problem to see if my guess about a workaround was correct.

Before I get to the results of my discovery let’s first look at what the blocking issue is. It’s actually quite simple – the product team, for various reasons, have decided that for SharePoint 2013 Foundation you can only have one Search Service Application and you shouldn’t be able to modify the topology of the Service Application; this means that when you provision Search using the Farm Configuration Wizard it will create a default topology for you in which all roles are on a single server. So, to enforce these rules they chose to make it so that the PowerShell cmdlets would not allow you to provision the service application or run any method or cmdlet that would otherwise allow you to modify an existing topology (so you can’t change the topology created by the wizard). I totally get the reasoning for the restriction – if you need enterprise topology type structures then pony up the money and get off the free stuff. That said, I think they took the lazy way out by simply blocking the cmdlets when they could have easily put in other restrictions that would have achieved their goals while still allowing users to use PowerShell to provision the environment.

If you’re curious as to what happens when you try to provision the service using PowerShell on SharePoint Foundation 2013 here’s a screenshot which shows the error that is thrown:


This error is thrown by a simple piece of code in the InternalValidate() method of the cmdlet which checks to make sure you are on Standard or Enterprise before allowing the cmdlet to execute (and any other cmdlets or methods that would otherwise affect the topology likewise perform this check).

To solve the problem I decided to start from the perspective of code run via the browser and drill down to see what I could find. So using Reflector I located the class and associated methods that are called by the Farm Configuration Wizard; this quickly led me to the public Microsoft.Office.Server.Search.Administration.SearchService.CreateApplication() static methods. So I did a quick test calling one of these methods and I was happy to find that the Search Service Application created perfectly – though there was one minor problem: the topology was empty. At first glance I figured this wouldn’t be an issue – I could simply clone the topology and add my components – unfortunately this is where I learned that they applied the SKU check to methods and cmdlets that would allow you to manipulate the topology. (On a side note, using these methods for Standard or Enterprise is potentially a great alternative to the New-SPEnterpriseSearchServiceApplication cmdlet as it lets you specify the names of databases that you can’t specify when using the cmdlet and because it creates an initially empty topology there’s less cleanup and manipulation of the cloned topology (assuming you don’t want to use what’s created) and it provisions slightly faster because it does less). So at this point I figured I’d hit the real road block – I could create the service application but it was useless as I couldn’t manipulate it.

This left me with only one option – to use reflection to call the internal method that the Farm Configuration Wizard calls to provision the service application. Now, before I get to the code that demonstrates how to do this I need to share a word of caution – using reflection to call internal methods is totally not supported. So what does this mean? Will Microsoft no longer support your Farm? Well, my understanding (and folks in the know please correct me if I’m in the wrong) is that Microsoft will continue to support you and that you will simply have to remove unsupported code before they will help you troubleshoot issues. Well, in this case it’s a one-time operation so there’s nothing really to remove; I figure the worst case scenario is that they’ll tell you that you need to recreate the service application using the Farm Configuration Wizard and then they’ll help you with your issue. But let’s take the question of supportability out of the equation for a second and look at it from a completely practical standpoint – if you were to look at the code that the Farm Configuration Wizard calls you’d see that, outside of some error checking and data validation and variable initialization, there’s effectively just two lines of code that do the provisioning of the service so I believe that the probability of getting it wrong is pretty low and the fact is search will either work or it won’t so if it doesn’t work then try again or just use the dang wizard. So, with all that said, if you decide to use any of this code you need to weigh the risks yourself and make an informed decision with those risks in mind. Alright, enough of that crap – you want to see the code so let’s get to the code.

To keep the PowerShell itself nice and simple I decide to derive this example from a script that Todd Klindt provides on his blog (the script I use is considerably more complex as it handles the changing of service options like the index folder and the service and crawl accounts, to name a few, and I don’t want the point of this post to be lost in all those details). Just to make sure the full chain of credit is provided I should note that Todd’s script is actually a derivative of what Spence Harbar provides on his blog but I wanted to reference Todd’s post specifically as it’s a bit shorter and more focused on the topic. Okay, background info – check; disclaimer – check; attribution – check – looks like it’s time for some code so here you go:

#Start the service instances
Start-SPEnterpriseSearchServiceInstance $env:computername
Start-SPEnterpriseSearchQueryAndSiteSettingsServiceInstance $env:computername

#Provide a unique name for the service application
$serviceAppName = "Search Service Application"

#Get the application pools to use (make sure you change the value for your environment)
$svcPool = Get-SPServiceApplicationPool "SharePoint Services App Pool"
$adminPool = Get-SPServiceApplicationPool "SharePoint Services App Pool"

#Get the service from the service instance so we can call a method on it
$searchServiceInstance = Get-SPEnterpriseSearchServiceInstanceLocal
$searchService = $searchServiceInstance.Service

#Use reflection to provision the default topology just as the wizard would
$bindings = @("InvokeMethod", "NonPublic", "Instance")
$types = @([string], [Type], [Microsoft.SharePoint.Administration.SPIisWebServiceApplicationPool], [Microsoft.SharePoint.Administration.SPIisWebServiceApplicationPool])
$values = @($serviceAppName, [Microsoft.Office.Server.Search.Administration.SearchServiceApplication], [Microsoft.SharePoint.Administration.SPIisWebServiceApplicationPool]$svcPool, [Microsoft.SharePoint.Administration.SPIisWebServiceApplicationPool]$adminPool)
$methodInfo = $searchService.GetType().GetMethod("CreateApplicationWithDefaultTopology", $bindings, $null, $types, $null)
$searchServiceApp = $methodInfo.Invoke($searchService, $values)

#Create the search service application proxy (we get to use the cmdlet for this!)
$searchProxy = New-SPEnterpriseSearchServiceApplicationProxy -Name "$serviceAppName Proxy" -SearchApplication $searchServiceApp

#Provision the search service application


Basically there’s two things that need to be done: first we need to use reflection to get the MethodInfo object for the CreateApplicationWithDefaultTopology() method of the Microsoft.Office.Server.Search.Administration.SearchService class and we’ll use this object to invoke the actual method, passing in the parameter types and values (and yes, the cast of the SPIisWebServiceApplicationPool objects is necessary otherwise you’ll get an error about trying to convert PSObjects to SPIisWebServiceApplicationPool types); the next thing we need to do, after the service application is created, is to create the service application proxy and then call the Provision() method on the search service application that we previously created (if you miss this step you’ll get errors about things like the admin component not be started and whatnot).

Once completed you’ll get a fully functional, PowerShell provisioned search service application. If you navigate to the search administration page you should see something that looks just like this (just like if you used the wizard):


So there you have it – it is indeed possible to provision the service using PowerShell – I’ll let you determine whether you should or not :)

Happy PowerShelling!