infopath, sharepoint

InfoPath vs SharePoint 2013 Web Services

If like me you have spent the past couple of hours struggling to get past 401 Unauthorized errors when attempting to call SharePoint webservices via InfoPath, take a look at the following blog post from Susan Hernandez.

There are a lot of crazy posts out their explaining that InfoPath doesn’t understand the fully claims based model of authentication that comes as standard in SharePoint 2013. Some of them also go on to describe some fairly dubious approaches to getting round the problem. Susan has done a fantastic post that explains how you can use a combination of a Secure Store entry and a data connection file to get yourself authenticated with SharePoint 2013 web services. The approach described seems far more sensible than some of the others:

I haven’t fully tested the ins and outs of this approach. I’m not entirely sure if it can be used to call a web service where the calling user’s identity is important to the service result.

For web service calls where the caller’s identity is not relevant then this approach did the trick for me

sharepoint, sharepoint 2013

SharePoint 2013: New Document and Edit Document Buttons Not Working

Hi all

If you get a situation in a SharePoint 2013 document library where your New Document or Edit Document buttons seem to do nothing, check if you have the Google Toolbar installed.

In my case, ever though I stopped using the toolbar quite a while ago, it was still there. After disabling the toolbar, the document library buttons started behaving themselves again.

It’s possible the issue is to do with the popup blocked in the Google Toolbar, so if you have your heart set on keeping the toolbar, you may want to have a poke around those settings.

Hope that helps


app fabric, sharepoint

Fix: Faulting application name: DistributedCacheService.exe

You might start getting issues with your SharePoint farm’s Distributed Cache if you resize a VM in the Cache cluster. We recently upped the VM memory and core count for a couple of machines that were acting as SharePoint 2013 front ends. As soon as we did this the App Fabric Service would crash immediately on start up throwing errors such as:

Faulting application name: DistributedCacheService.exe, version: 1.0.4632.0, time stamp: 0x4eafeccf
Faulting module name: KERNELBASE.dll, version: 6.3.9600.17415, time stamp: 0x54505737
Exception code: 0xe0434352
Fault offset: 0x0000000000008b9c

Or the following

AppFabric Caching service crashed with exception {System.ArgumentException: An entry with the same key already exists.
at System.Collections.Generic.TreeSet`1.AddIfNotPresent(T item)
at System.Collections.Generic.SortedDictionary`2.Add(TKey key, TValue value)

I’m not going to relist the solution here, but the answer in our case was to export and slightly modify the cluster configuration as described by this life saver of a post:

Like the original poster, I’m still not quite clear on why the service crashed or why tweaking those seemingly innocuous parts of the config would suddenly bring it back to life.

I will be bearing this in mind though as it seems there is a small “tax” to be paid whenever you change certain types of virtual hardware on a machine running the App Fabric Service

.net, iis, sharepoint

Microsoft SharePoint is not supported in 32-bit process. Please verify that you are running in a 64-bit executable.

You might get the following error if you attempt to create a 32 bit application that references the 64 bit SharePoint dlls such as Microsoft.SharePoint.dll etc

Microsoft SharePoint is not supported in 32-bit process. Please verify that you are running in a 64-bit executable.

In a console or Windows application the fix to this is fairly straight forward – just make sure your Visual Studio project targets Any CPU or force it to have a Platform Target of 64 bit.

Resolving this during Visual Studio development with IIS Express is a little more challenging. As far as I can tell IIS express runs as 32 bit process by default. This means it really won’t like them there 64 bit SharePoint DLLs.

The fix is to force IIS Express to run as a 64 bit process by making sure the following registry key is set to 1:

IIS Express as 64 bit

I hope that saves someone the 30 mins I just spent!

sharepoint, sharepoint 2013

Recursively Copy Document Libraries from SharePoint 2010 to SharePoint 2013

We’ve recently been looking at migrating quite a large amount of content from our legacy SharePoint 2010 Farm to a new clean SharePoint 2013 installation. For a variety of reasons we didn’t want to go the whole content database->site collection upgrade approach, not least of all because we really wanted a clean break from what has been quite a naughty SharePoint 2010 farm. The thing is creaking, it hates all forms of human life and is quite possibly haunted. Because of this we wanted to bring the binary file data across and essentially nothing else.

Migrating documents around would seem to be one of those things that an Enterprise class document management platform should have nailed. However it turns out, its not quite as straightforward as you might think.

Perhaps I’m asking too much, but there didn’t seem to be any simple way of saying, “recursively copy all that, and place it there”.

I came across scores of roll your own scripts, C# apps and seriously expensive bits of software that would charge me by the gigabyte. For various reasons most of the approaches I tried didn’t pan out. We either had to spend a ton of cash to get a pro piece of software to (presumably) do what we needed, or we could spend a lot of time writing our own recursive code via either Powershell or C# and attempt to work around the fact that there is no real recursive copy function in the SharePoint APIs – at least not one that I found.

The approach I ended up going with is undoubtedly brute force, but has been extremely effective. The process is essentially a powershell script that does the following:

  1. Loop through all the target document libraries at the source site
  2. Dynamically create a *mapped drive* to both the Source and Destination locations
  3. Robocopy the files between the two mapped folders – taking advantage of the fact that robocopy doesn’t care that this is sharepoint behind the scenes and handles recursive copy like a walk in the park
  4. Wonder why that wasn’t considerably easier

There is one caveat here – because we were rearranging our library structure, I had already pre-created the destination document libraries. You may well not be in this scenario, in which case you will need to tweak the script to potentially create a document library at the correct location as you go. This would probably be a one line change.

The following script will need to be tweaked to get your source and destination library locations lining up. It won’t work right off the bat, but I did want to provide as a sample to demonstrate that powershell + robocopy can be used to migrate a large amount of content as it took me waaaayy to long to get to this point (thanks MS)


$currentSourceDriveMapping = ""
$currentDestinationDriveMapping = ""

$sourceWeb = Get-SPWeb "http://mysourcelocation"
$destinationWeb =  Get-SPWeb "http://mydestinationsite"

Write-Host "Connected to $sourceWeb"
Write-Host "Connected to $destinationWeb"

Write-Host "Releasing mapped drives"

$sourceLibraries = $sourceWeb.Lists

Write-Host $sourceLibraries.Count + " document libraries found at source location"

$listTemplate = [Microsoft.SharePoint.SPListTemplateType]::DocumentLibrary

$processedCounter = 1

foreach($currentSourceLibrary in $sourceLibraries){
	Write-Host "Mapping Source Library to S drive"
	$currentSourceDriveMapping = "http://mysourcelocation" + $currentSourceLibrary.RootFolder.ServerRelativeUrl
	Write-Host $currentSourceDriveMapping
	# net use will create a mapped drive using the sharepoint location provided 
	net use s: "$currentSourceDriveMapping"
	Write-Host "Mapping Source Library to T drive"
	# NOTE: Voodoo here that won't apply to you - this made sense in my environment due to restructuring that we were doing
	$currentDestinationDriveMapping = $destinationWeb.Url + "/" + $currentSourceLibrary.Description + "/Tech Notes"
	Write-Host $currentDestinationDriveMapping
	net use t: "$currentDestinationDriveMapping"
	Write-Host "Mapping Source Library to T drive"
	# Robocopy all folders apart from the sharepoint specific Forms folder at the root of the source library
	# Note that robocopy is ideal for this as it already implictly handles recursive copying
	robocopy S: T: /e /xd s:\Forms
	Write-Host "Releasing mapped drives"
	net use s: /delete
	net use t: /delete
	Write-Host $processedCounter " records libraries processed successfully"


I’ve used this script to migrate about 25GB worth of content in 2 hours. It’s not lightning fast, but it got the job done.

There are likely numerous caveats that may apply to you but didn’t to me. Amongst them would be the fact that any relvant permissions, history, properties will almost certainly not be copied over.

If you just need a fairly straight recursive dump of content though this may give you some pointers


We’re having a problem opening this location in File Explorer

If you get the following on a SharePoint 2013 install when trying to use the “Open in Explorer” feature from the server:

We’re having a problem opening this location in File Explorer. Add this web site to your Trusted Sites list and try again

You may need to add the “Desktop Experience” feature as this is not installed on a server OS by default.

You can follow the instructions below:

It should go without saying that you probably don’t want to do this on a production server, but you may need/want to on a development box


Hang whilst configuring farm in SharePoint 2013

I’ve recently been getting a new production SharePoint 2013 farm set up at work. The following cost me a couple of hours so I thought I’d share in case it saves anyone else some time.


  • Provisioning a new Central Admin Site either via Powershell or the SharePoint Configuration Wizard seems to hang. The SharePoint Configuration Wizard will tend to hang at Step 3
  • There are no obvious error messages but nothing seems to be happening – low CPU/Mem/Disk usage on all servers
  • You may find the following placed in the SharePoint logs over and over again:
    • “Not running in high contrast, so we will paint the background with our trademarked image”

This was actually being caused by the Behavior Monitoring feature in our Sophos Antivirus. We had followed the MS guidance on excluding various directories from the virus scans and thought we had it covered. Not so – Sophos was silently crippling performance through its behavior analysis module.

I’m not certain but there was some anecdotal evidence from watching some SQL Server activity that Sophos was limiting the nuymber of simultaneous connections the SharePoint config wizard was being allowed to open at one time.

With Sophos on, only 2 -3 queries would complete per second on the SQL Server. With Sophos uninstalled performance jumped to 10s/100s queries per second. If you are finding similar behaviour where query throughput during large portions of the install is in the toilet, have a look into any antivirus or threat prevention tools you might be running

We havent yet experimented with just disabling the behaviour analysis feature rather than fully uninstalling Sophos so far