infopath, sharepoint

InfoPath vs SharePoint 2013 Web Services

If like me you have spent the past couple of hours struggling to get past 401 Unauthorized errors when attempting to call SharePoint webservices via InfoPath, take a look at the following blog post from Susan Hernandez.

There are a lot of crazy posts out their explaining that InfoPath doesn’t understand the fully claims based model of authentication that comes as standard in SharePoint 2013. Some of them also go on to describe some fairly dubious approaches to getting round the problem. Susan has done a fantastic post that explains how you can use a combination of a Secure Store entry and a data connection file to get yourself authenticated with SharePoint 2013 web services. The approach described seems far more sensible than some of the others:

https://suehernandez.wordpress.com/2013/10/11/sharepoint-2013-infopath-claims-getuserprofilebyname

I haven’t fully tested the ins and outs of this approach. I’m not entirely sure if it can be used to call a web service where the calling user’s identity is important to the service result.

For web service calls where the caller’s identity is not relevant then this approach did the trick for me

Advertisements
app fabric, sharepoint

Fix: Faulting application name: DistributedCacheService.exe

You might start getting issues with your SharePoint farm’s Distributed Cache if you resize a VM in the Cache cluster. We recently upped the VM memory and core count for a couple of machines that were acting as SharePoint 2013 front ends. As soon as we did this the App Fabric Service would crash immediately on start up throwing errors such as:

Faulting application name: DistributedCacheService.exe, version: 1.0.4632.0, time stamp: 0x4eafeccf
Faulting module name: KERNELBASE.dll, version: 6.3.9600.17415, time stamp: 0x54505737
Exception code: 0xe0434352
Fault offset: 0x0000000000008b9c

Or the following

AppFabric Caching service crashed with exception {System.ArgumentException: An entry with the same key already exists.
at System.Collections.Generic.TreeSet`1.AddIfNotPresent(T item)
at System.Collections.Generic.SortedDictionary`2.Add(TKey key, TValue value)

I’m not going to relist the solution here, but the answer in our case was to export and slightly modify the cluster configuration as described by this life saver of a post:

http://codebender.denniland.com/sharepoint-server-2013-issue-appfabric-distributed-cache-service-crashes/

Like the original poster, I’m still not quite clear on why the service crashed or why tweaking those seemingly innocuous parts of the config would suddenly bring it back to life.

I will be bearing this in mind though as it seems there is a small “tax” to be paid whenever you change certain types of virtual hardware on a machine running the App Fabric Service

.net, iis, sharepoint

Microsoft SharePoint is not supported in 32-bit process. Please verify that you are running in a 64-bit executable.

You might get the following error if you attempt to create a 32 bit application that references the 64 bit SharePoint dlls such as Microsoft.SharePoint.dll etc

Microsoft SharePoint is not supported in 32-bit process. Please verify that you are running in a 64-bit executable.

In a console or Windows application the fix to this is fairly straight forward – just make sure your Visual Studio project targets Any CPU or force it to have a Platform Target of 64 bit.

Resolving this during Visual Studio development with¬†IIS Express is a little more challenging. As far as I can tell IIS express runs as 32 bit process by default. This means it really won’t like them there 64 bit SharePoint DLLs.

The fix is to force IIS Express to run as a 64 bit process by making sure the following registry key is set to 1:

IIS Express as 64 bit

I hope that saves someone the 30 mins I just spent!

sharepoint, sharepoint 2013

Recursively Copy Document Libraries from SharePoint 2010 to SharePoint 2013

We’ve recently been looking at migrating quite a large amount of content from our legacy SharePoint 2010 Farm to a new clean SharePoint 2013 installation. For a variety of reasons we didn’t want to go the whole content database->site collection upgrade approach, not least of all because we really wanted a clean break from what has been quite a naughty SharePoint 2010 farm. The thing is creaking, it hates all forms of human life and is quite possibly haunted. Because of this we wanted to bring the binary file data across and essentially nothing else.

Migrating documents around would seem to be one of those things that an Enterprise class document management platform should have nailed. However it turns out, its not quite as straightforward as you might think.

Perhaps I’m asking too much, but there didn’t seem to be any simple way of saying, “recursively copy all that, and place it there”.

I came across scores of roll your own scripts, C# apps and seriously expensive bits of software that would charge me by the gigabyte. For various reasons most of the approaches I tried didn’t pan out. We either had to spend a ton of cash to get a pro piece of software to (presumably) do what we needed, or we could spend a lot of time writing our own recursive code via either Powershell or C# and attempt to work around the fact that there is no real recursive copy function in the SharePoint APIs – at least not one that I found.

The approach I ended up going with is undoubtedly brute force, but has been extremely effective. The process is essentially a powershell script that does the following:

  1. Loop through all the target document libraries at the source site
  2. Dynamically create a *mapped drive* to both the Source and Destination locations
  3. Robocopy the files between the two mapped folders – taking advantage of the fact that robocopy doesn’t care that this is sharepoint behind the scenes and handles recursive copy like a walk in the park
  4. Wonder why that wasn’t considerably easier

There is one caveat here – because we were rearranging our library structure, I had already pre-created the destination document libraries. You may well not be in this scenario, in which case you will need to tweak the script to potentially create a document library at the correct location as you go. This would probably be a one line change.

The following script will need to be tweaked to get your source and destination library locations lining up. It won’t work right off the bat, but I did want to provide as a sample to demonstrate that powershell + robocopy can be used to migrate a large amount of content as it took me waaaayy to long to get to this point (thanks MS)


cls

$currentSourceDriveMapping = ""
$currentDestinationDriveMapping = ""

$sourceWeb = Get-SPWeb "http://mysourcelocation"
$destinationWeb =  Get-SPWeb "http://mydestinationsite"

Write-Host "Connected to $sourceWeb"
Write-Host "Connected to $destinationWeb"

Write-Host "Releasing mapped drives"

$sourceLibraries = $sourceWeb.Lists

Write-Host $sourceLibraries.Count + " document libraries found at source location"

$listTemplate = [Microsoft.SharePoint.SPListTemplateType]::DocumentLibrary

$processedCounter = 1

foreach($currentSourceLibrary in $sourceLibraries){
	
	Write-Host "Mapping Source Library to S drive"
	
	$currentSourceDriveMapping = "http://mysourcelocation" + $currentSourceLibrary.RootFolder.ServerRelativeUrl
		
	Write-Host $currentSourceDriveMapping
	
	# net use will create a mapped drive using the sharepoint location provided 
	net use s: "$currentSourceDriveMapping"
	
	Write-Host "Mapping Source Library to T drive"
		
	# NOTE: Voodoo here that won't apply to you - this made sense in my environment due to restructuring that we were doing
	$currentDestinationDriveMapping = $destinationWeb.Url + "/" + $currentSourceLibrary.Description + "/Tech Notes"
	
	Write-Host $currentDestinationDriveMapping
	
	net use t: "$currentDestinationDriveMapping"
	
	Write-Host "Mapping Source Library to T drive"
	
	# Robocopy all folders apart from the sharepoint specific Forms folder at the root of the source library
	# Note that robocopy is ideal for this as it already implictly handles recursive copying
	robocopy S: T: /e /xd s:\Forms
	
	Write-Host "Releasing mapped drives"
	net use s: /delete
	net use t: /delete
	
	Write-Host $processedCounter " records libraries processed successfully"
	
	$processedCounter++

}

I’ve used this script to migrate about 25GB worth of content in 2 hours. It’s not lightning fast, but it got the job done.

There are likely numerous caveats that may apply to you but didn’t to me. Amongst them would be the fact that any relvant permissions, history, properties will almost certainly not be copied over.

If you just need a fairly straight recursive dump of content though this may give you some pointers

sharepoint

We’re having a problem opening this location in File Explorer

If you get the following on a SharePoint 2013 install when trying to use the “Open in Explorer” feature from the server:

We’re having a problem opening this location in File Explorer. Add this web site to your Trusted Sites list and try again

You may need to add the “Desktop Experience” feature as this is not installed on a server OS by default.

You can follow the instructions below:

http://blogs.technet.com/b/rmilne/archive/2013/07/11/install-desktop-experience-on-windows-server-2012.aspx

It should go without saying that you probably don’t want to do this on a production server, but you may need/want to on a development box

azure, sharepoint

Creating a SharePoint 2013 Farm in Azure in Under 15 Minutes

This is going to be a very quick post that I hope helps a few people who want to be able to spin up a new SharePoint 2013 farm without it becoming an almighty ordeal involving SQL Servers, Domain Controllers and dodgy powershell scripts downloaded from the internet that don’t actually work ūüėČ

One of the coolest features of the new Azure Portal Microsoft has been trialing is the ability to generate entire multi-server environments, preconfigured from a template. Unless I’d missed it, this wasn’t a feature of the old azure portal – at least not as far as SharePoint was concerned. We absolutely could spin up individual machines quite quickly but they were very much isolated and all the networking, DC configuration and product installations needed to be done by hand. This wasn’t much fun and was pretty much missing out on a huge potential win for IAAS.

Check out the new portal here: portal.azure.com

I’ll let you figure out the UI. It took a bit of clicking around but I now much prefer it to the old portal.

You want to find the new “SharePoint Server Farm” option. This simple menu item belies an incredible time saving in setting up a typical SharePoint OnPrem instance for test or demo purposes.

The New

The only thing I want to highlight specifically is the “Enable High Availability” option. That option is both very cool, and very expensive. Tick it and it will create a 9 server, fully resilient farm in under 15 minutes complete with multiple DCs, front ends and domain controllers. I’m not shitting you – just leave your credit card¬†behind the bar though.

Leave it unticked you will get the still perfectly serviceable but much more cost effective 3 server option. SP FE + DC + SQL.

Whip through the wizard and you will very quickly have a full SharePoint 2013 setup that is fully accessible via the internet. The DC, DNS, SQL Server and Private/Public networking will be preconfigured for you and both Central Admin and a single web app will be waiting for you when you log in for the first time.

A couple of tips if you are trying it for the first time:

  • Some of the machine sizings seem a little off to me. I’d encourage you to go through the wizard options carefully and tweak the size of the machines to what you actually need. Some of the machines seemed a little small to me, whilst others¬†seemed overspec’d. Choose carefully as it will have cost implications.
  • The first run through of the wizard actually failed for me. Something to do with a timeout when creating the SQL Server. I got the impression there is a lot of Azure Powershell flying about to create and configure these machines. Perseverance paid off however because once I had nuked the first set of machines, it worked like a charm on the second go.

Really looking forward to more complex templates coming to azure in the future!

approval, nintex, sharepoint

Resending Nintex Approval Emails

We’ve had a couple of situations recently where our Nintex approval emails have gone AWOL and never made it to their intended recipient. Because these approvals can relate to some important and time sensitive activities, its important to keep the approval mechanism running smoothly.

I’m not a Nintex expert but as far as I could tell there is no built in way to resend the approval request email. As a slight workaround however, you can use the Delegate feature shown below to resend the email by “delegating” the workflow task to the user that already owns it.

Note there is no need to delegate the workflow task to yourself and then back to the intended approver.

Delegate Approval