A Clever-workaround for Saving Photos to SharePoint from PowerApps

Send to Kindle

At the time of writing, a common request for PowerApps is to be able to able to upload photos to SharePoint. It makes perfect sense, especially now that its really easy to make a PowerApp that is bound to a SharePoint list. Sadly, although Microsoft have long acknowledged the need in the PowerUsers forum, a solution has not been forthcoming.

I have looked at the various workarounds, such as using the OneDrive connector or a custom web API, but these for me were fiddly. Thanks to ideas from John Liu, I’ve come up with a method that is more flexible and less fiddly to implement, provided you are okay with a bit of PowerShell, and (hopefully) with PnP PowerShell. One advantage to the method is that it handles an entire gallery of photos in a single transaction, rather than just a single photo at a time.

Now in the old days I would have meticulously planned out a multi-part series of posts related to a topic like this, because I have to pull together quite a lot of conceptual threads into a single solution. But since the pace of change in the world of Office365 is so rapid, my solution may be out of date by the time I publish it. So instead I offer a single summary post of my solution and leave the rest to you to figure out.  Sorry followers, its just too hard to do epic multi-part articles these days – times have changed.

What you need

  1. An understanding of JSON and basic idea of web services
  2. An azure subscription
  3. Access to Azure functions
  4. The PnP Powershell cmdlets
  5. A Swagger file (don’t worry if this makes no sense now)
  6. To be signed up to PowerApps

 How we are going to solve this…

In a nutshell, we will create an Azure function, using PowerShell to receive photos from PowerApps and uploads them to a SharePoint library. Here is my conceptual diagram that I spent hours and hours drawing…

Snapshot

To do this we will need to do a few things.

  1. Customise PowerApps to store photo data in the way we need
  2. Create and configure our Azure function
  3. Write and test the PowerShell code to upload to SharePoint
  4. Create a Swagger file so that PowerApps can talk to our Azure function
  5. Create a custom PowerApps connection/datasource use the Azure function
  6. Test successfully and bask in the glory of your awesomeness

Step 1: Customise PowerApps to store photo data in the way we need

Let’s set up a basic proof of concept PowerApp. We will add a camera control to take photos, a picture gallery to view the photos and a button to submit the photos to SharePoint. I’ll use the PowerApps desktop client rather than the web page for this task and create a blank app using the Phone Layout.

image

From the Insert menu, add a Camera control from the Media dropdown to add it to the screen. Leave it up near the top…

image

From the Insert Menu, add a Gallery control. For my demo I will use the vertical gallery. Move it down below the camera control so it looks like the second image below.

image  image

From the Insert Menu, add two buttons below the gallery. Set the text property on one to “Submit” and the other to “Clear”. I realise the resulting layout will not win any design awards but just go with it. Use the picture below to guide you.

image    image

Now let’s wire up some magic. Firstly, we will set it up so clicking on the camera control will take a photo, and save it to PowerApps storage. To do this we will use the Collect function. Assuming your Camera control is called “Camera1”, select it , and set the OnSelect property to:

Collect(PictureList,Camera1.Photo)

image

Now when a photo is taken, it is added to an in-memory PowerApps data-source called PictureList. To see this in action, preview the PowerApp and click the camera control a couple of times. Exit the preview and choose “Collections” from the left hand menu. You will now see the PictureList collection with the photos you just took, stored in a field called Url. The reason it is called URL and not “Photo” will become clear later).

image

Now let’s wire up the clear button to clear out this collection. Choose the button labelled “Clear” and set its OnSelect property to:

Clear(PictureList)

image

If you preview the app and click this button, you will see that the collection is now empty of pictures.

The next step is to wire up the Gallery to the PictureList collection so that you can see the photos being taken. To do this, select the gallery control and set the Items property to PictureList as shown below. Preview this and you should be able to take a set of photos, see them added to the gallery and be able to clear the gallery via the button.

image

image

Now we get to a task that will not necessarily make sense until later. We need to massage the PictureList collection to get it into the right format to send to SharePoint. For example, each photo needs a filename, and in a real-world scenario, we would likely further customise the gallery to capture additional information about each photo. For this post I will not do this, but I want to show you how you can manipulate data structures in PowerApps. To do this, we are going to now wire up some logic to the “Submit” button. First I will give you the code before I explain it.

Clear(SubmitData);
ForAll(PictureList,Collect(SubmitData, { filename: "a file.jpg", filebody: Url }))

image

In PowerApps, it is common to add multiple statements to controls, separated by a semicolon. Thus, the first line above initialises a new Collection I have called “SubmitData”. If this collection already had data in it, the Clear function will wipe it out. The second line uses two functions, ForAll and the previously introduced Collect. ForAll([collection],[formula]) will iterate through [collection] and perform tasks specified in [formula]. In our case we are adding records to the SubmitData collection. Each record consists of two fields and is in JSON format – hence the curly braces. The first field is called filename and the second is called filebody. In my example the filename is a fixed string, but filebody grabs the Url field from the current item in PictureList.

To see the effect, run the app, click submit and then re-examine the collections. Now you will see two collections listed – the original one that captures the photos from the camera (PictureList), and the one called SubmitData that now has a field for filename and a field called filebody with the photo. I realise that setting a static filename called “a file.jpg” is not particularly useful to anybody, and I will address this a little later, the point is we now have the data in the format we need.

image

By the way, behind the scenes, PowerApps stores the photo in the Data URI scheme. This is essentially a Base64 encoded version of the image with a descriptor at the start that is included in HTML. When you think about it, this makes sense in some situations because it reduces the number of HTTP round trips between browser and server. For example here is a small image encoded and embedded direct in HTML using the technique.

<img src="data:image/png;base64,iVBORw0KGgoAAA
ANSUhEUgAAAAUAAAAFCAYAAACNbyblAAAAHElEQVQI12P4
//8/w38GIAXDIBKE0DHxgljNBAAO9TXL0Y4OHwAAAABJRU
5ErkJggg==" alt="Red dot" />

The implication of this format is when PowerApps talks to our Azure function, it will send this sort of JSON…

[ {  "filename": "boo.jpg",
"filebody": "data:image/jpeg;base64,/9j/4AAQSkZJR [ snip heaps of Base64 ] T//Z" },
{  "filename": "boo3.jpg",
"filebody": "data:image/jpeg;base64,/9j/TwBAQMEBA [ snip heaps of Base64 ] m22I" } ]

In the next section we will set up an Azure Function and write the code to handle the above format, so save the app in its current state and give it a nice name. We are done with PowerApps for now…

2. Create and Configure Azure Function

Next we are going to create an Azure function. This is the bit that is likely new knowledge for many readers. You can read all about them on their Azure page, but my quick explanation is that they allow you to take a script or small piece of code and turn it into a fully fledged webservice. As you will soon see, this is very useful indeed (as well as very cost effective).

Now like many IT Pros, I am a PowerShell hacker and I have been using the PowerShell PnP libraries for all sorts of administrative purposes for quite some time. In fact if you are administering an Office365 tenant and you are not using PnP, then I can honestly say you are missing out on some amazing time-saving toolsvand you owe it to yourself to skill-up in this area. Of course, I realise that many readers will not be familiar with PowerShell, let alone PnP, and I expect some readers have not done much coding at all. Luckily the code we are going to use is just a few lines and I think I can sufficiently explain it.

But we are getting ahead of ourselves, let’s create the Azure function and then revisit PowerShell. Assuming you have an Azure subscription, visit functions.azure.com and log in. If not, sign up for the free account and then create a function app to host the function. I called mine MyFunctionsDemo but yours will have to be something different. This will take minute or two to complete and you will be redirected to the Azure functions portal.

image   image

Once the web application to host your functions is created, Click the + next to the Functions button to create a new function. PowerShell is still in preview, so you have to click the option to create a custom function. On the next screen, in the Language dropdown, choose PowerShell.

image  image

Our function is going to be triggered from PowerApps when a user clicks the submit button. PowerApps will make a HTTP request so this is a HttpTrigger scenario. Click the HttpTrigger-PowerShell template, give it a name (I called mine PhotoSendSP) and click the Create button. If all goes to plan you will be presented with a screen with some basic PowerShell code… essentially a “Hello World” web service.

image   image

Let’s test this newly minted Azure function before we customise it. If you look to the right of the screen above, you will see a “Test” vertical label. Click it and you will be presented with a screen that allows you to craft some data to send to your shiny new function. You can see that the test is going to be a HTTP POST by default. As you can see below, there is a basic JSON entry with a single name/value pair “name”: “Azure”. Change the Azure string to something else and then click the Run button. The result will be displayed below the JSON as shown below.

image   image

Now let’s take a quick look at the PowerShell code provided to you by default. Only lines 2, 3 and 11 matter for our purposes. What lines 2 and 3 show is that all of the details that are posted to this webservice are stored in a variable called $req. Line 2 converts this to JSON format and stores that in a variable called $requestbody. Line 3 then asks $requestbody for the value of “name”, which is you look in the screenshots above are what you set in the test. Line 11 then outputs this line to a variable called $res, which is the response back to the caller of this webservice. In this case you can see it returns “Hello “ with $name appended to it.

image
Now that we have seen the PowerShell code, let’s now update it with code to receive data from PowerApps and send it to SharePoint.

3. Write and test the PowerShell code that uploads to SharePoint

If you recall with PowerApps, the data we are sending to SharePoint is one or more photos. The data will look like this…

[ {  "filename": "boo.jpg",
"filebody": "data:image/jpeg;base64,/9j/4AAQSkZJR [ snip heaps of Base64 ] T//Z" },
{  "filename": "boo3.jpg",
"filebody": "data:image/jpeg;base64,/9j/TwBDAQMEBAUEBQ  m22I" } ]

In addition, for the purposes of keeping things simple, I am going to hard code various things like the document library to save the files to and not worry about exception handling. Below is my sample code with annotations at the end…

1.  Import-Module "D:\home\site\wwwroot\modules\SharePointPnPPowerShellOnline\2.15.1705.0\SharePointPnPPowerShellOnline.psd1" -Global
2.  $requestBody = Get-Content $req -Raw | ConvertFrom-Json
3.  $username = "paul@tenant.onmicrosoft.com"
4.  $password = $env:PW;
5.  $siteUrl = "https://tenant.sharepoint.com"
6.  $secpasswd = ConvertTo-SecureString $password -AsPlainText -Force
7.  $creds = New-Object System.Management.Automation.PSCredential ($username, $secpasswd)
8.  Connect-PnPOnline -url $siteUrl -Credentials $creds
9.  $ctx = get-pnpcontext
10. $doclib = $ctx.Web.Lists.GetByTitle("Documents")
11. ForEach ($item in $requestbody)
12. {
13.    $filename = $item.filename
14.    $rawfiledata = $item.filebody
15.    $rawfiledata = $rawfiledata -replace 'data:image/jpeg;base64,', ''
16.    $bytes = [System.Convert]::FromBase64String($rawfiledata)
17.    # uses comma notation related to .net reflection http://piers7.blogspot.com.au/2010/03/3-powershell-array-gotchas.html
18.    $memoryStream = New-Object System.IO.MemoryStream (,$bytes)
19.    $FileCreationInfo = New-Object Microsoft.SharePoint.Client.FileCreationInformation
20.    $FileCreationInfo.Overwrite = $true
21.    $FileCreationInfo.ContentStream = $memoryStream
22.    $FileCreationInfo.URL = $filename
23.    $Upload = $doclib.RootFolder.Files.Add($FileCreationInfo)
24.    $ctx.Load($Upload)
25.    $ctx.ExecuteQuery()
26. }

 

  • Line 1 loads the PnP PowerShell module. Without this, commands like Connect-PnPOnline and Get-PnPContext will not work. I’ll show how this is done after explaining the rest of the code.
  • Lines 3-7 are all about connecting to my SharePoint online tenant. Line 4 contains a variable called $env:PW. The idea here is to avoid passwords being stored in the code in clear text. The password is instead is pulled from an environment variable that I will show later.
  • Lines 7-9 connect to a site collection and then connect to the default document library within it.
  • Line 10 looks at the data sent from PowerApps and loops around to process for each image/filename pair.
  • Lines 13-18 grabs the file name and file data. It converts the file data into a memorystream, which is a way to represent a file in memory.
  • Line 19-25 then uploads the in-memory image to the document library in SharePoint, based on the filename provided. (Note: any PnP gurus wondering why I did not use Add-PnPFile, it was because this cmdlet did not properly handle the memorystream and the images were not proper binary and always broken.)

So now that we have seen the code, lets sort out some final configuration to make this all work. A lot of the next section I learnt from John Liu and watching the excellent Office Patterns and Practices Special Interest Group webinar he recently did with my all-time SharePoint hero, Vesa Juvonen.

Installing PnP PowerShell Components

First up, none of this will work without the PnP PowerShell module deployed to the Azure Function App. The easiest way to do this is to install the PnP PowerShell cmdlets locally and then copy the entire installation up to the Azure function environment. John Liu explains this in the aforementioned webinar but in summary, the easiest way to do this is to use the Kudu tool that comes bolted onto Azure functions. You can find it by clicking the Azure function name (“MyFunctionDemo” in my case) and choosing the “Platform Features” menu. From here you will find Kudu hiding under the Development Tools section. When the Kudu tab loads, click the Debug console menu and create a CMD or PowerShell console (it doesn’t matter which)

image  image  image

We are going to use this console to copy up the PnP PowerShell components. You can ignore the debug console and focus on the top half of the screen. This is showing you the top level folder structure for the Azure function application. Click on site and then wwwroot folders. This is the folder where all of your functions are stored (you will see a folder matching the name of the module we made in step 2). What we will do is install the PnP modules at this level, so it can be used for other functions that you are sure to develop Smile.

image  image

So click the + icon to create a folder here and call it Modules. From here, drag and drop the PnP PowerShell install location to this folder. In my case C:\Program Files\WindowsPowerShell\Modules\SharePointPnPPowerShellOnline\2.15.1705.0. I copied the SharePointPnPPowerShellOnline\2.15.1705.0 folder and all of its content here as I want to be able to maintain multiple versions of PnP as I develop functions over time.

image  image

Now that you have done this, the first line of the PowerShell script will make sense. Make sure you update the version number in the Import-Mobile command to the version of PnP you uploaded.

Import-Module "D:\home\site\wwwroot\modules\SharePointPnPPowerShellOnline\2.15.1705.0\SharePointPnPPowerShellOnline.psd1"

Handling passwords

The next thing we have to do is address the issue of passwords. This is where the $env:PW comes in on line 4 of my code. You see, when you set up Azure functions application, you can create your own settings that can drive the behaviour of your functions. In this case, we have made an environment variable called PW which we will store the password to access this site collection. This hides clear text passwords from code, but unfortunately it is a security by obscurity scenario, since anyone with access to the Azure function can review the environment variable and retrieve it. If I get time, I will revisit this via App Only Authentication and see how I go. I suspect though that this problem will get “properly” solved when Azure functions support using the Azure Key Vault.

In any event, you will find this under the Applications Settings link in the Platform Features tab. Scroll down until you find the “App Settings” section and add your password in as shown in the second image below.

image  image

Testing it out…

Right! At this point, we have all the plumbing done. Let’s test to see how it goes. First we need to create a JSON file in the required format that I explained earlier (the array of filename and filebody pairs). I crafted these by hand in notepad as they are pretty simple. To remind you the format was:

[ {  "filename": "boo.jpg",
"filebody": "data:image/jpeg;base64,/9j/4AAQSkZJR [ snip heaps of Base64 ] T//Z" },
{  "filename": "boo3.jpg",
"filebody": "data:image/jpeg;base64,/9j/TwBDAQMEB [ snip heaps of Base64 ] m22I" } ]

To generate the filebody elements, I used the covers of my two books (they are awesome – buy them!) and called them HG2BP and HG2M respectively. To create the base 64 encoded images in the Data URI scheme, I went to https://www.base64-image.de and generated the encoded versions. If you are lazy and want to use a pre-prepared file, just download the one I used for testing.

h2bph2m image

To test it, all we need to do is click the right hand Test link and paste the JSON into it. Click the Run button and hope for the best! As you can see in my example below, the web service returned a 200 status which means hunky dory, and the logs showed the script executing successfully.

image

Checking my document library and they are there… wohoo!!

image

So basically we have a lot of the bits in place. We have proven that our Azure function can take a JSON file with encoded images, process that file and then save it to a SharePoint document library. You might be thinking that all we need to do now is to wire up PowerApps to this function? Yeah, so did I too, but little did I realise the pain I was about to endure…

4. Create a Swagger file so that PowerApps can talk to our Azure function

Now we come to the most painful part of this whole saga. We need to describe our Azure function using a standard called Swagger (or OpenAPI). This provides important metadata so that PowerApps can make it easy for users to consume. This will make sense soon enough, but first we have to create it, which is a royal pain in the ass. I found the online documentation for swagger to be lacking and it took me a while to understand enough of the format to get it working.

So first up to make things simpler, let’s reduce some of the complexity. Our Azure function is a simple HTTP post. We have not defined any other type of requests, so lets make this formal as it will generate a much less ugly Swagger definition. Expand your function and choose the “Integrate” option. On the next screen, under Triggers, you will find a drop down with a label “Allowed HTTP methods”. Change the default value to “Selected methods” and then untick all HTTP methods except for POST. Click Save.

image  image  image

Now click back to your function app, and choose “API Definition” from the top menu. This will take you to the screen where you create/define your Swagger file.

image

On the initial screen, set your API definition source to function if asked, and you should see a screen that looks somewhat like this…

image

Click the Generate API definition template button as suggested by the comment in the code box in the middle. This will generate a swagger file and on the right side of the screen, the file has will be used to generate a summary of your API. You can see the Url of your azure app, some information about an API key (which we will deal with later) and below that, the PhotoSendSP function exposed as a webservice (/api/PhotoSendSP).

image

Now at this point you are probably thinking “okay so its unfamiliar, but this is pretty easy”, and you would be right. Where things got nightmarish for me was working out how to understand and customise the swagger file as the template is currently incomplete. All it has done is defined our function (note the paths section in the above screenshot  – can you see all those empty square brackets? that’s what you need to now fill in).

For the sake of brevity, I am not going to describe the ins and outs of this format (and I don’t fully know it yet anyway!). What I can tell you is that getting this right is a painful and time-consuming combination of trial and error, reading the swagger spec and testing in PowerApps. Let’s hope my hints here save you some frustration.

The first step is we need to create a definition for the format of data that our Azure function accepts as input. If you look closely above, you will see a section called definitions. Paste the following into the section so it looks like the screenshot below. Note: If you see any symbol apart from a benign warning message next to the “Photos:” line, then you do not have it right!.

definitions:
   Photos:
      type: object
      required:
         - filename
         - filebody
      properties:
         filename:
            type: string
            example: image.jpg
         filebody:
            type: string

image

So what we have defined here is an object called Photos which consists of two required properties, filename and filebody. Both are assumed to be string format, and filename also has an example to illustrate what is expected. Depending on how this swagger file is consumed by another application that supports swagger, one can imagine that example showing up on online help or intellisense when our function is being called.

Now lets make use of this definition.  Paste this into the parameters and description sections. Note the “schema:” section. Here we have told the swagger file that our function is expecting an array of objects based on the Photos definition that we created earlier.

parameters:
  - name: photocollection
    in: body
    description: “The encoded files”
    required: true
    schema:
       type: array
       items:
          $ref: '#/definitions/Photos'
description: "A collection of photos and filenames"

image

Finally, let’s finish off by defining that our Azure function consumes and produces its data in JSON format. Although our sample code is not producing anything back to PowerApps, you can imagine situations where we might do something like send back a JSON array of all of the SharePoint URL’s of each photo.

produces:
- application/json
consumes:
- application/json

image

Okay, so we are all set. Now to be clear, there is a lot more to Swagger, especially if you wanted to call our function from flow, but for now this is enough. Click the Save button, and then click the button “Export to PowerApps + Flow”. You will be presented with a new panel that explains the process we are about to do. Feel free to read it, but the key step is to download the swagger file we just created.

image

Okay, so if you have made it this far, you have a swagger JSON file and you are in the home stretch. Let’s now head back to PowerApps!

5. Create a custom PowerApps connection/datasource to use the Azure function

Back in PowerApps, we need to make a new custom connection to our Azure function. Click the Connections menu item and you will be redirected to web,powerapps.com. Click “Manage Custom Connections” and then click “Create a custom connector”. This will take you to a wizard.

image   image

image

The first step is to upload the swagger file you created in step 4 and before you do anything else, rename your connector to something short and sweet, as this will make it easier when displayed in the PowerApps list of connections.

image

Scroll to the bottom the of page and click the “Continue” button. Now you are presented with some security options about your connector. For context, the default settings for the PowerShell azure function template we chose in step 2 was to use an API key, so you can leave all of the defaults here, although I like to add the more meaningful “API Key” (this will make sense soon). Click Continue

image

PowerApps has now processed the swagger file and found the PhotoSendSP POST action we defined. It has also pulled some of the data from the swagger file to prepopulate some fields. Note this screen has some UI problems – you need to hover your cursor over the forms in the middle of the screen to see the scrollbar, so there is more to edit than what you initially see…

image

For now, do not enter anything into the summary field and scroll down to look at some of the other settings. The Visibility setting does not really matter for PowerApps, but remember that other online services can call our function. This visibility stuff relates more to Microsoft Flow, so you can ignore it for now. Scroll further down and you will see how our swagger schema has been processed by PowerApps. You can explore this but I suggest leaving it well alone. Below I have used all my MSPaint skills to make a montage to show how this relates to your Swagger file…

image

Finally, click Create Connector to wire it up. If all has gone to plan, you will see something resembling the following in the list of custom connectors in PowerApps. If you get this far, congratulations! You are almost there!

image

6. Test successfully and bask in the glory of your awesomeness

Now that you have a custom connection, let’s try it out. Open your PowerApp that you created in step 1. From the Content menu, click Data sources, click Add Datasource and then click New Connection. Scroll through the list of connections until you find the one you created in step 5. Click on it and you will be prompted for an API key (now you see why I added that friendly label during step 5).

image  image  image

Where to find this key? Well, it turns out that it was automagically generated for you when you first created your function! Go back to Azure functions portal and find your function. From the function sub menus, click Manage and you will be presented with a “Functions Keys” section with a default key listed. Copy this to the clipboard, and paste it back into the PowerApps API Key text box and click the Create button. Your datasource that connects to your Azure function is now configured in PowerApps!! (yay!).

image

image  image

Now way back in step 1 (gosh it seems like such a long time ago), we created a button labelled Submit, with the following formula.

Clear(SubmitData);
ForAll(PictureList,Collect(SubmitData, { filename: "a file.jpg", filebody: Url }))

The problem we are going to have is that all files submitted to the webservice will be called “a file.jpg” as I hardcoded the filename parameter for simplicity. Now if I fully developed this app, I would add a textbox to the gallery so that each photo has to be named prior to being able to submit to SharePoint. I am not going to do that here as this post is already too long, so instead I will use a trick I saw here to create a random two letter filename. I know it is not truly unique, but for our demo will suffice.

Here is the formula in all its ugliness.

Concatenate(Text( Now(), DateTimeFormat.LongDate ),Mid("0123456789ABCDEFGHIJKLMNOPQRTSTIUVWXYZ", 1 + RoundDown(Rand() * 36, 0), 1),Mid("0123456789ABCDEFGHIJKLMNOPQRTSTIUVWXYZ", 1 + RoundDown(Rand() * 36, 0), 1),".jpg"), filebody: Url } ) )

Yeah I know… let me break it down for you…

  • Text( Now(), DateTimeFormat.LongDate ) produces a string containing todays date
  • Mid(“0123456789ABCDEFGHIJKLMNOPQRTSTIUVWXYZ”, 1 + RoundDown(Rand() * 36, 0), 1) – selects a random letter/number from the string
  • Concatenate takes the above date, random letters and adds “.jpg” to it.

PowerApps does allow for line breaks in code, so it looks less ugly…

image

Let’s quickly test this before the final step of sending it off to SharePoint. Preview the app, take some photos and then examine the collections. As you can see, the SubmitData collection now has unique filenames assigned (By the way, yes I took these in a car and no, I was not driving at the time! 🙂

image

We are now ready for the final step. We need to call the Azure function! Smile

Go back to your submit button and add the following code to the end of the existing code, taking into account the name you gave to your connection/datasource. You should find that PowerApps uses intellisense to help you add the line because it has a lot of metadata from the swagger file.

'myfunctionsdemo.azurewebsites.net'.apiPhotoSendSPpost(SubmitData)

image

Important! Before we go any further, I strongly suggest you use the PowerApps web based authoring environment and not the desktop application. I have seen a problem where the desktop application does not encode images using the Data URI format, whereas the web based tool (and PowerApps clients on android and IOS) work fine.

So log into PowerApps on the web, open your app, preview it and cross your fingers! (If you want to be clever, go back to your function in Azure and keep and eye on the logs Smile

image

As the above screen shows, things are looking good. Let’s check the SharePoint document library that the script uploads to… YEAH BABY!! We have our photos!!!

image

Conclusion

Now you finally get to bask in your awesomeness. If you survived all this plumbing and have gotten this working then congratulations! you are well on your way to becoming a PowerApps, PowerShell, PnP (and flow) guru. This means you can now enhance your apps in all sorts of ways. For the non-developers who got this far, you can truly call yourself a citizen developer and design all sorts of innovative solutions via these techniques.

Even though a lot of this stuff is fiddly (especially that god-awful swagger crap), once you have gone through it a couple of times, and understand the intent of the components we have used, this is actually quite an easy solution to put together. I also found the Azure function side of things in particular, really easy to debug and see what was going on.

In terms of where we could take this, there are several avenues that I can immediately think of, but the possibilities are endless.

  • We could update the PowerShell code so it takes the destination document library as a parameter. We would need to update a new Swagger definition and then update our datasource in PowerApps, but that is not too hard a task once you have the basics working.
  • Using the same method, we could design a much more sophisticated form and capture lots of useful metadata with the pictures, and get them into SharePoint as metadata on the picture.
  • We could add in a lot more error handling into the script and return much better detail to PowerApps, such as detailed failure information.
  • Similarly, we could return detailed information to PowerApps to make the app richer. For example, we could generate unique filenames in our PowerShell function rather than PowerApps and return those names (and the URL to the image) in the reply to the HTTP Post, which would enable PowerApps to display images inline.
  • We could also take advantage of recent PowerApps enhancements and use local caching. I.e, when an internet connection is available, call the API, but if not, save to local storage and call the API once a connection is available.
  • We could not only upload images, but update lists and any other combination of SharePoint functions supported by PnP.

I hope that this post has helped you to better understand how these components hang together and I look forward to your feedback and how you have adopted/adapted and enhanced the ideas presented here.

 

Thanks for reading

 

Paul Culmsee

www.hereticsguidebooks.com

 

 Digg  Facebook  StumbleUpon  Technorati  Deli.cio.us  Slashdot  Twitter  Sphinn  Mixx  Google  DZone 

No Tags

Send to Kindle

Rediscovering my curiosity at Creative Melbourne

Send to Kindle

As I write this I am somewhere over the middle of Australia, flying back to Perth after participating in a 3 day event that was fun, challenging and highly insightful. The conference was Creative Melbourne, and I am proud to say I was one of the inaugural speakers. If they want me back again, I will do it in a heartbeat, and I hope a lot of you come along for the ride.

CreativeMelbourne-1

The premise: practical co-creation…

First the background… I have known the conference organiser, Arthur Shelley, for a few years. We first met at a Knowledge Management conference in Canberra and though I have no recollection of how we got talking, I do recall we clicked fairly quickly. At the time I was starting to explore the ideas around ambiguity, which eventually formed my second book. Back then I had a chip on my shoulder about how topics like complexity, Design Thinking and collaboration were being taught to students. I felt that the creative and fun parts glossed over the true stress and cognitive overload of wicked problems. This would produce highly idealistic students who would fall flat on their face once they hit a situation that was truly wicked. I therefore questioned whether anything was being built into students mental armory for the inevitable pain to come.

Now for some people who operate and teach in this space, making such a statement immediately and understandably gets their defenses up. But not Arthur – he listened to everything I had to say, and showed me examples of how he structured his courses and teachings to deal with this challenge. It was impressive stuff: every time his students thought they had a handle on things, Arthur would introduce a curveball or a change they were not anticipating. In other words, while teaching the techniques, he was building their capacity for handling ambiguous situations. Little did I know his conference was about to do the same to me…

One thing about Arthur that blows me away constantly is his incredible network of practitioners in this space. Arthur has long had a vision for bringing a constellation of such practitioners together and he hand-picked a bunch of us from all over the world. The premise, was to create an event that had a highly practical focus. He wanted practitioners to help attendees “Discover creative techniques to enhance performance and engage your team back at the office to increase productivity.”

Now where did I leave my curiosity?

While I am a sensemaking practitioner, I’ll admit straight up that I get irritated at the “fluffiness” and rampant idealism in this space. A good example is Design Thinking in this respect. While I like it and apply ideas from it to my practice, I dislike it when Design Thinking proponents claim it to be suited to wicked problems. The reality is the examples and case studies often cited are rarely wicked at all (at least in the way the term was originally conceived). When I see this sort of thing happening, it leaves me wondering if proponents have truly been in a complex, contingent situation and had the chance to stress test their ideas.

Now I don’t apologise for critically examining the claims made by anyone, but I do apologise for the unfortunate side effect – becoming overly contrarian. In my case, after all these years of research, reading and practice in this field, I am at the point where I see most new ideas as not actually new and are rediscoveries of past truths. Accordingly, it has been a long time since I felt that sense of exhilaration from having my mental molecules rearranged from a new idea. It makes sense right? I mean, the more you learn about something, the more your mental canvas has been painted on. In my case I already have a powerful arsenal of useful tools and approaches that I call upon when needed and more importantly, I was never on a spiritual quest for the one perfect answer to the mysteries of organsiational life anyway.

In short, I have what I need to do what I do. The only problem is somewhere along the line I lost the very sense of curiosity that started me along the path in the first place. It took Arthur, fellow presenters like Stuart French, Jamie Bartie, Jean-Charles Cailliez, Meredith Lewis, Brad Adriaanse, Vadim Shiryaev and a diverse group of participants to help me rediscover it…

Disrupting the disruptor…

Imagine someone like me participating in day 1, where we did things like build structures out of straws, put on silly hats, used the metaphor of zoo animals to understand behaviors, arm-wrestled to make a point about implicit assumptions and looked at how artists activate physical space and what we could learn from it when designing collaborative spaces. There was some hippie stuff going on here and my contrarian brain would sometimes trigger a reflexive reaction. I would suddenly realise I was tense and have to tell myself to relax. Sometimes my mind would instinctively retort with something like “Yeah right… try that in a politicised billion dollar construction project…” More than once I suppressed that instinct, telling myself “shut up brain – you are making assumptions and are biased. Just be quiet, listen, be present and you might learn something.”

That evening I confided to a couple of people that I felt out of place. Perhaps I was better suited to a “Making decisions in situations of high uncertainty and high cognitive overload” conference instead. I was a little fearful that I would kill the positive vibe of day 1 once I got to my session. No-one wants to be the party pooper…

Day 2 rolled around and when it was my turn to present. I held back a little on the “world according to Paul” stuff. I wanted to challenge people but was unsure of their tolerance for it – especially around my claims of rampant idealism that I mentioned earlier. I needn’t have worried though, as the speaker after me, Karuna Ramanathan from Singapore, ended up saying a lot of what I wanted to say and did a much better job. My talk was the appetizer to his “reality check” main course. He brilliantly articulated common organsiational archetypes and why some of the day 1 rhetoric often hits a brick wall. It was this talk that validated I did belong in this community after all. Arthur had indeed done his homework with his choice of speakers.

That same afternoon, we went on a walking tour of Melbourne with Jamie Bartie, who showed us all sorts of examples of cultural gems in Melbourne that were hiding in plain sight. The moral of the story was similar to day 1… that we often look past things and have challenge ourselves to look deeper. This time around my day 1 concerns had evaporated and I was able to be in the moment and enjoy it for what it was. I spoke to Jamie at length that evening and we bonded over a common childhood love of cult shows like Monkey Magic. I also discovered another kung-fu movie fan in Meredith Lewis, who showed me a whole new way to frame conversations to get people to reveal more about themselves, and develop richer personal relationships along the way.

Petcha Kucha – Getting to a point…

Day 3 was a bit of a watershed moment for me for two reasons. Months prior, I had accepted an invitation from Stuart French to participate in his Petcha Kucha session. At the time I said “yes” without really looking into what it entailed. The gist is you do a presentation of 20 slides, with 20 seconds per slide, all timed so they change whether you are ready or not. This forces you to be incredibly disciplined with delivering your talk, which I found very hard because I was so used to “winging it” in presentations. Despite keynoting conferences with hundreds of people in the room, doing a Petcha Kucha to a smaller, more intimate group was much more nerve-racking. I had to forcibly switch off my tangential brain because as soon as I had a thought bubble, the slides would advance and I would fall behind and lose my momentum. It took a lot of focus for me to suppress my thought bubbles but it was worth it. In short, a Petcha Kucha is a fantastic tool to test one’s mental muscles and enforce discipline. I highly recommend that everyone give it a go – especially creative types who tend to be a bit “all over the place”. It was a master-stoke from Stuart to introduce the technique to this audience and I think it needs to be expanded next time.

I presented the first Petcha Kucha, followed by Stuart and then Brad Adriaanse, who described the OODA Loop philosophy. OODA stands for observe, orient, decide, and act, providing a way to break out of one’s existing dogma and reformulate paradigms, allowing you to better adapt to changing circumstances. Dilbert cartoons aptly shows us that we all have incomplete (and often inconsistent) world views which should be continually refined and adapted in the face of new observations. Brad put it nicely when he said OODA was about maintaining a fluid cognitive state and that assumptions can be a straightjacket and dogma can blind us. This really hit home for me, based on how I reacted at times on day 1. Brad also said that the OODA loop can be internalised by adopting a lifelong learning mindset, being curious and become more and more comfortable with ambiguity.

It was at this exact moment where I rediscovered my latent curiosity and understood why I felt the way I did on day 1 and 2. It was also at this moment that I realised Arthur Shelley’s genius in why he made this event happen, who he brought together and what he has created in this event. All attendees need to be disrupted. Some need their idealism challenged, and some, like me, need a reminder of what started us on this path in the first place.

I have returned a better practitioner for it… Thankyou Arthur

 

Paul Culmsee

p.s Arthur Shelley is still a giant hippie

 Digg  Facebook  StumbleUpon  Technorati  Deli.cio.us  Slashdot  Twitter  Sphinn  Mixx  Google  DZone 

No Tags

Send to Kindle

Teddies, Fetishes and the Management Consulting Scam

Send to Kindle

jediteddy.jpg

What if I told you that the key to becoming a successful management consultant was to become a Teddy Bear?

What if I also told you that it involves fetishes? You might be re-checking the URL to make sure you are on the right site!

Fear not, this article is definitely not “50 Shades of Management Consulting Grey”. Nor is it about donning a cuddly animal suit as a mascot for a football team. To borrow from the much loved children’s TV show “Playschool,” there’s definitely a bear in there, but not the one you might be thinking!

You see, for many people, modern corporate life is now at a point where pace of change is accelerating, unrelenting and fatiguing. In my home state of Western Australia, businesses are reeling from unprecedented levels of disruption and uncertainty, be it the end of the commodity boom, the impact of global competition or disruptive, technology-enabled innovation. It is now difficult to think of any industry that has not had the ground shift beneath it in some way — except perhaps, for Management Consulting.

Management Consulting thrives in an environment of fear, ambiguity and doubt, principally because its business model is based on the presumption that they can make it go away. It’s lucrative too — ambiguity is such a powerful force that executives will part with copious amounts of cash in attempts to escape it…

read the full article at medium.com

 Digg  Facebook  StumbleUpon  Technorati  Deli.cio.us  Slashdot  Twitter  Sphinn  Mixx  Google  DZone 

No Tags

Send to Kindle

Explaining the new book in 3 minutes…

Send to Kindle

The power of video as a means to convey a message and engage an audience cannot be under-estimated. For my new book, The Heretics Guide to Management, we decided to record a video after Kailash had played around with VideoScribe for some of his blog articles. My daughter, Ashlee is a talented artist and she and I fleshed out a basic script to explain the book with some imagery ideas.

The net result is the video below. The narrator is my son Liam, and I think the theme of “teddies for grown ups” really works when narrated by a child.

Both Ashlee and Liam did an amazing job and we were absolutely stoked with the result. For the record, the tools used were Camtasia for the recording, VideoScribe for the visuals and Ashlee’s Intuous Comic touch tablet.

Hope you enjoy the video… Let me know what you think Smile

 Digg  Facebook  StumbleUpon  Technorati  Deli.cio.us  Slashdot  Twitter  Sphinn  Mixx  Google  DZone 

No Tags

Send to Kindle

My new book about Teddies and fetishes is out…

Send to Kindle

cover7

 

Hi all

I am pleased to announce that my new business book, The Heretic’s Guide to Management: The Art of Harnessing Ambiguity is now available in ebook format (The print edition is still a couple of weeks away). Once again I wrote this with Kailash Awati and is a loose sequel to our first book, The Heretics Guide to Best Practices.

Many reviewers liked the writing style of our first book, which combined rigour with humour. This book continues in the same vein, so if you enjoyed the first one we hope you might enjoy this one too. The new book is half the size of the first one, and I would say, less idealistic too. In terms of subject matter, I could probably just say “Ambiguity, Teddy Bears and Fetishes” and leave it at that. I’m sure someone would think that we have moved into erotic fiction Smile

Unfortunately for those looking for some titillation, I’m afraid we did not write a management version of Fifty Shades of Grey. Instead, we aim to help readers understand how ambiguity affects the human behaviour and more importantly how it can be harnessed it in positive ways. We noticed that most management techniques (eg strategic planning, project management or operational budgeting) attempt to reduce ambiguity and provide clarity. Yet in a great irony of modern corporate life, they often end up doing the opposite: increasing ambiguity rather than reducing it.

On the surface, it is easy enough to understand why: organizations are complex entities and it is unreasonable to expect management models, such as those that fit neatly into a 2*2 matrix or a predetermined checklist, to work in the real world. In fact, expecting them to work as advertised is like colouring a paint-by-numbers Mona Lisa and expecting that you can recreate Da Vinci’s masterpiece. Ambiguity remains untamed, and reality reimposes itself no matter how alluring the model is…

It turns out that most of us have a deep aversion to situations that involve even a hint of ambiguity. Recent research in neuroscience has revealed the reason for this: ambiguity is processed in the parts of the brain which regulate our emotional responses. As a result, many people associate ambiguity with feelings of anxiety. When kids feel anxious, they turn to transitional objects such as teddy bears or security blankets, providing them with a sense of stability when situations or events seem overwhelming. In this book, we show that as grown-ups we don’t stop using teddy bears – it is just that the teddies we use take a different, more corporate, form. Drawing on research, we discuss how management models, fads and frameworks are actually akin to teddy bears. They provide the same sense of comfort and certainty to corporate managers and minions as real teddies do to distressed kids.

base teddy

Most children usually outgrow their need for teddies as they mature and learn to cope with their childhood fears. However, if development is disrupted or arrested in some way, the transitional object can become a fetish – an object that is held on to with a pathological intensity, simply for the comfort that it offers in the face of ambiguity. The corporate reliance on simplistic solutions for the complex challenges faced is akin to little Johnny believing that everything will be OK provided he clings on to Teddy.

When this happens you, the trick is finding ways to help Johnny overcome his fear of ambiguity (as well as your own).

jediteddysithteddy

Ambiguity is a primal force that drives much of our behaviour. It is typically viewed negatively – something to be avoided or to be controlled. The truth, however, is that it is a force that can be used in positive ways too. The Force that gave the Dark Side their power in the Star Wars movies was harnessed by the Jedi in positive ways.This new management book shows you how ambiguity, so common in the corporate world, can be harnessed to achieve outstanding results.

The book should be available via most online outlets.

Paul

 Digg  Facebook  StumbleUpon  Technorati  Deli.cio.us  Slashdot  Twitter  Sphinn  Mixx  Google  DZone 

No Tags

Send to Kindle

The ASS Scale. The best 2*2 management model ever!

Send to Kindle

So today I was inspired to come out of blogging hibernation because I saw possibly the worst dodgy 2*2 management matrix ever. The piece below was something that was originally going to be part of my next book with Kailash – as we spend some time on why models like this are so popular. Unfortunately this piece never made it, but Craig Brown told me I had to release it or he would. Thus, I feel it is now appropriate to unveil the greatest 2*2 dodgy management model ever! Without further ado I present to you the ASS Scale…

Does your team kick ass?

Want to improve team performance? Do you want your teams to be more agile, resilient, flexible, strategic, emergent, dynamic and follow orders without question?

The Agile Synergy Scale (ASS)™ is a cutting edge team diagnostic tool that provides a typology of team states. This provides CEO’s and other people who control the budget a sure-fire way to bring the best out of your people, help them reach their full potential and Kick Ass!.

The Agile Synergy Scale draws on several beers worth of research into all the latest literature from Wikipedia and Social Media, such as Big Data Analytics, Neuroscience, Holocracy, Transdisciplinary Intelligence, Innovation Ideation, Neurolinguistic Complexity Theory, Tasseography, Graphology, Craniosacral Therapy and 3D Printing. It explores the relationship between people, motivation and intelligence and unlocks an entirely new way of thinking about all forms of organisational awesomeness.

The framework consists of 4 domains – or “ASS cheeks” as shown below. There is a fifth domain – but we will get to that in a moment. These domains are illustrated in the diagram below.

assscale

The X axis represents team ability from low to high – and incorporates all of the sheer talent and expert knowledge necessary to probe for outstanding achievement for team and organisational excellence. The vertical scale represents a team desire – the lube of synergy that is the difference between accommodating maximum motivation versus constricted performance.

Let’s examine each ass-cheek in more detail and see where you and your team sits.

High Desire, High Skills: Kick Ass!

You and your team are as awesome as the Avengers. Perfectly balanced between brain, brawn and beauty, there is no challenge too tough for you and a Nobel prize in the category of legendaryness is a foregone conclusion.

High Desire, Low Skills: Kiss Ass

You and your team so want to be awesome, you all read the clickbait pearls of wisdom on your LinkedIn feed and therefore “talk the talk” with the best of them, but when the rubber hits the road and pressure is on, there is nothing under the hood. A dangerous sub-variety of kiss-asses are scary-asses (those who think they are kick-asses but are blind to their skill deficiencies.)

Low Desire, High Skills: Slack-ass (or “Can’t be assed”)

You all know your stuff as good as anybody, but nevertheless, you all withhold your discretionary effort (loafing). This is likely because the psychological needs of your team and individual members are not being met – either that or you are all whiny bitches.

Low Desire, Low Skills: Suck-ass

This quadrant has two sub-types. Rational suck-asses and stupid suck-asses. Rational suck-asses have the self-awareness to know they suck-ass and remedial action can be undertaken. Stupid suck-asses unfortunately have their head so far up their asses that they have little awareness of how much they suck-ass.

The toxic hole of chaos

There is a fifth domain (in the middle of the diagram): The toxic hole of chaos, which is the state of not knowing what sort of ASS cheek your team aligns with. It is extremely important you avoid this area in the long term as prolonged exposure can stifle and suffocate your team.

How to measure your ASS

We measure your teams ASS by administering a Rate of Extrinsic Collaboration and Team Agile Leadership Exam. This psychometric instrument can be administered by one of our certified Agile Synergy Scale PROfessional Business Excellence Reviewers. Our ASS PROBERS have gone through an extensive vetting process via a comprehensive multi-choice exam, and can administer a RECTAL exam with minimum discomfort.

So what are you waiting for? Sign your team up for a RECTAL exam today and measure your ASS.

 

Paul Culmsee

www.hereticsguidebooks.com

 Digg  Facebook  StumbleUpon  Technorati  Deli.cio.us  Slashdot  Twitter  Sphinn  Mixx  Google  DZone 

No Tags

Send to Kindle

Glyma is now open source!

Send to Kindle

Hi all

If you are not aware, my colleagues and I have spend a large chunk of the last few years developing a software tool for SharePoint called Glyma (pronounced “Glimmer”). Glyma is a very powerful  knowledge management solution for SharePoint 2010/2013, that deals with knowledge that is highly valuable, yet difficult to capture in writing – all that hard earned knowledge that tends to walk out the door in organisations.

Glyma was born from Seven Sigma’s Dialogue Mapping skills and it represents a lot of what we do as an organisation, and the culmination of many years of experience in the world of complex problem facilitation. We have been using Glyma as a consultancy value-add for some time, and our clients have gained a lot of benefit from it. Clients have also deployed it in their environments for reasons such as capture of knowledge, lessons learnt, strategic planning, corporate governance as well as business analysis, critical thinking and other knowledge visualisation/knowledge exchange scenarios.

image

I am very pleased to let people know that we have now decided to release Glyma under an open source license (Apache 2). This means you are free to download the source and use it in any manner you see fit.

You can download the source code from Chris Tomich’s githib site or you can contact me or Chris for the binaries. The install/user and admin manuals can be found from the Glyma web site, which also has a really nice help system, tutorial videos and advice on how to build good Glyma maps.

This is not just some sample code we have uploaded. This is a highly featured, well architected and robust product with some really nice SharePoint integration. In particular for my colleague, Chris Tomich, this represents a massive achievement as a developer/product architect. He has created a highly flexible graph database with some real innovation behind it. Technically, Glyma is a hypergraph database, that sits on SQL/SharePoint. Very few databases of this type exist outside of academia/maths nerds and very few people could pull off what he has done.

image

For those of you that use/have tried Compendium software, Glyma extends the ideas of Compendium (and can import Compendium maps), while bringing it into the world of enterprise information management via SharePoint.

Below I have embedded a video to give you an idea of what Glyma is capable of. More videos exist on Youtube as well as the Glyma site, so be sure to dig deeper.

 

I look forward to hearing how organsiations make use of it. Of course, feel free to contact me for training/mentoring and any other value-add services Smile

 

Regards

Paul Culmsee

 Digg  Facebook  StumbleUpon  Technorati  Deli.cio.us  Slashdot  Twitter  Sphinn  Mixx  Google  DZone 

No Tags

Send to Kindle

So what is this newfangled apps model anyway and why do I care? (part 3)

This entry is part 3 of 3 in the series Apps Model
Send to Kindle

Hiya

This is the third post in some articles aimed helping strategic or business focused users understand the SharePoint 2013 and Office365 “apps model”, and what it means for the future of SharePoint. In part 1 of this series, I outlined the opportunities and challenges that Microsoft are currently trying to addressing. They were:

  • Changing perceptions to cloud technologies and increased adoption; which enables…
  • The big scary bogeyman known as Google with a viable alternative to SharePoint, Office and Exchange in the form of Google Apps; as well as…
  • An increasing number of smaller cloud-based “point solution” players who chip away at SharePoint features with cheaper and easier to use offerings; while suffering from…
  • A serious case of Apple envy and in particularly the rise of the app and the app marketplace; while dealing with…
  • Customers unable to handle the ever increasing complexity of SharePoint, leading to delaying upgrades for years

These reasons prompted Microsoft’s to take a strongly cloud driven strategy and have really been transforming their business to deliver it. They really have transitioned from a software provider to an application hosting provider and in terms of SharePoint, the “apps” model is now the future of customisation.

Now “apps” is a multifaceted topic and the word has been overused unfortunately. So in part two we started to unpack the apps model by channelling the kids TV show playshcool to show the idea of SharePoint customisations being hosted on separate servers, but presenting a seamless experience for users.

If you never read part 1 and 2, I seriously suggest you do. To whet your appetite, here are some pretty diagrams to highlight what you missed out on!

image image  image

The main point I made was the notion that custom SharePoint components ran on separate, non SharePoint servers and were embedded into SharePoint via Iframes. These remote apps then communicate securely with SharePoint (eg read or write data from lists) via web services.

I concluded part 2 by showing the benefits of the apps model from Microsoft’s perspective. Among other things, this model of developing custom SharePoint solutions can be supported on Office365 and on-premises. For on-premises customers, SharePoint servers remain pristine and free of the muck and clutter of 3rd party code, making service packs and cumulative updates much less complex and costly. It also enables Microsoft to offer an app store, where 3rd party vendors can maintain cloud based services that can be embedded and consumed by on-premises and online SharePoint installs. If you go back to the 5 key strategic threats I started this post with, it addresses each one nicely.

So Microsoft’s intent is good, and there is nothing wrong with good intentions… or is there?

image

Digging deeper

So where does one start with unpacking the apps model? Let’s make this a little less of a dry read by channelling Big Bang Theory to find out. First up, Penny wants to know where app data stored is stored, given that the app runs on a different server to SharePoint as shown below. (If you think about it, from apps perspective – SharePoint is the remote server).

image

Sheldon’s answer? (Of course Sheldon invented the apps model right)…

image

Er… come again? Let’s see if Leonard can give a clearer explanation…

image

Leonard’s explanation is a little better. Ultimately the app developer has the choice over where app data is stored. For example, let’s say someone writes a survey app for SharePoint and it renders on the home page via an iframe. When users fill in this poll, the results could be stored on the server that hosts the app and not SharePoint at all (which in SharePoint terms is referred to as the remote web). Alternatively, the app could store the survey results in a list on a SharePoint site which the app is being invoked from (this is called the host web). Yet another alternative is something called the App Web, which I  will return to in a moment.

First lets look at the pros and cons of the first two options.

Option A: Store App Data in SharePoint Lists

If the app developer chooses this approach, the app reads/writes to SharePoint lists in the site where the app is deployed (henceforth called the host web). In this approach, when a site administrator chooses to add an app to the site, the app has to specify the level of access is required for the site, and it asks the site admin to authorise this access. In the image below, you can see that this Kodak app is requesting the ability to edit or delete items in document libraries and lists on this site, as well as access user profile information. If the site administrator clicks “Trust it”, the app now has the access it needs.

image

The pro to this approach is that all app data resides in regular old SharePoint, where it is searchable and can take advantage of all of the goodies that lists and libraries give you like versioning, information management policies and workflow. Additionally, multiple apps can access these lists, so this allows for the development of componentised solutions that work with a single authoritative data source.

The potential cons (or implications to be aware of) to the approach are:

  • SharePoint lists and libraries are not always an appropriate data stores for some types of data. Most people are well aware now that a SharePoint list is most definitely not a relational database and it has performance issues when misused (among other things). SharePoint also has in-built thresholds that kick in when lists get big (list queries that generate a result set of 5000 or more items will fail by default). Microsoft state in their SharePoint 2013 and SharePoint Online solution packs documentation “If your business needs require you to work with large data sets and query result sets, this approach won’t work”.
  • If the app has been deployed to many sites and site collections (and uses lists on each), then things can get painful if a new version of the app requires a new or modified set of columns on the list in the host web.
  • if you delete the app from the host web, the list data remains on the site as the lists will likely not be deleted. Sometimes this is a good thing as the data might be important or used in other ways, but if the app developer is storing configuration data here, would leave orphaned data in the site.

Also think about what happens when you have an on-premises SharePoint server but the app is hosted by a 3rd party outside your firewall. How is the remote app even able to get to your SharePoint box in the first place? To enable this to happen, you are likely going to need to talk to your network/security people because you are going to need some funky firewall/reverse proxy infrastructure to allow that to happen. Additionally, some organisations might be uncomfortable that an app from a 3rd party on the interweb can have the ability to read and write data inside an internal SharePoint server anyway.

So what alternatives are there here?

Option B: Store data in the remote app

The other option for the app developer is to store the app data on the same server the app is running from (called the remote web). In this approach, no data is stored in SharePoint at all. The pros for this option is that it alleviates two of the issues from option A above in that developers can use any data storage system they want (eg SQL Server, a GIS system or a graph database) and you do not have any of those pesky firewall issues with the app connecting back to your SharePoint server. The app renders in its iframe and does its thing.

Unfortunately this also has some cons.

  • There are potential data sovereignty issues. Where is the remote app hosted? Are you sure you trust the 3rd party app provider with your data? Consider that a 3rd party might host an app for many organisations. Do they have adequate precautions for keeping your data isolated and secure (ie Is your stuff stored in a separate database to everyone else?) Are they adequately backing up that data consistent with your internal standards? If you uninstall the app, is the data also uninstalled at the app provider end?
  • This data is very likely not searchable or easily usable by SharePoint for other purposes as it is not necessarily directly accessible by SharePoint.
  • Chances are that the app will have to talk to SharePoint in some way, so you really don’t get out of dealing with your network/security people to make it all work if the remote web is outside your firewall.
  • Also consider data integrity. if say, you needed to restore SharePoint because of a data loss or data corruption issue, what does this mean for the data stored on the remote app? will things get out of sync?

These (and other questions) bring us to the next option that is a bit of a mind-bending middle ground.

Option C: Store data in the App Web

Now here is where things start to mess with your head quite a bit. Most people can understand the idea behind option A and B, but what the heck is this thing known as an app web? The short answer, it is a special SharePoint sub-site that is used for certain apps. It usually gets created when a site administrator adds an app to a site and importantly, removed when a site administrator removes the app. If you consider the diagrams below we can see our mythical SharePoint 2013 homepage with 3 apps on it, all running on separate servers as explained in part 2. If we assume each of these apps have deployed an app web on our SharePoint site, there are now three SharePoint sub-sites created underneath it as shown below. These are the app webs.

image

Now app webs are no ordinary subsites in the way you might know them now. For a start, if you tried to access them from your SharePoint host web, you would not be able to at all. Through some trickery, SharePoint puts these sites on a completely different URL than the site where the apps were installed. For example, if you had a site called http://myintranet/sites/web1 and you added a survey app, a subsite exists but it would absolutely not be http://myintranet/sites/web1/survey or anything like that. It would instead look something like:

http://app-afb952fd75de4a.apps.mycompany/sites/web1/surveyapp/

Now being a business audience reading this, trust me that there are good reasons for this apparent weirdness related to security which I will speak to in the next post.

But if this makes sense so far, then in some ways, one can argue that an app web is a weird cross between option A and B in that it is a subsite on your SharePoint farm, yet SharePoint treats it as if it is a remote data store. This means the app web is a special isolated storage area for app developers to put stuff like data, configuration, CSS, JavaScript and whatever other functionality their app needs to do its thing.

This approach has some advantages:

  • The app web is technically a SharePoint site, so app developers can create whatever structure they need (think lists, libraries and columns) to store their stuff like images, css, JavaScript and other goodies. This allows for much more flexible data structures that can easily be accommodated that writing to the odd list in a host web (Option A above)
  • The app web lives is on your SharePoint server, so it means some app components (and data) can be stored here instead of on a remote web on a server far away from you. When you back up your configuration database, the app web is backed up like everything else. (Less data integrity risk than option B).
  • The app web facilitates clean install/uninstall of an app, since the app web is removed if an app is removed. In other words, no more orphaned data lying around

But if you think all of those points through, you might see several important implications:

  • If the app developer decided to store critical app data in the app web, when the app is uninstalled that data is lost (or put better, developers have to write special uninstall code to copy the data somewhere else which means yet more code)
  • Just like option A, SharePoint lists and libraries are not always an appropriate data stores for some types of data. (remember the list item threshold I mentioned in option A? They also apply here too)
  • Apps cannot share app webs between them. In other words, apps cannot reach in and access data from other app webs. Therefore you can’t easily use the information stored in app webs with other apps and applications. In fact if you want to do this you pretty much have to choose option A. Store the shared data apps need in the host web and have both apps access that data
  • You may end up with many many app webs. If you take my example above there are 3 app webs to handle 3 apps. What if this was a project site template and your organisation has hundreds of projects. That means you have thousands of app webs, all with potentially interesting data that trapped in mini information silos.
  • SharePoint search cannot index the app webs
  • A critical but often overlooked one is that developers can’t update the library/list metadata schema in an app web (think columns, content types, libraries, etc) without updating and redeploying the app. As you will see in a future post, this gets real ugly real fast!
  • SharePoint App Webs are created with special templates which block SharePoint Designer (that’s probably a good thing given the purpose of an app web)

Conclusion

So if you are a non-developer reading this post, consider that none of the above options are on their own, likely to give you a solution. For each option, there seems to be just as many advantages as flaws. The reality is that many apps will use at least two, or even all three options. Things like images, css and javascript might be loaded from the app web, some critical reusable SharePoint content from the host web and the remote web for some heavy duty data manipulation.

If you think that through, that means as SharePoint administrators and governance teams, you will likely end up dealing with all of the cons of each of the options. Imagine asking a developer to conceptually draw an app that uses each of the options and consider how many “moving parts” there are to it all. Then when you add the fact that most organisations still have a legacy of full trust solutions to deal with, you can start to see how complex this will be to manage.

Now this really is just the tip of the iceberg. In the next post I am going to talk a little about how all this stuff is wired up from an authentication and security standpoint. I am also going to focus on the application lifecycle management implications of this model. If you think about the picture I have painted here with all of the potential moving parts, how to you think an upgrade to an app would fare?

thanks for reading

Paul Culmsee

 Digg  Facebook  StumbleUpon  Technorati  Deli.cio.us  Slashdot  Twitter  Sphinn  Mixx  Google  DZone 

No Tags

Send to Kindle

A free IT Unity Webinar: Rewriting the Rulebook for Managing Knowledge

Send to Kindle

Hi all

Just to let you know that I’ll be presenting a webinar with Christian Buckley on the topic of Glyma, Knowledge Management and SharePoint next week. If you have an interest in things like workforce planning, knowledge capture, project lessons learnt, strategic planning, policy analysis, etc then it will be well worth your time to take a look. It is called Rewriting the Rulebook for Managing Knowledge. Here is the synopsis…

“Managing knowledge in a business has always been tough, and these days it has become even tougher. The amount of information available has skyrocketed and so too have the formats, places and channels through which it is received. Complexity around how we gather, organize and effectively use information has magnified – and to deal with this complexity, we need a new approach.

In this webinar, Australian-based information management strategist, SharePoint guru, and award-winning author Paul Culmsee will be joined by Office 365 MVP and well-known SharePoint and social strategist Christian Buckley to help participants rewrite the rulebook on managing their intellectual capital. Find out about new approaches, techniques and tools that can be used within your organization to help better leverage your existing knowledge stores and intellectual capital all using SharePoint and featuring Glyma.”

Key topics that we will cover include:

  • Avoiding the “what happens when Jeff leaves” brain drain crisis
  • Tapping into “what’s in the head” and turning it into usable assets for the business
  • Bringing a halt to the revolving cycle of “re-inventing the wheel”
  • Stopping the nonsensical repetition of costly mistakes

Details

  • Date : Tuesday, January 27, 2015
  • Time: 4:00pm Eastern (EST) 1:00pm Pacific (PST)
  • Duration: 1 hour
  • Cost: Free

Hope to see you there!

Paul Culmsee

 Digg  Facebook  StumbleUpon  Technorati  Deli.cio.us  Slashdot  Twitter  Sphinn  Mixx  Google  DZone 

No Tags

Send to Kindle

So what is this newfangled apps model anyway and why do I care? (part 2)

This entry is part 2 of 3 in the series Apps Model
Send to Kindle

Hiya

This is the second post in some articles aimed at demystifying the SharePoint “apps” model for the strategic or business focused user. In case you are not aware, Microsoft have gotten a serious case of “app fever” in recent times, introducing the terminology not only into SharePoint, but Office as well. While there are very good reasons for this happening, Microsoft used the “app” terminology in multiple ways, therefore making their message rather confusing. As a result, Microsoft have not communicated their intent particularly well and customers often fail to understand why they make the changes that they do.

Things are definitely getting better, but I nevertheless see a lot of confusion around the topic. So in part 1 of this series, I explained the reasons Microsoft have adopted the strategy that they have done. To recap, they are trying to respond to five major disruptive forces that challenge their market position:

  • Changing perceptions to cloud technologies and increased adoption; which enables…
  • The big scary bogeyman known as Google with a viable alternative to SharePoint, Office and Exchange in the form of Google Apps; as well as…
  • An increasing number of smaller cloud-based “point solution” players who chip away at SharePoint features with cheaper and easier to use offerings; while suffering from…
  • A serious case of Apple envy and in particularly the rise of the app and the app marketplace; while dealing with…
  • Customers unable to handle the ever increasing complexity of SharePoint, leading to delaying upgrades for years

Microsoft’s answer to this has to go all-in with cloud, as this is the only way to beat the cloud providers at their own game, while reducing the complexity burden on their customers.  This of course is in the form of Office365, OneDrive and an ever increasing set of cloud oriented tools like Delve and Project Online.

But in SharePoint land, this has turned traditional development upside down. More than a decade of customisation “best practices” are no longer best – in fact they are no longer usable in many circumstances. The main reason is that the most common method of customisation normally applied to SharePoint (full-trust server side code) is not permitted in the cloud. Microsoft couldn’t risk untrusted, 3rd party custom code on their servers. What happens if one clients dodgy code affects everybody else sharing the service? This would threaten performance, uptime and Microsoft’s ability to upgrade their service over time.

So things had to change. Microsoft’s small army of product architects commandeered a whiteboard and started architecting innovative solutions to deal with these challenges and the apps model is the result. So let’s examine some core bits to the apps model by channelling a much loved children’s TV show.

There is a bear in there…

Now at this point I have to warn the developers or tech people writing this post. I am going to give a simplified version of the apps model intended for a decision making audience. I will omit many details I don’t deem necessary to make my key points. You have been warned…

image

Any parent of small children in most countries might be familiar with Playschool – a show for toddlers that has been around for eons. It is well known for its theme song starting with the line “There is a bear in there…and a chair as well”. When trying to come up with a suitable way to explain the SharePoint apps model, using Playschool as a metaphor turned out to work brilliantly. You see in each episode of Playschool, there was a segment where viewers were taken through the “magic window” to faraway lands. In the show, the presenter would pick one of the three windows and we would zoom into it, resulting in a transition to another segment. In our case, we have to pick the square window for two reasons. Firstly, a good many apps are in effect windows to somewhere else. Secondly, and much more importantly, it perfectly matches the new Microsoft corporate logo. Perfect metaphor or what eh? 🙂

Like the Playschool magic window, browsers have a similar capability to enable you to visit strange and magical lands… Not only is there a bear in there and a chair as well, but there are plenty of other things like YouTube videos and Yammer discussions. I have drawn this conceptually shown below. Note the black window in the SharePoint team site on the left, that can be filled with YouTube or Yammer.

image

You have no doubt visited web sites that have embedded content like YouTube videos or SlideShare slides in them (This blog site has lots of embedded ads that make me no money!). Essentially, it is possible for browsers to include content from different sites together into a single “page” experience. Users see it all as one page, even through content can come from all over the place. This is really useful, because it means you can leverage the capability of other sites to enhance the functionality of your own sites..

This my friends, is one of the core tenets of the current SharePoint 2013 apps model. Instead of running on the SharePoint server, many apps now run separately from SharePoint, embedded in SharePoint pages so that they look like they are part of SharePoint. In the example mock-up below, we have a SharePoint team site. In it, we have a remote web site that displays some pretty dashboard data. By loading that emote content into our magical square window, it now appears a part of SharePoint.

image       image

Going back to Microsoft’s core pain points, this helps things a lot. For a start, it means no custom code has to be installed onto the SharePoint server. Instead, SharePoint simply embeds the external content on the page. In the leftmost image above, you can see the SharePoint server (labelled as “Your server”), rendering a page with a placeholder in it. It then retrieves content from a remote server (labelled “my server”) and displays it in the placeholder to render the complete page (the rightmost image above).

So what feat of Microsoft innovation and general awesomeness enabled this to happen?

Everybody meet “Mr IFrame”.

Inline Frames (IFrames) are windows cut into your webpage that allow your visitor to view content on another site without reloading the entire page. The concept was first implemented in Microsoft Internet Explorer way back in 1997. Yep – you heard right… 1997. So IFrames are not a new concept at all – in fact its positively ancient when you count time in internet years. For this reason, when developers find this out, their reaction is usually something like this…

image

But there is more than meets the eye…

Now if the apps model was just IFrames alone, then you you might wonder what the big deal is with apps. In fact IFrames have been used this way in SharePoint for years via the Page Viewer Web Part. For years, companies with SharePoint deployments have embedded stuff like Twitter, YouTube or Facebook widgets via iFrames.

So of course, there is more to it…

Let’s revisit the “your server” and “my server” diagram used above and consider the question.. What if these remote applications displayed inside an iFrame can interact with SharePoint? In other words, What if the remote application running on my remote server is able to connect to your SharePoint server and read/write data? In the diagram below I have illustrated the idea. The top half of the diagram represents a SharePoint server that could be on-premises or an Office365 tenant. On the left is a Products list, that is somewhere inside this SharePoint server. At the bottom is my application running on my server that creates a pretty dashboard. What if my remote application queried the SharePoint products list to create the dashboard? Now we have an application, that while not running inside SharePoint, can nevertheless utilise live data from SharePoint to create a seamless experience for users.

image

If we now add 3 iframes to a page, the implications should start to become more clear. We can build hybrid solutions leveraging the best of what SharePoint can do, whilst leveraging the best of what other platforms can do. To the user, these are still SharePoint sites, but the reality is that we are now viewing a page that has been delivered by various different platforms. Each can interact with SharePoint data in different ways to deliver a seamless experience. Because these remote apps are not SharePoint at all, developers can write any application they want to, using the platform and tools of their choice. But to the user it is still a SharePoint page… neat huh? I’m sure the Microsoft product team thought that this was a brilliant conceptual masterpiece when they dreamt it up.

image

A beautiful model…

I don’t know if you have ever watched developers come up with API’s, but it tends to be a lot of excitement around a whiteboard as they revel in the glory of their elegant solution designs. So let’s quickly re-examne the benefits of this remotely hosted app approach from Microsoft’s perspective and see how we are going so far…

image

First and foremost, we now have SharePoint customisation approach that they can be fully support in Office365. Microsoft don’t have to put code on their online servers, yet can support extensibility. Now they are much more evenly matched with Google, while at the same time, reduce their tech support costs of SharePoint because they have isolated 3rd party code out of SharePoint. If any problems are encountered with a remote app, SharePoint will keep humming along and Microsoft can now legitimately tell the clients “no really it is not SharePoint causing your issue – go see your friendly neighbourhood app developer”.

More importantly since apps can also can be used in on-premises SharePoint deployments too, meaning both Microsoft and their customers now have pristine SharePoint servers free of the muck and clutter of 3rd party code. Therefore service packs and cumulative updates should no longer strike fear into admins. Microsoft also now nails google’s ass because Google has no real concept of on-premises at all in the way Microsoft does. Thus when hybrid scenarios come up in conversation, Microsoft has a much stronger story to tell.

But there is a more important implication than all of that. Microsoft can now do the app store thing. Vendors can maintain cloud based services that can be embedded and consumed by on-premises and online SharePoint installs. This means 3rd parties can tap into the customer ecosystem with a captive marketplace and customers can browse the store to examine what options are out there to extend SharePoint functionality. In theory, this should enable hundreds of vendors to do some slight modifications to their existing web based applications and incorporate them into the SharePoint ecosystem.

image

But reality is not what’s on the whiteboard…

At this point, I hope I have painted a pretty good picture of the advantages offered by this new paradigm and you can probably appreciate the Microsoft nerds completely falling in love with this conceptual model of future SharePoint customsiations. The Microsoft strategy dudes probably loved it too because it elegantly dealt with all of the challenges they were seeing. Unfortunately though, with most conceptual models, reality is a very different beast from the convenient fiction of models.

So in the next post, we are going to dig a little deeper. For example, how can a remote app even have permissions to talk to SharePoint in the first place? Do you really want code running in some untrusted 3rd party server to be fiddling with data in your SharePoint lists and libraries? How does that even work anyway in an on-premises scenario when a cloud hosted app has to access data behind your firewall?

Fear not though – the Microsoft guys thought of this (and more) when they were drawing their apps model concept on the big whiteboard. So in the next post, we are going to look at what it takes to bring this conceptual masterpiece into reality.

 

Thanks for reading

Paul Culmsee

 Digg  Facebook  StumbleUpon  Technorati  Deli.cio.us  Slashdot  Twitter  Sphinn  Mixx  Google  DZone 

No Tags

Send to Kindle