A Sample OpenAPI/Swagger file for PowerApps

Send to Kindle

Ever since I posted a video on how to use Flow to upload photos to SharePoint from PowerApps, I get a lot of requests for help with the most mysterious bit – the swagger/openAPI file…

To save you all much pain and suffering, here is a sample file that you can use to get started.

In this post I am going to assume you have watched the video and understand the intent. Here I will simply annotate the file with some notes that will help you customise and extend it for your own purposes.

Note 1: This only works for a HTTP request trigger in Flow

Flow is capable of being called like any other web service. To do so, you have to use the following trigger.

image

Note that the trigger states clearly “URL will be generated after save”, so the first thing to do is generate that URL…

image

Once you have done so, it will look like this:

image

If we break the URL it down, you will see:

  • A domain something like <location>.logic.azure.com.
  • A URL path of “/workflows/<instance ID>/triggers/manual/paths/invoke” which  that identifies your specific workflow ID. Take note of this as you will need it.
  • A parameter called api-version with a (at the time of writing) value of “2016-06-01” – eg api-version=2016-06-01
  • A parameter called sp with an encoded value of “/triggers/manual/run” = eg sp=%2Ftriggers%2Fmanual%2Frun
  • A parameter called sv with a value of 1.0 – eg &sv=1.0
  • A parameter called sig with a random string value. eg sig=PeZc-ljtjmJhsz00SD78YrwEohTqlUxpQuc95BQQuwU

In combination, the URL looks as follows with the important bits in bold…

https://<location>.westus.logic.azure.com:443/workflows/<workflow instance ID>/triggers/manual/paths/invoke?api-version=2016-06-01&sp=%2Ftriggers%2Fmanual%2Frun&sv=1.0&sig=<signature>

The bits in bold you will need to know, because they need to be added to the OpenAPI file. Why? because this file is what PowerApps uses to construct a HTTP call to your flow.

So let’s look at the Swagger File…

Note 2: Host, basePath and Path

Open the Swagger file and look for the section called “host”… Replace the section labelled [enter Flow URL here] with the URL from the flow Trigger I mentioned above. eg:

prod-23.westus.logic.azure.com:443 or prod-01.australiasoutheast.logic.azure.com:443

From this…

image

To this…

image

Now find the section labelled [enterid here]. This is where the workflow ID goes… so from this:

image

To this..

image

Note 3: Double check the sv, api-version and sp parameter sections.

All of the parameters expected by the Flow are specified in the OpenAPI file. You will see it in the “parameters” subsection of the “post” section…eg

image

Now for reference, each parameter section has:

  • name: The name of the parameter as it appears on the URL
  • in: specifies whether this parameter is in the query string, header or body. All of the default flow parameters are in the query string.
  • default: This is the value to check!! If Flow is updated in future it is very likely this parameter will reflect it. Please do not come to me for support if you have not checked this!
  • required: States that this parameter MUST be passed. PowerApps will not allow you to call this Flow without specifying this parameter
  • x-ms-visibility: this basically says “use the default value and don’t show the user”. So in effect, the above “required” condition is met, but PowerApps will not ask the user to enter it.
  • type: is self-explanatory. It tells PowerApps that the parameter is a string.

Note: For more detail on these parameters go and read the OpenAPI 2.0 standard and Microsoft’s documentation.

Note 4: Update the sig parameter…

The sig parameter is like an API key or a password. You need to paste it as the default value in your file like so…

image   image

Note: It is possible to set this up in PowerApps so that it has to be entered when a user adds a datasource. However I am not covering that here.

Note 5: Add (and remove) your own parameters…

This swagger file makes the assumption that PowerApps is going to send a file name for the photo, as well as a description, latitude and longitude. Note that all fields are set to required, but none have default values and the x-ms-visibility parameter is not specified, meaning that PowerApps will prompt the user to enter them.

Using the examples as a guide, add or remove parameters as you see fit.

image

Note 6: Set your function call names appropriately…

Going back to the top of of the file, update the description to suit the task you are performing. Pay special attention to “Title” and “operationId”, as PowerApps uses these. For example, based on the image below, you will be calling a function called PhotoHandler.UploadPhoto() from PowerApps.

image

At this point you should be able to save your file and register it as a custom connection and call it from PowerApps.

Note 7: Do not use the word “SharePoint” in your custom  connector name

Believe it or not, if you name your custom connector with the word “SharePoint” it will confuse PowerApps completely. For example, consider this custom connector:

Now look what happens when you try to use it… you get the message “The data did not load correctly. please try again”, with a sub message of “Resource not found”…

The solution? Name your connector anything, so long as the word SharePoint is not there 🙂

Parting notes…

If you intend to send data back to Flow, you will also have the define the schema for what is returned to Flow in the responses section. I have not added any custom schema in the sample swagger file and discussing it is outside the scope of this article. But in case you are interested, to get you started, below is an example of calling Microsoft’s QNAmaker chatbot service REST API and sending the results back to PowerApps.

 

"responses": {
  "200": {
    "description": "OK",
    "schema": {
       "type": "object",
       "properties": {
          "answers": {
          "type": "array",
          "uniqueItems": true,
          "minItems": 1,
          "items": {
             "required": [
                "answer",
                "score"
             ],
             "type": "object",
             "properties": {
                "answer": {
                   "type": "string",
                   "minLength": 1
                },
                "questions": {
                   "type": "array",
                   "items": {
                      "type": "string"
                   }
                 },
                 "score": {
                   "type": "number"
                 }
              }
           }
        }
      },
       "required": [
         "answers"
       ]
    }
 }

 

Thanks for reading

Paul Culmsee

 Digg  Facebook  StumbleUpon  Technorati  Deli.cio.us  Slashdot  Twitter  Sphinn  Mixx  Google  DZone 

No Tags

Send to Kindle

From Rick Astley to Fidget Spinners: A slew of PowerApps and Flow video tutorials

Send to Kindle

Hiya

I have been recording various videos over time of some advanced PowerApps and Flow concepts/solutions. All of these are either workarounds for current limitations in PowerApps or Flow or work I have done with my daughter, Ashlee.  I have listed each here with explanations…

How to Save Photos from PowerApps to SharePoint via Flow

This video outlines a robust and flexible method for uploading photos from PowerApps to SharePoint. At the time of writing, it is the best option despite having to create OpenAPI files.

 

Calling Cognitive Services Vision API from PowerApps via Flow

This video demonstrates a simple receipt tracker that uses the OCR capability of Microsoft cognitive services to find price information from a scanned receipt.

 

How to set SharePoint list permissions using Flow

This video shows the high level view on how Flow can be used to set SharePoint permissions, much like an app step that is used in SharePoint Designer. It also demonstrates the idea of breaking up flows into reusable chunks – called service flows.

 

It’s not a Flow, nor a Proxy… It’s a Floxy!!

This is an example of utilising flow to display document library content in PowerApps. I also wrote a detailed post about this one…

 

How To Rickroll Your Friends Using PowerApps

A funny app with some very clever design considerations. This was actually done by my daughter, Ashlee. She explains how she did it below…

 

Paul and Ashlee on PowerApps

More nerdy fun with my daughter, who is already an accomplished PowerApps coder as you will soon see. In this video, she build me a sophisticated audit/checklist app using Microsoft PowerApps and Flow. This app demonstrates offline support, calling external API’s and photo handling.

  

 

and finally….the famous fidget spinner…

Build a FidgetSpinner using PowerApps

Demonstrating the power of the PowerApps platform for citizen developers, Ashlee won a contest from Microsoft to create a fidget spinner using PowerApps. In this video, Ashlee explains to me how she built the app and shames me for my dodgy high school maths…

p.s don’t miss out the Solar System PowerApps by MVP Daniel Christian, who was inspired by Ashlee’s fidget spinner. Amazing stuff…

I think these videos highlight the flexibility and power of this platform. Let me know if you would like me or Ashlee to record others or expand on them!

Paul Culmsee

 Digg  Facebook  StumbleUpon  Technorati  Deli.cio.us  Slashdot  Twitter  Sphinn  Mixx  Google  DZone 

No Tags

Send to Kindle

Incredible! Download a PowerApp that determines your IQ from a selfie…

Send to Kindle

Hiya

Some of you might have seen my post where I explained how how my daughter, Ashlee (18), won a competition from Microsoft to develop a fidget spinner app. Well if you think that was impressive, check this out…

image

Inspired by the positive feedback from the community, Ashlee decided to up her game and investigated some of the various online services that PowerApps and Flow can connect to. The net result is an app which has capabilities that are quite extroadinary. By connecting to Microsoft Cognitive Services in Azure, Ashlee has created an app that will determine your IQ from your facial structure. In other words, you download this app, take a selfie, and via cloud-based machine learning that has processed millions of photos, you can find out your IQ without doing a traditional IQ test!

Even better, Ashlee has kindly donated it to the community as a learning tool, so you can not only learn about these ground-breaking techniques, but you can download/share it with your colleagues too.

So far this app has been used in various global organisations including Microsoft. It has also been demonstrated to rapturous applause at the recent Digital Workplace Conference in Sydney.

 

So DO NOT MISS OUT on this, especially when using this app is a simple set of easy-to-follow steps!

 

Ready… Steady…

 

Wait… I forgot to mention, this app is optimised for phones, and utilises Microsoft speech recognition, so be sure to turn up the volume so you can hear the instructions…

 

Right are you ready?

 

Oh… don’t forget… you need an Office365 subscription for this so make sure PowerApps is enabled on your tenant…

 

Okay, so do the following right now!

 

1. Download and install PowerApps studio onto your PC.

2. Download Ashlee’s Guess your IQ MSAPP (rename the file extension from .zip to .msapp) and open it in PowerApps Studio

3. In PowerApps studio, save a copy of the App to your tenant (“the cloud”) via the File menu

image

4. Now install PowerApps on your IOS or Android Phone (you will find it in the app store and its free)

5. Run PowerApps on your phone and sign in with the same user account you used in step 3

6. Select the app called Guess Your IQ. It might take a little while to load, but don’t worry – there is a lot of awesome functionality.

7. Enter your name when prompted…

image

8. Now use your phone camera to take a selfie…

image

9. Click the “Find my IQ” button and watch the magic of machine learning…

image  image

10. Seriously, you WILL NOT BELEIVE what happens next!… but you have to try for yourself!

 

Make sure you let me know what your IQ is! Smile

 

Paul Culmsee

www.hereticsguidebooks.com

 Digg  Facebook  StumbleUpon  Technorati  Deli.cio.us  Slashdot  Twitter  Sphinn  Mixx  Google  DZone 

No Tags

Send to Kindle

How’s the weather? Using a public API with PowerApps (part 2)

This entry is part 3 of 3 in the series OpenAPI
Send to Kindle

Introduction

Hi again

This is the second half to a post that will use the OpenWeatherMap API in PowerApps. The business scenario is around performing inspections. In the first post I gave the example of a park ranger or plant operator, both conducting inspections where weather conditions can impact the level of danger or the result of the inspection. In such a scenario it makes sense to capture weather conditions when a photo is taken.

PowerApps has the ability to capture location information such as latitude and longitude, and public weather API’s generally allow you to get weather conditions for a given location. So the purpose of these posts is to show you how you can not only capture this data in PowerApps, but then send it to SharePoint in the form of metadata via Flow.

In Part 1, we got the painful stuff out of the way – that is, getting PowerApps to talk to the OpenWeather web service via a custom connector. Hopefully if you got through that part, you now have a much better understanding of the purpose of the OpenAPI specification and can see how it could be used to get PowerApps to consume many other web services. Now we are going to actually build an app that takes photos and captures weather data.

App prerequisites…

Now to get through this post, we are going to do this is to leverage a proof of concept app I built in a separate post. This app was also an inspection scenario, allowing a user to take a bunch of photos, which were then sent to SharePoint via Flow, with correctly named files. If you have not read that post, I suggest you do so now, because I am assuming you have that particular app set up and ready to go.

Go on… its cool, I will wait for you… Smile

Seriously now, don’t come back until you can do what I show below. On the left is PowerApps, taking a couple of photos, and on the right is the photos now in a SharePoint document library.

image  image

Now if you have performed the tasks in the aforementioned article, not only do you have a PowerApp that can take photos, you’ll have a connection to Flow and ready to go (yeah the pun was intended).

First up, lets recap two key parts of the original app.

1. Photo and file name…

When the camera was clicked, a photo was taken and a file name was generated for it. The code is below:

Collect(PictureList,{

        Photo: Camera1.Photo,

        ID: Concatenate(AuditNumber.Text,”-“,Text(Today(),”[$-en-US]dd:mm:yy”),”-“,Text(CountRows(PictureList)+1),”.jpg”)

} )

This code created an in-memory table (a collection named PictiureList) that, when a few photos are taken, looks like this:

image

2. Saving to Flow

The other part of the original app was saving the contents of the above collection to Flow. The Submit button has the following code…

UpdateContext( { FinalData : Concat(PictureList, ID & “|” & Photo & “#”) } );
UploadPhotostoAuditLib.Run(FinalData)

The first line takes the PictureList collection above and munges it into a giant string called FinalData. Each row is delimited by a hash “#” and each column delimited by a pipe “|”. The second line then calls the Flow and passes this data to it.

Both of these functions are about to change…

Getting the weather…

The first task is to get the weather. In part 1 we already created the custom connector to the service. Now it is time to use it in our app by adding it as a data source. From the View menu, choose Data Sources. You should see your existing data source that connects to Flow.

image  image

Click Add data source and then New connection. If you got everything right in part 1, you should see a data source called OpenWeather. Click on it, and you will be asked to enter an API key. You should have this key from part 1 (and you should understand exactly why you were asked for it at this point), so go ahead, add it here and click the Create button. If all things to go plan, you will now see OpenWeather added as a data source.

image  image  image  image

Now we are connected to the API, let’s test it by modifying the code that takes a photo. Instead of just capturing the photo and generating a file name, let’s also grab the latitude, longitude from PowerApps, call the API and collect the current temperature.

First here is the code and then I will break it down…

image

UpdateContext( { Weather: OpenAPI.GetWeather(Location.Latitude,Location.Longitude,”metric”) } );
Collect(PictureList,
{

     Photo: Camera1.Photo,

     ID: Concatenate(AuditNumber.Text,”-“,Text(Today(),”[$-en-US]dd:mm:yy”),”-“,Text(CountRows(PictureList)+1),”.jpg”),

     Latitude:Location.Latitude,

     Longitude:Location.Longitude,

     Temp:Weather.main.temp } )

 

The first line is where the weather API is called: OpenAPI.GetWeather(Location.Latitude,Location.Longitude,”metric”) . The parameters Location.Latitude and Location.Longitude come straight from PowerApps. I want my temperature in Celsius so I pass in the string “metric” as the 3rd parameter.

My API call is then wrapped into an UpdateContext() function, which enables us to save the result of the API call into a variable I have called Weather.

Now if you test the app by taking photos, you will notice a couple of things. First up, under variables, you will now see Weather listed. Drill down into it and you will see that a complex data structure is returned by the API. In the example below I drilled down to Weather->Main to find a table that includes temperature.

image

image  image

The second line of code (actually I broke it across multiple lines for readability) is the Collect function which, as its title suggests, creates collections. A collection is essentially an in-memory data table and the first parameter of Collect() is simply the name of the collection. In our example it is called PictureList.  The second second parameter is a record to insert into the collection. A record is a comma delimited set of fields and values inside curly braces. eg: { Title: “Hi”, Description: “Greetings” }. In our example, we are building a table consisting of:

  • Photo
  • File name for Photo
  • Latitude
  • Longitude
  • Temperature

The last parameter is the most interesting, because we are getting the temperature from the Weather variable. As this variable is a complex data type, we have to be quite specific about the value we want. I.e. Weather.main.temp.

Here is what the PictureList collection looks like now. If you have understood the above code, you should be able to extend it to grab other interesting weather details like wind speed and direction.

image

Getting ready for Flow…

Okay, so now let’s look at the code behind the Submit button. The change made here is to now include the additional columns from PictureList into my variable called FinalData. If this it not clear then I suggest you read this post or even Mikael Svenson’s work where I got the idea…

image

UpdateContext( { FinalData : Concat(PictureList, ID & “|” & Photo & “|” & Latitude & “|” & Longitude & “|” & Temp & “#”) } );
UploadPhotosToAuditLib.Run(FinalData)

So in case it is not clear, the first line munges each row and column from PictureList into a giant string called FinalData. Each row is delimited by a hash “#” and each column delimited by a pipe “|”. The second line then calls the Flow and passes it FinalData.

At this point, save your changes to PowerApps and publish as you are done here. Let’s now make some changes to the SharePoint document library where the photos are currently being uploaded to. We will add columns for Temperature, Latitude and Longitude and I am going to assume you know enough of SharePoint to do this and will paste a screenshot of the end in mind…

image  image

Right! Now it is time to turn our attention to Flow. The intent here is to not only upload the photos to the document library, but update metadata with the location and temperature data. Sounds easy enough right? Well remember how I said that we got rid of most of the painful stuff in part 1?

I lied…

Going with the Flow…

Now with Flow, it is easy to die from screenshot hell, so I am going to use some brevity in this section. If you played along with my pre-requisite post, you already had a flow that more or less looks like this:

  1. A PowerApps Trigger
  2. A Compose action that splits the photo via hash: @split(triggerbody()[‘ProcessPhotos_Inputs’],”#”)
  3. An Apply to each statement with a condition inside @not(empty(item()))
  4. A Compose action that grabs the file name: @split(item(),’|’)[0]
  5. A Compose action that grabs the file contents and converts it to binary: @dataUriToBinary(@split(item(),’|’)[1])
  6. A SharePoint Create File action that uploads the file to a document library

The image below illustrates the basic layout.

image

Our task is to now modify this workflow to:

  1. Handle the additional data sent from PowerApps (temperature, latitude and longitude)
  2. Update SharePoint metadata columns on the uploaded photos with this new data.

As I write these lines, Flow has very poor support for doing this. It has been acknowledged on UserVoice and I know the team are working on improvements. So the method I am going to use here is essentially a hack and I actually feel a bit dirty even suggesting it. But I do so for a couple of reasons. Firstly, it helps you understand some of the deeper capabilities of Flow and secondly, I hope this real-world scenario is reviewed by the Flow team to help them understand the implications of their design decisions and priorities.

So what is the issue? Basically the flow actions in SharePoint have some severe limitations, namely:

  • The Create File action provides no way to update library metadata when uploading a file
  • The Create Item action provides access to metadata but only works on lists
  • The Update Item action works on document libraries, but requires the item ID of the document to do so. Since Create File does not provide it, we have no reference to the newly created file
  • The Get Items function allows you to search a list/library for content, but cannot match on File Name (actually it can! I have documented a much better method here!)

So my temporary “clever” method is to:

  1. Use Create File action to upload a file
  2. Use the Get Items action to bring me back the metadata for the most recently created file in the library
  3. Grab the ID from step 2
  4. Use the Update Item action to set the metadata on the recently uploaded image.

Ugh! This method is crude and I fear what happens if a lot of flows or file activity was happening in this library and I really hope that soon the next section is soon redundant…

Okay so let’s get started. First up let’s make use of some new Flow functionality and use Scopes to make this easier. Expand the condition block and find the Compose action that extracts the file name. If you dutifully followed my pre-req article it will be called “Get File Name”. Above this, add a Scope and rename it to “Get File and Metadata”. Drag the “Get File Name” and “Get File Body” actions to it as shown below.

image  image  image

Now let’s sort out the location and temperature data. Under “Get File Body”, add a new Compose action and rename it to “Get Latitude”. In the compose function add the following:

Under “Get Latitude”, add a new Compose action and rename it to “Get Longitude”. In the compose function add the following:

Under “Get Longitude”, add a new Compose action and rename it to “Get Temperature”. In the compose function add the following:

  • @split(item(),’|’)[4]

This will result in the following:

image  image

Now click on the Get File and Metadata scope and collapse it to hide the detail of metadata extraction (now you know what a scope is for Smile)

image

So now we have our metadata, we need to apply it to SharePoint. Under the “Create File” action, add a new scope and call it “Get Item ID”. This scope is where the crazy starts…

Inside the scope, add a SharePoint – Get Items action. Enter the URL of your site collection and in the name (not URL) of your document library. In the Order By field, type in Created desc and set the Maximum Get Count to 1. Basically this action is calling a SharePoint list web service and “Created desc” says “order the results by Created Date in descending order (newest first).

Actually what you do is set Filter Query to FileLeaf eq ‘[FileName]’ as described in this later post!

Now note the plural in the action: “Get Items”. That means that by design, it assumes more than 1 result will be returned. The implication is that the data will comes back as an array. in JSON land this looks like the following:

[ { “Name”: “Value” }, { “Name”: “Value2” }, { “Name”: “Value3” } ]

and so on…

Also note that there is no option in this action to choose which fields we want to bring back, so this action will bring back a big, ugly JSON array back from SharePoint containing lots of information.

Both of these caveats mean we now have to do some data manipulation. For a start, we have to get rid of the array as many Flow actions cannot handle them as data input. Also, we are only interested in the item ID for the newly uploaded photo. All of the other stuff is just noise. So we will add 3 more flow actions that:

  1. clear out all data apart from the ID
  2. turn it from an array back to a regular JSON element
  3. extract the ID from the JSON.

For step 1, under the “Get items” action just added, add a new Data Operations – Select action. We are going to use this to select just the ID field and delete the rest. In the From textbox, choose the Value variable returned by the Get Items action. In the Map field, enter a key called “ID” and set the value to be the ID variable from the “Get Items” action.

image

For step 2, under the “Select” action, add a Data Operations – Join action. This action allows you to munge an array into a string using a delimiter – much like what we did in PowerApps to send data to Flow. Set the From text box to be the output of the Select action. The “Join with” delimiter can actually be anything, as the array will always have 1 item. Why? In the Get Items action above, we set the Maximum Get Count to 1. We will always get back a single item array.

image

The net effect of this step will be the removal of the array. I.e., from:

[ { “ID”: 48 } ]

to

{ “ID”:48 }

For step 3, under the “Join” action, add a Data Operations – Parse JSON action. This action will process the JSON file and each element found will be available to subsequent actions. The easiest way to understand this action is to just do it and see the effect. First, set the Content textbox to the output from the Join action.

image

Now we need to tell this action which elements that we are interested in. We already know that we only have 1 variable called ID because of the Select action we set up earlier that has stripped everything else out. So to tell this action we are expecting an ID, click the “use sample payload…” link and paste some sample JSON in our expected format…

{

    “ID”:48

}

If all goes to plan, a Schema has been generated based on that sample data that will now allow us to grab the actual ID value.

image  image

Okay, so we are done with the Get Item ID scope, so collapse it as follows…

image

Finally, under the “Get Item ID” scope, add a SharePoint – Update Item action. Add the URL of your site collection and then specify the document library where the photos were uploaded to. If this has worked, the additional metadata columns should now be visible as shown in the red box below. Now set the specific item we want to update by setting the ID parameter to the ID variable from the Parse JSON step.

image

Now assign the metadata to the library columns. Set Latitude to the output variable from the Get Latitude step, Longitude to the output variable from the Get Longitude step and Temperature to the output variable from the Get Temperature step as shown below.

image

Now save your flow and cross all fingers and toes…

Testing and conclusion!

Return to PowerApps (in the browser or on your mobile device – not the PowerApps studio app). Enter an audit number and take some photos… Wohoo! Here they are in the library along with metadata. Looks like I need to put on a jacket when I step outside Smile

image   image

So taking a step back, we have managed to achieve quite a lot here.

  1. We have wired up a public web service to PowerApps
  2. We have used PowerApps built-in location data to load weather data each time a photo has been taken
  3. We have used Flow to push this data into SharePoint and included the location and weather data as parameters.

Now on reflection there are a couple of massive issues that really prevent this from living up to its Citizen Developer potential.

  1. I had to use a 3rd party service to generate my OpenAPI file and it was not 100%. Why can’t Microsoft provide this service?
  2. Flow’s poor support for common SharePoint scenarios meant I had to use (non) clever workarounds to achieve my objectives.

Both of these issue were resolvable, which is a good thing, but I think the solutions take this out of the realm of most citizen developers. I had to learn too much JSON, too much Swagger/OpenAPI and delve deep into Flow.

In saying all that, I think Flow is the most immature piece of the puzzle at this stage. The lack of decent SharePoint support is definitely one where if I were a program manager, I would have pushed up the release schedule to address. It currently feels like the developer of the SharePoint actions has never used SharePoint on a day to day basis, and dutifully implemented the web services without lived experience of the typical usage scenarios.

For other citizen developers, I’d certainly be interested in how you went with your own version and variations of this example, so please let me know how you go.

And for Microsoft PowerApps/Flow product team members who (I hope) are reading this, I think you are building something great here. I hope this material is useful to you as well.

 

Thanks for reading

 

Paul Culmsee

www.hereticsguidebooks.com

 Digg  Facebook  StumbleUpon  Technorati  Deli.cio.us  Slashdot  Twitter  Sphinn  Mixx  Google  DZone 

No Tags

Send to Kindle

How’s the weather? Using a public API with PowerApps (part 1)

This entry is part 2 of 3 in the series OpenAPI
Send to Kindle

Introduction

Okay citizen developers, listen up! This post and its forthcoming second part will teach you how to get PowerApps to connect to an online web service to add an extra dimension of awesomeness to your app. It also covers what I think is the most off-putting aspect of PowerApps work in general – dealing with the horribleness of Swagger/OpenAPI. Hopefully by the end you will find my approach useful to make make it a bit less horrible. If you have not come across OpenAPI. you did not read a previous post of mine when I whined about it. That’s ok because that post is not a prerequisite for this article, BUT another post is required pre-reading.

In a previous post, I created a sample app that demonstrated an inspection scenario where a user would be taking photos, which were then sent to SharePoint via Flow, with nice file names. In this small series of posts we are going to build on this example, so I suggest starting there first and getting to the point where you can save photos to SharePoint via Flow as shown below:

image  image

In this post, I will focus on getting PowerApps talking to an external web service…

Now a common inspection scenario would be to capture current weather information each time a photo was taken. A good example would be an park ranger, inspecting sensitive environmental sites, or a plant operator who needs to conduct and document a safety inspection of equipment before it is used. In both scenarios, weather such as rain or wind, might have a material impact on the result of the inspection, so it makes sense to capture location and weather data along with the photo.

Now location data is easy, as PowerApps has the in-built ability to capture GPS data. All we need to do is pass that GPS data to a weather web service to get the local conditions like temperature, humidity and wind speed/direction. At first I thought I could do it all out of the box because Microsoft provides a built-in connection to MSN Weather API as shown below, but unfortunately you can only pass it location data in the form of a city name, not a latitude/longitude.

image

It did not take long for me to find an alternative. The nice folks at OpenWeatherMap offer an API that does accept geographic co-ordinates as input. Even better, they have a free option, provided you stay within certain limits. To use the service, you need to sign up and generate an API key so they can identify you, which is added to the API call. To access weather data for a location, the URL looks like the following:

http://api.openweathermap.org/data/2.5/weather?lat=[latitude]&lon=[longitude]&appid=[API Key]&units=metric

Of course, the “units” parameter at the end of the URL can be changed in case you live in one of the 3 remaining countries holding out against metric (looking at you USA! Smile)

The data that is returned by this API is in JSON format. For example here is what comes back for my home town of Perth (lat -31.9529 & lon 115.8573)

image

I have pasted the data in a clearer format below. Note, it pays to get used to the JSON format, as it underpins a lot of modern web applications.

 

{"coord":{
   "lon":115.86,
   "lat":-31.95},
 "weather":[{
   "id":804,
   "main":"Clouds",
   "description":"overcast clouds",
   "icon":"04n"}],
 "base":"stations",
 "main":{
   "temp":5.37,
   "pressure":1023,
   "humidity":100,
   "temp_min":5,
   "temp_max":6},
 "visibility":7000,
 "wind":{
   "speed":1.5,
   "deg":50},
 "clouds":{
   "all":90},
 "dt":1499297700,
 "sys":{
   "type":1,
   "id":8189,
   "message":0.0089,
   "country":"AU",
   "sunrise":1499296640,
   "sunset":1499333125},
"id":2066756,
"name":"Maylands",
"cod":200}

As you can see above this is a bunch of name/value pairs. But now the challenge is to get this API/data into PowerApps so we can use it. Below is an image showing what it looks like when things are wired up into PowerApps. You might be thinking, where did this “OpenWeather.getWeather” function come from? How does it know to ask for latitude, longitude and units of measure?

image

The answer my friends is OpenAPI/Swagger. Strap yourselves in fellow citizen developers, as we need to go on a fun-filled ride.

Generating an initial OpenAPI file

Basically the OpenAPI specification (formerly known as Swagger) is a way to describe web services. Just as we create columns in document libraries so we have metadata to describe our documents, OpenAPI is basically “metadata for web services”. Unfortunately the comparison ends here because unlike documents in a SharePoint library, describing web services is an entirely different beast. I found the OpenAPI format so ugly and hard to get my head around, it took me an entire day just to use it previously. In short the learning curve is high, and although with persistence you will eventually get there, I found the whole thing to be poorly documented with not enough good examples.

Now it is not my intent to describe the format of OpenAPI. Luckily for us, there are websites that make the process of creating the metadata we need much easier. To this end, we are going to use a service from Apigee called openApiGen that takes 90% of the pain away.

So go ahead and sign up to OpenWeather and get yourself an API key. Confirm that it works in the browser as I showed earlier and then head on over to http://specgen.apistudio.io and follow these simple steps:

Step 1: Run your API

Paste your working API call from earlier into the text box and click the Send button. Check for a successful response and the JSON is in the expected format as shown below: 

image

Step 2: Enter API Program Information

This is where you describe your API in general. Note that PowerApps uses this info, so keep the title short and sweet and rename it from the default. I called mine OpenWeather.

image

Step 3: Update API Call Information

In this step, you have the option to perform more fine-grained tuning of your API, such as the function name that will be called from PowerApps. Like step 2, PowerApps uses this info, so keep your title short and sweet. The main thing is to set the Operation Id, as this corresponds directly to PowerApps as I show below (I called mine getWeather)

image

Step 4: Update header info

In this step, you provide information about any API parameters passed in the HTTP headers. In this case the OpenWeather API does not use headers, so there is nothing to verify/edit here…

image

Step 5: Update Query Parameters

In this step, need to provide additional metadata information the parameters passed to the query in step 1. As you can see below, the 4 parameters we used have been found. Go ahead and add descriptions for the lat, lon and unit parameters, but ignore appid. We actually will not be using appid in the same way as the other three, so no edits need to be made (we will take care of appid later).

image

Step 6: API Path Info

This step is designed to handle API’s where the URL needs to adjust. (eg /api/{somedynamicvalue}/weather. In our case the URL is static, so there is nothing to do here.

image

Step 7: Save your goodness!

At this point, you have generated an OpenAPI file! Now that wasn’t too bad was it? Click the bright orange Download button to grab a copy of it.

image

Tweaking your initial OpenAPI file

At this point you might think that you are done, but not so..  While that site does a great job of generating your file, it is not a 100% guarantee to work, and in fact if we try this in PowerApps now, it will have an issue. For the sake of learning let’s go through the process anyway so you know how to deal with quirks…

So let’s turn our attention to PowerApps. Choose Connections from the left navigation and then click Manage custom connectors”. On the following screen, click Create Custom Connector.

image  image

One the next screen, upload your newly minted OpenAPI file. Note the connector name, as it will be based on the API Program Title you specified in step 2 of the previous section.

image

Scroll down and optionally, add a custom icon. I grabbed the OpenWeather icon from their site and added it here. Click the Continue button at the bottom of the screen.

image

The next screen is really important to get right. PowerApps needs to know what authentication method should be used when talking to the OpenWeather API. As we learnt earlier, it is an API Key, which is one of the options available. On choosing this option, you will be asked to provide some additional detail. As it states, “users will be required to provide the API Key when creating a connection”. This is a good thing because it allows you to distribute your PowerApp to others without requiring you to hardcode your own API key.

image

The important thing to get right is Parameter name and Parameter location. You will note that I entered “appid” for Parameter name, which corresponds to identically named parameter in the URL for the web service, I.e.

http://api.openweathermap.org/data/2.5/weather?lat=[latitude]&lon=[longitude]&appid=[API Key]&units=metric

The Parameter location simply tells PowerApps that the API key is in the URL query, and not in the HTTP headers.

The next screen is a busy one, principally because it is where you get to tweak your API definition as from what is specified in the OpenAPI file. First up, on the top half of the screen, in the General section, verify that the Operation ID is correct as per step 3 of the previous section. Note that PowerApps will warn you if the function name does not start with an Upper case letter, so make that change if you are really concerned about it.

image  image

The more important bit is further down the screen. Scroll down to the Request section and delete the appid parameter altogether. Since we have already told PowerApps to ask for it when you add a connection to OpenWeather, we should not be asking for it inside the PowerApps function call…

image  image\

Finally, complete the operation by clicking Create connector.

image

PowerApps will chug away and eventually you will see an error. Unfortunately for us, whatever team designed this part of PowerApps didn’t think things through, because they leave limited space for the error message and the actual meaningful part of the error is truncated. To alleviate this piece of poor design, so move your cursor over the error message to see the full text. Unfortunately the poor design does not stop as this error hover over is fleeting and disappears, requiring you to hover off and back on again to see it.

For the record, it will say:

Your custom connector has been successfully created, but there was an issue converting it to WADL for PowerApps: An error occurred while converting OpenAPI file to WADL file. Error: ‘Required property “type” is not present or not a string at JSON path paths[‘/data/2.5/weather’].get.responses.200.schema.properties.weather.items’

image

So we are going to have to tweak our OpenAPI file to fix this. If you do not have a text editor that supports JSON, I suggest you download and install Notepad++ to make this easier…

Fixing the OpenAPI file…

Now delving into the guts of the OpenAPI specification is not my idea of fun, nor is teaching you to learn JSON. So let’s just assume you are not familiar with JSON and  review the message and see if it helps us:

Error: ‘Required property “type” is not present or not a string at JSON path paths[‘/data/2.5/weather’].get.responses.200.schema.properties.weather.items’

Looking at the OpenAPI file, I see the a pattern. First I can see how it corresponds with the OpenWeather web service API output. For example, the response from calling the API in the browser produces output that starts with a “coord” parameter and then a “weather” parameter.

image

When I look in the OpenAPI file, I can see those same parameters under the “properties” declaration. Check out the image below… given the property section is below the “Responses” and “200” sections, it is easy to see now that this section of the OpenAPI file is describing the metadata for a successful response from OpenWeather (i.e. HTTP code 200). Hmm… suddenly this ugly file is looking slightly less ugly…

image

While the image above shows me the section I need to focus on according to the error message. The image below more detailed view of that section with some annotations (and can you tell I just bought a laptop with a pen? Smile). I noticed that every property has a type declaration for it (highlighted), except for the property called “items” (marked with red).  It seems to be anomalous in that no “type” is designated for “item”.

image

So the next question is if a type is missing, what type is it? Looking elsewhere in the OpenAPI file, it appeared the missing type was “object” based on the fact that the items property itself was made up of sub properties. So I inserted the “type” : “object” under the declaration of the items property. made the following change:

image

I saved the file and returned to PowerApps. I deleted the problematic connector and then repeated the steps in PowerApps to create a custom connection. Aha! We have lift-off!!!

image

Okay, so at this point we have things wired up. I hope this gives you the confidence to wire up other web services to PowerApps.

In the next post, we will make use of this newly connected API in PowerApps!

 

 

Until then, thanks for reading

 

Paul Culmsee

www.hereticsguidebooks.com

 Digg  Facebook  StumbleUpon  Technorati  Deli.cio.us  Slashdot  Twitter  Sphinn  Mixx  Google  DZone 

No Tags

Send to Kindle

A Clever-workaround for Saving Photos to SharePoint from PowerApps

Send to Kindle

At the time of writing, a common request for PowerApps is to be able to able to upload photos to SharePoint. It makes perfect sense, especially now that its really easy to make a PowerApp that is bound to a SharePoint list. Sadly, although Microsoft have long acknowledged the need in the PowerUsers forum, a solution has not been forthcoming.

<update>I now use a Flow-based method for photos that I documented in this video. I only use this method in complex scenarios</update>

I have looked at the various workarounds, such as using the OneDrive connector or a custom web API, but these for me were fiddly. Thanks to ideas from John Liu, I’ve come up with a method that is more flexible and less fiddly to implement, provided you are okay with a bit of PowerShell, and (hopefully) with PnP PowerShell. One advantage to the method is that it handles an entire gallery of photos in a single transaction, rather than just a single photo at a time.

Now in the old days I would have meticulously planned out a multi-part series of posts related to a topic like this, because I have to pull together quite a lot of conceptual threads into a single solution. But since the pace of change in the world of Office365 is so rapid, my solution may be out of date by the time I publish it. So instead I offer a single summary post of my solution and leave the rest to you to figure out.  Sorry followers, its just too hard to do epic multi-part articles these days – times have changed.

What you need

  1. An understanding of JSON and basic idea of web services
  2. An azure subscription
  3. Access to Azure functions
  4. The PnP Powershell cmdlets
  5. A Swagger file (don’t worry if this makes no sense now)
  6. To be signed up to PowerApps

 How we are going to solve this…

In a nutshell, we will create an Azure function, using PowerShell to receive photos from PowerApps and uploads them to a SharePoint library. Here is my conceptual diagram that I spent hours and hours drawing…

Snapshot

To do this we will need to do a few things.

  1. Customise PowerApps to store photo data in the way we need
  2. Create and configure our Azure function
  3. Write and test the PowerShell code to upload to SharePoint
  4. Create a Swagger file so that PowerApps can talk to our Azure function
  5. Create a custom PowerApps connection/datasource use the Azure function
  6. Test successfully and bask in the glory of your awesomeness

Step 1: Customise PowerApps to store photo data in the way we need

Let’s set up a basic proof of concept PowerApp. We will add a camera control to take photos, a picture gallery to view the photos and a button to submit the photos to SharePoint. I’ll use the PowerApps desktop client rather than the web page for this task and create a blank app using the Phone Layout.

image

From the Insert menu, add a Camera control from the Media dropdown to add it to the screen. Leave it up near the top…

image

From the Insert Menu, add a Gallery control. For my demo I will use the vertical gallery. Move it down below the camera control so it looks like the second image below.

image  image

From the Insert Menu, add two buttons below the gallery. Set the text property on one to “Submit” and the other to “Clear”. I realise the resulting layout will not win any design awards but just go with it. Use the picture below to guide you.

image    image

Now let’s wire up some magic. Firstly, we will set it up so clicking on the camera control will take a photo, and save it to PowerApps storage. To do this we will use the Collect function. Assuming your Camera control is called “Camera1”, select it , and set the OnSelect property to:

Collect(PictureList,Camera1.Photo)

image

Now when a photo is taken, it is added to an in-memory PowerApps data-source called PictureList. To see this in action, preview the PowerApp and click the camera control a couple of times. Exit the preview and choose “Collections” from the left hand menu. You will now see the PictureList collection with the photos you just took, stored in a field called Url. The reason it is called URL and not “Photo” will become clear later).

image

Now let’s wire up the clear button to clear out this collection. Choose the button labelled “Clear” and set its OnSelect property to:

Clear(PictureList)

image

If you preview the app and click this button, you will see that the collection is now empty of pictures.

The next step is to wire up the Gallery to the PictureList collection so that you can see the photos being taken. To do this, select the gallery control and set the Items property to PictureList as shown below. Preview this and you should be able to take a set of photos, see them added to the gallery and be able to clear the gallery via the button.

image

image

Now we get to a task that will not necessarily make sense until later. We need to massage the PictureList collection to get it into the right format to send to SharePoint. For example, each photo needs a filename, and in a real-world scenario, we would likely further customise the gallery to capture additional information about each photo. For this post I will not do this, but I want to show you how you can manipulate data structures in PowerApps. To do this, we are going to now wire up some logic to the “Submit” button. First I will give you the code before I explain it.

Clear(SubmitData);
ForAll(PictureList,Collect(SubmitData, { filename: "a file.jpg", filebody: Url }))

image

In PowerApps, it is common to add multiple statements to controls, separated by a semicolon. Thus, the first line above initialises a new Collection I have called “SubmitData”. If this collection already had data in it, the Clear function will wipe it out. The second line uses two functions, ForAll and the previously introduced Collect. ForAll([collection],[formula]) will iterate through [collection] and perform tasks specified in [formula]. In our case we are adding records to the SubmitData collection. Each record consists of two fields and is in JSON format – hence the curly braces. The first field is called filename and the second is called filebody. In my example the filename is a fixed string, but filebody grabs the Url field from the current item in PictureList.

To see the effect, run the app, click submit and then re-examine the collections. Now you will see two collections listed – the original one that captures the photos from the camera (PictureList), and the one called SubmitData that now has a field for filename and a field called filebody with the photo. I realise that setting a static filename called “a file.jpg” is not particularly useful to anybody, and I will address this a little later, the point is we now have the data in the format we need.

image

By the way, behind the scenes, PowerApps stores the photo in the Data URI scheme. This is essentially a Base64 encoded version of the image with a descriptor at the start that is included in HTML. When you think about it, this makes sense in some situations because it reduces the number of HTTP round trips between browser and server. For example here is a small image encoded and embedded direct in HTML using the technique.

<img src="data:image/png;base64,iVBORw0KGgoAAA
ANSUhEUgAAAAUAAAAFCAYAAACNbyblAAAAHElEQVQI12P4
//8/w38GIAXDIBKE0DHxgljNBAAO9TXL0Y4OHwAAAABJRU
5ErkJggg==" alt="Red dot" />

The implication of this format is when PowerApps talks to our Azure function, it will send this sort of JSON…

[ {  "filename": "boo.jpg",
"filebody": "data:image/jpeg;base64,/9j/4AAQSkZJR [ snip heaps of Base64 ] T//Z" },
{  "filename": "boo3.jpg",
"filebody": "data:image/jpeg;base64,/9j/TwBAQMEBA [ snip heaps of Base64 ] m22I" } ]

In the next section we will set up an Azure Function and write the code to handle the above format, so save the app in its current state and give it a nice name. We are done with PowerApps for now…

2. Create and Configure Azure Function

Next we are going to create an Azure function. This is the bit that is likely new knowledge for many readers. You can read all about them on their Azure page, but my quick explanation is that they allow you to take a script or small piece of code and turn it into a fully fledged webservice. As you will soon see, this is very useful indeed (as well as very cost effective).

Now like many IT Pros, I am a PowerShell hacker and I have been using the PowerShell PnP libraries for all sorts of administrative purposes for quite some time. In fact if you are administering an Office365 tenant and you are not using PnP, then I can honestly say you are missing out on some amazing time-saving toolsvand you owe it to yourself to skill-up in this area. Of course, I realise that many readers will not be familiar with PowerShell, let alone PnP, and I expect some readers have not done much coding at all. Luckily the code we are going to use is just a few lines and I think I can sufficiently explain it.

But we are getting ahead of ourselves, let’s create the Azure function and then revisit PowerShell. Assuming you have an Azure subscription, visit functions.azure.com and log in. If not, sign up for the free account and then create a function app to host the function. I called mine MyFunctionsDemo but yours will have to be something different. This will take minute or two to complete and you will be redirected to the Azure functions portal.

image   image

Once the web application to host your functions is created, Click the + next to the Functions button to create a new function. PowerShell is still in preview, so you have to click the option to create a custom function. On the next screen, in the Language dropdown, choose PowerShell.

image  image

Our function is going to be triggered from PowerApps when a user clicks the submit button. PowerApps will make a HTTP request so this is a HttpTrigger scenario. Click the HttpTrigger-PowerShell template, give it a name (I called mine PhotoSendSP) and click the Create button. If all goes to plan you will be presented with a screen with some basic PowerShell code… essentially a “Hello World” web service.

image   image

Let’s test this newly minted Azure function before we customise it. If you look to the right of the screen above, you will see a “Test” vertical label. Click it and you will be presented with a screen that allows you to craft some data to send to your shiny new function. You can see that the test is going to be a HTTP POST by default. As you can see below, there is a basic JSON entry with a single name/value pair “name”: “Azure”. Change the Azure string to something else and then click the Run button. The result will be displayed below the JSON as shown below.

image   image

Now let’s take a quick look at the PowerShell code provided to you by default. Only lines 2, 3 and 11 matter for our purposes. What lines 2 and 3 show is that all of the details that are posted to this webservice are stored in a variable called $req. Line 2 converts this to JSON format and stores that in a variable called $requestbody. Line 3 then asks $requestbody for the value of “name”, which is you look in the screenshots above are what you set in the test. Line 11 then outputs this line to a variable called $res, which is the response back to the caller of this webservice. In this case you can see it returns “Hello “ with $name appended to it.

image
Now that we have seen the PowerShell code, let’s now update it with code to receive data from PowerApps and send it to SharePoint.

3. Write and test the PowerShell code that uploads to SharePoint

If you recall with PowerApps, the data we are sending to SharePoint is one or more photos. The data will look like this…

[ {  "filename": "boo.jpg",
"filebody": "data:image/jpeg;base64,/9j/4AAQSkZJR [ snip heaps of Base64 ] T//Z" },
{  "filename": "boo3.jpg",
"filebody": "data:image/jpeg;base64,/9j/TwBDAQMEBAUEBQ  m22I" } ]

In addition, for the purposes of keeping things simple, I am going to hard code various things like the document library to save the files to and not worry about exception handling. Below is my sample code with annotations at the end…

1.  Import-Module "D:\home\site\wwwroot\modules\SharePointPnPPowerShellOnline\2.15.1705.0\SharePointPnPPowerShellOnline.psd1" -Global
2.  $requestBody = Get-Content $req -Raw | ConvertFrom-Json
3.  $username = "paul@tenant.onmicrosoft.com"
4.  $password = $env:PW;
5.  $siteUrl = "https://tenant.sharepoint.com"
6.  $secpasswd = ConvertTo-SecureString $password -AsPlainText -Force
7.  $creds = New-Object System.Management.Automation.PSCredential ($username, $secpasswd)
8.  Connect-PnPOnline -url $siteUrl -Credentials $creds
9.  $ctx = get-pnpcontext
10. $doclib = $ctx.Web.Lists.GetByTitle("Documents")
11. ForEach ($item in $requestbody)
12. {
13.    $filename = $item.filename
14.    $rawfiledata = $item.filebody
15.    $rawfiledata = $rawfiledata -replace 'data:image/jpeg;base64,', ''
16.    $bytes = [System.Convert]::FromBase64String($rawfiledata)
17.    # uses comma notation related to .net reflection http://piers7.blogspot.com.au/2010/03/3-powershell-array-gotchas.html
18.    $memoryStream = New-Object System.IO.MemoryStream (,$bytes)
19.    $FileCreationInfo = New-Object Microsoft.SharePoint.Client.FileCreationInformation
20.    $FileCreationInfo.Overwrite = $true
21.    $FileCreationInfo.ContentStream = $memoryStream
22.    $FileCreationInfo.URL = $filename
23.    $Upload = $doclib.RootFolder.Files.Add($FileCreationInfo)
24.    $ctx.Load($Upload)
25.    $ctx.ExecuteQuery()
26. }

 

  • Line 1 loads the PnP PowerShell module. Without this, commands like Connect-PnPOnline and Get-PnPContext will not work. I’ll show how this is done after explaining the rest of the code.
  • Lines 3-7 are all about connecting to my SharePoint online tenant. Line 4 contains a variable called $env:PW. The idea here is to avoid passwords being stored in the code in clear text. The password is instead is pulled from an environment variable that I will show later.
  • Lines 7-9 connect to a site collection and then connect to the default document library within it.
  • Line 10 looks at the data sent from PowerApps and loops around to process for each image/filename pair.
  • Lines 13-18 grabs the file name and file data. It converts the file data into a memorystream, which is a way to represent a file in memory.
  • Line 19-25 then uploads the in-memory image to the document library in SharePoint, based on the filename provided. (Note: any PnP gurus wondering why I did not use Add-PnPFile, it was because this cmdlet did not properly handle the memorystream and the images were not proper binary and always broken.)

So now that we have seen the code, lets sort out some final configuration to make this all work. A lot of the next section I learnt from John Liu and watching the excellent Office Patterns and Practices Special Interest Group webinar he recently did with my all-time SharePoint hero, Vesa Juvonen.

Installing PnP PowerShell Components

First up, none of this will work without the PnP PowerShell module deployed to the Azure Function App. The easiest way to do this is to install the PnP PowerShell cmdlets locally and then copy the entire installation up to the Azure function environment. John Liu explains this in the aforementioned webinar but in summary, the easiest way to do this is to use the Kudu tool that comes bolted onto Azure functions. You can find it by clicking the Azure function name (“MyFunctionDemo” in my case) and choosing the “Platform Features” menu. From here you will find Kudu hiding under the Development Tools section. When the Kudu tab loads, click the Debug console menu and create a CMD or PowerShell console (it doesn’t matter which)

image  image  image

We are going to use this console to copy up the PnP PowerShell components. You can ignore the debug console and focus on the top half of the screen. This is showing you the top level folder structure for the Azure function application. Click on site and then wwwroot folders. This is the folder where all of your functions are stored (you will see a folder matching the name of the module we made in step 2). What we will do is install the PnP modules at this level, so it can be used for other functions that you are sure to develop Smile.

image  image

So click the + icon to create a folder here and call it Modules. From here, drag and drop the PnP PowerShell install location to this folder. In my case C:\Program Files\WindowsPowerShell\Modules\SharePointPnPPowerShellOnline\2.15.1705.0. I copied the SharePointPnPPowerShellOnline\2.15.1705.0 folder and all of its content here as I want to be able to maintain multiple versions of PnP as I develop functions over time.

image  image

Now that you have done this, the first line of the PowerShell script will make sense. Make sure you update the version number in the Import-Mobile command to the version of PnP you uploaded.

Import-Module "D:\home\site\wwwroot\modules\SharePointPnPPowerShellOnline\2.15.1705.0\SharePointPnPPowerShellOnline.psd1"

Handling passwords

The next thing we have to do is address the issue of passwords. This is where the $env:PW comes in on line 4 of my code. You see, when you set up Azure functions application, you can create your own settings that can drive the behaviour of your functions. In this case, we have made an environment variable called PW which we will store the password to access this site collection. This hides clear text passwords from code, but unfortunately it is a security by obscurity scenario, since anyone with access to the Azure function can review the environment variable and retrieve it. If I get time, I will revisit this via App Only Authentication and see how I go. I suspect though that this problem will get “properly” solved when Azure functions support using the Azure Key Vault.

In any event, you will find this under the Applications Settings link in the Platform Features tab. Scroll down until you find the “App Settings” section and add your password in as shown in the second image below.

image  image

Testing it out…

Right! At this point, we have all the plumbing done. Let’s test to see how it goes. First we need to create a JSON file in the required format that I explained earlier (the array of filename and filebody pairs). I crafted these by hand in notepad as they are pretty simple. To remind you the format was:

[ {  "filename": "boo.jpg",
"filebody": "data:image/jpeg;base64,/9j/4AAQSkZJR [ snip heaps of Base64 ] T//Z" },
{  "filename": "boo3.jpg",
"filebody": "data:image/jpeg;base64,/9j/TwBDAQMEB [ snip heaps of Base64 ] m22I" } ]

To generate the filebody elements, I used the covers of my two books (they are awesome – buy them!) and called them HG2BP and HG2M respectively. To create the base 64 encoded images in the Data URI scheme, I went to https://www.base64-image.de and generated the encoded versions. If you are lazy and want to use a pre-prepared file, just download the one I used for testing.

h2bph2m image

To test it, all we need to do is click the right hand Test link and paste the JSON into it. Click the Run button and hope for the best! As you can see in my example below, the web service returned a 200 status which means hunky dory, and the logs showed the script executing successfully.

image

Checking my document library and they are there… wohoo!!

image

So basically we have a lot of the bits in place. We have proven that our Azure function can take a JSON file with encoded images, process that file and then save it to a SharePoint document library. You might be thinking that all we need to do now is to wire up PowerApps to this function? Yeah, so did I too, but little did I realise the pain I was about to endure…

4. Create a Swagger file so that PowerApps can talk to our Azure function

Now we come to the most painful part of this whole saga. We need to describe our Azure function using a standard called Swagger (or OpenAPI). This provides important metadata so that PowerApps can make it easy for users to consume. This will make sense soon enough, but first we have to create it, which is a royal pain in the ass. I found the online documentation for swagger to be lacking and it took me a while to understand enough of the format to get it working.

So first up to make things simpler, let’s reduce some of the complexity. Our Azure function is a simple HTTP post. We have not defined any other type of requests, so lets make this formal as it will generate a much less ugly Swagger definition. Expand your function and choose the “Integrate” option. On the next screen, under Triggers, you will find a drop down with a label “Allowed HTTP methods”. Change the default value to “Selected methods” and then untick all HTTP methods except for POST. Click Save.

image  image  image

Now click back to your function app, and choose “API Definition” from the top menu. This will take you to the screen where you create/define your Swagger file.

image

On the initial screen, set your API definition source to function if asked, and you should see a screen that looks somewhat like this…

image

Click the Generate API definition template button as suggested by the comment in the code box in the middle. This will generate a swagger file and on the right side of the screen, the file has will be used to generate a summary of your API. You can see the Url of your azure app, some information about an API key (which we will deal with later) and below that, the PhotoSendSP function exposed as a webservice (/api/PhotoSendSP).

image

Now at this point you are probably thinking “okay so its unfamiliar, but this is pretty easy”, and you would be right. Where things got nightmarish for me was working out how to understand and customise the swagger file as the template is currently incomplete. All it has done is defined our function (note the paths section in the above screenshot  – can you see all those empty square brackets? that’s what you need to now fill in).

For the sake of brevity, I am not going to describe the ins and outs of this format (and I don’t fully know it yet anyway!). What I can tell you is that getting this right is a painful and time-consuming combination of trial and error, reading the swagger spec and testing in PowerApps. Let’s hope my hints here save you some frustration.

The first step is we need to create a definition for the format of data that our Azure function accepts as input. If you look closely above, you will see a section called definitions. Paste the following into the section so it looks like the screenshot below. Note: If you see any symbol apart from a benign warning message next to the “Photos:” line, then you do not have it right!.

definitions:
   Photos:
      type: object
      required:
         - filename
         - filebody
      properties:
         filename:
            type: string
            example: image.jpg
         filebody:
            type: string

image

So what we have defined here is an object called Photos which consists of two required properties, filename and filebody. Both are assumed to be string format, and filename also has an example to illustrate what is expected. Depending on how this swagger file is consumed by another application that supports swagger, one can imagine that example showing up on online help or intellisense when our function is being called.

Now lets make use of this definition.  Paste this into the parameters and description sections. Note the “schema:” section. Here we have told the swagger file that our function is expecting an array of objects based on the Photos definition that we created earlier.

parameters:
  - name: photocollection
    in: body
    description: “The encoded files”
    required: true
    schema:
       type: array
       items:
          $ref: '#/definitions/Photos'
description: "A collection of photos and filenames"

image

Finally, let’s finish off by defining that our Azure function consumes and produces its data in JSON format. Although our sample code is not producing anything back to PowerApps, you can imagine situations where we might do something like send back a JSON array of all of the SharePoint URL’s of each photo.

produces:
- application/json
consumes:
- application/json

image

Okay, so we are all set. Now to be clear, there is a lot more to Swagger, especially if you wanted to call our function from flow, but for now this is enough. Click the Save button, and then click the button “Export to PowerApps + Flow”. You will be presented with a new panel that explains the process we are about to do. Feel free to read it, but the key step is to download the swagger file we just created.

image

Okay, so if you have made it this far, you have a swagger JSON file and you are in the home stretch. Let’s now head back to PowerApps!

5. Create a custom PowerApps connection/datasource to use the Azure function

Back in PowerApps, we need to make a new custom connection to our Azure function. Click the Connections menu item and you will be redirected to web,powerapps.com. Click “Manage Custom Connections” and then click “Create a custom connector”. This will take you to a wizard.

image   image

image

The first step is to upload the swagger file you created in step 4 and before you do anything else, rename your connector to something short and sweet, as this will make it easier when displayed in the PowerApps list of connections.

image

Scroll to the bottom the of page and click the “Continue” button. Now you are presented with some security options about your connector. For context, the default settings for the PowerShell azure function template we chose in step 2 was to use an API key, so you can leave all of the defaults here, although I like to add the more meaningful “API Key” (this will make sense soon). Click Continue

image

PowerApps has now processed the swagger file and found the PhotoSendSP POST action we defined. It has also pulled some of the data from the swagger file to prepopulate some fields. Note this screen has some UI problems – you need to hover your cursor over the forms in the middle of the screen to see the scrollbar, so there is more to edit than what you initially see…

image

For now, do not enter anything into the summary field and scroll down to look at some of the other settings. The Visibility setting does not really matter for PowerApps, but remember that other online services can call our function. This visibility stuff relates more to Microsoft Flow, so you can ignore it for now. Scroll further down and you will see how our swagger schema has been processed by PowerApps. You can explore this but I suggest leaving it well alone. Below I have used all my MSPaint skills to make a montage to show how this relates to your Swagger file…

image

Finally, click Create Connector to wire it up. If all has gone to plan, you will see something resembling the following in the list of custom connectors in PowerApps. If you get this far, congratulations! You are almost there!

image

6. Test successfully and bask in the glory of your awesomeness

Now that you have a custom connection, let’s try it out. Open your PowerApp that you created in step 1. From the Content menu, click Data sources, click Add Datasource and then click New Connection. Scroll through the list of connections until you find the one you created in step 5. Click on it and you will be prompted for an API key (now you see why I added that friendly label during step 5).

image  image  image

Where to find this key? Well, it turns out that it was automagically generated for you when you first created your function! Go back to Azure functions portal and find your function. From the function sub menus, click Manage and you will be presented with a “Functions Keys” section with a default key listed. Copy this to the clipboard, and paste it back into the PowerApps API Key text box and click the Create button. Your datasource that connects to your Azure function is now configured in PowerApps!! (yay!).

image

image  image

Now way back in step 1 (gosh it seems like such a long time ago), we created a button labelled Submit, with the following formula.

Clear(SubmitData);
ForAll(PictureList,Collect(SubmitData, { filename: "a file.jpg", filebody: Url }))

The problem we are going to have is that all files submitted to the webservice will be called “a file.jpg” as I hardcoded the filename parameter for simplicity. Now if I fully developed this app, I would add a textbox to the gallery so that each photo has to be named prior to being able to submit to SharePoint. I am not going to do that here as this post is already too long, so instead I will use a trick I saw here to create a random two letter filename. I know it is not truly unique, but for our demo will suffice.

Here is the formula in all its ugliness.

Concatenate(Text( Now(), DateTimeFormat.LongDate ),Mid("0123456789ABCDEFGHIJKLMNOPQRTSTIUVWXYZ", 1 + RoundDown(Rand() * 36, 0), 1),Mid("0123456789ABCDEFGHIJKLMNOPQRTSTIUVWXYZ", 1 + RoundDown(Rand() * 36, 0), 1),".jpg"), filebody: Url } ) )

Yeah I know… let me break it down for you…

  • Text( Now(), DateTimeFormat.LongDate ) produces a string containing todays date
  • Mid(“0123456789ABCDEFGHIJKLMNOPQRTSTIUVWXYZ”, 1 + RoundDown(Rand() * 36, 0), 1) – selects a random letter/number from the string
  • Concatenate takes the above date, random letters and adds “.jpg” to it.

PowerApps does allow for line breaks in code, so it looks less ugly…

image

Let’s quickly test this before the final step of sending it off to SharePoint. Preview the app, take some photos and then examine the collections. As you can see, the SubmitData collection now has unique filenames assigned (By the way, yes I took these in a car and no, I was not driving at the time! 🙂

image

We are now ready for the final step. We need to call the Azure function! Smile

Go back to your submit button and add the following code to the end of the existing code, taking into account the name you gave to your connection/datasource. You should find that PowerApps uses intellisense to help you add the line because it has a lot of metadata from the swagger file.

'myfunctionsdemo.azurewebsites.net'.apiPhotoSendSPpost(SubmitData)

image

Important! Before we go any further, I strongly suggest you use the PowerApps web based authoring environment and not the desktop application. I have seen a problem where the desktop application does not encode images using the Data URI format, whereas the web based tool (and PowerApps clients on android and IOS) work fine.

So log into PowerApps on the web, open your app, preview it and cross your fingers! (If you want to be clever, go back to your function in Azure and keep and eye on the logs Smile

image

As the above screen shows, things are looking good. Let’s check the SharePoint document library that the script uploads to… YEAH BABY!! We have our photos!!!

image

Conclusion

Now you finally get to bask in your awesomeness. If you survived all this plumbing and have gotten this working then congratulations! you are well on your way to becoming a PowerApps, PowerShell, PnP (and flow) guru. This means you can now enhance your apps in all sorts of ways. For the non-developers who got this far, you can truly call yourself a citizen developer and design all sorts of innovative solutions via these techniques.

Even though a lot of this stuff is fiddly (especially that god-awful swagger crap), once you have gone through it a couple of times, and understand the intent of the components we have used, this is actually quite an easy solution to put together. I also found the Azure function side of things in particular, really easy to debug and see what was going on.

In terms of where we could take this, there are several avenues that I can immediately think of, but the possibilities are endless.

  • We could update the PowerShell code so it takes the destination document library as a parameter. We would need to update a new Swagger definition and then update our datasource in PowerApps, but that is not too hard a task once you have the basics working.
  • Using the same method, we could design a much more sophisticated form and capture lots of useful metadata with the pictures, and get them into SharePoint as metadata on the picture.
  • We could add in a lot more error handling into the script and return much better detail to PowerApps, such as detailed failure information.
  • Similarly, we could return detailed information to PowerApps to make the app richer. For example, we could generate unique filenames in our PowerShell function rather than PowerApps and return those names (and the URL to the image) in the reply to the HTTP Post, which would enable PowerApps to display images inline.
  • We could also take advantage of recent PowerApps enhancements and use local caching. I.e, when an internet connection is available, call the API, but if not, save to local storage and call the API once a connection is available.
  • We could not only upload images, but update lists and any other combination of SharePoint functions supported by PnP.

I hope that this post has helped you to better understand how these components hang together and I look forward to your feedback and how you have adopted/adapted and enhanced the ideas presented here.

 

Thanks for reading

 

Paul Culmsee

www.hereticsguidebooks.com

 

 Digg  Facebook  StumbleUpon  Technorati  Deli.cio.us  Slashdot  Twitter  Sphinn  Mixx  Google  DZone 

No Tags

Send to Kindle