How’s the weather? Using a public API with PowerApps (part 2)

This entry is part 3 of 3 in the series OpenAPI
Send to Kindle

Introduction

Hi again

This is the second half to a post that will use the OpenWeatherMap API in PowerApps. The business scenario is around performing inspections. In the first post I gave the example of a park ranger or plant operator, both conducting inspections where weather conditions can impact the level of danger or the result of the inspection. In such a scenario it makes sense to capture weather conditions when a photo is taken.

PowerApps has the ability to capture location information such as latitude and longitude, and public weather API’s generally allow you to get weather conditions for a given location. So the purpose of these posts is to show you how you can not only capture this data in PowerApps, but then send it to SharePoint in the form of metadata via Flow.

In Part 1, we got the painful stuff out of the way – that is, getting PowerApps to talk to the OpenWeather web service via a custom connector. Hopefully if you got through that part, you now have a much better understanding of the purpose of the OpenAPI specification and can see how it could be used to get PowerApps to consume many other web services. Now we are going to actually build an app that takes photos and captures weather data.

App prerequisites…

Now to get through this post, we are going to do this is to leverage a proof of concept app I built in a separate post. This app was also an inspection scenario, allowing a user to take a bunch of photos, which were then sent to SharePoint via Flow, with correctly named files. If you have not read that post, I suggest you do so now, because I am assuming you have that particular app set up and ready to go.

Go on… its cool, I will wait for you… Smile

Seriously now, don’t come back until you can do what I show below. On the left is PowerApps, taking a couple of photos, and on the right is the photos now in a SharePoint document library.

image  image

Now if you have performed the tasks in the aforementioned article, not only do you have a PowerApp that can take photos, you’ll have a connection to Flow and ready to go (yeah the pun was intended).

First up, lets recap two key parts of the original app.

1. Photo and file name…

When the camera was clicked, a photo was taken and a file name was generated for it. The code is below:

Collect(PictureList,{

        Photo: Camera1.Photo,

        ID: Concatenate(AuditNumber.Text,”-“,Text(Today(),”[$-en-US]dd:mm:yy”),”-“,Text(CountRows(PictureList)+1),”.jpg”)

} )

This code created an in-memory table (a collection named PictiureList) that, when a few photos are taken, looks like this:

image

2. Saving to Flow

The other part of the original app was saving the contents of the above collection to Flow. The Submit button has the following code…

UpdateContext( { FinalData : Concat(PictureList, ID & “|” & Photo & “#”) } );
UploadPhotostoAuditLib.Run(FinalData)

The first line takes the PictureList collection above and munges it into a giant string called FinalData. Each row is delimited by a hash “#” and each column delimited by a pipe “|”. The second line then calls the Flow and passes this data to it.

Both of these functions are about to change…

Getting the weather…

The first task is to get the weather. In part 1 we already created the custom connector to the service. Now it is time to use it in our app by adding it as a data source. From the View menu, choose Data Sources. You should see your existing data source that connects to Flow.

image  image

Click Add data source and then New connection. If you got everything right in part 1, you should see a data source called OpenWeather. Click on it, and you will be asked to enter an API key. You should have this key from part 1 (and you should understand exactly why you were asked for it at this point), so go ahead, add it here and click the Create button. If all things to go plan, you will now see OpenWeather added as a data source.

image  image  image  image

Now we are connected to the API, let’s test it by modifying the code that takes a photo. Instead of just capturing the photo and generating a file name, let’s also grab the latitude, longitude from PowerApps, call the API and collect the current temperature.

First here is the code and then I will break it down…

image

UpdateContext( { Weather: OpenAPI.GetWeather(Location.Latitude,Location.Longitude,”metric”) } );
Collect(PictureList,
{

     Photo: Camera1.Photo,

     ID: Concatenate(AuditNumber.Text,”-“,Text(Today(),”[$-en-US]dd:mm:yy”),”-“,Text(CountRows(PictureList)+1),”.jpg”),

     Latitude:Location.Latitude,

     Longitude:Location.Longitude,

     Temp:Weather.main.temp } )

 

The first line is where the weather API is called: OpenAPI.GetWeather(Location.Latitude,Location.Longitude,”metric”) . The parameters Location.Latitude and Location.Longitude come straight from PowerApps. I want my temperature in Celsius so I pass in the string “metric” as the 3rd parameter.

My API call is then wrapped into an UpdateContext() function, which enables us to save the result of the API call into a variable I have called Weather.

Now if you test the app by taking photos, you will notice a couple of things. First up, under variables, you will now see Weather listed. Drill down into it and you will see that a complex data structure is returned by the API. In the example below I drilled down to Weather->Main to find a table that includes temperature.

image

image  image

The second line of code (actually I broke it across multiple lines for readability) is the Collect function which, as its title suggests, creates collections. A collection is essentially an in-memory data table and the first parameter of Collect() is simply the name of the collection. In our example it is called PictureList.  The second second parameter is a record to insert into the collection. A record is a comma delimited set of fields and values inside curly braces. eg: { Title: “Hi”, Description: “Greetings” }. In our example, we are building a table consisting of:

  • Photo
  • File name for Photo
  • Latitude
  • Longitude
  • Temperature

The last parameter is the most interesting, because we are getting the temperature from the Weather variable. As this variable is a complex data type, we have to be quite specific about the value we want. I.e. Weather.main.temp.

Here is what the PictureList collection looks like now. If you have understood the above code, you should be able to extend it to grab other interesting weather details like wind speed and direction.

image

Getting ready for Flow…

Okay, so now let’s look at the code behind the Submit button. The change made here is to now include the additional columns from PictureList into my variable called FinalData. If this it not clear then I suggest you read this post or even Mikael Svenson’s work where I got the idea…

image

UpdateContext( { FinalData : Concat(PictureList, ID & “|” & Photo & “|” & Latitude & “|” & Longitude & “|” & Temp & “#”) } );
UploadPhotosToAuditLib.Run(FinalData)

So in case it is not clear, the first line munges each row and column from PictureList into a giant string called FinalData. Each row is delimited by a hash “#” and each column delimited by a pipe “|”. The second line then calls the Flow and passes it FinalData.

At this point, save your changes to PowerApps and publish as you are done here. Let’s now make some changes to the SharePoint document library where the photos are currently being uploaded to. We will add columns for Temperature, Latitude and Longitude and I am going to assume you know enough of SharePoint to do this and will paste a screenshot of the end in mind…

image  image

Right! Now it is time to turn our attention to Flow. The intent here is to not only upload the photos to the document library, but update metadata with the location and temperature data. Sounds easy enough right? Well remember how I said that we got rid of most of the painful stuff in part 1?

I lied…

Going with the Flow…

Now with Flow, it is easy to die from screenshot hell, so I am going to use some brevity in this section. If you played along with my pre-requisite post, you already had a flow that more or less looks like this:

  1. A PowerApps Trigger
  2. A Compose action that splits the photo via hash: @split(triggerbody()[‘ProcessPhotos_Inputs’],”#”)
  3. An Apply to each statement with a condition inside @not(empty(item()))
  4. A Compose action that grabs the file name: @split(item(),’|’)[0]
  5. A Compose action that grabs the file contents and converts it to binary: @dataUriToBinary(@split(item(),’|’)[1])
  6. A SharePoint Create File action that uploads the file to a document library

The image below illustrates the basic layout.

image

Our task is to now modify this workflow to:

  1. Handle the additional data sent from PowerApps (temperature, latitude and longitude)
  2. Update SharePoint metadata columns on the uploaded photos with this new data.

As I write these lines, Flow has very poor support for doing this. It has been acknowledged on UserVoice and I know the team are working on improvements. So the method I am going to use here is essentially a hack and I actually feel a bit dirty even suggesting it. But I do so for a couple of reasons. Firstly, it helps you understand some of the deeper capabilities of Flow and secondly, I hope this real-world scenario is reviewed by the Flow team to help them understand the implications of their design decisions and priorities.

So what is the issue? Basically the flow actions in SharePoint have some severe limitations, namely:

  • The Create File action provides no way to update library metadata when uploading a file
  • The Create Item action provides access to metadata but only works on lists
  • The Update Item action works on document libraries, but requires the item ID of the document to do so. Since Create File does not provide it, we have no reference to the newly created file
  • The Get Items function allows you to search a list/library for content, but cannot match on File Name (actually it can! I have documented a much better method here!)

So my temporary “clever” method is to:

  1. Use Create File action to upload a file
  2. Use the Get Items action to bring me back the metadata for the most recently created file in the library
  3. Grab the ID from step 2
  4. Use the Update Item action to set the metadata on the recently uploaded image.

Ugh! This method is crude and I fear what happens if a lot of flows or file activity was happening in this library and I really hope that soon the next section is soon redundant…

Okay so let’s get started. First up let’s make use of some new Flow functionality and use Scopes to make this easier. Expand the condition block and find the Compose action that extracts the file name. If you dutifully followed my pre-req article it will be called “Get File Name”. Above this, add a Scope and rename it to “Get File and Metadata”. Drag the “Get File Name” and “Get File Body” actions to it as shown below.

image  image  image

Now let’s sort out the location and temperature data. Under “Get File Body”, add a new Compose action and rename it to “Get Latitude”. In the compose function add the following:

Under “Get Latitude”, add a new Compose action and rename it to “Get Longitude”. In the compose function add the following:

Under “Get Longitude”, add a new Compose action and rename it to “Get Temperature”. In the compose function add the following:

  • @split(item(),’|’)[4]

This will result in the following:

image  image

Now click on the Get File and Metadata scope and collapse it to hide the detail of metadata extraction (now you know what a scope is for Smile)

image

So now we have our metadata, we need to apply it to SharePoint. Under the “Create File” action, add a new scope and call it “Get Item ID”. This scope is where the crazy starts…

Inside the scope, add a SharePoint – Get Items action. Enter the URL of your site collection and in the name (not URL) of your document library. In the Order By field, type in Created desc and set the Maximum Get Count to 1. Basically this action is calling a SharePoint list web service and “Created desc” says “order the results by Created Date in descending order (newest first).

Actually what you do is set Filter Query to FileLeaf eq ‘[FileName]’ as described in this later post!

Now note the plural in the action: “Get Items”. That means that by design, it assumes more than 1 result will be returned. The implication is that the data will comes back as an array. in JSON land this looks like the following:

[ { “Name”: “Value” }, { “Name”: “Value2” }, { “Name”: “Value3” } ]

and so on…

Also note that there is no option in this action to choose which fields we want to bring back, so this action will bring back a big, ugly JSON array back from SharePoint containing lots of information.

Both of these caveats mean we now have to do some data manipulation. For a start, we have to get rid of the array as many Flow actions cannot handle them as data input. Also, we are only interested in the item ID for the newly uploaded photo. All of the other stuff is just noise. So we will add 3 more flow actions that:

  1. clear out all data apart from the ID
  2. turn it from an array back to a regular JSON element
  3. extract the ID from the JSON.

For step 1, under the “Get items” action just added, add a new Data Operations – Select action. We are going to use this to select just the ID field and delete the rest. In the From textbox, choose the Value variable returned by the Get Items action. In the Map field, enter a key called “ID” and set the value to be the ID variable from the “Get Items” action.

image

For step 2, under the “Select” action, add a Data Operations – Join action. This action allows you to munge an array into a string using a delimiter – much like what we did in PowerApps to send data to Flow. Set the From text box to be the output of the Select action. The “Join with” delimiter can actually be anything, as the array will always have 1 item. Why? In the Get Items action above, we set the Maximum Get Count to 1. We will always get back a single item array.

image

The net effect of this step will be the removal of the array. I.e., from:

[ { “ID”: 48 } ]

to

{ “ID”:48 }

For step 3, under the “Join” action, add a Data Operations – Parse JSON action. This action will process the JSON file and each element found will be available to subsequent actions. The easiest way to understand this action is to just do it and see the effect. First, set the Content textbox to the output from the Join action.

image

Now we need to tell this action which elements that we are interested in. We already know that we only have 1 variable called ID because of the Select action we set up earlier that has stripped everything else out. So to tell this action we are expecting an ID, click the “use sample payload…” link and paste some sample JSON in our expected format…

{

    “ID”:48

}

If all goes to plan, a Schema has been generated based on that sample data that will now allow us to grab the actual ID value.

image  image

Okay, so we are done with the Get Item ID scope, so collapse it as follows…

image

Finally, under the “Get Item ID” scope, add a SharePoint – Update Item action. Add the URL of your site collection and then specify the document library where the photos were uploaded to. If this has worked, the additional metadata columns should now be visible as shown in the red box below. Now set the specific item we want to update by setting the ID parameter to the ID variable from the Parse JSON step.

image

Now assign the metadata to the library columns. Set Latitude to the output variable from the Get Latitude step, Longitude to the output variable from the Get Longitude step and Temperature to the output variable from the Get Temperature step as shown below.

image

Now save your flow and cross all fingers and toes…

Testing and conclusion!

Return to PowerApps (in the browser or on your mobile device – not the PowerApps studio app). Enter an audit number and take some photos… Wohoo! Here they are in the library along with metadata. Looks like I need to put on a jacket when I step outside Smile

image   image

So taking a step back, we have managed to achieve quite a lot here.

  1. We have wired up a public web service to PowerApps
  2. We have used PowerApps built-in location data to load weather data each time a photo has been taken
  3. We have used Flow to push this data into SharePoint and included the location and weather data as parameters.

Now on reflection there are a couple of massive issues that really prevent this from living up to its Citizen Developer potential.

  1. I had to use a 3rd party service to generate my OpenAPI file and it was not 100%. Why can’t Microsoft provide this service?
  2. Flow’s poor support for common SharePoint scenarios meant I had to use (non) clever workarounds to achieve my objectives.

Both of these issue were resolvable, which is a good thing, but I think the solutions take this out of the realm of most citizen developers. I had to learn too much JSON, too much Swagger/OpenAPI and delve deep into Flow.

In saying all that, I think Flow is the most immature piece of the puzzle at this stage. The lack of decent SharePoint support is definitely one where if I were a program manager, I would have pushed up the release schedule to address. It currently feels like the developer of the SharePoint actions has never used SharePoint on a day to day basis, and dutifully implemented the web services without lived experience of the typical usage scenarios.

For other citizen developers, I’d certainly be interested in how you went with your own version and variations of this example, so please let me know how you go.

And for Microsoft PowerApps/Flow product team members who (I hope) are reading this, I think you are building something great here. I hope this material is useful to you as well.

 

Thanks for reading

 

Paul Culmsee

www.hereticsguidebooks.com

 Digg  Facebook  StumbleUpon  Technorati  Deli.cio.us  Slashdot  Twitter  Sphinn  Mixx  Google  DZone 

No Tags

Send to Kindle

A lesser known way to fine-tune SharePoint search precision…

Send to Kindle

Hi all

While I’d like to claim credit for the wisdom in this post, alas I cannot. One of Seven Sigma’s consultants (Daniel Wale) worked this one out and I thought that it was blog-worthy. Before I get into the issue and Daniel’s resolution, let me give you a bit of search engine theory 101 with a concept that I find is useful to help understand search optimisation.

Precision vs. recall

Each time a person searches for information, there is an underlying goal or intended outcome. While there has been considerable study of information seeking behaviours in academia and beyond, they boil down to three archetype scenarios.

  1. “I know exactly what I am looking for” – The user has a particular place in mind, either because they visited it in the past or because they assume it exists. This known as known item seeking, but is also referred to as navigational seeking or refinding.
  2. “I’m not sure what I am looking for but I’ll know it when I find it” – This is known as exploratory seeking and the purpose is to find information assumed to be available. This is characterised by
    • – Looking for more than one answer
    • – No expectation of a “right” answer
    • – Open ended
    • – Not necessarily knowing much about what is being looking for
    • – Not being able to articulate what is being looked for
  3. “Gimme gimme gimme!” – A detailed research type search known as exhaustive seeking, leaving no stone unturned in topic exploration. This is characterised by;
    • – Performing multiple searches
    • – Expressing what is being looked for in many ways

Now among other things, each of these scenarios would require different search results to meet the information seeking need. For example: If you know what you are looking for, then you would likely prefer a small, highly accurate set of search results that has the desired result at the top of the list. Conversely if you are performing an exploratory or exhaustive search, you would likely prefer a greater number of results since any of them are potentially relevant to you.

In information retrieval, the terms precision and recall are used to measure search efficiency. Google’s Tim Bray put it well when he said “recall measures how well a search system finds what you want and precision measures how well it weeds out what you do not want”. Sometimes recall is just what the doctor ordered, whereas other times, precision is preferred.

The scenario and the issue…

That said, recently, Seven Sigma worked on a knowledgebase project for a large customer contact centre. The vast majority of the users of the system are customer centre operators who deal directly with all customer enquiries and have worked there for a long time. Thus most of the search behaviours are in the known item seeking category as they know the content pretty well – it is just that there is a lot of it. Additionally, picture yourself as one of those operators and then imagine the frustration a failed or time consuming search with an equally frustrated customer on the end of the phone and a growing queue of frustrated callers waiting their turn. In this scenario, search results need to be as precise as possible.

Thus, we invested a lot of time in the search and navigation experience on this project and that investment paid off as the users were very happy with the new system and particularly happy with the search experience. Additionally, we created a mega menu solution to the current navigation that dynamically builds links from knowledgebase article metadata and a managed metadata term set. This was done via the data view web part, XSLT, JavaScript and Marc’s brilliant SPServices. We were very happy with it because there was no server side code at all, yet it was very easy to administer.

So what was the search related issue? In a nutshell, we forgot that the search crawler doesn’t differentiate between your pages content and items in your custom navigation. As a result, we had an issue where searches did not have adequate precision.

To explain the problem, and the resolution, I’ll take a step back and let Daniel continue the story… Take it away Dan…

The knowledgebase that Paul described above contained thousands of articles, and when the search crawler accessed each article page, it also saw the titles of many other articles in the dynamic menu code embedded in the page. As a result, this content also got indexed. When you think about it, the search crawler can’t tell whether content is real content versus when it is a dynamic menu that drops down/slides out when you hover over the menu entry point. The result was that when users searched for any term that appeared in the mega menu, they would get back thousands of results (a match for every page) even when the “actual content” of the page doesn’t contain any references to the searched term.

There is a simple solution however, for controlling what the SharePoint search crawler indexes and what it ignores. SharePoint knows to exclude content that exists inside of <div> HTML tags that have the class noindex added to them. Eg

<div class=”menu noindex> 
  <ul> 
    <li>Article 1</li> 
    <li>Article 2</li> 
  </ul> 
</div>

There is one really important thing to note however. If your <div class=”noindex”> contains a nested <div> tag that doesn’t contain the noindex class, everything inside of this inner <div> tag will be included by the crawler. For example:

<div class=”menu noindex> 
  <ul> 
    <li>Article 1</li> 

      <div class=”submenu>
        <ul>
          <li>Article 1.1</li>
          <li>Article 1.2</li>
        </ul>
      </div>

    <li>Article 2</li> 
  </ul> 
</div>

In the code above the nested <div> to surround the submenu items does not contain the noindex class. So the text “Article 1.1” and “Article 1.2” will be crawled, while the “Article 1” and “Article 2” text in the parent <div> will still be excluded.

Obviously the example above its greatly simplified and like our solution, your menu is possibly making use of a DataViewWebPart with an XSL transform building it out. It’s inside your XSL where you’ll need to include the <div> with the noindex class because the Web Part will generate its own <div> tags that will encapsulate your menu. (Use the browser Developer Tools and inspect the code that it inserts if you aren’t familiar with the code generated, you’ll find at least one <div> elements that is nested inside any <div class=”noindex”> you put around your web part thinking you were going to stop the custom menu being crawled).

Initially looking around for why our search results were being littered with so many results that seemed irrelevant, I found the way to exclude the custom menu using this method rather easily, I also found a lot of forum posts of people having the same issue but reporting that their use of <div> tags with the noindex class was not working. Some of these posts people had included snippets of their code, each time they had nested <div> tags and were baffled by why their code wasn’t working. I figured most people were having this problem because they simply don’t read the detail in the solutions about the nesting or simply don’t understand that the web part will generate its own HTML into their page and quite likely insert a <div> that surrounds the content they are wanting to hide. As any SharePoint developer quickly finds out a lot of knowledge in SharePoint won’t come from well set out documentation library with lots of code examples that developers get used to with other environments, you need to read blogs (like this one), read forums, talk to colleagues and just build up your own experience until these kinds of gotchas are just known to you. Even the best SharePoint developer can overlook simple things like this and by figuring them out they get that little bit better each time.

Being a SharePoint developer is really about being the master of self-learning, the master of using a search engine to find the knowledge you need and most importantly the master of knowing which information you’re reading is actually going to be helpful and what is going to lead you down the garden path. The MSDN blog post by Mark Arend (http://blogs.msdn.com/b/markarend/archive/2010/06/07/control-search-indexing-crawling-within-a-page-with-noindex.aspx) gives a clear description of the problem and the solution, he also states that it is by design that nested <div> tags are re-evaluated for the noindex class. He also mentions the product team was considering changing this…  did this create the confusion for people or was it that they read the first part of the solution and didn’t read the note about nested <div> tags? In any case it’s a vital bit of the solution that it seems a lot of people overlook still.

In case you are wondering, the built in SharePoint navigation menu’s already have the correct <div> tags with the noindex class surrounding them so they aren’t any concern. This problem only exists if you have inserted your own dynamic menu system.

Other Search Provider Considerations

It is more common that you think that some sites do not just use SharePoint Search. The <div class=”noindex”> is a SharePoint specific filter for excluding content within a page, what if you have a Google Search Appliance crawling your site as well? (Yep… we did in this project)

You’re in luck, the Google documents how to exclude content within a page from their search appliance. There are a few different options but the equivalent blanket ignore of the contents between the <div class=”noindex”> tags would be to encapsulate the section between the following two comments

<!–googleoff: all–>

and

<!–googleon: all–>

If you want to know more about the GSA googleoff/googleon tags and the various options you have here is the documentation: http://code.google.com/apis/searchappliance/documentation/46/admin_crawl/Preparing.html#pagepart

Conclusion

(… and Paul returns to the conversation).

I think Dan has highlighted an easy to overlook implication of custom designing not only navigational content, but really any type of dynamically generated content on a page. While the addition of additional content can make a page itself more intuitive and relevant, consider the implication on the search experience. Since the contextual content will be crawled along with the actual content, sometimes you might end up inadvertently sacrificing precision of search results without realising.

Hope this helps and thanks for reading (and thanks Dan for writing this up)

 

Paul Culmsee

www.sevensigma.com.au

h2bp2013

 Digg  Facebook  StumbleUpon  Technorati  Deli.cio.us  Slashdot  Twitter  Sphinn  Mixx  Google  DZone 

No Tags

Send to Kindle

Free MOSS Web Part – Hide Controls via JavaScript

Send to Kindle

Note: version 0.2 posted with minor bugfix 15th March 08!

Note2: Only works with MOSS 2007 sorry as you WSS guys do not have audiences targeting 🙁

This is my small contribution to the SharePoint world. It is a web part that once added to a web part page, allows you to customise the display by adding JavaScript to selectively hide controls on the page . Ever needed to hide a field from display/edit for a certain audience? Well here is a way do it without requiring SharePoint Designer and having to break a page from it’s site definition (unghosting).

Before and after shots below (look ma – no top button!)

image  image

To fully understand what is being done here, I suggest you read my series of articles on the use of JavaScript in SharePoint. Part 3 in particular will show you how to safely add this web part to pages with editing disabled (NewForm.aspx, EditForm.aspx and DispForm.aspx)

The full series can be found here: Part 1, Part 2, Part 3, Part 4, Part 5 and Part 6.

Kudos to Jeremy Thake for feedback and some code contribution. Despite being seriously metrosexual, he is otherwise otherwise very cool :-P.

Now two important warnings:

Warning 1: This is an alpha quality release and I may never touch it again 🙂 So you very likely *will* break it. If there is enough interest, I am happy to pop it on codeplex

Warning 2: This web part should NOT be considered as a security measure and thus used in any security sensitive scenario (such as an extranet or WCM site). JavaScript by its very nature can be trivially interfered with and thus other methods (server side) should be employed in these scenarios to prevent interference at the browser.

You can download by reading the disclaimer and clicking the button below..

THIS CODE IS PROVIDED UNDER THIS LICENSE ON AN “AS IS” BASIS, WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, WITHOUT LIMITATION, WARRANTIES THAT THE COVERED CODE IS FREE OF DEFECTS, MERCHANTABLE, FIT FOR A PARTICULAR PURPOSE OR NON-INFRINGING. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE COVERED CODE IS WITH YOU. SHOULD ANY COVERED CODE PROVE DEFECTIVE IN ANY RESPECT, YOU (NOT THE INITIAL DEVELOPER OR ANY OTHER CONTRIBUTOR) ASSUME THE COST OF ANY NECESSARY SERVICING, REPAIR OR CORRECTION. THIS DISCLAIMER OF WARRANTY CONSTITUTES AN ESSENTIAL PART OF THIS LICENSE. NO USE OF ANY COVERED CODE IS AUTHORIZED HEREUNDER EXCEPT UNDER THIS DISCLAIMER

Use at your own risk!

To install perform the following commands

  1. stsadm.exe” -o addsolution -filename CleverWorkAroundsHideFields.wsp
  2. stsadm.exe” -o execadmsvcjobs
  3. stsadm.exe” -o deploysolution -name CleverWorkAroundsHideFields.wsp -immediate -allowgacdeployment -allcontenturls
  4. stsadm.exe” -o execadmsvcjobs

To remove/reinstall perform the following commands

  1. stsadm.exe” -o retractsolution -name CleverWorkAroundsHideFields.wsp -immediate -allcontenturls
  2. stsadm.exe” -o execadmsvcjobs
  3. stsadm.exe” -o deletesolution -name CleverWorkAroundsHideFields.wsp
  4. stsadm.exe” -o execadmsvcjobs
  5. stsadm.exe” -o addsolution -filename CleverWorkAroundsHideFields.wsp
  6. stsadm.exe” -o execadmsvcjobs
  7. stsadm.exe” -o deploysolution -name CleverWorkAroundsHideFields.wsp -immediate -allowgacdeployment -allcontenturls
  8. stsadm.exe” -o execadmsvcjobs
 Digg  Facebook  StumbleUpon  Technorati  Deli.cio.us  Slashdot  Twitter  Sphinn  Mixx  Google  DZone 

No Tags

Send to Kindle

More SharePoint Branding – Customisation using JavaScript – Part 6

Send to Kindle

God help me, I’m up to part 6 of series about a technology I dislike and still going. For those of you that have just joined us, then you might want to go back to the very beginning of this series where I used JavaScript to improve the SharePoint user experience. Since then, I’ve been trying to pick a path through the thorny maze of what you could term, ‘sustainable customisation’.

By that, I mean something that hopefully will not cause you grief and heartache the next time a service pack is applied!

So no mood for jokes this time – I want to get this over with so let’s get straight to it and finish this thing!

So where are we at?

  • Part 1 looked at how we can use JavaScript to deal with the issue of hiding form elements from the user in lists and document libraries.
  • Part 2 examined some of the issues with the part 1 JavaScript hacks and wrapped it into a web part using the content editor web part.
  • Part 3 then examined the various issues of adding this new web part to certain SharePoint pages (NewForm.aspx, EditForm.aspx and DispForm.aspx). I also covered using SharePoint Audience targeting to make the hiding/unhiding of form elements personalised to particular groups of users.
  • Part 4 started to address a couple of remaining usability issues, and introduced ‘proper’ web-part development using Visual Studio and STSDEV. I created a project to perform the same functionality in part 3, but would not requiring the user to have any JavaScript knowledge or experience.
  • Part 5 then used STSDEV to create a solution package that allowed easy debugging, deployment and updating of the web part developed in part 4.

So what could we possibly have left to cover? Basically this article will revisit the web part code and make some functionality improvements and then I will cover off some remaining quirks/issues that you should be aware of.

[Quick Navigation: Part 1, Part 2, Part 3, Part 4, Part 5 and Part 6]

Continue reading

 Digg  Facebook  StumbleUpon  Technorati  Deli.cio.us  Slashdot  Twitter  Sphinn  Mixx  Google  DZone 

No Tags

Send to Kindle

More SharePoint Branding – Customisation using Javascript – Part 5

Send to Kindle

Hello and welcome to part 5 of another epic CleverWorkArounds blog post.

If you think I write a lot on my blog, you should see my documentation and training material! I seem to be rare insofar as I actually like to write documentation and can churn out reasonable quality pretty fast. So if you need your scary SharePoint farm/infrastructure audited and fully documented, you know who to call! 🙂

Anyhow, here is the current state of play.

  • Part 1 of this series looked at how we can use JavaScript to deal with the common request of hiding form elements from the user in lists and document libraries. We looked at a Microsoft documented method, then a better, more flexible method.
  • Part 2 wrapped this JavaScript code into a web part which has been loaded into the SharePoint web part gallery.
  • Part 3 then examined the trials and tribulations of getting this new web part added to certain SharePoint pages (NewForm.aspx, EditForm.aspx and DispForm.aspx), and then with a few simple edits, use this web part to hide form fields as desired. Finally, I demonstrated the power of combining this with SharePoint Audiences targeting functionality to make the hiding/unhiding of form elements personalised to particular groups of users.
  • Part 4 introduced Visual Studio and STSDEV. I created a project to perform the same functionality in part 3, but not requiring any JavaScript knowledge or experience. By the end of part 4 I had a STSDEV project that compiled with no errors.

And now we are onto Part 5 where we turn our attention to the packaging and deployment of our web part. As you are about to see, STSDEV makes this a very quick and painless experience. If you aren’t convinced of the merits of STSDEV and the SharePoint solution framework by the time you finish this article, then I don’t know what will convince you.

[Quick Navigation: Part 1, Part 2, Part 3, Part 4 and Part 6]

Continue reading

 Digg  Facebook  StumbleUpon  Technorati  Deli.cio.us  Slashdot  Twitter  Sphinn  Mixx  Google  DZone 

No Tags

Send to Kindle

More SharePoint Branding – Customisation using JavaScript – Part 4

Send to Kindle

Hi there. As I write this post, the media are telling me that the stock market is stuffed, the US economy is going to the dogs and banks are writing down billions from sub-prime excess. I dare not check my online broker, road traffic this morning was abysmal, I was late, brought in the wrong laptop and left an important DVD at home.

Could it get any worse? Who knows, but it sounds like the sort of day to re-visit JavaScript and get frustrated with writing a web part for the first time.

So to recap on our journey thus far..

  • Part 1 of this series looked at how we can use JavaScript to deal with the common request of hiding form elements from the user in lists and document libraries. It looked at a Microsoft documented method, then a better, more flexible method.
  • Part 2 wrapped this JavaScript code into a web part which has been loaded into the SharePoint web part gallery.
  • Part 3 then examined the trials and tribulations of getting this new web part added to certain SharePoint pages (NewForm.aspx, EditForm.aspx and DispForm.aspx), and then with a few simple edits, use this web part to hide form fields as desired. Finally, I demonstrated the power of combining this with SharePoint Audiences targeting functionality to make the hiding/unhiding of form elements personalised to particular groups of users.

All in all a pretty clever workaround so far if I say so myself. 🙂

My original goals for this JavaScript was to find an effective, easily repeatable way to customise SharePoint form pages by hiding fields or form elements when we need to. Specifically:

  • Allow hidden fields based on identity/audience
  • Avoid use of SharePoint Designer
  • Avoid customisations to the form pages that unghosted the pages from the site definition

We achieved these goals in part three, but was I satisfied? No. The quest for more clever workarounds always goes on!

[Quick Navigation: Part 1, Part 2, Part 3, Part 5 and Part 6]

Continue reading

 Digg  Facebook  StumbleUpon  Technorati  Deli.cio.us  Slashdot  Twitter  Sphinn  Mixx  Google  DZone 

No Tags

Send to Kindle

More SharePoint Branding – Customisation using JavaScript Part 3

Send to Kindle


Hey there. Welcome to part 3 of my series on SharePoint customisation using JavaScript and web parts.

So here is the lowdown so far. We are trying to find an effective, repeatable way to easily customise SharePoint form pages, so that we can hide fields or form elements when we need to. The goals were to:

  • Allow hidden fields based on identity
  • Avoid use of SharePoint Designer
  • Avoid customisations to the form pages that unghosted the pages from the site definition

So how have we progressed thus far?.

  • Part 1 of this series looked at how we can use JavaScript to deal with the common request of hiding form elements from the user in lists and document libraries.
  • Part 2 wrapped this JavaScript code into a web part which has been loaded into the SharePoint web part gallery.

So let’s knock the rest of this over and pick up right from we left off…

CleverWorkArounds Coffee requirement of this post depends on how much you hate JavaScript.

Metrosexual web developer    image

Socially inept technical guy    imageimageimage

[Quick Navigation: Part 1, Part 2, Part 4, Part 5 and Part 6]

Continue reading

 Digg  Facebook  StumbleUpon  Technorati  Deli.cio.us  Slashdot  Twitter  Sphinn  Mixx  Google  DZone 

No Tags

Send to Kindle

More SharePoint Branding – Customisation using JavaScript Part 2

Send to Kindle

Hi again.

JavaScript sucks! There I said it. Despite me hating it as a programming language, I can’t deny that in SharePoint, it does have its uses.

CleverWorkArounds Coffee requirement of this post depends on how much you hate JavaScript.

Metrosexual web developer    image
Socially inept technical guy    imageimageimage
Luddite IT manager                   imageimageimageimageimageimageimageimageimage 
(sorry … why are you here anyway?)

[Quick Navigation: Part 1, Part 3, Part 4, Part 5 and Part 6]

To quickly recap the first post of this series, we looked at how we can use JavaScript to deal with the common request of hiding form elements from the user in lists and document libraries. The technique demonstrated can be used for columns, buttons and whatever else you want. The method once debugged, is fairly easy to implement with SharePoint designer with and some cut and paste.

But there are several problems with the method that prevent it from getting a better CleverWorkaround rating than “Meh”. They include:

  • One size fits all, fields are hidden for all visitors irrespective of need.
  • You need to modify the page in SharePoint designer via cut and paste of JavaScript code
  • You need to modify auto-generated pages
  • You need to modify a page from its site definition
  • Insecure, relying on client side to hide content/controls is not a secure solution

Continue reading

 Digg  Facebook  StumbleUpon  Technorati  Deli.cio.us  Slashdot  Twitter  Sphinn  Mixx  Google  DZone 

No Tags

Send to Kindle

More SharePoint Branding – Customisation using JavaScript – Part 1

Send to Kindle

Hi all

I thought with my last post that involved XSL/XSLT, I’d escape from horrid programming languages and write about more interesting topics but it wasn’t meant to be. This time round I had to delve back into the world of JavaScript – something I swore that I would never do again after a painful encounter back in 2000. (Yep, it’s taken me 8 years to face it again!)

But like everything else with SharePoint, by being a ‘specialist‘, you seem to have to use more technologies and IT disciplines than you would think possible.

As I progressed writing this article, I realised that I was delving back into branding again and toyed with the idea of making this part 8 of the branding series. But the governance topic in part 7 for me rounded off that series of posts nicely, so I will deal with this separately for now and perhaps refresh that series in the future.

Like a vast majority of my posts, this will also be a mini series.

CleverWorkArounds Coffee requirement rating (for Metrosexual web developers): image

CleverWorkArounds Coffee requirement rating (for the rest of us ): image image image

[Quick Navigation: Part 2, Part 3, Part 4, Part 5 and Part 6]

Continue reading

 Digg  Facebook  StumbleUpon  Technorati  Deli.cio.us  Slashdot  Twitter  Sphinn  Mixx  Google  DZone 

No Tags

Send to Kindle