Client-side People Picker control for Sharepoint Apps

When creating my "Sharepoint Hosted" App for Office365 (15) I had to use the client-side People Picker control in the App Part. There doesn't appear to be much in the way of documentation for this control other than the "How to" MSDN document here:

How to: Use the client-side People Picker control in apps for SharePoint

I wanted to default the control to the current user. From the document I could tell that there was a parameter in the init function that should contain an array of initial PickerEntity objects, but sadly no example of this. The example code simply passed a null where this should be.

SPClientPeoplePicker_InitStandaloneControlWrapper(peoplePickerElementId, null, schema);

I like to use the jQuery deferred function with the SharePoint asynchronous requests so that I have some control over my code executing in a sequential manner.

Here is a snippet of code that demonstrates how to default the People Picker control to the current user:

$(document).ready(function () {
    SP.SOD.executeFunc('sp.js', 'SP.ClientContext', sharePointReady);
})

function sharePointReady() {
    context = new SP.ClientContext.get_current();
    web = context.get_web();
    getUser().done(function (user) {
        var schema = {};
        schema['PrincipalAccountType'] = 'User,DL,SecGroup,SPGroup';
        schema['SearchPrincipalSource'] = 15;
        schema['ResolvePrincipalSource'] = 15;
        schema['AllowMultipleValues'] = false;
        schema['MaximumEntitySuggestions'] = 50;
        schema['Width'] = '280px';

        var users = new Array(1);
        var defaultUser = new Object();
        defaultUser.AutoFillDisplayText = user.get_title();
        defaultUser.AutoFillKey = user.get_loginName();
        defaultUser.Description = user.get_email();
        defaultUser.DisplayText = user.get_title();
        defaultUser.EntityType = "User";
        defaultUser.IsResolved = true;
        defaultUser.Key = user.get_loginName();
        defaultUser.Resolved = true;
        users[0] = defaultUser;
        SPClientPeoplePicker_InitStandaloneControlWrapper('peoplePickerDiv', users, schema);
    });
}

function getUser() {
    var dfd = $.Deferred(function () {
        user = web.get_currentUser();
        context.load(user);
        context.executeQueryAsync(
            function () {
                dfd.resolve(user);
            }),
            function () {
                dfd.reject(args.get_message());
            };
    });
    return dfd.promise();
}

Localization & jQuery datePicker

I've started using jsFiddle.net to quickly try out jQuery scripts, I've also found an online Bootstrap distribution so I can make it look pretty too.

This time I wanted to make the datePicker jQuery UI feature use the browsers language setting to display dates in the correct format. It turns out that jQuery already handles this and more, in fact it translates all the text too.

You'll have to change your browser's language settings to test it properly, but it you wanted to quickly test it with a different language you can edit the fiddle and replace the first line of javascript with this:

language = 'de-DE';

Here's the fiddle:

Microsoft Azure

This was published back in April 2012.  Watch out for me in the clips showing developers working on laptops which was filmed in the MTC at Microsoft.

We were attending a 3 day proof of concept to lift and shift the Retail Manager solution into Azure. The video was created by Steve Plank a developer evangelist for Azure.

SQL Date Dimension table data generator in PowerShell

It's common to have a date dimension table in databases that are used by applications that need to store information related to the financial/fiscal/holiday year of a business. Every business will likely have a different start date of the year and want to query data by week number, which would be different to the standard calendar week number. Additionally they might want to query against quarters & periods which differ from year to year depending on how they choose to break the year down in weeks.

Generating this data within T-SQL is challenging to say the least, it is getting easier in SQL Server 2012 with the new T-SQL functions. A much more flexible approach is to use PowerShell to do this, which is what I have done in the script below.

You'll need to set the $weeks variable to match the required breakdown of weeks in the year, this should add up to 52.

Set the $startDate to the 1st day of the year.

Set the $connectionString to your database and the destinationTableName to the table that contains the following columns:

  • ID
  • GIVEN_DATE(smalldatetime)
  • WEEK_NO(int)
  • CAL_WEEK_NO(int)
  • DAY_ID (int)
  • NON_WORK (bit)
  • YEAR_NO(int)
  • HOLIDAY_WEEK_No(int)
  • PERIOD (int)
  • QUARTER (int)

You can of course change the columns/data that you generate and store.

[void] [Reflection.Assembly]::LoadWithPartialName("Microsoft.VisualBasic")

$weeks = @(4,4,5,4,4,5,4,4,5,4,4,5)
$startDate = Get-Date "30/01/2011"
$yearNo = "1"
$connectionString = "Data Source=server;Initial Catalog=database;user=sa;password=password"
$bulkCopy = new-object ("Data.SqlClient.SqlBulkCopy") $connectionString
$bulkCopy.DestinationTableName = "date_dimension"

$dataTable = DateDimension | Out-DataTable
$bulkCopy.WriteToServer($dataTable)

Function out-DataTable {
  $dt = new-object Data.datatable  
  $First = $true  

  foreach ($item in $input){  
    $DR = $DT.NewRow()  
    $Item.PsObject.get_properties() | foreach {  
      if ($first) {  
        $Col =  new-object Data.DataColumn  
        $Col.ColumnName = $_.Name.ToString()  
        $DT.Columns.Add($Col)       }  
      if ($_.value -eq $null) {  
        $DR.Item($_.Name) = "[empty]"  
      }  
      elseif ($_.IsArray) {  
        $DR.Item($_.Name) =[string]::Join($_.value ,";")  
      }  
      else {  
        $DR.Item($_.Name) = $_.value  
      }  
    }  
    $DT.Rows.Add($DR)  
    $First = $false  
  } 
  return @(,($dt))
}

function Get-WeekdayFlag($target) {
 if($target.DayOfWeek -eq 'saturday' -or $target.DayOfWeek -eq 'sunday') {$true}
 else {$false}
}

function Get-TimeKey ($target) { $target.ToString("yyyy-MM-dd hh:mm:ss") }

function DatePart($interval, $target) {
 [Microsoft.VisualBasic.DateAndTime]::DatePart(
  $interval,
  $target,
  [Microsoft.VisualBasic.FirstDayOfWeek]"System",
  [Microsoft.VisualBasic.FirstWeekOfYear]"System")
}

function DateDimension
{
    $weekNo = 1
    $dayNo = 0
    $count = 1
    $quarter = 1
    foreach($elem in $weeks)    
    {
        $max = $elem * 7
        $dayNo = 1        
        while($dayNo -le $max)
        {            
            New-Object PSObject |
                Add-Member -pass NoteProperty ID("1") |
                Add-Member -pass NoteProperty GIVEN_DATE($startDate) |
                Add-Member -pass NoteProperty WEEK_NO($weekNo) |
                Add-Member -pass NoteProperty CAL_WEEK_NO(DatePart "ww" $startDate) |
                Add-Member -pass NoteProperty DAY_ID([int]$startDate.DayOfWeek) |
                Add-Member -pass NoteProperty NON_WORK(Get-WeekdayFlag $startDate) |
                Add-Member -pass NoteProperty YEAR_NO($yearNo) |
                Add-Member -pass NoteProperty HOLIDAY_WEEK_No(0) |
                Add-Member -pass NoteProperty PERIOD($count) |    
                Add-Member -pass NoteProperty QUARTER ($quarter) 
            if($dayNo % 7 -eq 0) {$weekNo++}    
            $startDate = $startDate.AddDays(1)
            $dayNo++
        }    
        if($count % 3 -eq 0){$quarter++}
        $count++
    }
}

Packaging InfoPath Forms into Site Features

Ever had an InfoPath Form that you wanted to wrap up into a re-deployable feature ?

The problem is that any data connections will still point to the their original locations and the Form itself will have an incorrect PublishURL.

An InfoPath Form is really a cabinet (.cab) file.  Unfortunately it's not part of the Open Office XML SDK, so you can't use the System.IO.Packaging class in the WindowsBase dll to extract it, this only works with zip files.  I didn't want to sacrifice my principles and start using unmanaged code or shelling out calls to cabarc.exe, so I scoured the internet for an alternative solution.  I found that the WiX open source installer project has some assemblies that can extract and package cabinet files:

Microsoft.Deployment.Compression.Cab.dll
Microsoft.Deployment.Compression.dll

Just download and install WiX and reference the above 2 files from the WiX SDK folder.

InfoPath has the ability to use external data connection files and Sharepoint has a Data Connections Library template to store them in.  This solution depends on this feature as we'll need to update any data connections that the form uses in our feature activated event reciever.  I use Sharepoint 2010 Foundation which runs on my Windows 7 laptop, the Data Connections Library template isn't available out of the box in the Foundation edition, but it is part of Search Server 2010 Express.  So I downloaded and installed Search Server 2010 Express, but I didn't configure it as I'm not really interested in actually using it.

As a scenario to ascertain the feasibility of creating a re-deployable InfoPath feature I created a Form that had a 2 data connection files.  One was a receive data connection that was connected to the site user list and the other a submit connection to a Form Library.

Now that we have all the pre-requisites needed it's time to fire up Visual Studio and create a new empty Sharepoint 2010 project.

Add a new List Instance, I called mine Data Connections.  Here is the Elements.xml file, note the TemplateType is set to 130 which is a Data Connection Library.

  
  

I added a new module called Connection Files and copied in the 2 data connection files, here's what the Elements.xml file looks like.


  
    
      
    
    
      
    


Now that we have a Data Connections Library with some connections in it, I'll add a new Content Type item called DCTest for the InfoPath Form. Here's the Elements.xml:


  
  
    
    
        
  

I added another Module called Form Template to the project with the xsn file in it.


  
    
  

Finally, just before we get into coding the feature activated event receiver I added another List Instance item to the project called Form Library.


      
  

Now that we have everything it's time to write the feature activated event receiver. The first job it to get a reference to the Web where the feature is being activated so that we know the new url for the InfoPath Form and Data Connections. I download a copy of the XSN file to the system temp folder, where I extract it using the WiX classes. The Manifest.xsf file can then be updated before packing it back into the .xsn file and uploading it back to Sharepoint. The data connection files are easier to update as they are just xml.
public override void FeatureActivated(SPFeatureReceiverProperties properties)
        {        
            XNamespace ns;
            XDocument doc;
            string tempPath = Path.GetTempPath();
            string contentTypeName = "DCTest";
            using (SPWeb web = (SPWeb)properties.Feature.Parent)
            {
                SPContentType contentType = web.ContentTypes[contentTypeName];
                WebClient webClient = new WebClient();
                webClient.Credentials = CredentialCache.DefaultCredentials;         
                webClient.DownloadFile(web.Site.Url + contentType.DocumentTemplateUrl, tempPath + contentType.DocumentTemplate);
                CabInfo cab = new CabInfo(tempPath + @"\" + contentType.DocumentTemplate);
                Directory.CreateDirectory(tempPath + contentTypeName);                
                cab.Unpack(tempPath + contentTypeName);               
                doc = XDocument.Load(tempPath + @"\" + contentTypeName + @"\manifest.xsf");
                ns = "http://schemas.microsoft.com/office/infopath/2006/solutionDefinition/extensions";
                doc.Root.Attributes("publishUrl").First().Value = web.Url + "/FormLibrary/Forms/DCTest/DCTest.xsn";
                foreach (XElement elem in doc.Descendants(ns + "connectoid"))
                {
                    elem.Attribute("siteCollection").Value = web.Site.Url;
                    elem.Attribute("source").Value = "/forms/DataConnections/" + elem.Attribute("source").Value.Substring(elem.Attribute("source").Value.LastIndexOf(@"/") + 1);
                }
                doc.Save(tempPath + @"\" + contentTypeName + @"\manifest.xsf");
                cab.Pack(tempPath + @"\" + contentTypeName);
                webClient.Headers[HttpRequestHeader.ContentType] = "application/octet-stream";                
                webClient.UploadFile(web.Site.Url + @"/forms/_cts/DCTest/DCTest.xsn", "PUT", tempPath + @"\" + contentType.DocumentTemplate);

                //Remove the default content type and add the site content type
                SPList library = web.Lists["Form Library"];
                library.ContentTypes[0].Delete();
                library.ContentTypes.Add(web.ContentTypes["DCTest"]);

                //Update the data connections
                SPList list = web.Lists["Data Connections"];
                foreach (SPFile file in list.RootFolder.Files)
                {

                    using (StreamReader sr = new StreamReader(file.OpenBinaryStream()))
                    {
                        doc = XDocument.Parse(sr.ReadToEnd());
                    }
                    ns = "http://schemas.microsoft.com/office/infopath/2006/udc";
                    if (file.Name == "DC001-Receive.udcx")
                    {
                        doc.Descendants(ns + "WebUrl").First().Value = web.Url;
                        doc.Descendants(ns + "ListId").First().Value = "{" + web.SiteUserInfoList.ID.ToString() + "}";
                    }
                    else if (file.Name == "DC001-Submit.udcx")
                    {
                        doc.Descendants(ns + "FolderName").First().Value = web.Url + "/FormLibrary";
                    }
                    byte[] byteArray = Encoding.ASCII.GetBytes(doc.Root.ToString());
                    MemoryStream stream = new MemoryStream(byteArray);
                    file.SaveBinary(stream);
                }
            }
        }

Tweet from Powershell using oAuth

I wanted to be able to post tweets from a Powershell script. There are alot of old blog posts and examples of composing tweets before Twitter introduced oAuth. I've seen a few examples of Powershell scripts that can post tweets using external dll's to handle the oAuth authentication. I wanted to keep my script self contained and only use .NET Framework assemblies where necessary and not rely on 3rd party dlls.

The first step is to create a Twitter application by logging on to http://dev.twitter.com. Complete the Application Details making sure to set the Application Type - Access to at least Read and Write. Make a note of the following:

  • Consumer key
  • Consumer secret
  • Access token
  • Access token secret

You'll need to update the associated Powershell script variables with your real values.

[Reflection.Assembly]::LoadWithPartialName("System.Security")
[Reflection.Assembly]::LoadWithPartialName("System.Net")

$status = [System.Uri]::EscapeDataString("My first tweet from Powershell");
$oauth_consumer_key = "<consumer key>";
$oauth_consumer_secret = "<consumer secret>";
$oauth_token = "<auth token>";
$oauth_token_secret = "<auth token secret>";
$oauth_nonce = [System.Convert]::ToBase64String([System.Text.Encoding]::ASCII.GetBytes([System.DateTime]::Now.Ticks.ToString()));
$ts = [System.DateTime]::UtcNow - [System.DateTime]::ParseExact("01/01/1970", "dd/MM/yyyy", $null).ToUniversalTime();
$oauth_timestamp = [System.Convert]::ToInt64($ts.TotalSeconds).ToString();

$signature = "POST&";
$signature += [System.Uri]::EscapeDataString("http://api.twitter.com/1/statuses/update.json") + "&";
$signature += [System.Uri]::EscapeDataString("oauth_consumer_key=" + $oauth_consumer_key + "&");
$signature += [System.Uri]::EscapeDataString("oauth_nonce=" + $oauth_nonce + "&"); 
$signature += [System.Uri]::EscapeDataString("oauth_signature_method=HMAC-SHA1&");
$signature += [System.Uri]::EscapeDataString("oauth_timestamp=" + $oauth_timestamp + "&");
$signature += [System.Uri]::EscapeDataString("oauth_token=" + $oauth_token + "&");
$signature += [System.Uri]::EscapeDataString("oauth_version=1.0&");
$signature += [System.Uri]::EscapeDataString("status=" + $status);

$signature_key = [System.Uri]::EscapeDataString($oauth_consumer_secret) + "&" + [System.Uri]::EscapeDataString($oauth_token_secret);

$hmacsha1 = new-object System.Security.Cryptography.HMACSHA1;
$hmacsha1.Key = [System.Text.Encoding]::ASCII.GetBytes($signature_key);
$oauth_signature = [System.Convert]::ToBase64String($hmacsha1.ComputeHash([System.Text.Encoding]::ASCII.GetBytes($signature)));

$oauth_authorization = 'OAuth ';
$oauth_authorization += 'oauth_consumer_key="' + [System.Uri]::EscapeDataString($oauth_consumer_key) + '",';
$oauth_authorization += 'oauth_nonce="' + [System.Uri]::EscapeDataString($oauth_nonce) + '",';
$oauth_authorization += 'oauth_signature="' + [System.Uri]::EscapeDataString($oauth_signature) + '",';
$oauth_authorization += 'oauth_signature_method="HMAC-SHA1",'
$oauth_authorization += 'oauth_timestamp="' + [System.Uri]::EscapeDataString($oauth_timestamp) + '",'
$oauth_authorization += 'oauth_token="' + [System.Uri]::EscapeDataString($oauth_token) + '",';
$oauth_authorization += 'oauth_version="1.0"';

$post_body = [System.Text.Encoding]::ASCII.GetBytes("status=" + $status); 
[System.Net.HttpWebRequest] $request = [System.Net.WebRequest]::Create("http://api.twitter.com/1/statuses/update.json");
$request.Method = "POST";
$request.Headers.Add("Authorization", $oauth_authorization);
$request.ContentType = "application/x-www-form-urlencoded";
$body = $request.GetRequestStream();
$body.write($post_body, 0, $post_body.length);
$body.flush();
$body.close();
$response = $request.GetResponse();

Office365 Weather WebPart

A while ago I wrote a Sharepoint web part that displayed a 5 day weather forecast for a user configurable location using Met Office data from the Azure Datamarket. I wanted to try porting this web part to Office365, to see if I could overcome the challenges of working predominantly client side using Javascript.

With the help of jQuery to make ajax requests to the Datamarket and a cryptographic library to encrypt the username and password that is included in the ajax request headers I've managed to accomplish this goal.

One of the major challenges was to overcome the cross site scripting issues, where I was making requests to the Datamarket from the Office365 domain. The Datamarket supports the oData standard well from a .Net Framework perspective, but it requires user interaction to allow the request from a purely web client or javascript approach. Internet Explorer has an object called a XDomainRequest which is specifically designed for cross site data requests, the only problem is that you can't include any headers, which means that authentication was ruled out. Aparently IE 10 will work the same way that Chrome does now and properly support jQuery CORS. So, I had to compromise and change internet explorer's security settings to allow cross site data requests.

I've just published the code in a CodePlex project here: http://o365Weather.codeplex.com