Dashboards and business intelligence

Last week I took some time out of my schedule to experiment with Power BI. While Power BI can access many data sources, including a JSON/REST web service such as the ServiceAPI, it is simplest to access data sources for which a connector exists. This led me to experiment with Power Query custom connectors.

The connector I show in the video below is not production ready but is far enough along to demonstrate its viability. Before proceeding I would like some feedback from anyone who might use this to answer questions like:

  • Is Power BI of interest?

  • What data would you want to extract from CM in Power BI?

  • What sort of Dashboards / Reports would you produce?

One nice thing about a custom connector is that it could quite easily be backwards compatible, so would work with existing CM implementations.

If you have any interest in this space and want to see this experiment go further send me a private message at the CM Forum.

Hello, World!

Deleting in the ServiceAPI

Strangely enough there are multiple paths via which you might choose to delete a Record in the ServiceAPI. When looking back on how this choice was made I tend to agree with Elizabeth Bennett that a good memory is unpardonable, but, is it a problem that there are multiple ways to delete a Record?

Delete via Record post

The first method is to use a service action, that is to post JSON similar to that below to the Record endpoint. Not only does this attempt to delete the Record but also allows you to make updates to properties (and save) the Record prior to delete being called.

{
    "Uri": 9000000544,
    "DeleteRecord": {
        "DeleteRecordDeleteContents": false
    }
}

Delete via delete service

The other way to delete is simply to post to the delete service using a URL like this:

http://localhost/ServiceAPI/Record/rec_364/Delete

And the winner is…

Avoid the first option and use the second option. Why? The first option, in addition to deleting the Record, does all of the processing required when updating properties and fields. This is unnecessary when you are deleting and is likely to cause errors. The second option avoids all the unnecessary pipeline and simply deletes the Record making it much more robust.

Loading the .Net SDK

If you relied on us loading HP.HPTRIM.SDK.dll into the GAC you may have noticed that the 93 installer no longer does this, which has eliminated a number of messy deployment issues for us. For more information see the Microsoft guidelines re GAC installation.

The good news

A key benefit of GAC installation was the ability to write version agnostic SDK applications, the good news is that you can still do this. All you need to do is intercept the assembly loading pipeline and you can specify that HP.HPTRIM.SDK.dll gets loaded from the location specified in the registry.

DataPort Custom Formatter

By default DataPort imports a tab delimited file. What is less well known is that you can write your own formatter to import an arbitrary file format. In this sample I import a very simple XML file. Even though the XML file itself is simple it is a little more complex to import than a tab delimited file due to the fact that it does not follow the essentially columnar layout of a tab delinted file.

We also have the code for our standard tab delimited formatter in the GitHub repo.

BTW, you may hear in the background the happy sounds of my 2 year old son playing in the backyard, enjoy!

Microsoft Office Web Add-ons

Recently I posted re some research we have done into Google Docs and Gmail integration. This post shows the result of similar research into building add-ons for Microsoft Office Online.

Online versions of Office applications are different to the native versions in some not so subtle ways, for example in Office Online your document is saved very frequently, also collaborative editing is encouraged, this impacts the design of a Content Manager add-on.

Currently we have a our SharePoint integration and the Web Client Office Online integration, so any potential Office Web add-on needs to be designed to solve a meaningful business problem. It is even conceivable that Email Link might provide a sufficient server side solution.

If you have an interest in Office add-ons get in touch via the usual channels.

Make it faster!

Recently I spent several hours chasing 280 milliseconds that I was convinced should not have been there.  The ServiceAPI URL I was calling looked like this:

http://localhost/ServiceAPI/Record?q=container:9000002536&format=json&excludecount=false&pagesize=100&properties=recordtitle,recordowner,recordcontainer,recorddateregistered,recordassignee,recorddatecreated,recordcreator,recordnumber

After hours tuning the ServiceAPI code I managed to tweak a 20 millisecond improvement, still I was sure there was more. I looked once more at the URL and made one small improvement, to this:

http://localhost/ServiceAPI/Record?q=container:9000002536&format=json&excludecount=false&pagesize=100&properties=recordtitle,recordownerlocation,recordcontainer,recorddateregistered,recordassignee,recorddatecreated,recordcreator,recordnumber

Suddenly I had eliminated my extra 260 milliseconds!

Can you spot the difference?  Previously I had an incorrect property name, RecordOwner should have been RecordOwnerLocation. Why is this a problem? If the ServiceAPI does not recognise a property it checks to see if it is a valid 'additional field' for this Record.  Multiply the time to check this by 100 (the page size) and there is the 260 milliseconds.

The moral

260 milliseconds is not a lot but in a web service environment it adds up so be careful to to include invalid property names in the properties parameter.

Now I am off to see if the 'additional field' code can be optimised.

Webdrawer - playing audio files

Webdrawer does not support the playing of audio files, instead you will get an error message when you preview a Record with an audio file attached.  This can be remedied by adding some HTML to the preview template. To do this, in your Webdrawer install folder:

Open the file '\Views\WDRecordPreview.cshtml'  in a text editor

Add the following HTML

if (new string[] { "MP3", "M4A", "WAV", "OGG" }.Any(ext => ext == record.Extension.Value))
{
    <audio preload="auto" autobuffer controls>
        <source src="~/Record/@record.Uri/file/document?inline=true" />
        <p>Audio playback not supported in this browser.</p>
    </audio>
}
else

Your HTML should end up looking like this:

audio.PNG

Google Docs add-on

Content Manager currently has no integration with Google Docs.  This sample explores what is possible within the constraints of the Google Docs add-on ecosystem.  It turns our quite a lot is possible, much more than with Gmail.

The Sample

The sample below demonstrates registering and updating a document from Google Docs to Content Manager.  The code for this sample is available on the community repo.

The future

If you have any thoughts on the importance of Google Docs support or the details of what a Google Docs integration should do please contact me.

Gmail addon

Today if you want to file email from Gmail to Content Manager you need to use the Email Link module.  The limitation of this is that there is not user interface so you are unable to fill in the Record data entry form.

Gmail add-ons

Gmail supports add-ons making it theoretically possible to create a client side solution which will allow the user to complete the data entry form at the time they file the email.  These add-ons are constrained in that they have to be able to function both in the Gmail web UI and also the mobile Gmail app.

Exploring the possibilities

The sample in this video demonstrates the possibilities and the limitations of a Gmail add-on.

The future

Given the current limitations of the UI components it is difficult to see how we could build a generic add-on.  Any customer or partner who wants to build a custom add-on is welcome to start with this sample, as always feel free to contact me if you either have suggestions for an 'off the shelf' Content Manager add-on or want to discuss the building of a bespoke solution.

ServiceAPI Bulk Loader

The ServiceAPI exposes a bulk loading capabilities for which a C# sample is provided in the help.  This blog post shows the actual JSON that is sent over the wire.

The first step is to create an Origin object in the client, once this is done a batch of Records can be posted to the bulk loader.  The sample below:

  • uses the Origin with Uri 9000000006,
  • does not use background processing (processing is completed within the HTTP post request),
  • creates one Location and one Record,
  • uses the Location created for the Record assignee,
  • assumes that the file "test doc.docx" has already been uploaded (using the UploadFile service), and
  • should be posted to http://MyServer/ServiceAPI/BulkLoader.
{
    "Origin": {
        "Uri":9000000006
    },
    "AutoCommitLocations": true,
    "SQLCheckConstraints": false,
    "SQLServerTableLock": true,
    "UseBulkLoaderRecordNumbering": true,
    "ProcessInBackground":true,
    "LocationsToImport":[{
        "LocationSurname":"Jones", 
        "LocationGivenNames":"Arthur",
        "LocationTypeOfLocation":"Person"
    }],
    "RecordsToImport": [
        {
            "RecordTitle": "My title from BL 5.1 AAAAA",
            "RecordAssignee":{
                "LocationSurname":"Jones", 
                "LocationGivenNames":"Arthur"
            },
            "RecordFilePath":"test doc.docx"
        }
    ]
}

Background processing

If 'ProcessInBackground' is set to true then the ServiceAPI will hand the bulk loading request off to the TRIMServiceAPIBulkLoader service (installed as a windows service).  In this case the bulk loading request will immediately return an OriginHistory object.  The Uri from this object can be used to poll the OriginHistory object to find out when the bulk loading operation has completed.  The Url to fetch the OriginHistory will look like this:

http://localhost/ServiceAPI/OriginHistory?q=uri:9000000019&properties=BulkLoaderIsRunning,OriginHistoryRecordsCreated

Demo

In the video below I demonstrate bulk loading one Record using background processing.

Generate Outlook linked folders

Background

Prior to CM 9.0 the Outlook integration had a feature to export/import linked folders, at that time linked folders were stored in the Windows registry so the import/export was required any time a user received a new machine.  This feature was also of use to those who wished to share their linked folders with their friends.

In 9.0 the architecture of the Outlook integration changed so that linked folders were stored in Checkin Style objects (in Content Manager) not in the registry, which seemed to make the import/export unnecessary, except for that sharing with friends usage.

A partial solution

A partial solution to sharing linked folders is to create Checkin Styles which have a group as the owner, this means that every member of that group will see that Checkin Style.  The gap is that a linked folder will not be auto-created for that Checkin Style.

Some sample code

I wrote a sample application that could be the basis for a utility to allow users to create linked folders from Checkin Styles that have been created for them.  This might useful in the case where the user has multiple Checkin Styles and does not wish to go through one by one to create a new linked folder for each one.  Below is a screen shot from this sample application.

linkkedfolders.PNG

The future

Clearly if the sharing of linked folders is a popular activity then it makes sense to bring it back in some form.  If you want to have input on how that should be implement lodge a support request or send me a private message on the forums.

Azure AD for Native Client

For those who wish to use Azure AD to authenticate with the native Content Manager client here are the steps.

In Azure AD create a native application, the Redirect URI must be urn:ietf:wg:oauth:2.0:oob

create native.PNG

In App Registrations select Endpoints and take note of the following two endpoints for later:

  • OAuth 2.0 Token Endpoint, and
  • OAuth 2.0 Authorization Endpoint
endpoints.PNG

In CM Enterprise Studio select your database and from the context menu choose Authentication, then go to the ADFS / Azure tab. In this tab set:

  • Authorize Endpoint URL to OAuth 2.0 Authorization Endpoint
  • Token Endpoint URL to OAuth 2.0 Token Endpoint
  • Client Id to the Application ID (in the Azure AD application you created)
  • Relying Party Trust also set to the Application ID

If you press Test Authenticate you should be able to authenticate as one of the users in Azure AD.