Loading the .Net SDK

If you relied on us loading HP.HPTRIM.SDK.dll into the GAC you may have noticed that the 93 installer no longer does this, which has eliminated a number of messy deployment issues for us. For more information see the Microsoft guidelines re GAC installation.

The good news

A key benefit of GAC installation was the ability to write version agnostic SDK applications, the good news is that you can still do this. All you need to do is intercept the assembly loading pipeline and you can specify that HP.HPTRIM.SDK.dll gets loaded from the location specified in the registry.

DataPort Custom Formatter

By default DataPort imports a tab delimited file. What is less well known is that you can write your own formatter to import an arbitrary file format. In this sample I import a very simple XML file. Even though the XML file itself is simple it is a little more complex to import than a tab delimited file due to the fact that it does not follow the essentially columnar layout of a tab delinted file.

We also have the code for our standard tab delimited formatter in the GitHub repo.

BTW, you may hear in the background the happy sounds of my 2 year old son playing in the backyard, enjoy!

Use Powershell to import a folder of files

There are a variety of ways to import files to Content Manager.  If you want granular control you may choose to write some code.  This Powershell script uses a CheckinStyle to import all EML files from a folder.

Add-Type -Path "c:\[CM Binary Path]\HP.HPTRIM.SDK.dll"
$database = New-Object HP.HPTRIM.SDK.Database
$database.Id = "L1"
$database.WorkgroupServerName = "local"
$database.Connect()

$checkinStyle = New-Object HP.HPTRIM.SDK.CheckinStyle($database, "test sec");

$files = [System.IO.Directory]::GetFiles("c:\\junk\\testimport", "*.eml", [System.IO.SearchOption]::TopDirectoryOnly);

foreach ($file in $files)
{
Try
{
    $inputDoc = New-Object HP.HPTRIM.SDK.InputDocument($file);
    $rec = $checkinStyle.SetupNewRecord($inputDoc)
    $rec.Save()

    [System.IO.File]::Move($file, [System.IO.Path]::Combine("c:\\junk\\imported\\", [System.IO.Path]::GetFileName($file)))

    Write-Host "Imported: " $rec.Title + " / " + $rec.Uri
    }
    Catch 
    {
        $ErrorMessage = $_.Exception.Message
        Write-Host "Error: " $ErrorMessage + " / " + $file

    }
}

$database.Dispose()

To use this script:

  • set the location of your Content Manager binaries in the first line,
  • use your database Id in the $database,
  • if your work-group server is not on the local machine set the WorkGroupServerName,
  • replace the name in the Checkin Style  Style constructor,
  • if you want to import files other than EML change the '*.eml' to something else,
  • set the source folder name, also the destination folder in the Move method (ensure this destination folder exists), then
  • run the script from Powershell.

Stream a document in the .Net SDK

The standard methods of getting a document from in the SDK (e.g. Record.GetDocument()) fetch the entire document from the document store before giving you access to the document.  DownloadNotifier allows you to:

  • not write a file to the file system but to a stream,
  • fetch the document in chunks rather than waiting for the entire document, and
  • start downloading part way through a file.

I just updated the SDK docs to include some information on using it, also, here is a video if you want to watch me use it.

Database connection caching

Previously if you wanted to avoid the overhead of Database.Connect() in a web service application you had to write some sort of connection pool. In 9.2 this is no longer required as the SDK itself caches connections.  The requirements are that:

  • the previous connection must be Disposed,
  • the subsequent connection must have the same Database Id, work group server name  and trusted user.

Sample Code

private static Database getDatabase(string user = null)
{
    Database database = new Database();
    database.WorkgroupServerName = "local";
    database.Id = "L1";
    if (user != null)
    {
        database.TrustedUser = user;
    }
    _watch.Reset();
    _watch.Start();
    database.Connect();
    _watch.Stop();

    Console.WriteLine(_watch.ElapsedMilliseconds);

    return database;
}


private static void connectDatabase()
{
    string trustedUser = "itu_tadmin";
    using (var db = getDatabase(trustedUser))
    {
    }

    using (var db2 = getDatabase(trustedUser))
    {
    }

    using (var db2 = getDatabase(trustedUser))
    {
    }
}



static void Main(string[] args)
{
    TrimApplication.TrimBinariesLoadPath = @"Your bin folder";
    TrimApplication.Initialize();
    TrimApplication.SetAsWebService("c:\\junk");

    connectDatabase();

}

Demonstration

String Search parsing and filtering

Sometimes I overlook some pretty important things.  I realised recently that I had missed an interesting behaviour of the TrimMainObjectSearch string searching.

A valid search with filtering

The search below will search for all records where the Additional Field 'Alcohol Level' is greater than zero and filter on Record Type == "Infringement".

TrimMainObjectSearch recordSearch = new TrimMainObjectSearch(db, BaseObjectTypes.Record);
recordSearch.SetSearchString("AlcoholLevel>0");
recordSearch.SetFilterString("recType:Infringement");

foreach (Record record in recordSearch)
{
    Console.WriteLine(record.Title);
}

An invalid search without filtering

This search is invalid because 'Alcohol Level' is a number field, so no results are returned, everything is good so far.

TrimMainObjectSearch recordSearch = new TrimMainObjectSearch(db, BaseObjectTypes.Record);
recordSearch.SetSearchString("AlcoholLevel>abc");

foreach (Record record in recordSearch)
{
    Console.WriteLine(record.Title);
}

An invalid search with filtering

You might expect that the search below would behave just as the search above and return no results, given the invalid search string, this is not the case.  The invalid search string is discarded and the filter is applied, the result is all Records where Record Type == 'Infringement'. 

TrimMainObjectSearch recordSearch = new TrimMainObjectSearch(db, BaseObjectTypes.Record);
recordSearch.SetSearchString("AlcoholLevel>abc");
recordSearch.SetFilterString("recType:Infringement");

foreach (Record record in recordSearch)
{
    Console.WriteLine(record.Title);
}

The solution

To avoid this problem always check the return value of SetSearchString().

TrimMainObjectSearch recordSearch = new TrimMainObjectSearch(db, BaseObjectTypes.Record);
TrimParserException parserException = recordSearch.SetSearchString("AlcoholLevel>abc");

if (parserException.Bad)
{
    Console.WriteLine(parserException.Message);
}
else
{
    recordSearch.SetFilterString("recType:Infringement");

    foreach (Record record in recordSearch)
    {
        Console.WriteLine(record.Title);
    }
}

Be careful with search grammar items in string searches

I was caught out the other day using hard-coded string searches in a ServiceAPI application.  It can be convenient (or necessary) to store canned searches (e.g. 'extension:docx OR extension:doc') at times.  The thing to remember is that search strings can be localised and your end user may not be using English.

First, the clauses

Typically the English versions of the search clauses should work irrespective of the user's language, so even if the end user has selected French ' extension' should still be OK.  Due to my over cautious nature I still use the internal name (e.g. 'recExtension) which can be found in the list of search clauses in the ServiceAPI.

Then the grammar items

Search grammar items (such as 'or' and 'and') are not language neutral so 'recExtension:doc or recExtensiom:docx' will not produce any results if your end user has selected Dutch as their language, what you will need is 'recExtension:doc of recExtensiom:docx'.

SDK

Getting the caption for a search grammar item in the SDK is simple:

EnumItem item = new EnumItem(AllEnumerations.SearchGrammarItem, (int)SearchGrammarItem.And).Caption

ServiceAPI - .Net

It is nearly as simple to get the grammar items using the ServiceAPI .Net client.

TrimClient client = new TrimClient("http://localhost/ServiceAPI");
client.Credentials = System.Net.CredentialCache.DefaultNetworkCredentials;

var response = client.Get<EnumItemDetailsResponse>(new EnumItemDetails() { Enums = new AllEnumerations[] { AllEnumerations.SearchGrammarItem } });
Console.WriteLine(response.EnumItems[AllEnumerations.SearchGrammarItem].Where(ei => ei.Name == "Not").First().Caption);

ServiceAPI

Or query the ServiceAPI directly:

http://localhost/ServiceAPI/EnumItem?Enums=SearchGrammarItem&format=json

In short

Never hard code string searches using things like 'not', 'or', 'and', 'me' or anything else in the SearchGrammarItems enum.  If you do and someone switches languages then everything is broken.

Warning when setting the ACL via the SDK

Overview

In the native (or web) client if I attempt to set an invalid ACL on a Record I will get a warning, something like this:

So, I asked myself, why was I not getting the same warning from the SDK?

A riddle

When will the appropriate warning appear in rec.ErrorMessage below...

Record rec = new Record(db, "REC_2");

TrimAccessControlList acl = rec.AccessControlList;
acl.SetAllInherited();
Console.WriteLine("A {0}", rec.ErrorMessage);
rec.AccessControlList = acl;
Console.WriteLine("B {0}", rec.ErrorMessage);
rec.Verify(true);
Console.WriteLine("C {0}", rec.ErrorMessage);
rec.Save();
Console.WriteLine("D {0}", rec.ErrorMessage);

The only point at which a warning message will be present in rec.ErrorMessage is at 'B' above.  Due to the vagaries of the SDK property handling (as described here) calling Verify (or Save() clears out the Error and ErrorMessage properties on the Record object.

The motto of the story

If you are interested in getting the warning from setting the ACL then check for it as soon as you set the AccessControlList property.

DataPort's Custom DataFormatters - Sample

Many people know that, out-of-the-box, DataPort supports importing data from tab delimited text files.  Many people don't know that users can create custom formatters to allow data from any source to be imported.  Maybe 3 people know that our tab delimited formatter is actually a custom data formatter baked into DataPort.

Below I have posted the code for a sample custom data formatter.  It is loosely based on our tab delimited formatter but will hopefully provide some good insight into how to develop one for an alternate type data source.

References

A custom data formatter requires that the following references are made:

  1. HP.HPTRIM.DataPort.Common.dll
  2. HP.HPTRIM.SDK.dll
  3. System.Windows.Forms.dll
  4. System.Drawing.dll

The Code

using System;
using System.Collections.Generic;
using System.IO;
using System.Windows.Forms;
using HP.HPTRIM.DataPort;
using HP.HPTRIM.DataPort.Framework.DataFormatters;

namespace DataPortFormatter
{
    public class SampleFormatter : IImportDataFormatter
    {
        private string          m_fileName  = string.Empty;
        private StreamReader    m_reader    = null;
        private long            m_itemRow   = 0;

        /// <summary>
        /// The caption that will be displayed above the KwikSelect control with 
        /// which the user selects the source of the data to be imported.
        /// </summary>
        public string KwikSelectCaption
        {
            get
            {
                return "Path to the sample file";
            }
        }

        /// <summary>
        /// The type of Origin to be created in HP Content Manager for the import.
        /// Most likely one of either TextFile, WindowsFolder, XMLFile or Custom<n>.
        /// The others are used by various in house integrations.
        /// </summary>
        public HP.HPTRIM.SDK.OriginType OriginType
        {
            get
            {
                return HP.HPTRIM.SDK.OriginType.TextFile;
            }
        }

        /// <summary>
        /// This event is called when a user clicks on the data source KwikSelect's button.
        /// </summary>
        /// <param name="parentForm">The main TRIMDataPortConfig.exe form</param>
        /// <param name="searchPrefix">The value that is in the text portion of the KwikSelect</param>
        /// <param name="suggestedBrowseUILocation">The point at which we advise any dialogues should be placed.</param>
        /// <param name="additionalData">Reserved for passing additional data through.  As of December 2016 only the DBid is provided.</param>
        /// <returns>A string that this dataformatter will use to resolve the data source during an import</returns>
        public string Browse(System.Windows.Forms.Form parentForm, string searchPrefix, System.Drawing.Point suggestedBrowseUILocation, Dictionary<AdditionalDataKeys, DescriptiveData> additionalData)
        {
            string retVal = "";
            FileDialog fileDialog = new OpenFileDialog();
            fileDialog.Filter = "Text Files|*.txt|All Files|*.*";
            fileDialog.InitialDirectory = searchPrefix;

            if (fileDialog.ShowDialog(parentForm) == DialogResult.OK)
            {
                retVal = fileDialog.FileName;
            }
            return retVal;
        }

        /// <summary>
        /// DataPort has finished importing the data and no longer requires the data source.
        /// </summary>
        /// <param name="additionalData">Reserved for passing additional data through.  As of December 2016 only the DBid is provided.</param>
        public void CloseConnection(Dictionary<AdditionalDataKeys, DescriptiveData> additionalData)
        {
            if (m_reader != null)
            {
                m_reader.Close();
                m_reader.Dispose();
                m_reader = null;
            }
        }

        /// <summary>
        /// This function is called by DataPort when it needs to display the fields contained in the data source
        /// </summary>
        /// <param name="validatedSource">The already validated connection string for the data source</param>
        /// <param name="additionalData">Reserved for passing additional data through.  As of December 2016 only the DBid is provided.</param>
        /// <returns>A list of field names available in the data source</returns>
        public List<string> GetFieldNames(string validatedSource, Dictionary<AdditionalDataKeys, DescriptiveData> additionalData)
        {
            List<string> retVal = new List<string>();

            using (StreamReader reader = new StreamReader(validatedSource))
            {
                string line = reader.ReadLine();
                if (!string.IsNullOrWhiteSpace(line))
                {
                    retVal = split(line, '\t');
                }

                reader.Close();
            }

            return retVal;
        }

        /// <summary>
        /// This function is called when DataPort is running an import to identify the data source 
        /// in a manner that is useful to humans.
        /// </summary>
        /// <param name="validatedSource">The already validated connection string for the data source</param>
        /// <param name="additionalData">Reserved for passing additional data through.  As of December 2016 only the DBid is provided.</param>
        /// <returns>A human readable identifier for this formatter</returns>
        public string GetFormatterInfo(string validatedSource, Dictionary<AdditionalDataKeys, DescriptiveData> additionalData)
        {
            string retVal = "";

            using (StreamReader reader = new StreamReader(validatedSource))
            {
                reader.ReadLine();
                retVal = string.Format("{0}: {1}", validatedSource, reader.CurrentEncoding.EncodingName);
                reader.Close();
            }

            return retVal;
        }

        /// <summary>
        /// Called during an import to obtain the items contained in the data source
        /// </summary>
        /// <param name="additionalData">Reserved for passing additional data through.  As of December 2016 only the DBid is provided.</param>
        /// <returns></returns>
        public ImportItem GetNextItem(Dictionary<AdditionalDataKeys, DescriptiveData> additionalData)
        {
            if (m_reader == null)
            {
                MessageBox.Show("The reader is null.  Something terrible has occurred.", "Formatter Error", MessageBoxButtons.OK, MessageBoxIcon.Error);
                return null;
            }

            List<string> values = new List<string>();

            if (m_reader.Peek() > 0)
            {
                values = split(m_reader.ReadLine(), '\t');
            }

            m_itemRow++;

            // Must return null when we're done.  A tab delim file is finished when it contains no more values...
            ImportItem retVal = null;

            if ( values.Count > 0 )
            {
                retVal = new ImportItem(m_itemRow.ToString(), values);
            }

            return retVal;
        }

        /// <summary>
        /// This function is called when DataPort has completed the import.
        /// </summary>
        /// <param name="stats">The statistics of what occurred in the import including things like the number created, updated or the errors that occurred.</param>
        /// <param name="additionalData">Reserved for passing additional data through.  As of December 2016 only the DBid is provided.</param>
        public void ImportCompleted(ProcessStatistics stats, Dictionary<AdditionalDataKeys, DescriptiveData> additionalData)
        {
            // You may want to write the stats to your own file or send an email or anything else that pleases you...
        }

        /// <summary>
        /// This function is called before DataPort uses a data source
        /// </summary>
        /// <param name="validatedSource">The already validated connection string for the data source</param>
        /// <param name="additionalData">Reserved for passing additional data through.  As of December 2016 only the DBid is provided.</param>
        public void Initialize(string validatedSource, Dictionary<AdditionalDataKeys, DescriptiveData> additionalData)
        {
            m_fileName = validatedSource;

            m_reader = new StreamReader(validatedSource);

            m_reader.ReadLine(); // read the header row so GetNextItem returns the first line of Data
        }

        /// <summary>
        /// This function is called after every item that is imported.  This is a synchronous call so anything 
        /// in this function will be processed before the next item can be imported.
        /// </summary>
        /// <param name="validatedSource">The already validated connection string for the data source</param>
        /// <param name="additionalData">Reserved for passing additional data through.  As of December 2016 only the DBid is provided.</param>
        public void ItemProcessed(ImportItem processedItem, Dictionary<AdditionalDataKeys, DescriptiveData> additionalData)
        {
            // You may use your imagination with this call...
        }

        /// <summary>
        /// This function is called by DataPort to allow the formatter to determine if the connection string
        /// the user has selected is valid for the purpose.
        /// </summary>
        /// <param name="parentForm">The main DataPort Config form</param>
        /// <param name="connectionStringToValidate">The connection string that the user has entered</param>
        /// <param name="additionalData">Reserved for passing additional data through.  As of December 2016 only the DBid is provided.</param>
        /// <returns>The connection string if valid or an empty string if not</returns>
        public string Validate(System.Windows.Forms.Form parentForm, string connectionStringToValidate, Dictionary<AdditionalDataKeys, DescriptiveData> additionalData)
        {
            string retVal = string.Empty;

            if (!string.IsNullOrWhiteSpace(connectionStringToValidate) 
                && File.Exists(connectionStringToValidate))
            {
                // For the sample we're just checking that the file exists.  It would be 
                // prudent to open the file and check that it is in the correct format...
                retVal = connectionStringToValidate;
            }

            return retVal;
        }

        #region Helper functions

        /// <summary>
        /// Splits a string on a specific character as long as it is not escaped by a backslash '\'
        /// </summary>
        /// <param name="whichString">The string to split</param>
        /// <param name="onWhichUnescapedChar">The char on which to split</param>
        /// <returns></returns>
        public static List<string> split(string whichString, Char onWhichUnescapedChar)
        {
            bool escaped = false;
            List<string> retVal = new List<string>();
            List<Char> currentEntry = new List<Char>();

            foreach (Char c in whichString)
            {
                if (!escaped
                    && c == onWhichUnescapedChar)
                {
                    retVal.Add(string.Join(string.Empty, currentEntry.ToArray()));
                    currentEntry.Clear();
                }
                else
                {
                    currentEntry.Add(c);
                    escaped = !escaped && c.Equals('\\');
                }
            }

            retVal.Add(string.Join(string.Empty, currentEntry.ToArray()));

            return retVal;
        }

        #endregion

        #region IDisposable Support

        private bool disposedValue = false; // To detect redundant calls

        protected virtual void Dispose(bool disposing)
        {
            if (!disposedValue)
            {
                if (disposing)
                {
                    if (m_reader != null)
                    {
                        this.CloseConnection(null);
                    }
                }

                // TODO: free unmanaged resources (unmanaged objects) and override a finalizer below.
                // TODO: set large fields to null.

                disposedValue = true;
            }
        }

        // TODO: override a finalizer only if Dispose(bool disposing) above has code to free unmanaged resources.
        // ~SampleFormatter() {
        //   // Do not change this code. Put cleanup code in Dispose(bool disposing) above.
        //   Dispose(false);
        // }

        // This code added to correctly implement the disposable pattern.
        public void Dispose( )
        {
            // Do not change this code. Put cleanup code in Dispose(bool disposing) above.
            Dispose(true);
            // TODO: uncomment the following line if the finalizer is overridden above.
            // GC.SuppressFinalize(this);
        }
        #endregion
    }
}

Register the DataFormatter

Once your new formatter has been built you need to register it with DataPort.  To do this you need to edit the ImportDataFormatters file in %appdata%\Hewlett-Packard\HP TRIM\DataPort\Preferences.  The formatter needs to be registered on any machine that will create DataPort projects with DataPort Config Manager or run the import with the DataPort engine.

<?xml version="1.0" encoding="utf-8"?>
<ArrayOfDataFormatterDefinition xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
  <DataFormatterDefinition>
    <DisplayName>Tab Delimited</DisplayName>
    <AssemblyName>HP.HPTRIM.DataPort.Common.dll</AssemblyName>
    <ClassName>HP.HPTRIM.DataPort.Framework.DataFormatters.ImportDataFormatterTab</ClassName>
  </DataFormatterDefinition>
</ArrayOfDataFormatterDefinition

Now you need to add in a new DataFormatterDefinition for your formatter:

<DataFormatterDefinition>
  <DisplayName>What name would you like to display?</DisplayName>
  <AssemblyName>Path and name of the binary containing the formatter</AssemblyName>
  <ClassName>DataPortFormatter.SampleFormatter</ClassName>
</DataFormatterDefinition>

 

 

Why does LastUpdatedOn get updated when I simply view a document?

There are a couple of things that can potentially be triggered on document view that will cause LastUpdatedOn to be set. First you may be setting the 'last action date' on view.  

Also, in your Record Type, you may have chosen to log 'Document Viewed'.

To avoid setting LastUpdatedOn disable both of the above and then access the document like this:

Record rec = new Record(database, "REC_1");

if (!rec.IsDocumentInClientCache)
{
    rec.LoadDocumentIntoClientCache();
}

Console.WriteLine(rec.DocumentPathInClientCache);

BTW...

Yes, I agree, it does seem odd to update LastUpdatedOn simply because we log 'Document Viewed'.  I have put in a request to see if it is possible to change this behaviour in a future release.

Use 'local' when running on the Workgroup Server

One tip that is not as well known as it should be is the use of the reserved work 'local' to refer to a local Workgroup Server.  Any time a .Net SDK application is talking to a Workgroup Server on the same machine you should use 'local' for the Workgroup Server name, rather than using the machine name.  Why is this?  Because 'local' will use a named pipes connection to the Workgroup Server rather than going through the network layer.

Web Client / ServiceAPI Users read this!

Of course the Web Client, ServiceaAPI and WebDrawer are .Net SDK applications, so of you are connecting to a WorkGroup Server on the same machine use 'local'.  Note the MSI will probably not suggest this to you so you wil have to edit the hptrim.config manually.  Make it look something like this:

  <workgroupServer port="1137"
                     workPath="C:\HP Records Manager\ServiceAPIWorkpath"
                     name="local"
                     alternateName="MY_ALTERNATE_WGS"
                     alternatePort="1137"
  />

Hello, World!

.Net SDK Developers

If you are writing an application to run on the same machine as the Workgroup Server do something like this:

using (Database database = new Database())
{
    database.Id = "J1";
    database.WorkgroupServerName = "local";
    database.Connect();

    //do something


}

Web Service performance tips

Whether you are using the ServiceAPI or your own web service built on the .Net SDK there a re a couple of things you can do to improve performance.

Object Cache

The object cache allows you to cache Records (and other objects) in memory, this removes the requirements to connect to the database next time this object is requested.  This, is of course, most useful when certain Records are requested more than once.  You can increase this to up to 99,999 for server applications (see below).

Workgroup Server Document Cache

When downloading a document it first must be fetched from the document store.  The streaming technique used means this should be fast enough but if not you can choose to cache the documents on the workgroup server itself, so if you run a workgroup server on the same machine as you host your web service the documents will be available locally and the web service will not request them from the document store.  Configure document caching in Enterprise Studio (see below).

Making this happen in your web service

To use the server object cache make sure you have called SetAsWebService, like this:

TrimApplication.SetAsWebService("c:\\MY_WORK_PATH");

To fetch documents from the workgroup server cache:

  1. configure the cache,
  2. call SetAsWebService
  3. get the file location using the Record property DocumentPathInWGSCache

Powershell to extract renditions

There may be a way to extract all PDF renditions for a selection of Records via the native client but I do not know how to do it.  When I have powershell I don't worry about the client, I just write a script, like this one...

Add-Type -Path "c:\trunk\AnyCPU\Debug\HP.HPTRIM.SDK.dll"
$database = New-Object HP.HPTRIM.SDK.Database
$database.Id = "I1"
$database.WorkgroupServerName = "local"
$database.Connect()

$recordSearch = New-Object HP.HPTRIM.SDK.TrimMainObjectSearch($database, [HP.HPTRIM.SDK.BaseObjectTypes]::Record)
$recordSearch.SetSearchString("all")

foreach ($record in $recordSearch) {
    foreach ($rendition in $record.ChildRenditions) {
        if ($rendition.TypeOfRendition -eq [HP.HPTRIM.SDK.RenditionType]::Longevity) {
            $extracted = $rendition.Extract("c:\MyTest\" + $record.Uri + "-" +$rendition.Uri + "." +$rendition.Extension, $false)
            Write-Host "Extracted: " $extracted
        }
    }
}

$database.Dispose()

Notes

  • I name the PDF file using both the Record and Rendition URI to both associate the file with the Record and also to allow for the fact that a Record may have multiple PDF renditions.
  • I would have used the record number for the file name except that record numbers may contain slashes which would break in a file name.
  • Your SDK path will be different to mine, most probably under Program Files instead of my Debug directory.

Record.EditDocument

I spend as little time with the COM SDK as possible but apparently it had the method Record.EditDocument which would check a document out to Offline Records and open it for editing.  This no longer exists in the .Net SDK but you can still achieve the same end. Here is a simple sample.

using (Database database = new Database())
{
    database.WorkgroupServerName = "local";
    database.Id = "I1";
    database.Connect();

    Record record = new Record(database, 9000000001);

    OfflineRecord offline = new OfflineRecord(record, true);
    System.Diagnostics.Process.Start(offline.FullFileName);
}

Auto Checkin

If you want to auto check the document in once you have finished editing then call the SetAutoCheckin() methd, for example:

using (Database database = new Database())
{
    database.WorkgroupServerName = "local";
    database.Id = "I1";
    database.Connect();

    Record record = new Record(database, "REC_15");
    
    OfflineRecord offline = new OfflineRecord(record, true);
    record.SetAutoCheckin(offline.FileName);

    System.Diagnostics.Process.Start(offline.FullFileName);
}

SDK UI

Overview

Some time ago there was a rationalisation of the UI components offered in the RM .Net SDK, the result of which is that some options that were available in the COM SDK are no longer to be found.  For example the ability to view an electronic document via  Record.ViewUI no longer exists.

What can you do?

As of RM 8.1 the following UI classes exist in the .Net SDK:

  • ObjectSelector,
  • PropertyEditor, and
  • DesktopHelper

Some Code

The following extracts come form this very simple sample project.

Edit Record Properties

Record record = new Record(_database, 9000000000);
if (HP.HPTRIM.SDK.PropertyEditor.EditModal(GetDesktopWindow(), record))
{
    record.Save();
}

Select Records

TrimMainObjectSearch search = new TrimMainObjectSearch(_database, BaseObjectTypes.Record);
search.SelectAll();

TrimMainObjectSearch selectedSearch = ObjectSelector.SelectMany(GetDesktopWindow(), search);

Select Database

DesktopHelper helper = new DesktopHelper();
Database newDatabase = helper.SelectDatabase(GetDesktopWindow(), false, false);

if (newDatabase != null)
{
    _database.Dispose();
    _database = newDatabase;
}