openEHR logo

openEHR Conformance Guide

Issuer: openEHR Specification Program

Release: CNF development

Status: DEVELOPMENT

Revision: [latest_issue]

Date: [latest_issue_date]

Keywords: conformance, guide

openEHR components
© 2021 - 2024 The openEHR Foundation

The openEHR Foundation is an independent, non-profit foundation, facilitating the sharing of health records by consumers and clinicians via open specifications, clinical models and open platform implementations.

Licence

image Creative Commons Attribution-NoDerivs 3.0 Unported. https://creativecommons.org/licenses/by-nd/3.0/

Support

Issues: Problem Reports
Web: specifications.openEHR.org

Acknowledgements

This specification was developed and is maintained by the openEHR Specifications Editorial Committee (SEC).

Principal Authors

  • Thomas Beale, Ars Semantica, UK; openEHR International Board

  • Pablo Pazos Gutierrez, CaboLabs, Uruguay

Trademarks

  • 'openEHR' is a trademark of openEHR Foundation

1. Preface

1.1. Purpose

This guide describes the openEHR Conformance Teting artefacts and methodology. The audience of this document includes:

  • Software development organisations developing HIT systems;

  • Customer organisations.

Useful references for reading this document include:

1.3. Status

This specification is in the DEVELOPMENT state. The development version of this document can be found at https://specifications.openehr.org/releases/CNF/development/guide.html.

Known omissions or questions are indicated in the text with a 'to be determined' paragraph, as follows:

TBD: (example To Be Determined paragraph)

1.4. Feedback

Feedback may be provided on the openEHR Conformance forum.

Issues may be raised on the specifications Problem Report tracker.

To see changes made due to previously reported issues, see the CNF component Change Request tracker.

2. Glossary of Terms and Acronyms

The following terms and acronyms are used in this document.

Term Meaning

API

Application Programmer Interface.

CDS

Clinical Decision Support.

REST

Representational state transfer, a type of web service. REST-compliant Web services allow requesting systems to access and manipulate textual representations of Web resources using a uniform and predefined set of stateless operations.

SUT

System under test.

3. Overview

3.1. Goals

Conformance testing is used as the basis of product and system certification, and has the following goals:

  • Tendering: enable tendering authorities to state formal criteria for compliance in tenders

  • Protection for solution developers: enables bona fide vendors and other developers to prove the quality of their solutions compared to other offerings claiming conformance

  • Protection for procurement: provides a way of ensuring that purchased solutions can be contractually guaranteed to perform in certain ways

3.2. Stakeholders

These needs imply four kinds of stakeholder and associated interests:

  • Platform Specifier(s) - openEHR International and others (e.g. SNOMED International) publish open specifications used by solution builders as well as conformance criteria and testing framework for assessment purposes;

  • Organisations procuring solutions based on the platform specifications;

  • Solution Builders - including product vendors, in-house, etc, whose solutions are claimed to be based on the specifications;

  • Conformance Assessment Agency/ies - independent assessors of conformance of concrete systems or products to the conformance criteria published by the platform specifier(s).

conformance framework use cases
Figure 1. Stakeholders and Use Cases

The confidence of procuring organisations in the overall platform solutions market relies on the availablility of formal conformance criteria, as well as guides and materials for concretely performing conformance assessment. Assessment may be performed in-house on both procurement and vendor side, but in a more mature market will also be performed independently by dedicated assessment organisation(s).

3.3. Product Scope

There are assumed to be three categories of testable artefact for the purposes of Conformance testing:

  • API-exposing (Platform) Components - components exposing a service API, including:

    • information-related platform components, such as Clinical Data Repository, Demographic repository;

    • reference data / knowledge-related components, e.g. Terminology Service, clinical model repository;

    • other services, e.g. Clinical Decision Support;

  • API-using Components (Platform clients) - components using a service API, including any application, tool or system that accesses APIs of a particular platform component;

  • Tools - standalone applications usually designed to perform a specific task, e.g. building of a certain type of model.

Conformance for these categories is assessed as follows.

Category What is being assessed How assessed Methodology

Platform Component

API conformance

Conformance of the implemented APIs to the published APIs, in a concrete API technology

Regression of test client running API call-in test cases against reference results

Data Validation conformance

Conformance of platform’s validation of data against semantic models (archetypes etc)

Regression of test client committing variable data sets against reference validity

Platform client

API compatibility

Conformance of the client’s ability to make calls into a published API, in a concrete API technology

Simulated service testing; functional testing against reference platform implementation

Tool

Artefact representation

Conformance of artefacts created / modified by the tool to the artefact specifications

Functional round-trip testing.

This guide primarily addresses the first category (platform implementations), but can be adapted to conformance assessment of platform clients (i.e. applications).

Non-functional conformance (performance, etc) is not addressed by this guide.

3.3.1. What can be Concretely Tested?

Although conformance criteria and tests can be specified in abstract ways, only real systems and applications can be tested, and the concrete conformance assessable is between such deployed artefacts and the technology-specific specifications on which they are based. Abstract semantics, including of call logic and data types, are only testable because they are expressed within the relevant technology-specific specifications, e.g. the REST API for the openEHR EHR service.

A specific deployed system to be tested is known as the system under test (SUT).

3.3.2. Platform Implementations

For the first product category, the SUT has a platform architecture, consisting of one or more components that each expose an API in one or more specific API technologies whose component semantics and API are described by the combination of the openEHR Platform Abstract Service Model and the relevant technology specification (e.g. REST API).

Any given API call exposed in a deployed component thus implements two types of semantics:

  • formal, transactional semantics defined by the service model;

  • API-specific semantics, that follow the rules of each API communication protocol, e.g. how to marshal arguments, handle errors etc.

A given component implementation (say, EHR Service) might expose more than one concrete API, e.g. REST and Apache Kafka, each of which represents a particular communication protocol for accessing the component. Such protocols include the text-based, such as SOAP/WSDL and REST, as well as various binary protocols, including Google Protocol Buffers, Apache Thrift, Avro, Kafka, ZeroC ICE, and Advanced Message Queueing Protocol (AMPQ).

The transactional semantics remain the same regardless of API protocol, while the API-specific semantics vary. The result of the call via any protocol should be the same. This is often achieved via a native API in e.g. Java, C# etc, but need not be. The general component model is shown below.

sut model
Figure 2. Component Model

To establish a common basis for naming and describing semantics of the openEHR Platform, the openEHR Platform Abstract Service Model is used. This defines a standard set of openEHR component names along with definitions of the transactional semantics of each component, i.e. the 'semantic' test target referred to above. (A product’s own component names need not correspond to the openEHR component names of course, but for the purpose of testing the logical mapping of the two should be provided by a developer.)

Each of the concrete protocol interfaces is defined by its own specification, for example the openEHR REST API specification, and may be regarded as a product component, for which conformance testing may be conducted. No assumption is made that any given product supports any particular protocol(s), although a minimal set of REST APIs on components such as the System Log is likely to be useful for testing purposes.

The following figure illustrates a notional openEHR platform product, consisting of components and various API interfaces as described above.

conformance sut
Figure 3. Platform implementation with multiple API protocols

3.3.3. Platform Clients

TBD

3.4. What Conformance Claims are Possible?

Conformance of a specific (i.e. individual) deployed system or application, which may be a custom build or an installed vendor product can be directly determined by executing appropriate test resources (e.g. executable test runners) on the deployment.

Conformance of a product (platform, application) provided by a vendor to any particular specification is inferred from testing of a deployment of the product in such a way as to be representative of any deployment.

4. Conformance Framework

The purpose of the openEHR Conformance Specifications is to provide a framework and methodology for performing conformance testing of openEHR-based systems and products to the intended specifications. The approach described here is not openEHR-specific, and may be applied to other components deployed as part of an overall platform solution.

4.1. Specifications

The openEHR Conformance Specifications are designed around the same separation of technology-independent and technology-specific levels of expression as the primary openEHR specifications (i.e. into abstract and ITS). Here, 'technology' refers to the combination of technologies relevant to platform service implementation, i.e. APIs (REST, Apache Kafka, etc) and data representation (JSON, XSD, etc).

The primary platform conformance specifications are based on the openEHR Platform Service Model, i.e. the abstract semantic definition of API functions for various components (CDR, etc). Technology-specific artefacts are derived from these for particular combinations of implementation technology such as REST API + JSON data representation. These are ideally directly executable, via the use of modern test frameworks such as Cucumber, Robot and Spock.

The following diagram illustrates the openEHR Conformance Specifications.

conformance framework artefacts
Figure 4. Conformance-related specifications

The Conformance specifications are shown in the top box, and consist of the following:

  • Guides:

    • Conformance Guide - this document;

    • RFI / RFP / RFQ Guides - guides on creating 'Request for X' documents based on conformance criteria, for contracting purposes (future).

  • Conformance specifications for a specific platform (technology-independent):

    • Platform X Conformance Test Framework - a technology-independent test framework for a specific platform;

    • Platform Conformance Test Schedule - test cases including test run logic based on platform Service Model.

  • Test execution artefacts for a specific platform (technology-dependent):

    • Executable test case runners (automated test framework scripts) for implemention of Platform X in technology Y (e.g. REST + JSON);

    • Other artefacts for establishing an executable testing environment.

  • Result-related artefacts:

    • Test Execution Report - system xxx / REST APIs - The results of a test run of a Test Schedule on some SUT - specific API + data rep’n

    • Conformance Statement - system xxx / REST APIs - Statement of conformance of a product or system to specifications

    • Conformance Certificate - system xxx - Certified statement of conformance issued by some recognised testing authority based on test execution.

The orange (dashed) arrows indicate generation of results from test execution activities.

4.2. From Specifications to Runnable Tests

In the diagram above, the blue arrows indicate derivation from more general artefacts to more specific ones. These relationships indicate how usable (ideally, directly executable) test runner definitions can be derived from the specifications. The logic of the 'square' formed by these relationships is shown below for an abstract API call I_EHR.create_ehr() defined in the openEHR Platform Service Model, in the following diagram.

conformance framework chain
Figure 5. Test items

The numeric order indicates the following logical progression:

  • (1) abstract API call I_EHR.create_ehr() - defines semantics of the call, independent of particular implementation details e.g. related to HTTP or web-sockets;

  • (2) representation of the call in (1) in an API technology (here: REST) - I_EHR.create_ehr() becomes POST {baseUrl}/v1/ehr in the HTTP protocol used for REST;

  • (3) abstract test cases for testing I_EHR.create_ehr() - again, independent of specific technology;

  • (4) concrete expression of the test cases from (3) as executable scripts for use with an implementation of (3).

Elements (1) and (3) may be understood as the specification of semantics of a call, and how to logically test it, respectively. Elements (2) and (4) may be understood as concretisation of both the definition and the test specification into a particular technology.

This approach involves more work than simple provision of API and test specifications in a fixed technology, but has a number of advantages:

  • the logic (i.e. semantics) of APIs is defined once, and clearly separated from the details of each implementation technology, such as the common questions of REST, including which HTTP verb to use, how to represent calls as URIs, which return codes to use to represent logical errors and success;

  • the majority of the logic of testing an API only has to be represented once, in the technology-indepdendent form, enabling quick derivation of specific test runners for the API in a target technology;

  • adding new API + data representation technologies is incremental work, and does not require any re-determination of the basic logic or testing logic of the API;

  • without the abstract level of expression of API defintiions and tests, only the concrete specifications are available - from these it is often difficult to easily determine the intended logic of a call, since the semantics have already been translated into the particulars of e.g. HTTP or other protocol usage.

5. Conformance Assessment

5.1. Test Environment

An operational test environment requires at a minimum a test application with the appropriate protocol client(s) in order to exercise the SUT. A system log viewer and a data viewer may be provided as part of the product in order to facilitate human interaction with the system, but these are not obligatory. A typical the conformance test environment is of the following form:

conformance sut rest
Figure 6. System Under Test - REST

5.2. Tooling

TBD

5.2.1. Test Execution Report

TBD

5.2.2. Conformance Statement

TBD

5.2.3. Conformance Certification

TBD

6. Amendment Record

Issue Details Raiser Completed

CNF Release 1.0.0 (unreleased)

0.6.0

SPECCNF-6: Initial Writing

P Pazos,
T Beale

08 Jan 2022