Test Documents for OOXML and ODF standards

October 23 2008

At the DII Workshop on OOXML today, one of the issues that is under discussion is creating an test suite of documents that implementers can test their implementation against. My recommendation is that the OOXML (and ODF) camp create something like the ACID test. The ACID test let's browser creators test specific features and functionality and let's the user see where they stand. This is not about implementing hundreds or thousands of pages of a spec, its about meeting specific requirements.

There should be a test suite of documents that you can open, edit, save, and create. And then some validator to measure how the specific implementation opens and saves these documents. The documents can be tied to versions of the specifications. What is important is that I, the user, can take a document, run it against the validator, and get a visible score myself. It needs to be transparent.

I know there is the ODF Validator, but it does not work very well. There should be one place where I upload a file, OOXML or ODF, and it scores the compatibility.

Now, the problem, is who leads this effort?

3 Responses to “Test Documents for OOXML and ODF standards”

  1. 1) Rob Weir says:

    Hi John,

    We have a new technical committee at OASIS called the ODF Interoperability and Conformance TC. We started the work of creating this TC four months ago, working our way through the OASIS process of socializing the TC, recruiting members, drafting a charter, issuing a Call for Participation, etc. But the prep work is finally done and the first meeting was officially held yesterday, with attendance by IBM, Sun, Google, Novell, Red Hat, Oracle, the Belgian federal ICT agency, Fedict, as well as individual participants.

    One of the goals of this TC -- in fact the primary goal -- is to produce a comprehensive test suite for ODF.

    More info on the TC, including its charter, and how to submit feedback to the TC, can be found here; { Link }

    Since the TC is is just starting its work, it is a great time for new members to join and get involved in the early discussions and decisions.

    I know you were critical last time about IBM not attending Microsoft's DII Workshop. But I hope you see that we've been busy, for many months now, along with several other ODF vendors, to create a lasting, open and transparent effort to support ODF interoperability. It isn't enough to merely talk interoperability. We need to do something, and that requires having solid institutional support for vendors to work together in a standards setting environment.

    And yes, Microsoft was invited, several times, to participate in this effort, but to no avail.



  2. 2) John Head says:

    Rob -

    Thanks for the post. Couple things:

    1. I have provided a smackdown to MS about this TC committee and will blog about it.

    2. Dennis Hamilton is in the room here and is part of my discussion

    3. The Charter of the TC on ODF Interop (http://www.oasis-open.org/committees/oic/charter.php) says this:

    "The following activities are explicitly not within the Scope of the OIC TC:

    1. Acting as a rating or certifying authority or agency for conformance of particular ODF implementations;

    2. Authoring or distributing software that tests the conformance or interoperability of ODF implementations."

    So this reads to me like you are not going to produce a testing suite .. your not authoring or distributing software. What I want is a third party validation tool, like ACID, that tests ODF AND OOXML files. Something where any user can upload a file and get a report back on the compatability level. This would create trust in the implementation of ODF and/or OOXML in that specific solution. And yes, it needs to be both ODF and OOXML. Like it or not, both are here to stay for a while

    As for getting involved in the TC, I am in the process of deciding what to do and where to get involved

  3. 3) Rob Weir says:


    Actually, we will be producing a test suite, or at least the specific documents that define the test suite, a assessment guide that defines the expected output or rendering of each test file, etc. The prohibition you cite essentially means that we won't be creating any extra test harness code/software that could be used to automate the testing, nor will we be (as an OASIS TC) be executing the test cases against specific applications and publicly reporting the results.

    A bit oversimplified, but we hypothetically might have a test document that defines a circle with a blue edge. We might call that test case odf-graphics-circle-color-01.odt and then have an assesment document that says, Step #1232, load document odf-graphics-circle-color-01.odt and verify that a circle with blue edge is displayed, as indicated in the accompanied image odf-graphics-circle-color-01.jpg.

    I'd call that a test suite.

    What we could not do is write the C++ code needed to load that test case automatically into an application, nor could we write up a report that says how specific applications score when the test cases are run against them.

    Both prohibitions are necessary to do this kind of work within OASIS, since OASIS is a standards consortium, and their rules handle quite well the creation of specifications (including test specifications) but have no provisions, IP-wise or infrastructure-wise for the creation of software, nor the liabilities involved in rating and representing the conformance of 3rd party applications.

    Of course, once the TC has completed its work, the assets we publish are free for anyone to implement in the form of an automated test suite, as an online validator or otherwise. I could imagine several possible implementations. For example, OpenOffice might want to have a test harness that automatically loads each test document in one window, the expected results in another window, and a UI for the tester to indicate pass/fail in another window. But what about KOffice, or MS Office, or Google Docs?

    The fact is that no single implementation of a conformance test suite is good for all implementations. This is another good reason to separate the creation of conformance assessment methodology documents from the glue layer that might sit above it to automate the tests.

    Of course, we can help make it easier to automate. I'm hoping we will do things like have good XML metadata associated with each test case, maybe even use some formal test assertion notation to describe the tests. These steps can make it easier for implementers to create the test harness code.

    One last point -- in ODF (and in OOXML) conformance is defined for a document as well as a document processor. These are two different things. An online validator can really only verify the document conformance. It doesn't really help with application conformance. If the document says circle, but the application draws a square, then we have a problem, right? So we also need the large set of test cases which can be loaded into an application to verify that it does the right thing.

    In practice I think that gets to the real-world source of interoperability problems. I have yet to see someone report an interop problem with an ODF document, and then find out that the problem was that the XML itself was non-conformant. The problem is almost always that of the application interpreting the XML incorrectly, or often not supporting the particular feature.



Leave a Reply