From 64895f6b51a8c65888f8309fc8fc62fdf994c594 Mon Sep 17 00:00:00 2001 From: michael Date: Sat, 10 Mar 2007 13:34:59 +0000 Subject: [PATCH] * Added template and README from Sergei Gorelkin git-svn-id: trunk@6769 - --- .gitattributes | 2 + packages/fcl-xml/tests/README | 35 ++ packages/fcl-xml/tests/template.xml | 485 ++++++++++++++++++++++++++++ 3 files changed, 522 insertions(+) create mode 100644 packages/fcl-xml/tests/README create mode 100644 packages/fcl-xml/tests/template.xml diff --git a/.gitattributes b/.gitattributes index 3a7e5fb151..cb5b5d3658 100644 --- a/.gitattributes +++ b/.gitattributes @@ -4289,6 +4289,8 @@ packages/fcl-xml/src/xmlstreaming.pp svneol=native#text/plain packages/fcl-xml/src/xmlutils.pp svneol=native#text/plain packages/fcl-xml/src/xmlwrite.pp svneol=native#text/plain packages/fcl-xml/src/xpath.pp svneol=native#text/plain +packages/fcl-xml/tests/README svneol=native#text/plain +packages/fcl-xml/tests/template.xml svneol=native#text/plain packages/fcl-xml/tests/xmlts.pp svneol=native#text/plain packages/fpmake.pp svneol=native#text/plain rtl/COPYING -text diff --git a/packages/fcl-xml/tests/README b/packages/fcl-xml/tests/README new file mode 100644 index 0000000000..0e5906d96d --- /dev/null +++ b/packages/fcl-xml/tests/README @@ -0,0 +1,35 @@ +Test runner for w3.org XML compliance suite +------------------------------------------- + +The xmlts is intended to run the XML compliance suite from W3.org. +The suite includes 2500+ tests. It may be downloaded from +http://www.w3.org/XML/Test/xmlts20031210.zip (approx. 1.7 mBytes) +After compiling xmlts.pp, run it with the following command line: + +xmlts [-t template.xml] [-v] + +Two required commandline parameters include path to test database file and report +filename. Optionally, you may specify validating mode with -v switch and report +template filename with -t (by default, 'template.xml' is used). +The test suite includes several test databases (all named 'xmlconf.xml'). There is +master database located in root dir, and several individual databases in different +subdirs. + +for example, to run all tests included into the suite in non-validating mode, use: + +xmlts xmlconf/xmlconf.xml myreport.html + +Report is produced in xhtml format, use your favourite browser to view it. + +As of 10.03.2007, the xml package does not support namespaces yet, so you might wish +to exclude namespace tests. To do this, edit xmlconf/xmlconf.xml file and comment out +two lines at the bottom which reference 'eduni-ns10' and 'eduni-ns11' testsuites. + +(The last lines should look like: + + &eduni-xml11; + + + + +) diff --git a/packages/fcl-xml/tests/template.xml b/packages/fcl-xml/tests/template.xml new file mode 100644 index 0000000000..643e9d73fb --- /dev/null +++ b/packages/fcl-xml/tests/template.xml @@ -0,0 +1,485 @@ + + + + + + + <?run-id name?> XML <?run-id type?> Processor + + + +

XML Processor Conformance Report:
+

+ +

This document is the output of an +XML test harness. +It reports on the conformance of the following +XML 1.0 processor configuration:

+ +
+ + + + + + + + + + + + + + + + + + + + +
XML Processor
Parser Class
Processing Mode
General Entities
Parameter Entities
+ +

The results were as reported through the parser's API to +this particular test harness and execution environment:

+ +
+ + + + + + + + + + + + + + + + + + + + +
Test Run Date
Harness and Version
Runtime Environment
Host OS Info
Suite of Testcases
+ +

An summary of test results follows. To know the actual test status, +someone must examine the result of each passed negative test +(and informative test) to make sure it failed for the right reason. +That examination may cause the counts of failed tests to increase +(and passed tests to decrease), changing a provisional "conforms" status +to a "does not conform".

+ +
+ + + + + + + + + + + + + + + + + + + + +
Status
Total Passed Tests (provisional)
Passed Negative Tests (provisional)
Failed Tests (provisional)
Tests Skipped
+ +

Sections of this report are: +Explanation of Tables; +Positive Tests, cases where this processor should +report no errors; +Negative Tests, documents for which this processor +must report the known errors; and +Informative Tests, documents with errors which +processors are not required to report.

+ +

NOTE: The OASIS/NIST test suite is currently in draft state, +and can't actually be used without modifications to the configuration file, +which is used both to generate the test documentation published at the +OASIS/NIST site and to operate this test harness. In some cases, test +cases may need to be reclassified; this would affect results attributed +to parsers. Accordingly, treat these results as preliminary.

+ +

+Explanation of Tables +

+ +

Sections presenting test results are composed largely of tables, with +explanations focussing on exactly what those tables indicate. Diagnostics +for failed tests are presented in italics, with a cherry colored background, +to highlight the result. Diagnostics for succesful tests should as a rule +only exist for negative tests. Initial parenthesized comments typically +come from the test harness.

+ +

Some such comments indicate the reporting category defined in the XML +specification. Some low-fidelity processor APIs don't expose recoverable +errors, which can make validation work awkward.

+ +
+
(fatal)
+
The diagnostic was reported as a fatal error. Such errors are + primarily well-formedness errors, such as the violation of XML 1.0 + syntax rules or of well formedness constraints. + In some underfeatured parser APIs, this may be the + only kind of error that gets reported. +
+
(error)
+
The diagnostic was reported as a recoverable error. Such + errors primarily used to report validation errors, which are all + violations of validity constraints, but some other errors are also + classed as nonfatal.
+
(warning)
+
The diagnostic was reported as a warning; warnings are purely + informative and may be emitted in a number of cases identified by + the XML 1.0 specification (as well as in other cases).
+
+ +

Other such comments may indicate other categories of conformance issue. +For example, some errors relate to problematic implementation of SAX; +and in exceptional cases, the harness can be forced to report a failure +on some test.

+ +
+
(thrown classname) ... abnormal
+
The named exception was directly thrown. If the exception + is a SAXException (or a subclass thereof) this suggests an error in + the parser (violating the SAX API specification) since it should + normally have used the SAX ErrorHandler instead.
+
(odd classname) ... abnormal
+
After the identified exception was reported through the + ErrorHandler, an exception of the named class was thrown directly. + This suggests an error in the parser, since the parser + either failed to continue after an error (or warning) which is + required to be continuable, or else it did not pass the exception + thrown by the application back to the application.
+
(EXCEPTION - DIRECTED FAILURE) ... abnormal
+
This test case was explicitly failed by the test operator; + the test was not run. This may be done in the case of parsers with + severe bugs which completely prevented handling the test case, + typically because the parser seems to "hang" by entering an + infinite loop.
+
+ +

In all cases, negative tests that appear to pass (diagnostics presented +with a white background) must be individually examined in the report below. +The diagnostic provided by the processor must correspond to the description +of the test provided; if the processor does not report the matching error, +the seeming "pass" is in fact an error of a type the test harness could +not detect or report. That error is either a conformance bug, or an error +in the diagnostic being produced; or, rarely, both.

+ + +

Nonvalidating processors may skip some tests if the tests require +processing a class of external entities (general, parameter, or both) +which that processor is known not to handle. If processor handling of +entities is not known, all such tests are skipped, in order to prevent +misreporting.

+ + +

+Positive Tests +

+ +

All conformant XML 1.0 processors must accept "valid" input documents +without reporting any errors, and moreover must report the correct output +data to the application when processing those documents. Nonvalidating +processors + +(such as this one) + +must also accept "invalid" input documents without reporting any errors. +These are called "Positive Tests" because they ensure that the processor +just "does the right thing" without reporting any problems.

+ +

In the interest of brevity, the only tests listed here are those which +produce diagnostics of some kind, such as test failures. In some cases, +warnings may be reported when processing these documents, but these do not +indicate failures.

+ +

No interpretation of these results is necessary; every "error" or +"fatal" message presented here is an XML conformance failure. Maintainers +of an XML processor will generally want to fix their software so that it +conforms fully to the XML specification.

+ + + +

Valid Documents

+ +

All XML processors must accept all valid documents. This group +of tests must accordingly produce no test failures.

+ + + + + + + + + +
Section and [Rules]Test IDDescriptionDiagnostic
+ +

Output Tests

+ +

The XML specification places requirements on the data which is reported +by XML processors to applications. + +This data flows through the processor API ... or it is not available, +so the processor is in those respects nonconformant. +For example, SAX1 did not report external entities which were not +included; but SAX2 does. +These output tests verify conformance with the specification by +recording that data and comparing it with what is required for conformance +with the XML 1.0 specification.

+ +

At this writing, the OASIS output tests have several categories of +known omissions (or weak output test coverage). These include:

    + +
  • No output tests address the additional requirements which validating +processors must satisfy. That is, reporting which whitespace is ignorable, +and reporting declarations of unparsed entities.
  • + +
  • Only a few output tests have been provided which address the +requirement to report NOTATION declarations, and some of those +appear to be missing.
  • + +
  • No tests address the requirement to report external entities +which were not included.
  • + +
+ +

Note that output tests automatically fail in cases where the processor +failed to parse the (valid) input document used to generate the +output data.

+ +

In some test harnessses, the output tests are unreliable because +they can't directly compare the parser output against reference data. +Such issues should be noted in the documentation for that harness.

+ +

Also, and not a bug, in some cases these diagnostics may seem like +they say two equivalent results are not equal. The issue is that some +differences, often those in reported whitespace, aren't easily visible +in this report. HTML hides many such differences (because it normalizes +whitespace before displaying it), and the method used to display the +differing results may also mask some issues.

+ + + + + + + +
Test IDDiagnostic
+ + +

Invalid Documents

+ +

As noted above, nonvalidating processors must accept all documents +which are well formed, but invalid. This same behavior would be delivered +by a validating processor, if the application chose to continue processing +after receiving each report of a validity error, and not report such +validity errors. (These tests are run as "negative" tests for validating +processors, since in those cases it is important that the correct validity +errors be reported and that they be reported at the correct level.)

+ + + + + + + + + +
Section and [Rules]Test IDDescriptionDiagnostic
+ + +

+Negative Tests +

+ +

All conformant XML 1.0 processors must reject documents which are not +well-formed. In addition, validating processors + +(such as this one) + +must report the validity errors for invalid documents. +These are called Negative Tests because the test is intended +to establish that errors are reported when they should be. +

+ +

Moreover, the processor must both fail for the appropriate reason (given +by the parser diagnostic) and must report an error at the right level ("error" +or "fatal"). If both criteria were not considered, a processor which failed +frequently (such as by failing to parse any document at all) would appear to +pass a large number of conformance tests Unfortunately, the test driver can +only tell whether the error was reported at the right level. It can't +determine whether the processor failed for the right reason.

+ +

That's where a person to interpret these test results is critical. Such +a person analyses the diagnostics, reported here, for negative tests not +already known to be failures (for not reporting an error, or reporting one +at the wrong level). If the diagnostic reported for such tests doesn't match +the failure from the test description, there is an error in the diagnostic or +in the processor's XML conformance (or sometimes in both).

+ +

For this processor, diagnostics must be +examined to get an accurate evaluation of its negative test status.

+ + +

Invalid Documents

+ +

Validating processors must correctly report "error" diagnostics +for all documents which are well formed but invalid. Such errors must +also be, "at user option", recoverable so that the validating parser +may be used in a nonvalidating mode by ignoring all validity errors. +Some parser APIs do not support recoverability. +Such issues should be noted in the documentation for the API, and +for its test harness. +

+ + + + + + + + + +
Section and [Rules]Test IDDescriptionDiagnostic
+ + +

Documents which are not Well-Formed

+ +

All XML processors must correctly reject (with a "fatal" +error) all XML documents which are not well-formed. + +(Nonvalidating processors may skip some of these tests, if +they require handling of a type of external entity which the +processor ignores. Such skipped tests are not reported.) + +

+ + + + + + + + + +
Section and [Rules]Test IDDescriptionDiagnostic
+ +

+Informative Tests +

+ +

Certain XML documents are specified to be errors, but the handling +of those documents is not fully determined by the XML 1.0 specification. +As a rule, these errors may be reported in any manner whatsoever, or +completely ignored, without consequence in terms of conformance to the +XML 1.0 specification. And some of these documents don't have errors; +documents in encodings other than UTF-8 and UTF-16 are legal, but not +all processors are required to parse them.

+ +

Such "optional" errors are listed here for informational purposes, since +processors which ignore such errors may cause document creators to create +documents which are not accepted by all conformant XML 1.0 processors. +(And of course, processors which produce incorrect diagnostics for such +cases should be avoided.)

+ + + + + + + + + +
Section and [Rules]Test IDDescriptionDiagnostic
+ + +

This report was produced by Free Software from +http://xmlconf.sourceforge.net, +and you should be able to reproduce these results yourself.

+ +