Your test cant work. Bit order doesnt work that way. Unless a partial byte is 
involved bit order does nothing. The values of whole bytes don't change.

There is a bit order tutorial in the daffodil tutorials module.

Suggest you start there. If you view the bitOrder tdml.xml file with firefox it 
should format it into html via an xslt stylesheet so it is readable.

We need to get tutorials like this up as part of the daffodil site materials.

________________________________
From: Sloane, Brandon <bslo...@tresys.com>
Sent: Monday, April 22, 2019 4:27:04 PM
To: dev@daffodil.apache.org
Subject: Understanding bitOrder=leastSignificantFirst

I am trying to work through a bug surrounding changing the bitOrder mid schema 
and am having trouble figuring out how Daffodil handles bitOrder in general.


I see that in InputSourceData, we have code to mask for the correct bits and 
shift out the gaps, but I do not see where we actually invert the bit-order 
internal to a byte.


In an attempt to test this, I made the following schema:


  <tdml:defineSchema name="s6">
    <xs:include 
schemaLocation="org/apache/daffodil/xsd/DFDLGeneralFormat.dfdl.xsd"/>
    <dfdl:format ref="ex:GeneralFormat" representation="binary"
      lengthUnits="bits" lengthKind='explicit'
      alignmentUnits='bits' alignment='1' binaryNumberRep='binary'
      bitOrder="leastSignificantBitFirst" byteOrder="littleEndian"
      />

    <xs:element name="changeOnSequence" dfdl:lengthKind='implicit'>
      <xs:complexType>
        <xs:sequence>
          <xs:sequence>
            <xs:element name="A" type="xs:unsignedInt" dfdl:length="8"/>
            <xs:element name="B" type="xs:unsignedInt" dfdl:length="8"/>
            <xs:element name="C" type="xs:unsignedInt" dfdl:length="8"/>
            <xs:element name="D" type="xs:unsignedInt" dfdl:length="8"/>
          </xs:sequence>
        </xs:sequence>
      </xs:complexType>
    </xs:element>

  </tdml:defineSchema>

Combined with the following testcase:


 <tdml:parserTestCase name="bitOrderChangeOnSequence"
    root="changeOnSequence" model="s6" description="Tests changing bitOrder 
when on a byte boundary.">
    <document xmlns="http://www.ibm.com/xmlns/dfdl/testData";>
      <documentPart type="bits">
        0000 0001
        0000 0001
        1000 0000
        1000 0000
      </documentPart>
    </document>
    <tdml:infoset>
      <tdml:dfdlInfoset xmlns:xs="http://www.w3.org/2001/XMLSchema";
        xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"; 
xmlns="http://example.com";>
        <changeOnSequence>
          <A>128</A>
          <B>128</B>
          <C>1</C>
          <D>1</D>
        </changeOnSequence>
      </tdml:dfdlInfoset>
    </tdml:infoset>
  </tdml:parserTestCase>

However, the actual output was:

        <changeOnSequence>
          <A>1</A>
          <B>1</B>
          <C>128</C>
          <D>128</D>
        </changeOnSequence>

As if I had never set the bitOrder at all.
Given that we have made extensive use of bitOrder in the past, I find it hard 
to believe that it is a fundomentally non-implemented as my current 
understanding of the code would suggest. Any ideas on what I am missing?



Brandon T. Sloane

Associate, Services

bslo...@tresys.com | tresys.com

Reply via email to