Hi Akhil,

> -----Original Message-----
> From: Akhil Goyal <akhil.go...@nxp.com>
> Sent: Thursday, April 16, 2020 3:38 PM
> To: Trahe, Fiona <fiona.tr...@intel.com>; Dybkowski, AdamX 
> <adamx.dybkow...@intel.com>;
> dev@dpdk.org
> Cc: Shally Verma <shal...@marvell.com>
> Subject: RE: [PATCH v2 2/2] test/compress: im buffer too small - add unit 
> tests
> 
> > > >
> > > > Hi Akhil,
> > > >
> > > > >
> > > > > Hi Fiona/Adam,
> > > > >
> > > > > > This patch adds a new test suite for verification of the "internal
> > > > > > QAT IM buffer too small" case handling. These unit tests are
> > > > > > specific to the QAT PMD only - that's why they are contained in
> > > > > > a separate test suite.
> > > > > >
> > > > > > Signed-off-by: Adam Dybkowski <adamx.dybkow...@intel.com>
> > > > > > ---
> > > > >
> > > > > Why do we need to have separate testsuite for QAT?
> > > > > Can't we have a single one and based on capability of the driver,
> > > > > Determine which tests need to be skipped in case they are not 
> > > > > supported.
> > > > > This would create a mess in the longer run just like cryptodev.
> > > > >
> > > > > Please fix this, we cannot take this patch as is.
> > > >
> > > > [Fiona] Yes, I understand your concern and we considered including in 
> > > > the
> > main
> > > > suite.
> > > > However these tests are not based on something that can be
> > > > checked in capabilities. They are tests to hone in on a specific corner 
> > > > case
> > > > based on a QAT limitation in its intermediate buffer size. So some of 
> > > > the
> > > > tests are to validate that the recent changes we made in the PMD 
> > > > correctly
> > > > work around that limitation, but other tests are negative and expected 
> > > > to fail
> > > > as provoking a corner-case that still exists. Other devices would 
> > > > probably not
> > fail
> > > > the same tests.
> > >
> > > Does that mean that all PMDs will pass with the newly added testcase 
> > > which is
> > for
> > > A corner case in QAT. If that is the case what is the issue in adding 
> > > that in the
> > main
> > > Test suite. It will get passed in all PMDs, isn't it? Am I missing 
> > > something?
> > >
> > > I believe we should not have PMD specific test suites, rather it should 
> > > be based
> > on
> > > Capabilities to identify the cases which should be run for that 
> > > particular PMD.
> > [Fiona] yes, several of the cases should pass on all PMDs.
> > So we could move those into the main suite.
> > But what to do about the negative tests?
> > Example: If a very large data buffer is passed to QAT to compress with dyn
> > compression, it will get
> > split in the PMD into many smaller requests to the hardware. However if the
> > number
> > of requests is bigger than can fit on the qp then this will never succeed. 
> > The test
> > validates that the PMD behaves appropriately in this expected error case. 
> > That
> > same
> > case would probably not have an error on another device. Maybe we should 
> > just
> > leave out
> > such negative tests, but I find them useful as they validate the known 
> > behaviour.
> > The buffer size used in the test is based on the known size QAT can handle 
> > and
> > the
> > corner case in which QAT will return an error.
> >
> > I see 4 options to handle this:
> > 1. Leave out those tests
> > 2. Use a qat-specific test suite only for negative cases which are 
> > constructed
> > based on specific qat internal meta-data.
> > 3. Include the negative tests in the main suite, but only run them on QAT 
> > (by
> > checking driver type)
> > 4. include the negative tests in the main suite, run them on all, expecting 
> > a FAIL
> > from QAT and a PASS from other devices.
> >
> > My preference is for 2.
> > But up to you.
> >
> I would say 4 is better. And why do you say negative cases will fail on QAT 
> and pass on all other.
> The test cases are to test the library APIs which are same for all the PMDs 
> and the behavior should
> Be same.
[Fiona] I've explained above why QAT fails, sorry if it isn't clear.
Any device can have errors - it's not an API or capability issue, it's a device 
limitation in a very unlikely corner case.
So 4 is ok? i.e. if there is conditional code in the UT expecting different 
result depending on PMD type?
If not, we'll revert to 1 and leave out those tests.

Reply via email to