For tests like this, I'd like to see something in Jasmine that is akin to
the "Expected Failure" result in JUinit / python unittest.

It means that we still run all of the tests, but a failure on a device that
doesn't support the feature doesn't cause the whole test suite to turn red.

On the other hand, if a test which is expected to fail actually succeeds,
that is reported as "unexpected success" in the test output. We can then go
and look at what has changed -- either the test is broken, or the issue was
actually resolved.

I don't think it's available as an idiom in Jasmine, but it's just
JavaScript; it shouldn't be too hard to implement.

Ian

> On 13-06-20 9:06 AM, "Andrew Grieve" <agri...@chromium.org> wrote:

> >
> > Definitely torn on this one. On one hand, if there are features
> > implemented
> > on some platforms that should be implemented on others than having them
> > fail is a constant reminder that your platform needs to implement the
> > missing functionality. OTOH, things like camera clean-up are meant to be
> > platform specific, so it's nothing but an annoyance if that fails on
> other
> > platforms.
> >
> > So, I think my take on it is:
> >
> > 1. Have them shared and failing if the API should eventually be
> > implemented
> > on all platforms
> > 2. Wrap tests in if (platform.name == 'ios') {} if they are meant to
> only
> > work on one platform.
> >
> >
> >
> >
> >
> >
> > On Thu, Jun 20, 2013 at 8:44 AM, Lisa Seacat DeLuca
> > <ldel...@us.ibm.com>wrote:
> >
> >> One issue I ran with respects to the mobile spec is some tests are only
> >> applicable to certain device types.  We have a couple options when it
> >> comes to those types of tests:
> >>
> >> 1. Not include them in the automated tests
> >> 2. Include them knowing that they *might* cause failures with certain
> >> device types (see example)
> >> 3. Add javascript logic to check for device type before performing the
> >> tests
> >> 4. OR we could create platform specific automated tests that should be
> >> ran
> >> in addition to the base automated test per device. ex. automatedAndroid,
> >> automatedIOS, etc.
> >>
> >> An example is:
> >> https://issues.apache.org/jira/browse/CB-3484
> >> camera.cleanup is only supported on iOS.
> >>
> >> I added a test case to verify that the function existed.  But it doesn't
> >> actually run camera.cleanup so there are no failure on other platforms.
> >> So
> >> really there shouldn't be any harm in keeping the test.
> >>
> >>
> >> What are everyone's opinions on a good approach to handle this type of
> >> situation?
> >>
> >> Lisa Seacat DeLuca
>
>
> ---------------------------------------------------------------------
> This transmission (including any attachments) may contain confidential
> information, privileged material (including material protected by the
> solicitor-client or other applicable privileges), or constitute
> non-public information. Any use of this information by anyone other
> than the intended recipient is prohibited. If you have received this
> transmission in error, please immediately reply to the sender and
> delete this information from your system. Use, dissemination,
> distribution, or reproduction of this transmission by unintended
> recipients is not authorized and may be unlawful.
>

Reply via email to