Qualcomm Innovation Center, Inc. is a member of Code Aurora Forum, a
> Linux Foundation Collaborative Project
>
>> -Original Message-
>> From: lldb-dev [mailto:lldb-dev-boun...@lists.llvm.org] On Behalf Of Vedant
>> Kumar via lldb-dev
>> Sent: Tuesday, March 13
laborative Project
> -Original Message-
> From: lldb-dev [mailto:lldb-dev-boun...@lists.llvm.org] On Behalf Of Vedant
> Kumar via lldb-dev
> Sent: Tuesday, March 13, 2018 7:48 PM
> To: Davide Italiano
> Cc: LLDB
> Subject: Re: [lldb-dev] increase timeout for tests
+1 On deleting the lldb-mi tests and increasing the timeout.
On Wed, 14 Mar 2018 at 02:27, Jim Ingham wrote:
> It is unfortunate that we have to set really long test timeouts because we
> are timing the total Test class run, not the individual tests. It is
> really convenient to be able to gr
It is unfortunate that we have to set really long test timeouts because we are
timing the total Test class run, not the individual tests. It is really
convenient to be able to group similar tests in one class, so they can reuse
setup and common check methods etc. But if we're going to have to
As a first step, I think there's consensus on increasing the test timeout to
~3x the length of the slowest test we know of. That test appears to be
TestDataFormatterObjC, which takes 388 seconds on Davide's machine. So I
propose 20 minutes as the timeout value.
Separately, regarding x-failed pe
On Tue, Mar 13, 2018 at 11:26 AM, Jim Ingham wrote:
> It sounds like we timing out based on the whole test class, not the
> individual tests? If you're worried about test failures not hanging up the
> test suite the you really want to do the latter.
>
> These are all tests that contain 5 or mor
It sounds like we timing out based on the whole test class, not the individual
tests? If you're worried about test failures not hanging up the test suite the
you really want to do the latter.
These are all tests that contain 5 or more independent tests. That's probably
why they are taking so
Few examples:
360 out of 617 test suites processed - TestObjCMethods.py
XX
TestObjCMethods.py: 363.726592
381 out of 617 test suites processed - TestHiddenIvars.py
XX
TestHiddenIvars.py: 363.887766
387 out of 61
I'll re-run the test and send you a list.
Thank you!
--
Davide
On Tue, Mar 13, 2018 at 9:02 AM, Pavel Labath wrote:
> Aha, in that case, it definitely sounds like increasing the timeout is in
> order. I would still go for something less than 30 minutes, but I don't care
> about that much. I may
Aha, in that case, it definitely sounds like increasing the timeout is in
order. I would still go for something less than 30 minutes, but I don't
care about that much. I may just export LLDB_TEST_TIMEOUT locally to lower
it if tests taking long to time out start bugging me.
BTW, do you know which
On Tue, Mar 13, 2018 at 3:30 AM, Pavel Labath wrote:
> I think we should get some data before pulling numbers out of our sleeves.
> If we can get some numbers on the slowest machine that we have around, then
> we should be able to specify the timeout as some multiple of the slowest
> test. For exa
I think we should get some data before pulling numbers out of our sleeves.
If we can get some numbers on the slowest machine that we have around, then
we should be able to specify the timeout as some multiple of the slowest
test. For example, for me the slowest test takes around 110 seconds. The
ti
On Mon, Mar 12, 2018 at 7:01 PM, Jim Ingham wrote:
> The problem with having no timeouts is that you have to then be fairly
> careful how you write tests. You can't do:
>
> while (1) {
>print("Set a breakpoint here and hit it a few times then stop the test");
> }
>
> because if the breakpoin
The problem with having no timeouts is that you have to then be fairly careful
how you write tests. You can't do:
while (1) {
print("Set a breakpoint here and hit it a few times then stop the test");
}
because if the breakpoint setting fails the test can run forever. And we wait
forever to
On Fri, Mar 9, 2018 at 3:45 AM, Pavel Labath wrote:
>
>
>
> On Thu, 8 Mar 2018 at 18:40, Davide Italiano wrote:
>>
>> On Thu, Mar 8, 2018 at 10:29 AM, Greg Clayton wrote:
>> > It would be great to look into these and see what is taking so long.
>> > Some tests are doing way too much work and sho
On Thu, 8 Mar 2018 at 18:40, Davide Italiano wrote:
> On Thu, Mar 8, 2018 at 10:29 AM, Greg Clayton wrote:
> > It would be great to look into these and see what is taking so long.
> Some tests are doing way too much work and should be split up. But it would
> be great to see if we have any tests
On Thu, Mar 8, 2018 at 10:29 AM, Greg Clayton wrote:
> It would be great to look into these and see what is taking so long. Some
> tests are doing way too much work and should be split up. But it would be
> great to see if we have any tests that are not good tests and are just taking
> too much
It would be great to look into these and see what is taking so long. Some tests
are doing way too much work and should be split up. But it would be great to
see if we have any tests that are not good tests and are just taking too much
time for no reason (like the watchpoint tests were that Pavel
Hi,
I've noticed some of the tests we have trigger timeouts when running in debug.
TIMEOUT: test_NSError_p_dwarf (lang/objc/foundation/TestObjCMethods2.py)
TIMEOUT: test_expression_lookups_objc_dwarf
(lang/objc/foundation/TestObjCMethods.py)
TIMEOUT: test_expressions_in_literals_dsym
(lang/objc/ob
19 matches
Mail list logo