Re: flakey windows CI build? Or real issue?

2021-04-29 Thread Beckerle, Mike
Wow, they even coined a term for this macro stuff... "check-enabled-idiom".

It says Apache License 2.0 in the License file.

From: Steve Lawrence 
Sent: Thursday, April 29, 2021 3:51 PM
To: dev@daffodil.apache.org 
Subject: Re: flakey windows CI build? Or real issue?

Good point. It looks like scala-logging has that macro stuff, and is a
wrapper for SLF4J so I would assume could be easily used by Java
applications:

https://github.com/lightbend/scala-logging

I haven't looked at license/dependency of that, but something like that
might ork.

On 4/29/21 3:31 PM, Beckerle, Mike wrote:
> I have no problem using someone else's logging infrastructure.
>
> The only sort-of-requirement is I've always hated the overhead of logging 
> because to create a good log message you end up doing a bunch of work and 
> then you pass that to the logger which says "not at the log level where that 
> is needed", and throws it all away.
>
> The reason for the logging macro is to lower the overhead so that logging like
>
> log(SomeLevel, formatStringExpr, arg1Expr, arg2Expr,)
>
> imagine those "...Expr" things are in fact expressions, perhaps with some 
> cost to lookup the offending things etc. They may access lazy vals that have 
> to be computed, for example.
>
> You really want this to behave as if this was what was written:
>
> if (SomeLevel >= LoggingLevel)
>   log(formatStringExpr, arg1Expr, arg2Expr, )
>
> So that none of the cost of computing the arg expressions is encountered 
> unless you are at a log level where they are needed.
>
> That's what the macro does. Just hoists the if test above the evaluation of 
> all those expressions.
>
> We can certainly still do that even if the underlying logger is one of the 
> conventional ones popular in the java world.
>
>
> 
> From: Steve Lawrence 
> Sent: Wednesday, April 28, 2021 8:22 AM
> To: dev@daffodil.apache.org 
> Subject: Re: flakey windows CI build? Or real issue?
>
> Maybe we should consider dropping our own logging implementation and use
> some existing logging library. Other people have put a lot more time and
> thought into logging than we have. And I don't think Daffodil has any
> special logging requirements that other loggers don't already have.
>
> Thoughts?
>
>
> On 4/27/21 7:28 PM, Beckerle, Mike wrote:
>> Logging is highly suspicious for race conditions to me.
>>
>> This whole design is completely non-thread safe, and just doesn't make 
>> sense. I think "with Logging" was just copied as a pattern from place to 
>> place.
>>
>> I just created https://issues.apache.org/jira/browse/DAFFODIL-2510 for this 
>> issue.
>> 
>> From: Beckerle, Mike 
>> Sent: Tuesday, April 27, 2021 3:28 PM
>> To: dev@daffodil.apache.org 
>> Subject: Re: flakey windows CI build? Or real issue?
>>
>> This one line:
>>
>> [error] Test org.apache.daffodil.example.TestScalaAPI.testScalaAPI2 failed: 
>> expected:<0> but was:<1>, took 0.307 sec
>>
>> For that test to fail an assertEquals, but only on one platform,... and it 
>> is not reproducible. Is very disconcerting.
>>
>> The test has exactly 3 assertEquals that compare against 0.
>>
>>   @Test
>>   def testScalaAPI2(): Unit = {
>> val lw = new LogWriterForSAPITest()
>>
>> Daffodil.setLogWriter(lw)
>> Daffodil.setLoggingLevel(LogLevel.Info)
>>
>> ...
>>
>> val res = dp.parse(input, outputter)
>>
>>...
>> assertEquals(0, lw.errors.size)
>> assertEquals(0, lw.warnings.size)
>> assertEquals(0, lw.others.size)
>>
>> // reset the global logging state
>> Daffodil.setLogWriter(new ConsoleLogWriter())
>> Daffodil.setLoggingLevel(LogLevel.Info)
>>   }
>>
>> So this test is failing sporadically because of something being written to 
>> the logWriter (lw) that wasn't before.
>>
>> 
>> From: Interrante, John A (GE Research, US) 
>> Sent: Tuesday, April 27, 2021 2:47 PM
>> To: dev@daffodil.apache.org 
>> Subject: flakey windows CI build? Or real issue?
>>
>> Once you drill down into and expand the "Run Unit Tests" log, GitHub lets 
>> you search that log with a magnifying lens icon and input search text box 
>> above the log.  Searching for "failed:" makes it easier to find the specific 
>> failures.  I found one failure and three warnings:
>>
>> [error] Test org.apache.daffodil.example.TestScalaAPI.testScalaAPI2 failed: 
>> expected:<0> but was:<1>, took 0.307 sec
>>
>> [warn] Test assumption in test 
>> org.apache.daffodil.usertests.TestSepTests.test_sep_ssp_never_1 failed: 
>> org.junit.AssumptionViolatedException: (Implementation: daffodil) Test 
>> 'test_sep_ssp_never_1' not compatible with implementation., took 0.033 sec
>> [warn] Test assumption in test 
>> org.apache.daffodil.usertests.TestSepTests.test_sep_ssp_never_3 failed: 
>> org.junit.AssumptionViolatedException: (Implementation: daffodil) Test 
>> 'test_sep_ssp_never_3' not compatible 

Re: need 2nd reviewer on PR

2021-04-29 Thread Adams, Joshua
Sorry for the delay, I'll take a look at this tonight when I get home if it 
still needs a second review.

Josh

On Apr 29, 2021 12:42 PM, "Beckerle, Mike"  
wrote:
https://github.com/apache/daffodil/pull/539

Needs a second reviewer.

I added some "Highlights" comments to the files diffs to help you surf the 
deltas more effectively.

This fixes an issue a user was having doing streaming reads of messages.

Mike Beckerle | Principal Engineer

[cid:2f6f0715-6c4b-4778-8cfb-94509b2b116c]

mbecke...@owlcyberdefense.com

P +1-781-330-0412



Re: flakey windows CI build? Or real issue?

2021-04-29 Thread Steve Lawrence
Good point. It looks like scala-logging has that macro stuff, and is a
wrapper for SLF4J so I would assume could be easily used by Java
applications:

https://github.com/lightbend/scala-logging

I haven't looked at license/dependency of that, but something like that
might ork.

On 4/29/21 3:31 PM, Beckerle, Mike wrote:
> I have no problem using someone else's logging infrastructure.
> 
> The only sort-of-requirement is I've always hated the overhead of logging 
> because to create a good log message you end up doing a bunch of work and 
> then you pass that to the logger which says "not at the log level where that 
> is needed", and throws it all away.
> 
> The reason for the logging macro is to lower the overhead so that logging like
> 
> log(SomeLevel, formatStringExpr, arg1Expr, arg2Expr,)
> 
> imagine those "...Expr" things are in fact expressions, perhaps with some 
> cost to lookup the offending things etc. They may access lazy vals that have 
> to be computed, for example.
> 
> You really want this to behave as if this was what was written:
> 
> if (SomeLevel >= LoggingLevel)
>   log(formatStringExpr, arg1Expr, arg2Expr, )
> 
> So that none of the cost of computing the arg expressions is encountered 
> unless you are at a log level where they are needed.
> 
> That's what the macro does. Just hoists the if test above the evaluation of 
> all those expressions.
> 
> We can certainly still do that even if the underlying logger is one of the 
> conventional ones popular in the java world.
> 
> 
> 
> From: Steve Lawrence 
> Sent: Wednesday, April 28, 2021 8:22 AM
> To: dev@daffodil.apache.org 
> Subject: Re: flakey windows CI build? Or real issue?
> 
> Maybe we should consider dropping our own logging implementation and use
> some existing logging library. Other people have put a lot more time and
> thought into logging than we have. And I don't think Daffodil has any
> special logging requirements that other loggers don't already have.
> 
> Thoughts?
> 
> 
> On 4/27/21 7:28 PM, Beckerle, Mike wrote:
>> Logging is highly suspicious for race conditions to me.
>>
>> This whole design is completely non-thread safe, and just doesn't make 
>> sense. I think "with Logging" was just copied as a pattern from place to 
>> place.
>>
>> I just created https://issues.apache.org/jira/browse/DAFFODIL-2510 for this 
>> issue.
>> 
>> From: Beckerle, Mike 
>> Sent: Tuesday, April 27, 2021 3:28 PM
>> To: dev@daffodil.apache.org 
>> Subject: Re: flakey windows CI build? Or real issue?
>>
>> This one line:
>>
>> [error] Test org.apache.daffodil.example.TestScalaAPI.testScalaAPI2 failed: 
>> expected:<0> but was:<1>, took 0.307 sec
>>
>> For that test to fail an assertEquals, but only on one platform,... and it 
>> is not reproducible. Is very disconcerting.
>>
>> The test has exactly 3 assertEquals that compare against 0.
>>
>>   @Test
>>   def testScalaAPI2(): Unit = {
>> val lw = new LogWriterForSAPITest()
>>
>> Daffodil.setLogWriter(lw)
>> Daffodil.setLoggingLevel(LogLevel.Info)
>>
>> ...
>>
>> val res = dp.parse(input, outputter)
>>
>>...
>> assertEquals(0, lw.errors.size)
>> assertEquals(0, lw.warnings.size)
>> assertEquals(0, lw.others.size)
>>
>> // reset the global logging state
>> Daffodil.setLogWriter(new ConsoleLogWriter())
>> Daffodil.setLoggingLevel(LogLevel.Info)
>>   }
>>
>> So this test is failing sporadically because of something being written to 
>> the logWriter (lw) that wasn't before.
>>
>> 
>> From: Interrante, John A (GE Research, US) 
>> Sent: Tuesday, April 27, 2021 2:47 PM
>> To: dev@daffodil.apache.org 
>> Subject: flakey windows CI build? Or real issue?
>>
>> Once you drill down into and expand the "Run Unit Tests" log, GitHub lets 
>> you search that log with a magnifying lens icon and input search text box 
>> above the log.  Searching for "failed:" makes it easier to find the specific 
>> failures.  I found one failure and three warnings:
>>
>> [error] Test org.apache.daffodil.example.TestScalaAPI.testScalaAPI2 failed: 
>> expected:<0> but was:<1>, took 0.307 sec
>>
>> [warn] Test assumption in test 
>> org.apache.daffodil.usertests.TestSepTests.test_sep_ssp_never_1 failed: 
>> org.junit.AssumptionViolatedException: (Implementation: daffodil) Test 
>> 'test_sep_ssp_never_1' not compatible with implementation., took 0.033 sec
>> [warn] Test assumption in test 
>> org.apache.daffodil.usertests.TestSepTests.test_sep_ssp_never_3 failed: 
>> org.junit.AssumptionViolatedException: (Implementation: daffodil) Test 
>> 'test_sep_ssp_never_3' not compatible with implementation., took 0.005 sec
>> [warn] Test assumption in test 
>> org.apache.daffodil.usertests.TestSepTests.test_sep_ssp_never_4 failed: 
>> org.junit.AssumptionViolatedException: (Implementation: daffodil) Test 
>> 'test_sep_ssp_never_4' not compatible with implementation., 

Re: flakey windows CI build? Or real issue?

2021-04-29 Thread Beckerle, Mike
I have no problem using someone else's logging infrastructure.

The only sort-of-requirement is I've always hated the overhead of logging 
because to create a good log message you end up doing a bunch of work and then 
you pass that to the logger which says "not at the log level where that is 
needed", and throws it all away.

The reason for the logging macro is to lower the overhead so that logging like

log(SomeLevel, formatStringExpr, arg1Expr, arg2Expr,)

imagine those "...Expr" things are in fact expressions, perhaps with some cost 
to lookup the offending things etc. They may access lazy vals that have to be 
computed, for example.

You really want this to behave as if this was what was written:

if (SomeLevel >= LoggingLevel)
  log(formatStringExpr, arg1Expr, arg2Expr, )

So that none of the cost of computing the arg expressions is encountered unless 
you are at a log level where they are needed.

That's what the macro does. Just hoists the if test above the evaluation of all 
those expressions.

We can certainly still do that even if the underlying logger is one of the 
conventional ones popular in the java world.



From: Steve Lawrence 
Sent: Wednesday, April 28, 2021 8:22 AM
To: dev@daffodil.apache.org 
Subject: Re: flakey windows CI build? Or real issue?

Maybe we should consider dropping our own logging implementation and use
some existing logging library. Other people have put a lot more time and
thought into logging than we have. And I don't think Daffodil has any
special logging requirements that other loggers don't already have.

Thoughts?


On 4/27/21 7:28 PM, Beckerle, Mike wrote:
> Logging is highly suspicious for race conditions to me.
>
> This whole design is completely non-thread safe, and just doesn't make sense. 
> I think "with Logging" was just copied as a pattern from place to place.
>
> I just created https://issues.apache.org/jira/browse/DAFFODIL-2510 for this 
> issue.
> 
> From: Beckerle, Mike 
> Sent: Tuesday, April 27, 2021 3:28 PM
> To: dev@daffodil.apache.org 
> Subject: Re: flakey windows CI build? Or real issue?
>
> This one line:
>
> [error] Test org.apache.daffodil.example.TestScalaAPI.testScalaAPI2 failed: 
> expected:<0> but was:<1>, took 0.307 sec
>
> For that test to fail an assertEquals, but only on one platform,... and it is 
> not reproducible. Is very disconcerting.
>
> The test has exactly 3 assertEquals that compare against 0.
>
>   @Test
>   def testScalaAPI2(): Unit = {
> val lw = new LogWriterForSAPITest()
>
> Daffodil.setLogWriter(lw)
> Daffodil.setLoggingLevel(LogLevel.Info)
>
> ...
>
> val res = dp.parse(input, outputter)
>
>...
> assertEquals(0, lw.errors.size)
> assertEquals(0, lw.warnings.size)
> assertEquals(0, lw.others.size)
>
> // reset the global logging state
> Daffodil.setLogWriter(new ConsoleLogWriter())
> Daffodil.setLoggingLevel(LogLevel.Info)
>   }
>
> So this test is failing sporadically because of something being written to 
> the logWriter (lw) that wasn't before.
>
> 
> From: Interrante, John A (GE Research, US) 
> Sent: Tuesday, April 27, 2021 2:47 PM
> To: dev@daffodil.apache.org 
> Subject: flakey windows CI build? Or real issue?
>
> Once you drill down into and expand the "Run Unit Tests" log, GitHub lets you 
> search that log with a magnifying lens icon and input search text box above 
> the log.  Searching for "failed:" makes it easier to find the specific 
> failures.  I found one failure and three warnings:
>
> [error] Test org.apache.daffodil.example.TestScalaAPI.testScalaAPI2 failed: 
> expected:<0> but was:<1>, took 0.307 sec
>
> [warn] Test assumption in test 
> org.apache.daffodil.usertests.TestSepTests.test_sep_ssp_never_1 failed: 
> org.junit.AssumptionViolatedException: (Implementation: daffodil) Test 
> 'test_sep_ssp_never_1' not compatible with implementation., took 0.033 sec
> [warn] Test assumption in test 
> org.apache.daffodil.usertests.TestSepTests.test_sep_ssp_never_3 failed: 
> org.junit.AssumptionViolatedException: (Implementation: daffodil) Test 
> 'test_sep_ssp_never_3' not compatible with implementation., took 0.005 sec
> [warn] Test assumption in test 
> org.apache.daffodil.usertests.TestSepTests.test_sep_ssp_never_4 failed: 
> org.junit.AssumptionViolatedException: (Implementation: daffodil) Test 
> 'test_sep_ssp_never_4' not compatible with implementation., took 0.003 sec
>
> Your previous run failed in the Windows Java 11 build's Compile step with a 
> http 504 error when sbt was trying to fetch artifacts:
>
> [error] 
> lmcoursier.internal.shaded.coursier.error.FetchError$DownloadingArtifacts: 
> Error fetching artifacts:
> [error] 
> https://repo.scala-sbt.org/scalasbt/sbt-plugin-releases/com.typesafe.sbt/sbt-native-packager/scala_2.12/sbt_1.0/1.8.1/jars/sbt-native-packager.jar:
>  download error: Caught java.io.IOException: Server 

need 2nd reviewer on PR

2021-04-29 Thread Beckerle, Mike
https://github.com/apache/daffodil/pull/539

Needs a second reviewer.

I added some "Highlights" comments to the files diffs to help you surf the 
deltas more effectively.

This fixes an issue a user was having doing streaming reads of messages.

Mike Beckerle | Principal Engineer

[cid:2f6f0715-6c4b-4778-8cfb-94509b2b116c]

mbecke...@owlcyberdefense.com

P +1-781-330-0412