Also, don’t you have a typo in your pattern? In your pattern you are using 
`$accountId`, while the variable is `account_id`? (Maybe I don’t understand it 
as I don’t know Scala very well).

Piotrek

> On 28 Feb 2020, at 11:45, Piotr Nowojski <pnowoj...@apache.org> wrote:
> 
> Hey,
> 
> What Java versions are you using? 
> 
> Also, could you check, if you are not mixing Scala versions somewhere? There 
> are two different Flink binaries for Scala 2.11 and Scala 2.12. I guess if 
> you mix them, of if you use incorrect Scala runtime not matching the 
> supported version of the binaries that you have downloaded, bad things could 
> happen.
> 
> Piotrek
> 
>> On 26 Feb 2020, at 12:56, David Magalhães <speeddra...@gmail.com 
>> <mailto:speeddra...@gmail.com>> wrote:
>> 
>> I'm testing a custom sink that uses TwoPhaseCommit with the test harness 
>> provided by flink-streaming-java.
>> 
>> "org.apache.flink" %% "flink-streaming-java" % flinkVersion % "test" 
>> classifier "tests"
>> 
>> Using this, in some tests that I use scala string interpolation, the string 
>> output have a strange behaviour, like it changes the place where values goes.
>> 
>> Example:
>> 
>> val account_id = "account0"
>> val partitionDate = "202002"
>> val fileName = "2020-02-26_11-09-46.parquet"
>> 
>> s"account_id=$accountId/partition_date=$partitionDate/$fileName"
>> 
>> Should be: 
>> account_id=account0/partition_date=202002/2020-02-26_11-09-46.parquet
>> Actual result: 
>> account_id=account0/partition_date=2020-02-26_11-09-46.parquet/202002
>> 
>> The variables values after the string interpolation do change values.
>> 
>> Concat behaviour is not affected: 
>> 
>> "account_id=".concat(accountId).concat("/partition_date=").concat(partitionDate).concat("/").concat(fileName)
>> 
>> If I remove the flink-streaming-java dependency is works as expected. 
>> 
>> Any thoughts why is behaving this way ?
> 

Reply via email to