RE: Unable to change job manager port when launching session cluster on Docker

2019-10-21 Thread Papadopoulos, Konstantinos
observations. Regards, Pritam. On Mon, 21 Oct 2019 at 13:41, Papadopoulos, Konstantinos mailto:konstantinos.papadopou...@iriworldwide.com>> wrote: Hi Aleksey, I tried using "8081:5000" as port binding configuration with no success. I also tried different port numbers (i.e,.

Unable to change job manager port when launching session cluster on Docker

2019-10-18 Thread Papadopoulos, Konstantinos
Hello all, I am trying to launch an Apache Flink session cluster on Docker using Docker Compose and following the respective tutorial: https://ci.apache.org/projects/flink/flink-docs-stable/ops/deployment/docker.html#flink-with-docker-compose The default job manager port (i.e., 8081) is in use

Disk full problem faced due to the Flink tmp directory contents

2019-07-10 Thread Papadopoulos, Konstantinos
Hi all, We are developing several batch processing applications using the DataSet API of the Apache Flink. For the time being, we are facing an issue with one of our production environments since its disk usage increase enormously. After a quick investigation, we concluded that the

RE: Flink SQL: Execute DELETE queries

2019-05-28 Thread Papadopoulos, Konstantinos
The case I have in mind was to have an external JDBC table sink and try to delete a number of or all rows of the target DB table. Is it possible using Flink SQL? From: Vasyl Bervetskyi Sent: Tuesday, May 28, 2019 5:36 PM To: Papadopoulos, Konstantinos Cc: user@flink.apache.org Subject: RE

Flink SQL: Execute DELETE queries

2019-05-28 Thread Papadopoulos, Konstantinos
Hi all, I experiment on Flink Table API & SQL and I have the following question; is there any way to execute DELETE queries using Flink SQL? Thanks in advance, Konstantinos

RE: writeAsFormattedText sets only Unix/Linux line endings

2019-05-09 Thread Papadopoulos, Konstantinos
Kind reminder From: Papadopoulos, Konstantinos Sent: Monday, May 06, 2019 5:31 PM To: user@flink.apache.org Subject: writeAsFormattedText sets only Unix/Linux line endings Hi all, We are developing an application using Flink DataSet API focusing on generating a CSV file from a dataset

writeAsFormattedText sets only Unix/Linux line endings

2019-05-06 Thread Papadopoulos, Konstantinos
Hi all, We are developing an application using Flink DataSet API focusing on generating a CSV file from a dataset of POJOs using writeAsFormattedText and a custom TextFormatter. During the testing of our application, we observed that the files generated consist of Unix line endings (i.e.,

RE: Flink JDBC: Disable auto-commit mode

2019-04-15 Thread Papadopoulos, Konstantinos
Hi Fabian, I opened the following issue to track the improvement proposed: https://issues.apache.org/jira/browse/FLINK-12198 Best, Konstantinos From: Papadopoulos, Konstantinos Sent: Δευτέρα, 15 Απριλίου 2019 12:30 μμ To: Fabian Hueske Cc: Rong Rong ; user Subject: RE: Flink JDBC: Disable

RE: Flink JDBC: Disable auto-commit mode

2019-04-15 Thread Papadopoulos, Konstantinos
Hi Fabian, Glad to hear that you agree for such an improvement. Of course, I can handle it. Best, Konstantinos From: Fabian Hueske Sent: Δευτέρα, 15 Απριλίου 2019 11:56 πμ To: Papadopoulos, Konstantinos Cc: Rong Rong ; user Subject: Re: Flink JDBC: Disable auto-commit mode Hi Konstantinos

RE: Flink JDBC: Disable auto-commit mode

2019-04-15 Thread Papadopoulos, Konstantinos
To: Papadopoulos, Konstantinos Cc: user Subject: Re: Flink JDBC: Disable auto-commit mode Hi Konstantinos, Seems like setting for auto commit is not directly possible in the current JDBCInputFormatBuilder. However there's a way to specify the fetch size [1] for your DB round-trip, doesn't

Flink JDBC: Disable auto-commit mode

2019-04-12 Thread Papadopoulos, Konstantinos
Hi all, We are facing an issue when trying to integrate PostgreSQL with Flink JDBC. When you establish a connection to the PostgreSQL database, it is in auto-commit mode. It means that each SQL statement is treated as a transaction and is automatically committed, but this functionality results

RE: InvalidProgramException when trying to sort a group within a dataset

2019-04-08 Thread Papadopoulos, Konstantinos
Thanks, Fabian. Problem solved after implementing the Serializable interface from all the services of the stack or making transient the ones not needed. Best, Konstantinos From: Fabian Hueske Sent: Δευτέρα, 8 Απριλίου 2019 11:37 πμ To: Papadopoulos, Konstantinos Cc: Chesnay Schepler ; user

FlinkException: The assigned slot was removed

2019-04-08 Thread Papadopoulos, Konstantinos
Hi all, When I execute my Flink job using IntelliJ IDEA stand-alone mode, the job is executed successfully, but when I try to attach it to a stand-alone Flink cluster, my job fails with a Flink exception that "the assigned slot was removed". Does anyone have any idea why I am facing this

RE: InvalidProgramException when trying to sort a group within a dataset

2019-04-08 Thread Papadopoulos, Konstantinos
Hi Fabian, Thanks for your support. I updated my POJO to implement the Serializable interface with no success. I got the same NotSerializableException. Best, Konstantinos From: Fabian Hueske Sent: Σάββατο, 6 Απριλίου 2019 2:26 πμ To: Papadopoulos, Konstantinos Cc: Chesnay Schepler ; user

RE: InvalidProgramException when trying to sort a group within a dataset

2019-04-03 Thread Papadopoulos, Konstantinos
ING) .reduceGroup(new ThresholdAcvFactCountGroupReducer()); } Regards, Konstantinos From: Chesnay Schepler Sent: Τετάρτη, 3 Απριλίου 2019 12:59 μμ To: Papadopoulos, Konstantinos ; user@flink.apache.org Subject: Re: InvalidProgramException when trying to sort a group within a

InvalidProgramException when trying to sort a group within a dataset

2019-04-02 Thread Papadopoulos, Konstantinos
Hi all, I am trying to sort a group within a dataset using KeySelector as follows: in .groupBy("productId", "timePeriodId", "geographyId") .sortGroup(new KeySelector() { @Override public Double getKey(ThresholdAcvFact thresholdAcvFact) throws Exception { return

IllegalArgumentException when trying to execute job

2019-03-28 Thread Papadopoulos, Konstantinos
Hi all, I am trying to execute a batch job that gets a list of IDs and perform a loop with a number of steps during each iteration including reading from a MS SQL Server DB. A sample pseudo-code of our implementation is the following: List ids = ... ids.foreach( id ->

RE: ProgramInvocationException when trying to submit a job by running a jar using Monitoring REST API

2019-03-15 Thread Papadopoulos, Konstantinos
Yes, we are submitting more than one job and we choose which one is going to be executed depending on the first program argument (i.e., ‘job’ argument). From: Chesnay Schepler Sent: Παρασκευή, 15 Μαρτίου 2019 12:53 μμ To: Papadopoulos, Konstantinos ; user@flink.apache.org Subject: Re

RE: ProgramInvocationException when trying to submit a job by running a jar using Monitoring REST API

2019-03-15 Thread Papadopoulos, Konstantinos
$Worker.run(Unknown Source) [?:1.8.0_201] at java.lang.Thread.run(Unknown Source) [?:1.8.0_201] From: Chesnay Schepler Sent: Παρασκευή, 15 Μαρτίου 2019 10:56 πμ To: Papadopoulos, Konstantinos ; user@flink.apache.org Subject: Re: ProgramInvocationException when trying to submit a job

RE: ProgramInvocationException when trying to submit a job by running a jar using Monitoring REST API

2019-03-15 Thread Papadopoulos, Konstantinos
To: Papadopoulos, Konstantinos ; user@flink.apache.org Subject: Re: ProgramInvocationException when trying to submit a job by running a jar using Monitoring REST API Please provide the logged exception, I cannot help you otherwise. On 14.03.2019 14:20, Papadopoulos, Konstantinos wrote: It seems

RE: ProgramInvocationException when trying to submit a job by running a jar using Monitoring REST API

2019-03-14 Thread Papadopoulos, Konstantinos
: "--job=mediaSpent,--initialScopeId=b494c35d-4c37-4338-8d23-0fc947bef690,--integratedScopeId=91769bd8-df4d-436c-b8d0-2e23ce862859,--projectId=333,--log.path=../log"} Content-Type: application/json From: Chesnay Schepler Sent: Πέμπτη, 14 Μαρτίου 2019 2:24 μμ To: Papadopoulos,

ProgramInvocationException when trying to submit a job by running a jar using Monitoring REST API

2019-03-14 Thread Papadopoulos, Konstantinos
Hi all, As part of our projects, I experiment with Flink Monitoring REST API and, especially, its capabilities of uploading and running jar files. When I am trying to submit one of our jobs by running a jar previously uploaded via '/jars/upload', I am getting an 500 Internal Server Error

RE: Add header to a file produced using the writeAsFormattedText method

2019-02-04 Thread Papadopoulos, Konstantinos
Hi Fabian, Do you know if there is any plan Flink core framework to support such functionality? Best, Konstantinos From: Fabian Hueske Sent: Δευτέρα, 4 Φεβρουαρίου 2019 3:49 μμ To: Papadopoulos, Konstantinos Cc: user@flink.apache.org Subject: Re: Add header to a file produced using

Add header to a file produced using the writeAsFormattedText method

2019-02-01 Thread Papadopoulos, Konstantinos
Hi all, I am trying to produce a file from a dataset using the writeAsFormattedText method (e.g., data.writeAsFormattedText(filename, writeMode, formatter)). Is there any easy way to add a header to the file produced? Thanks in advance, Konstantinos