your observations.
Regards,
Pritam.
On Mon, 21 Oct 2019 at 13:41, Papadopoulos, Konstantinos
mailto:konstantinos.papadopou...@iriworldwide.com>>
wrote:
Hi Aleksey,
I tried using "8081:5000" as port binding configuration with no success. I also
tried different port numbers (i.e,
the docker-compose approach? Or alternatively, should I pull
the image, create the container and modify the respective configuration after
connecting to it?
Regards,
Konstantinos
From: Aleksey Pak
Sent: Friday, October 18, 2019 10:46 PM
To: Papadopoulos, Konstantinos
Cc: user@flink.apache.org
S
Hello all,
I am trying to launch an Apache Flink session cluster on Docker using Docker
Compose and following the respective tutorial:
https://ci.apache.org/projects/flink/flink-docs-stable/ops/deployment/docker.html#flink-with-docker-compose
The default job manager port (i.e., 8081) is in use on
Hi all,
We are developing several batch processing applications using the DataSet API
of the Apache Flink.
For the time being, we are facing an issue with one of our production
environments since its disk usage increase enormously. After a quick
investigation, we concluded that the /tmp/flink-i
The case I have in mind was to have an external JDBC table sink and try to
delete a number of or all rows of the target DB table. Is it possible using
Flink SQL?
From: Vasyl Bervetskyi
Sent: Tuesday, May 28, 2019 5:36 PM
To: Papadopoulos, Konstantinos
Cc: user@flink.apache.org
Subject: RE
Hi all,
I experiment on Flink Table API & SQL and I have the following question; is
there any way to execute DELETE queries using Flink SQL?
Thanks in advance,
Konstantinos
Kind reminder
From: Papadopoulos, Konstantinos
Sent: Monday, May 06, 2019 5:31 PM
To: user@flink.apache.org
Subject: writeAsFormattedText sets only Unix/Linux line endings
Hi all,
We are developing an application using Flink DataSet API focusing on generating
a CSV file from a dataset of
Hi all,
We are developing an application using Flink DataSet API focusing on generating
a CSV file from a dataset of POJOs using writeAsFormattedText and a custom
TextFormatter.
During the testing of our application, we observed that the files generated
consist of Unix line endings (i.e., '\n')
Hi Fabian,
I opened the following issue to track the improvement proposed:
https://issues.apache.org/jira/browse/FLINK-12198
Best,
Konstantinos
From: Papadopoulos, Konstantinos
Sent: Δευτέρα, 15 Απριλίου 2019 12:30 μμ
To: Fabian Hueske
Cc: Rong Rong ; user
Subject: RE: Flink JDBC: Disable
Hi Fabian,
Glad to hear that you agree for such an improvement. Of course, I can handle it.
Best,
Konstantinos
From: Fabian Hueske
Sent: Δευτέρα, 15 Απριλίου 2019 11:56 πμ
To: Papadopoulos, Konstantinos
Cc: Rong Rong ; user
Subject: Re: Flink JDBC: Disable auto-commit mode
Hi Konstantinos
To: Papadopoulos, Konstantinos
Cc: user
Subject: Re: Flink JDBC: Disable auto-commit mode
Hi Konstantinos,
Seems like setting for auto commit is not directly possible in the current
JDBCInputFormatBuilder.
However there's a way to specify the fetch size [1] for your DB round-trip,
do
Hi all,
We are facing an issue when trying to integrate PostgreSQL with Flink JDBC.
When you establish a connection to the PostgreSQL database, it is in
auto-commit mode. It means that each SQL statement is treated as a transaction
and is automatically committed, but this functionality results
Thanks, Fabian.
Problem solved after implementing the Serializable interface from all the
services of the stack or making transient the ones not needed.
Best,
Konstantinos
From: Fabian Hueske
Sent: Δευτέρα, 8 Απριλίου 2019 11:37 πμ
To: Papadopoulos, Konstantinos
Cc: Chesnay Schepler ; user
Hi all,
When I execute my Flink job using IntelliJ IDEA stand-alone mode, the job is
executed successfully, but when I try to attach it to a stand-alone Flink
cluster, my job fails with a Flink exception that "the assigned slot was
removed".
Does anyone have any idea why I am facing this issue
Hi Fabian,
Thanks for your support. I updated my POJO to implement the Serializable
interface with no success.
I got the same NotSerializableException.
Best,
Konstantinos
From: Fabian Hueske
Sent: Σάββατο, 6 Απριλίου 2019 2:26 πμ
To: Papadopoulos, Konstantinos
Cc: Chesnay Schepler ; user
der.ASCENDING)
.reduceGroup(new ThresholdAcvFactCountGroupReducer());
}
Regards,
Konstantinos
From: Chesnay Schepler
Sent: Τετάρτη, 3 Απριλίου 2019 12:59 μμ
To: Papadopoulos, Konstantinos ;
user@flink.apache.org
Subject: Re: InvalidProgramException when trying to sort a group
Hi all,
I am trying to sort a group within a dataset using KeySelector as follows:
in
.groupBy("productId", "timePeriodId", "geographyId")
.sortGroup(new KeySelector() {
@Override
public Double getKey(ThresholdAcvFact thresholdAcvFact) throws Exception {
return
Optional.ofNul
Hi all,
I am trying to execute a batch job that gets a list of IDs and perform a loop
with a number of steps during each iteration including reading from a MS SQL
Server DB.
A sample pseudo-code of our implementation is the following:
List ids = ...
ids.foreach(
id -> executeIteration();
Yes, we are submitting more than one job and we choose which one is going to be
executed depending on the first program argument (i.e., ‘job’ argument).
From: Chesnay Schepler
Sent: Παρασκευή, 15 Μαρτίου 2019 12:53 μμ
To: Papadopoulos, Konstantinos ;
user@flink.apache.org
Subject: Re
$Worker.run(Unknown
Source) [?:1.8.0_201]
at java.lang.Thread.run(Unknown Source) [?:1.8.0_201]
From: Chesnay Schepler
Sent: Παρασκευή, 15 Μαρτίου 2019 10:56 πμ
To: Papadopoulos, Konstantinos ;
user@flink.apache.org
Subject: Re: ProgramInvocationException when trying to submit a job
9 10:20 πμ
To: Papadopoulos, Konstantinos ;
user@flink.apache.org
Subject: Re: ProgramInvocationException when trying to submit a job by running
a jar using Monitoring REST API
Please provide the logged exception, I cannot help you otherwise.
On 14.03.2019 14:20, Papadopoulos, Konstantinos wrote
:
"--job=mediaSpent,--initialScopeId=b494c35d-4c37-4338-8d23-0fc947bef690,--integratedScopeId=91769bd8-df4d-436c-b8d0-2e23ce862859,--projectId=333,--log.path=../log"}
Content-Type: application/json
From: Chesnay Schepler
Sent: Πέμπτη, 14 Μαρτίου 2019 2:24 μμ
To: Papadopoulos,
Hi all,
As part of our projects, I experiment with Flink Monitoring REST API and,
especially, its capabilities of uploading and running jar files.
When I am trying to submit one of our jobs by running a jar previously uploaded
via '/jars/upload', I am getting an 500 Internal Server Error respons
Hi Fabian,
Do you know if there is any plan Flink core framework to support such
functionality?
Best,
Konstantinos
From: Fabian Hueske
Sent: Δευτέρα, 4 Φεβρουαρίου 2019 3:49 μμ
To: Papadopoulos, Konstantinos
Cc: user@flink.apache.org
Subject: Re: Add header to a file produced using the
Hi all,
I am trying to produce a file from a dataset using the writeAsFormattedText
method (e.g., data.writeAsFormattedText(filename, writeMode, formatter)).
Is there any easy way to add a header to the file produced?
Thanks in advance,
Konstantinos
25 matches
Mail list logo