Re: Specifying Schema dynamically

2017-02-13 Thread Luqman Ghani
Hi,

My case is very similar to what is described in this link of Spark:
http://spark.apache.org/docs/latest/sql-programming-guide.html#programmatically-specifying-the-schema

I hope this clarifies it.

Thanks,
Luqman

On Mon, Feb 13, 2017 at 12:04 PM, Tzu-Li (Gordon) Tai <tzuli...@apache.org>
wrote:

> Hi Luqman,
>
> From your description, it seems like that you want to infer the type (case
> class, tuple, etc.) of a stream dynamically at runtime.
> AFAIK, I don’t think this is supported in Flink. You’re required to have
> defined types for your DataStreams.
>
> Could you also provide an example code of what the functionality you have
> in mind looks like?
> That would help clarify if I have misunderstood and there’s actually a way
> to do it.
>
> - Gordon
>
> On February 12, 2017 at 4:30:56 PM, Luqman Ghani (lgsa...@gmail.com)
> wrote:
>
> Like if a file has a header: id, first_name, last_name, last_login
> and we infer schema as: Int, String, String, Long
>
>


Specifying Schema dynamically

2017-02-12 Thread Luqman Ghani
Hi,

I hope everyone is doing well.

I have a use case where we infer schema according to file headers and other
information. Now, in Flink, we can specify schema of a stream with case
classes and tuples. With tuples, we cannot give names to fields, but we
will have to generate case classes on the fly if we use them. Is there any
way of specifying schema with a Map[String,Any] to Flink, so it can infer
schema from this map.

Like if a file has a header: id, first_name, last_name, last_login
and we infer schema as: Int, String, String, Long

Can we specify it as Map[String, Any]("id" -> Int, "first_name" -> String,
"last_name" -> String, "last_login" -> Long)

We want to use keyBy with field names instead of their indices. I hope
there is a way :)

I was looking into dynamically create case classes in scala using
scala-reflect, but I'm facing problems in getting that class that
forwarding it to Flink program.

Thanks,
Luqman


Re: Submitting an app via API

2017-02-06 Thread Luqman Ghani
Hi,

Thanks a lot.

On Mon, Feb 6, 2017 at 9:06 PM, Ufuk Celebi <u...@apache.org> wrote:

> You can use RemoteStreamEnvironment or the REST APIs
> (https://ci.apache.org/projects/flink/flink-docs-
> release-1.3/monitoring/rest_api.html#submitting-programs).
>
> On Sun, Feb 5, 2017 at 4:43 PM, Luqman Ghani <lgsa...@gmail.com> wrote:
> > Hi,
> >
> > On quickstart page of Flink docs, it suggests starting a Flink app with
> > "bin/flink" command on command line. Is there any other way of
> submitting to
> > a cluster of flink, that is, through API call within a program, or
> through
> > server request?
> >
> > Thanks,
> > Luqman
>


Submitting an app via API

2017-02-05 Thread Luqman Ghani
Hi,

On quickstart page of Flink docs, it suggests starting a Flink app with
"bin/flink" command on command line. Is there any other way of submitting
to a cluster of flink, that is, through API call within a program, or
through server request?

Thanks,
Luqman