static table in flink
Hello, I have questions about static table in flink. Join the stream table with static table. I'm looking at temporal table, while the time based table would grow exponentially over period. Any suggestions? Stream tables checks the contains in static table(updates once everyday with new set of data). Trying to approach this with views. Thanks for any suggestions. Regards 嘉琪
Re: Flink SQL dynamic configuration
Thanks for your helps Till. I appreciate it. On Fri, Nov 8, 2019 at 9:02 PM Till Rohrmann wrote: > Hi Jaqie, > > not sure whether this is easily possible with Flink's SQL API but if you > used the DataStream API directly you could create a connected stream where > you have two inputs. One input could be the normal message stream and the > other input could be the configuration stream. So whenever there is a > configuration change, you would need to stream it into your application > (e.g. by writing it to Kafka) and then the connected stream operators could > apply the configuration changes. > > Cheers, > Till > > On Thu, Nov 7, 2019 at 4:16 AM Jaqie Chan wrote: > >> Hello, >> >> I use Flink SQL API to process a data stream from Kafka. To process these >> data, I use some configurations loaded from an HTTP endpoint once at >> initialization. >> >> The configuration is loaded only once at job initialization. So it works >> well with a static configuration, but do not handle dynamic ones. >> >> How to handle dynamic configuration, without having to reload the >> configuration at each message? >> >> Thanks >> 嘉琪 >> >
Re: flink on yarn-cluster kerberos authentication for hbase
Hello, Does this have helps to you? https://stackoverflow.com/questions/34596165/how-to-do-kerberos-authentication-on-a-flink-standalone-installation Regards 嘉琪 On Fri, Nov 8, 2019 at 4:00 PM venn wrote: > HI Guys: > > Who can share some example for flink on yarn-cluster kerberos > authentication for hbase > > I auth as what I do in java program, it look like just > authentication zookeeper, cannot access for hbase > > thanks >
Flink SQL dynamic configuration
Hello, I use Flink SQL API to process a data stream from Kafka. To process these data, I use some configurations loaded from an HTTP endpoint once at initialization. The configuration is loaded only once at job initialization. So it works well with a static configuration, but do not handle dynamic ones. How to handle dynamic configuration, without having to reload the configuration at each message? Thanks 嘉琪
RocksDB and local file system
Hello I am using Flink rocksDb state backend, the documentation seems to imply i can use a regular file system such as: file:///data/flink/checkpoints, but the code javadoc only mentions hdfs or s3 option here. I am wondering if it's possible to use local file system with flink rocksdb backend. Thanks 嘉琪