> least 60,000 active connections to Flume HTTP Source running on 3 boxes.
>
> With so “high” QPS, the HTTP Source instances will report
> "java.io.IOException: Too many open files" error.
>
> Except to scale out the boxes, does anyone have any advice to improve the
>
Dear all,
We have deployed the flume 1.7.0 on our production box, and now we have at
least 60,000 active connections to Flume HTTP Source running on 3 boxes.
With so “high” QPS, the HTTP Source instances will report
"java.io.IOException: Too many open files" error.
Except to scale out
Dear all,
For the Flume avro events generated by the HTTP Source, Any good to consume
the events directly by the kafka consumer?
As we are using the Confluent Kafka, is it possible to register the schema
in the Schema-Registry and then use the kafka-avro-console-consumer to
consume the messages
Hi,
Thanks for the suggestion. I will take a look at Blob Handler.
On Thu, Aug 31, 2017 at 1:28 PM, Archana Ravindran <srarchan...@gmail.com>
wrote:
> You can try using blob handler in http source. Though blob is for binary,
> we were able to push relatively large event.
>
&
Hi,
Thanks for the suggestion. I will take a look at Blob Handler.
On Thu, Aug 31, 2017 at 1:28 PM, Archana Ravindran <srarchan...@gmail.com>
wrote:
> You can try using blob handler in http source. Though blob is for binary,
> we were able to push relatively large event.
>
&
You can try using blob handler in http source. Though blob is for binary,
we were able to push relatively large event.
For size limit, in our case we were not able to find default support in
flume. So we customized the code with a small function for checking the
size.
You can try similar.
Thanks
Hello,
I am using an http source for my Flume agent and would like to know if
there a size limit (explicit or implicit) on POST request body/data
(content length) for Flume's HTTP Source ?
I have a handler which is throwing* "MalformedJsonException: Unterminated
string at line 1 column 9
HI Hari,
Thanks. i get some time 501 error. anyhow i'll check. and hope this is the
way we can ensure that, whether it has posted the data to source.
On Wed, Mar 2, 2016 at 11:43 PM, Hari Shreedharan <hshreedha...@cloudera.com
> wrote:
> The http source returns an http status. It retur
HI Team,
I have scenario in which my node post the data to Flume. it works no
problem.
But in exception case is there any way we can configure some kind
of acknowledgement to that POST request ??
Thanks
Nithesh
thanks gonzalo, the problem is on my side. On my workplace, the team
developer has made some modification on lib http-source.
and can i ask some more question again ? i had a problem again when using
failover method, which is said "ERROR lifecycle.LifecycleSupervisor: Unable
to
> i make some topology flume which collect data from nginx and write to
> hdfs...
>
> first is top agent ---> mid agent ---> hdfs agent
>
> top agent = http source which collect data from nginx
> mid agent = which will forward the source to hdfs agent
> hdfs agent = whi
Hi all,
i make some topology flume which collect data from nginx and write to
hdfs...
first is top agent ---> mid agent ---> hdfs agent
top agent = http source which collect data from nginx
mid agent = which will forward the source to hdfs agent
hdfs agent = which will write the data t
HI All
down votefavorite
<http://stackoverflow.com/questions/34118713/flume-curl-post-is-not-sending-data-to-flume-using-http-source#>
I have written flume script for receiving data from http source,but flume
is not receiving data from curl post command. Here is my script
## NEW
ber 2015 at 14:53, Shashi Vishwakarma <shashi.vish...@gmail.com>
wrote:
> HI All
>
>
> down votefavorite
> <http://stackoverflow.com/questions/34118713/flume-curl-post-is-not-sending-data-to-flume-using-http-source#>
>
> I have written flume script for rece
Hi All,
These are the follow up observations & issues on the benchmarking.
Configuration is same as HTTP source -> File Channel -> Kafka Sink: When sent
larger messages from the HTTP clients, observed EPS is around 140. Each single
large message is batch of 100 individual log mess
manth Abbina" <heman...@eiqnetworks.com> wrote:
> Hi All,
>
>
>
> These are the follow up observations & issues on the benchmarking.
>
>
>
> Configuration is same as HTTP source -> File Channel -> Kafka Sink: When
> sent larger messages from the
I found one day that Flume's HTTP source implementation is somewhat outdated
and it's not really optimized for performance.
Our requirement includes processing more than 10k requests within a single
node, but as Hemanth said, Flume's HTTP source processed a few hundreds per
second.
We
that.
Regards,
Gonzalo
On Nov 14, 2015 7:40 AM, "Hemanth Abbina" <heman...@eiqnetworks.com> wrote:
> Hi,
>
>
>
> We have been trying to validate & benchmark the Flume performance for our
> production use.
>
>
>
> We have configured Fl
anks for the response.
>
>
>
> I haven’t tried with different source. Will try that.
>
> We are sending through multiple HTTP clients (around 40 clients) and using
> single event per batch.
>
>
>
> First, we would like to validate & see the max supported HTTP source EP
Hi Hari,
Thanks for the response.
I haven’t tried with different source. Will try that.
We are sending through multiple HTTP clients (around 40 clients) and using
single event per batch.
First, we would like to validate & see the max supported HTTP source EPS for a
single Flume server
ilto:gherre...@gmail.com
> <javascript:_e(%7B%7D,'cvml','gherre...@gmail.com');>]
> *Sent:* Saturday, November 14, 2015 2:02 PM
> *To:* user <user@flume.apache.org
> <javascript:_e(%7B%7D,'cvml','user@flume.apache.org');>>
> *Subject:* Re: Flume benchmarking with HTTP
che.org>
Subject: Re: Flume benchmarking with HTTP source & File channel
If that is just with a single server, 600 messages per sec doesn't sound bad to
me.
Depending on the size of each message, it could be the network the limiting
factor.
I would try with the null sink and in
) is that an intermediary process is
required to POST from the actual source of data (e.g. a web-API) to the Flume
http Source web-service hosted on the master node of a Hadoop cluster.
From: Hari Shreedharan [mailto:hshreedha...@cloudera.com]
Sent: 12 November 2015 23:24
To: user@flume.apache.org
Subject: Re: Flume
> required to POST from the actual source of data (e.g. a web-API) to the
> Flume http Source web-service hosted on the master node of a Hadoop cluster.
>
>
>
> *From:* Hari Shreedharan [mailto:hshreedha...@cloudera.com]
> *Sent:* 12 November 2015 23:24
> *To:* user@flume.apache.
Hi,
We have been trying to validate & benchmark the Flume performance for our
production use.
We have configured Flume to have HTTP source, File channel & Kafka sink.
Hardware : 8 Core, 32 GB RAM, CentOS6.5, Disk - 500 GB HDD.
Flume configuration:
svcagent.sources = http-source
svcage
Hi Flumers
I’m new to Flume so please go easy on me, but this is a placeholder for those
that follow.
I’m seeing the following Exception when configuring a http source for Flume
v1.6.0, pointing at an externally hosted web API I know I can connect to, and
GET/POST to otherwise
15/11/12 21:23
Thanks Hari
Re.
“HTTP Source does not pull anything from an address, instead it binds to the
address specified in the config”
Can you clarify this a bit more - it seems to contradict ☺. An address is
specified in the Config.
Kind Regards
Timothy Garza
Database and BI Developer
Collinson
HTTP Source does not pull anything from an address, instead it binds to the
address specified in the config. It can listen on that port and then accept
data from applications.
Thanks,
Hari Shreedharan
> On Nov 12, 2015, at 1:59 PM, Timothy Garza <timothy.ga...@collinsongroup.com>
"A source which accepts Flume Events by HTTP POST and GET. GET should be
used for experimentation only." --
https://flume.apache.org/FlumeUserGuide.html#http-source
This is a source that *accepts* data, it will not *poll* an api. You would
need your API to push events to flume, or
mailto:iainw...@gmail.com]
Sent: 12 November 2015 22:53
To: user@flume.apache.org
Subject: Re: Flume Exception - http source
"A source which accepts Flume Events by HTTP POST and GET. GET should be used
for experimentation only."
--https://flume.apache.org/FlumeUserGuide.html#http-so
I don't think all is in order. HTTP Source is trying to listen on a port
which has another process already listening on.
Thanks,
Hari
On Thu, Nov 12, 2015 at 3:01 PM, Timothy Garza <
timothy.ga...@collinsongroup.com> wrote:
> So why the exception if all is in order?
>
>
>
&
Shreedharan [mailto:hshreedha...@cloudera.com]
Sent: 12 November 2015 22:02
To: user@flume.apache.org
Subject: Re: Flume Exception - http source
HTTP Source does not pull anything from an address, instead it binds to the
address specified in the config. It can listen on that port and then accep
HTTP Source is a listener. It doesn’t actively pull from a remote endpoint.
The address in the config is an address on the flume server to bind to.
From: Timothy Garza [mailto:timothy.ga...@collinsongroup.com]
Sent: Thursday, November 12, 2015 2:09 PM
To: user@flume.apache.org
Subject: RE: Flume
s passing json data via a python script in dict
> (key, value) pairs and we need both header and body contents/data in HDFS.
>
>
>
> Please advise!
>
>
>
> *From:* Hari Shreedharan [mailto:hshreedha...@cloudera.com]
> *Sent:* Friday, September 04, 2015 12:43 PM
> *T
, he is passing json data via a python script in dict (key,
value) pairs and we need both header and body contents/data in HDFS.
Please advise!
From: Hari Shreedharan [mailto:hshreedha...@cloudera.com]
Sent: Friday, September 04, 2015 12:43 PM
To: user@flume.apache.org
Subject: Re: N
Dear Community,
We are trying to send http/json messages and no errors in Flume get all files
in HDFS is NULL (no data seen), we are passing events as JSON Strings, yet,
when we see files in HDFS, we see not data
Is there a HDFS "sink" parameter to show JSON data in hdfs?
We are testing this
The JSONHandler requires the data to be in a specific format:
https://flume.apache.org/releases/content/1.5.0/apidocs/org/apache/flume/source/http/JSONHandler.html
Thanks,
Hari
On Fri, Sep 4, 2015 at 10:38 AM, Sutanu Das wrote:
> Dear Community,
>
>
>
> We are trying to send
Hello,
I have configured flume agent to listen on an HTTP source with my own
handler. I am writing to HBase sink.
I have 3 servers (flume agents) in the cluster. What is the best way to
send messages to my flume agents?
What I am doing now is to send a message to a single server, for example
I am new to flume and I am trying to stream tweets which is from gnip using
Flume.
Please suggest , which Flume source need to be used to stream tweets from
Gnip.
Does Http Source is suitable for it ?
Please suggest an example to Stream realtime data to Flume Agent .
Regards,
Rafeeq S
*(“What
tweets from
Gnip.
Does Http Source is suitable for it ?
Please suggest an example to Stream realtime data to Flume Agent .
Regards,
Rafeeq S
*(“What you do is what matters, not what you think or say or plan.” )*
Hi I am trying to use Flume HTTP Handler . But getting an error while
starting the agent. I am using CDH4.3
Command failed to run because this role has invalid configuration.
Review and correct its configuration. First error: Component tier1:
Configuration of component failed. (#httpsrc)
This
Hi There,
I am a hadoop noob and wanted to write data from Windows eventing logs to
HDFS, setup on a linux box. I want to use Flume to do the same. I want to
understand how i can setup a source on a Windows machine with the sink on a
Linux box.
I was exploring the usage of the HTTP source
and wanted to write data from Windows eventing logs to
HDFS, setup on a linux box. I want to use Flume to do the same. I want to
understand how i can setup a source on a Windows machine with the sink on a
Linux box.
I was exploring the usage of the HTTP source to push data through
and many things are new to me
and probably I do many mistakes. I want to point that I use Flume 1.3.1
compiled by me if this is important to be mentioned. I have tested too with
the official binary from Flume web site but the results are the same.
Here is what I do to test HTTP source:
my config
and probably I do many mistakes. I want to point that I use Flume 1.3.1
compiled by me if this is important to be mentioned. I have tested too with
the official binary from Flume web site but the results are the same.
Here is what I do to test HTTP source:
my config file (httppost.conf):
agent1
and probably I do many mistakes. I want to point that I use Flume 1.3.1
compiled by me if this is important to be mentioned. I have tested too with
the official binary from Flume web site but the results are the same.
Here is what I do to test HTTP source:
my config file (httppost.conf):
agent1
HTTP source:
my config file (httppost.conf):
agent1.sources = r1
agent1.channels = ch1
agent1.sinks = k1
agent1.sources.r1.type = org.apache.flume.source.http.HTTPSource
agent1.sources.r1.port = 9001
agent1.sources.r1.channels = ch1
#agent1.sources.r1.handler = org.example.rest.RestHandler
to test HTTP source:
my config file (httppost.conf):
agent1.sources = r1
agent1.channels = ch1
agent1.sinks = k1
agent1.sources.r1.type = org.apache.flume.source.http.HTTPSource
agent1.sources.r1.port = 9001
agent1.sources.r1.channels = ch1
#agent1.sources.r1.handler
from Flume web site but the results are the same.
Here is what I do to test HTTP source:
my config file (httppost.conf):
agent1.sources = r1
agent1.channels = ch1
agent1.sinks = k1
agent1.sources.r1.type = org.apache.flume.source.http.HTTPSource
agent1.sources.r1.port = 9001
agent1
1.3.1
compiled by me if this is important to be mentioned. I have tested too
with
the official binary from Flume web site but the results are the same.
Here is what I do to test HTTP source:
my config file (httppost.conf):
agent1.sources = r1
agent1.channels = ch1
agent1.sinks = k1
agent1
have tested too with
the official binary from Flume web site but the results are the same.
Here is what I do to test HTTP source:
my config file (httppost.conf):
agent1.sources = r1
agent1.channels = ch1
agent1.sinks = k1
agent1.sources.r1.type = org.apache.flume.source.http.HTTPSource
Does not look like you are using the http source at all. Your source type needs
to be HTTP
Cheers,
Hari
On Thursday, June 20, 2013 at 8:57 AM, Nickolay Kolev wrote:
Hi all,
I am new to flume and all that logging stuff and probably many things are
unclear to me despite I read the docs
, why doesn't the HTTP Source use Netty?
On Mon, Nov 5, 2012 at 1:09 PM, Nathaniel Auvil nathaniel.au...@gmail.com
(mailto:nathaniel.au...@gmail.com) wrote:
One thing i do not see with this HTTPSource is any way to customize the
response.
On Mon, Nov 5, 2012 at 10:58 AM, Brock
(mailto:nathaniel.au...@gmail.com) wrote:
is there a target date for 1.3?
On Mon, Nov 5, 2012 at 10:34 AM, Brock Noland br...@cloudera.com
(mailto:br...@cloudera.com) wrote:
There is actually an HTTP Source committed to the 1.3 branch:
https://git-wip-us.apache.org/repos
There is actually an HTTP Source committed to the 1.3 branch:
https://git-wip-us.apache.org/repos/asf?p=flume.git;a=commit;h=0bb1b21e3fa80cbb0f15c6397214d2a040fd1a5c
Otherwise I would shade or jarjar your HTTP Source jars so they don't conflict.
Brock
On Mon, Nov 5, 2012 at 9:31 AM, Nathaniel
is there a target date for 1.3?
On Mon, Nov 5, 2012 at 10:34 AM, Brock Noland br...@cloudera.com wrote:
There is actually an HTTP Source committed to the 1.3 branch:
https://git-wip-us.apache.org/repos/asf?p=flume.git;a=commit;h=0bb1b21e3fa80cbb0f15c6397214d2a040fd1a5c
Otherwise I would
for 1.3?
On Mon, Nov 5, 2012 at 10:34 AM, Brock Noland br...@cloudera.com wrote:
There is actually an HTTP Source committed to the 1.3 branch:
https://git-wip-us.apache.org/repos/asf?p=flume.git;a=commit;h=0bb1b21e3fa80cbb0f15c6397214d2a040fd1a5c
Otherwise I would shade or jarjar your HTTP
Just curious, why doesn't the HTTP Source use Netty?
On Mon, Nov 5, 2012 at 1:09 PM, Nathaniel Auvil
nathaniel.au...@gmail.comwrote:
One thing i do not see with this HTTPSource is any way to customize the
response.
On Mon, Nov 5, 2012 at 10:58 AM, Brock Noland br...@cloudera.com wrote
58 matches
Mail list logo