Thank you,
But now I have this error:
java.lang.ClassCastException: java.lang.Integer cannot be cast to
java.lang.Long
My offsets are actually not big enough to be long. If I put bigger
values, I have no such exception.
For me looks like a bug.
Any ideas for a workaround?
Thank!
On 05/02/2016 06:57 PM, Cody Koeninger wrote:
If you're confused about the type of an argument, you're probably
better off looking at documentation that includes static types:
http://spark.apache.org/docs/latest/api/scala/index.html#org.apache.spark.streaming.kafka.KafkaUtils$
createDirectStream's fromOffsets parameter takes a map from
TopicAndPartition to Long.
There is documentation for a python constructor for TopicAndPartition:
http://spark.apache.org/docs/latest/api/python/_modules/pyspark/streaming/kafka.html#TopicAndPartition
On Mon, May 2, 2016 at 5:54 AM, Tigran Avanesov
<tigran.avane...@olamobile.com> wrote:
Hi,
I'm trying to start consuming messages from a kafka topic (via direct
stream) from a given offset.
The documentation of createDirectStream says:
:param fromOffsets: Per-topic/partition Kafka offsets defining the
(inclusive) starting
point of the stream.
However it expects a dictionary of topics (not names...), as i tried to feed
it something like { 'topic' : {0: 123, 1:234}}, and of course got an
exception.
How should I build this fromOffsets parameter?
Documentation does not say anything about it.
(In general, I think it would be better if the function accepted topic
names)
Thank you!
Regards,
Tigran
---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org
--
Tigran Avanesov | IT Architect
phone: +352 261911 3562
email: tigran.avane...@olamobile.com
skype: tigran.avanesov.corporate
post: Olamobile S.à.r.l.
2-4 rue Eugène Ruppert
Bâtiment Vertigo-Polaris
L-2453 Luxembourg
Luxembourg
web: www.olamobile.com
---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org