Hi,
Structured streaming works great with Kafka source but I need to persist
the data after processing in some database like Cassandra or at least
Postgres.
Any suggestions, help please.
Thanks
ex.html#04%20SQL,%20DataFrames%20%26%20Datasets/
> 02%20Introduction%20to%20DataFrames%20-%20scala.html
>
>
> On Sun, Nov 20, 2016 at 1:24 PM, pandees waran wrote:
>
>> have you tried using "." access method?
>>
>> e.g:
>> ds1.select("name",
The following my dataframe schema
root
|-- name: string (nullable = true)
|-- addresses: array (nullable = true)
||-- element: struct (containsNull = true)
|||-- street: string (nullable = true)
|||-- city: string (nullable = true)
I want to output
You can define your class which is supported by SQL Encoder, and
> convert this generated class to the new class in `parseLine`.
>
> On Wed, Nov 16, 2016 at 4:22 PM, shyla deshpande > wrote:
>
>> Ryan,
>>
>> I just wanted to provide more info. Here is my .proto
t;)
.format("console")
.start()
query.awaitTermination()
}
On Thu, Nov 17, 2016 at 11:30 AM, shyla deshpande
wrote:
> val spark = SparkSession.builder.
> master("local")
> .appName("spark session example")
> .getOrCreate()
>
> im
val spark = SparkSession.builder.
master("local")
.appName("spark session example")
.getOrCreate()
import spark.implicits._
val dframe1 = spark.readStream.format("kafka").
option("kafka.bootstrap.servers","localhost:9092").
option("subscribe","student").load()
*How do I deserialize the
g city = 2;
}
message Person {
optional string name = 1;
optional int32 age = 2;
optional Gender gender = 3;
repeated string tags = 4;
repeated Address addresses = 5;
}
On Wed, Nov 16, 2016 at 3:04 PM, shyla deshpande
wrote:
> *Thanks for the response. Following is the
ME_FIELD_NUMBER = 1
final val AGE_FIELD_NUMBER = 2
final val GENDER_FIELD_NUMBER = 3
final val TAGS_FIELD_NUMBER = 4
final val ADDRESSES_FIELD_NUMBER = 5
}
On Wed, Nov 16, 2016 at 1:28 PM, Shixiong(Ryan) Zhu wrote:
> Could you provide the Person class?
>
> On Wed, Nov 16, 201
.org/overviews/reflection/thread-
> safety.html) AFAIK, the only way to fix it is upgrading to Scala 2.11.
>
> On Wed, Nov 16, 2016 at 11:16 AM, shyla deshpande <
> deshpandesh...@gmail.com> wrote:
>
>> I am using protobuf to encode. This may not be related to the new relea
I am using protobuf to encode. This may not be related to the new release
issue
Exception in thread "main" scala.ScalaReflectionException: is not a
term
at scala.reflect.api.Symbols$SymbolApi$class.asTerm(Symbols.scala:199)
at
scala.reflect.internal.Symbols$SymbolContextApiImpl.asTerm(Symbols
Is it OK to use ProtoBuf for sending messages to Kafka? I do not see
anyone using it .
Please direct me to some code samples of how to use it in Spark Structured
streaming.
Thanks again..
On Sat, Nov 12, 2016 at 11:44 PM, shyla deshpande
wrote:
> Thanks everyone. Very good discuss
.com/jaceklaskowski
>
>
> On Sat, Nov 12, 2016 at 4:07 PM, Luciano Resende
> wrote:
>
> If you are interested in Akka streaming, it is being maintained in Apache
> Bahir. For Akka there isn't a structured streaming version yet, but we
> would
> be interested in coll
Using ProtoBuf for Kafka messages with Spark Streaming because ProtoBuf is
already being used in the system.
Some sample code and reading material for using ProtoBuf for Kafka messages
with Spark Streaming will be helpful.
Thanks
I am using Spark 2.0.1. I wanted to build a data pipeline using Kafka,
Spark Streaming and Cassandra using Structured Streaming. But the kafka
source support for Structured Streaming is not yet available. So now I am
trying to use Akka Stream as the source to Spark Streaming.
Want to make sure I a
I am using spark-cassandra-connector_2.11.
On Mon, Nov 7, 2016 at 3:33 PM, shyla deshpande
wrote:
> Hi ,
>
> I am trying to do structured streaming with the wonderful SparkSession,
> but cannot save the streaming data to Cassandra.
>
> If anyone has got this working, please help
>
> Thanks
>
>
Hi ,
I am trying to do structured streaming with the wonderful SparkSession, but
cannot save the streaming data to Cassandra.
If anyone has got this working, please help
Thanks
Hi Jaya!
Thanks for the reply. Structured streaming works fine for me with socket
text stream . I think structured streaming with kafka source not yet
supported.
Please if anyone has got it working with kafka source, please provide me
some sample code or direction.
Thanks
On Sun, Nov 6, 2016 a
I am trying to do Structured Streaming with Kafka Source. Please let me
know where I can find some sample code for this. Thanks
would only be provided you run your application with spark-submit or
> otherwise have Spark's JARs on your class path. How are you launching your
> application?
>
> On Fri, Nov 4, 2016 at 2:00 PM, shyla deshpande
> wrote:
>
>> object App {
>>
>>
&
object App {
import org.apache.spark.sql.functions._
import org.apache.spark.sql.SparkSession
def main(args : Array[String]) {
println( "Hello World!" )
val sparkSession = SparkSession.builder.
master("local")
.appName("spark session example")
.getOrCreate()
}
}
ation for setting up IDE - https://cwiki.apache.org/
> confluence/display/SPARK/Useful+Developer+Tools#
> UsefulDeveloperTools-IDESetup
>
> I hope this is helpful.
>
>
> 2016-11-04 9:10 GMT+09:00 shyla deshpande :
>
>> Hello Everyone,
>>
>> I just installed Spa
Hello Everyone,
I just installed Spark 2.0.1, spark shell works fine.
Was able to run some simple programs from the Spark Shell, but find it hard
to make the same program work when using IntelliJ.
I am getting the following error.
Exception in thread "main" java.lang.NoSuchMethodError:
scala.Pr
>> Look at the resolved subtasks attached to that ticket you linked.
> >> Some of them are unresolved, but basic functionality is there.
> >>
> >> On Tue, Nov 1, 2016 at 7:37 PM, shyla deshpande
> >> wrote:
> >> > Hi Michael,
> >> >
&
I'm not aware of any open issues against the kafka source for structured
> streaming.
>
> On Tue, Nov 1, 2016 at 4:45 PM, shyla deshpande
> wrote:
>
>> I am building a data pipeline using Kafka, Spark streaming and Cassandra.
>> Wondering if the issues with Kafka
I am building a data pipeline using Kafka, Spark streaming and Cassandra.
Wondering if the issues with Kafka source fixed in Spark 2.0.1. If not,
please give me an update on when it may be fixed.
Thanks
-Shyla
101 - 125 of 125 matches
Mail list logo