Hi Nirmal,
i came across the following article "
https://stackoverflow.com/questions/47497003/why-is-hive-creating-tables-in-the-local-file-system
"
(and an updated ref link :
https://cwiki.apache.org/confluence/display/Hive/AdminManual+Metastore+Administration
)
you should check
Hi Xiao,
Just report this with JIRA SPARK-28103.
https://issues.apache.org/jira/browse/SPARK-28103
Thanks and Regards,
William
On Wed, 19 Jun 2019 at 1:35 AM, Xiao Li wrote:
> Hi, William,
>
> Thanks for reporting it. Could you open a JIRA?
>
> Cheers,
>
> Xiao
>
> William Wong
>
> Hi,
>
> I am using Livy RestApI to submit spark job. I used s3a to replace HDFS.
> I have to write fs.s3a.access.key and fs.s3a.sercet.key directly in
> core-site.xml, as there is no these conf param in Livy API.
> How could I encrypt my AK and SK?
>
> Yours.
> Jane
>
and btw, same connection string works fine when used in SQL Developer.
On Tuesday, June 18, 2019, 03:49:24 PM PDT, Richard Xin
wrote:
HI, I need help with tcps oracle connection from spark (version:
spark-2.4.0-bin-hadoop2.7)
Properties prop = new
HI, I need help with tcps oracle connection from spark (version:
spark-2.4.0-bin-hadoop2.7)
Properties prop = new Properties();prop.putAll(sparkOracle); //
username/password
prop.put("javax.net.ssl.trustStore", "path to
root.jks");prop.put("javax.net.ssl.trustStorePassword", "password_here");
Hi,
I have prices coming through Kafka in the following format
key,{JSON data}
The key is needed as part of data post to NoSQL database like Aerospike.
The following is record of topic from Kafka
I'm trying to do a brute force fuzzy join where I compare N records against
N other records, for N^2 total comparisons.
The table is medium size and fits in memory, so I collect it and put it
into a broadcast variable.
The other copy of the table is in an RDD. I am basically calling the RDD
map
Hi, William,
Thanks for reporting it. Could you open a JIRA?
Cheers,
Xiao
William Wong 于2019年6月18日周二 上午8:57写道:
> BTW, I noticed a workaround is creating a custom rule to remove 'empty
> local relation' from a union table. However, I am not 100% sure if it is
> the right approach.
>
> On Tue,
Just an update on the thread that it's kerberized.
I'm trying to execute the query with a different user xyz not hive.
Because seems like some permission issue the user xyz trying creating directory
in /home/hive directory
Do i need some impersonation setting?
Thanks,
Nirmal
Get Outlook for
BTW, I noticed a workaround is creating a custom rule to remove 'empty
local relation' from a union table. However, I am not 100% sure if it is
the right approach.
On Tue, Jun 18, 2019 at 11:53 PM William Wong wrote:
> Dear all,
>
> I am not sure if it is something expected or not, and should I
Thanks for the response Oliver.
I am facing this issue intermittently, once in a while i don't see service
being created for the respective spark driver(* i don't see service for
that driver on kubernetes dashboard and not even via kubectl but in driver
logs i see the service endpoint*) and by
Hi Raymond,
Permission on hdfs is 777
drwxrwxrwx - impadmin hdfs 0 2019-06-13 16:09
/home/hive/spark-warehouse
But it’s pointing to a local file system:
Exception in thread "main" java.lang.IllegalStateException: Cannot create
staging directory
Hi
Can you check the permission of the user running spark
On the hdfs folder where it tries to create the table
On Tue, Jun 18, 2019, 15:05 Nirmal Kumar
wrote:
> Hi List,
>
> I tried running the following sample Java code using Spark2 version 2.0.0
> on YARN (HDP-2.5.0.0)
>
> public class
Hi List,
I tried running the following sample Java code using Spark2 version 2.0.0 on
YARN (HDP-2.5.0.0)
public class SparkSQLTest {
public static void main(String[] args) {
SparkSession sparkSession = SparkSession.builder().master("yarn")
.config("spark.sql.warehouse.dir",
Hi,
I am using Livy RestApI to submit spark job. I used s3a to replace HDFS. I
have to write fs.s3a.access.key and fs.s3a.sercet.key directly in
core-site.xml, as there is no these conf param in Livy API.
How could I encrypt my AK and SK?
Yours.
Jane
19 matches
Mail list logo