Hi,
Now I need to upgrade my spark cluster from version 1.1.0 to 1.2.1 , if
there is convenient way to do this. something like ./start-dfs.sh
(http://start-dfs.sh) -upgrade in hadoop
Best Wishs
THX
--
qiaou
已使用 Sparrow (http://www.sparrowmailapp.com/?sig)
’).collect.toList :::
hbaseQuery(’bb’).collect.toList) it return the right value
obviously i have got an action after my transformation action ,but why it did
not work
fyi
--
qiaou
已使用 Sparrow (http://www.sparrowmailapp.com/?sig)
}
}
}
return generateRdd
}
--
qiaou
已使用 Sparrow (http://www.sparrowmailapp.com/?sig)
在 2014年11月12日 星期三,下午2:50,Shixiong Zhu 写道:
Could you provide the code of hbaseQuery? It maybe doesn't support to execute
in parallel.
Best Regards,
Shixiong Zhu
2014-11-12 14:32
this work!
but can you explain why should use like this?
--
qiaou
已使用 Sparrow (http://www.sparrowmailapp.com/?sig)
在 2014年11月12日 星期三,下午3:18,Shixiong Zhu 写道:
You need to create a new configuration for each RDD. Therefore, val
hbaseConf = HBaseConfigUtil.getHBaseConfiguration should