I suggest you to try with date_dim.d_year in the query

On Tue, May 5, 2015 at 10:47 PM, Ishwardeep Singh <
ishwardeep.si...@impetus.co.in> wrote:

>  Hi Ankit,
>
>
>
> printSchema() works fine for all the tables.
>
>
>
> hiveStoreSalesDF.printSchema()
>
> root
>
> |-- store_sales.ss_sold_date_sk: integer (nullable = true)
>
> |-- store_sales.ss_sold_time_sk: integer (nullable = true)
>
> |-- store_sales.ss_item_sk: integer (nullable = true)
>
> |-- store_sales.ss_customer_sk: integer (nullable = true)
>
> |-- store_sales.ss_cdemo_sk: integer (nullable = true)
>
> |-- store_sales.ss_hdemo_sk: integer (nullable = true)
>
> |-- store_sales.ss_addr_sk: integer (nullable = true)
>
> |-- store_sales.ss_store_sk: integer (nullable = true)
>
> |-- store_sales.ss_promo_sk: integer (nullable = true)
>
> |-- store_sales.ss_ticket_number: integer (nullable = true)
>
> |-- store_sales.ss_quantity: integer (nullable = true)
>
> |-- store_sales.ss_wholesale_cost: double (nullable = true)
>
> |-- store_sales.ss_list_price: double (nullable = true)
>
> |-- store_sales.ss_sales_price: double (nullable = true)
>
> |-- store_sales.ss_ext_discount_amt: double (nullable = true)
>
> |-- store_sales.ss_ext_sales_price: double (nullable = true)
>
> |-- store_sales.ss_ext_wholesale_cost: double (nullable = true)
>
> |-- store_sales.ss_ext_list_price: double (nullable = true)
>
> |-- store_sales.ss_ext_tax: double (nullable = true)
>
> |-- store_sales.ss_coupon_amt: double (nullable = true)
>
> |-- store_sales.ss_net_paid: double (nullable = true)
>
> |-- store_sales.ss_net_paid_inc_tax: double (nullable = true)
>
> |-- store_sales.ss_net_profit: double (nullable = true)
>
>
>
> dateDimDF.printSchema()
>
> root
>
> |-- d_date_sk: integer (nullable = false)
>
> |-- d_date_id: string (nullable = false)
>
> |-- d_date: date (nullable = true)
>
> |-- d_month_seq: integer (nullable = true)
>
> |-- d_week_seq: integer (nullable = true)
>
> |-- d_quarter_seq: integer (nullable = true)
>
> |-- d_year: integer (nullable = true)
>
> |-- d_dow: integer (nullable = true)
>
> |-- d_moy: integer (nullable = true)
>
> |-- d_dom: integer (nullable = true)
>
> |-- d_qoy: integer (nullable = true)
>
> |-- d_fy_year: integer (nullable = true)
>
> |-- d_fy_quarter_seq: integer (nullable = true)
>
> |-- d_fy_week_seq: integer (nullable = true)
>
> |-- d_day_name: string (nullable = true)
>
> |-- d_quarter_name: string (nullable = true)
>
> |-- d_holiday: string (nullable = true)
>
> |-- d_weekend: string (nullable = true)
>
> |-- d_following_holiday: string (nullable = true)
>
> |-- d_first_dom: integer (nullable = true)
>
> |-- d_last_dom: integer (nullable = true)
>
> |-- d_same_day_ly: integer (nullable = true)
>
> |-- d_same_day_lq: integer (nullable = true)
>
> |-- d_current_day: string (nullable = true)
>
> |-- d_current_week: string (nullable = true)
>
> |-- d_current_month: string (nullable = true)
>
> |-- d_current_quarter: string (nullable = true)
>
> |-- d_current_year: string (nullable = true)
>
>
>
> itemDF.printSchema()
>
> root
>
> |-- i_item_sk: integer (nullable = false)
>
> |-- i_item_id: string (nullable = false)
>
> |-- i_rec_start_date: date (nullable = true)
>
> |-- i_rec_end_date: date (nullable = true)
>
> |-- i_item_desc: string (nullable = true)
>
> |-- i_current_price: decimal (nullable = true)
>
> |-- i_wholesale_cost: decimal (nullable = true)
>
> |-- i_brand_id: integer (nullable = true)
>
> |-- i_brand: string (nullable = true)
>
> |-- i_class_id: integer (nullable = true)
>
> |-- i_class: string (nullable = true)
>
> |-- i_category_id: integer (nullable = true)
>
> |-- i_category: string (nullable = true)
>
> |-- i_manufact_id: integer (nullable = true)
>
> |-- i_manufact: string (nullable = true)
>
> |-- i_size: string (nullable = true)
>
> |-- i_formulation: string (nullable = true)
>
> |-- i_color: string (nullable = true)
>
> |-- i_units: string (nullable = true)
>
> |-- i_container: string (nullable = true)
>
> |-- i_manager_id: integer (nullable = true)
>
> |-- i_product_name: string (nullable = true)
>
>
>
> Regards,
>
> Ishwardeep
>
>
>
> *From:* ankitjindal [via Apache Spark User List] [mailto:ml-node+[hidden
> email] <http:///user/SendEmail.jtp?type=node&node=22768&i=0>]
> *Sent:* Tuesday, May 5, 2015 5:00 PM
> *To:* Ishwardeep Singh
> *Subject:* RE: Unable to join table across data sources using sparkSQL
>
>
>
> Just check the Schema of both the tables using frame.printSchema();
>  ------------------------------
>
> *If you reply to this email, your message will be added to the discussion
> below:*
>
>
> http://apache-spark-user-list.1001560.n3.nabble.com/Unable-to-join-table-across-data-sources-using-sparkSQL-tp22761p22766.html
>
> To unsubscribe from Unable to join table across data sources using
> sparkSQL, click here.
> NAML
> <http://apache-spark-user-list.1001560.n3.nabble.com/template/NamlServlet.jtp?macro=macro_viewer&id=instant_html%21nabble%3Aemail.naml&base=nabble.naml.namespaces.BasicNamespace-nabble.view.web.template.NabbleNamespace-nabble.view.web.template.NodeNamespace&breadcrumbs=notify_subscribers%21nabble%3Aemail.naml-instant_emails%21nabble%3Aemail.naml-send_instant_email%21nabble%3Aemail.naml>
>
> ------------------------------
>
>
>
>
>
>
> NOTE: This message may contain information that is confidential,
> proprietary, privileged or otherwise protected by law. The message is
> intended solely for the named addressee. If received in error, please
> destroy and notify the sender. Any use of this email is prohibited when
> received in error. Impetus does not represent, warrant and/or guarantee,
> that the integrity of this communication has been maintained nor that the
> communication is free of errors, virus, interception or interference.
>
> ------------------------------
> View this message in context: RE: Unable to join table across data
> sources using sparkSQL
> <http://apache-spark-user-list.1001560.n3.nabble.com/Unable-to-join-table-across-data-sources-using-sparkSQL-tp22761p22768.html>
> Sent from the Apache Spark User List mailing list archive
> <http://apache-spark-user-list.1001560.n3.nabble.com/> at Nabble.com.
>



-- 
Best Regards,
Ayan Guha

Reply via email to