Ok are you using PySpark for this with PyCharm whatever? Then convert that DF into Pandas DF and do the plot. Check Google for the needed packages
import numpy as np import matplotlib.pyplot as plt import pandas as pd summary_df = spark.sql(f"""SELECT datetaken, salesvolume as volumeOfSales FROM {fullyQualifiedTableName} WHERE datetaken BETWEEN '{start_date}' AND '{end_date}' AND lower(regionname) = lower('{regionname}') ORDER BY datetaken""") p_df = summary_df.toPandas() # create a Panda DF from Spark DF print(p_df) # Describe returns a DF where count,mean, min, std,max... are values of the index p_df.plot(kind='scatter', stacked = False, x = 'datetaken', y = ['volumeOfSales'], colormap='jet') #ax = y.plot(linewidth=2, colormap='jet', marker='.', markersize=20) plt.xlabel("year", fontdict=config['plot_fonts']['font']) plt.ylabel("Volume of Monthly Sales", fontdict=config['plot_fonts']['font']) plt.title(f"""Stats from {regionname} for the past 10 years """, fontdict=config['plot_fonts']['font'] ) plt.text(0.35, 0.85, "2016 stamp duty change impact [Ref 1]", transform=plt.gca().transAxes, color="darkgreen", fontsize=10 ) plt.show() plt.close() [image: image.png] HTH view my Linkedin profile <https://www.linkedin.com/in/mich-talebzadeh-ph-d-5205b2/> *Disclaimer:* Use it at your own risk. Any and all responsibility for any loss, damage or destruction of data or any other property which may arise from relying on this email's technical content is explicitly disclaimed. The author will in no case be liable for any monetary damages arising from such loss, damage or destruction. On Fri, 9 Apr 2021 at 14:30, Muhammed Favas < favas.muham...@expeedsoftware.com> wrote: > Hi, > > > > No, I am using normal spark streaming using DStream API. > > > > *Regards,* > > *Favas * > > > > *From:* Mich Talebzadeh <mich.talebza...@gmail.com> > *Sent:* Friday, April 9, 2021 18:18 PM > *To:* Muhammed Favas <favas.muham...@expeedsoftware.com> > *Cc:* user@spark.apache.org > *Subject:* Re: How to use spark steaming data to plot live line chart > > > > Hi, > > > > Within the event driven architecture are you using Spark > Structured Streaming with foreachBatch to pick up the streaming data? > > > > HTH > > > > > > [image: Image removed by sender.] view my Linkedin profile > <https://www.linkedin.com/in/mich-talebzadeh-ph-d-5205b2/> > > > > *Disclaimer:* Use it at your own risk. Any and all responsibility for any > loss, damage or destruction of data or any other property which may arise > from relying on this email's technical content is explicitly disclaimed. > The author will in no case be liable for any monetary damages arising from > such loss, damage or destruction. > > > > > > > > On Fri, 9 Apr 2021 at 13:34, Muhammed Favas < > favas.muham...@expeedsoftware.com> wrote: > > Hi, > > > > I have an application that collects streaming data and transformed into > dataframe. Now I want to plot a live line chart using this data each time > when ever a new set of data comes in spark RDD. > > Please suggest what is the best solution to implement this > > > > *Regards,* > > *Favas * > > > >