If you are plotting that many data points, you might want to look at 'hexbin' as a way of aggregating the values to a different presentation. It is especially nice if you are doing a scatter plot with a lot of data points and trying to make sense out of it.
On Wed, Apr 27, 2011 at 5:16 AM, Jonathan Gabris <jonat...@k-m-p.nl> wrote: > Hello, > > I am working on a project analysing the performance of motor-vehicles > through messages logged over a CAN bus. > > I am using R 2.12 on Windows XP and 7 > > I am currently plotting the data in R, overlaying 5 or more plots of data, > logged at 1kHz, (using plot.ts() and par(new = TRUE)). > The aim is to be able to pan, zoom in and out and get values from the > plotted graph using a custom Qt interface that is used as a front end to > R.exe (all this works). > The plot is drawn by R directly to the windows graphic device. > > The data is imported from a .csv file (typically around 100MB) to a matrix. > (timestamp, message ID, byte0, byte1, ..., byte7) > I then separate this matrix into several by message ID (dimensions are in > the order of 8cols, 10^6 rows) > > The panning is done by redrawing the plots, shifted by a small amount. So as > to view a window of data from a second to a minute long that can travel the > length of the logged data. > > My problem is that, the redrawing of the plots whilst panning is too slow > when dealing with this much data. > i.e.: I can see the last graphs being drawn to the screen in the half-second > following the view change. > I need a fluid change from one view to the next. > > My question is this: > Are there ways to speed up the plotting on the MSWindows display? > By reducing plotted point densities to *sensible* values? > Using something other than plot.ts() - is the lattice package faster? > I don't need publication quality plots, they can be rougher... > > I have tried: > -Using matrices instead of dataframes - (works for calculations but not > enough for plots) > -increasing the max usable memory (max-mem-size) - (no change) > -increasing the size of the pointer protection stack (max-ppsize) - (no > change) > -deleting the unnecessary leftover matrices - (no change) > -I can't use lines() instead of plot() because of the very different scales > (rpm-10000, flags -1to3) > > I am going to do some resampling of the logged data to reduce the vector > sizes. > (removal of *less* important data and use of window.ts()) > > But I am currently running out of ideas... > So if sombody could point out something, I would be greatfull. > > Thanks, > > Jonathan Gabris > > ______________________________________________ > R-help@r-project.org mailing list > https://stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide http://www.R-project.org/posting-guide.html > and provide commented, minimal, self-contained, reproducible code. > -- Jim Holtman Data Munger Guru What is the problem that you are trying to solve? ______________________________________________ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.