Re: [R] Problem with converting grib file to excel

2024-09-30 Thread Roy Mendelssohn - NOAA Federal via R-help
I have corresponded with Javad off-line,  posting this as a follow-up, to close 
the issue.  There are two separate questions here.  The first is why did the 
posted code below fail.  The second is there an easy way to read in the values, 
given the oddities of his file  (more on that), and yes it turns out "terra" 
performs much better than "raster".

First a little about GRIB files in general,  and this file in particular.  GRIB 
files were design to store grids in files with very small footprints,  usually 
from model output.  GRIB files have the characteristic that if I "cat" a bunch 
of  GRIB files,  I still have a valid GRIB file.  So a given GRIB file often 
contains multiple parameters,  and each of those through time.  To translate 
into terms used by R spatial packages,  each variable grid,  at each time 
period will be seen as a  "band" or a "layer" in the dataset.  Thus if I have 3 
parameters,  say temperature,  dew point and cloud cover at  8670 time periods, 
 R spatial packages will see 3*8670 layers or bands.

However this GRIB file is clearly not a raw GRIB file made by a data provider,  
but rather an extract made with some tool.  In this file there are 3 underlying 
time series. temperature,  dew point and cloud cover,  each of dimension (1, 1, 
8670),  that is because it is defined at one spatial point only.

So why did the code below fail?  raster::stack()  read the file just fine,  and 
the layer_names are correct,  all 3*8670 of them.  The next steps:

>> # Extract layers based on layer names - adjust as necessary
>> t2m <- raster_data[[grep("t2m", layer_names)]]
>> d2m <- raster_data[[grep("d2m", layer_names)]]
>> tcc <- raster_data[[grep("tcc", layer_names)]]
>> valid_time <- raster_data[[grep("valid_time", layer_names)]]
>> t2m

which are aimed at subsetting the raster stack based on the variable name all 
fail because ""t2m" , "d2m" and "tcc" are not in the layer_names. So then of 
course everything else fails.

So the next question is how to read the file and get a data frame.  A first 
pass would be:

raster_stack <- raster::stack("Met.grib")
raster_data <- raster::getValues(raster_stack).

but you do not want to do that.  For whatever reason,  the raster_stack created 
above is enormous,  and the raster::getValues() takes forever,  I aborted after 
about an hour.  However,  if you use 'terra" instead:

raster_stack <-terra::rast("Met.grib")
raster_data <-terra::values(raster_stack).

it finishes in a flash.  raster_data is now one long array that needs to be 
reshaped:

raster_data <- array(raster_data, dim = c(3, 8670)

now raster_data[1, ] contains the temperature series,  raster_data[2, ] 
contains the dew point data,  and raster_data[3, ] contains the cloud cover 
data.

HTH,

-Roy

PS - GRIB files can be a bear.  The best GRIB readers in R are just front ends 
for either eccodes or wgrib2,and  while they work very well,  installation 
for eccodes and wgrib2 can be non-trivial,  so I didn't want to use them as a 
solution.  Also for quick viewing of GRIB files there is NASA's Panoply 
(https://www.giss.nasa.gov/tools/panoply/) but that requires a Java 
installation.

> On Sep 23, 2024, at 11:31 PM, javad bayat  wrote:
> 
> Dear R users;
> I have downloaded a grib file format (Met.grib) and I want to export its
> data to excel file. Also I want to do some mathematic on some columns. But
> I got error. I would be more than happy if anyone can help me to do this. I
> have provided the codes and the Met.grib file in this email.
> Sincerely yours
> 
> # Load the necessary libraries
>> library(raster)  # For reading GRIB files
>> library(dplyr)   # For data manipulation
>> library(lubridate)   # For date manipulation
>> library(openxlsx)# For writing Excel files
> 
> # Specify the file paths
>> grib_file_path <- "C:/Users/Omrab_Lab/Downloads/Met.grib"
>> excel_file_path <- "C:/Users/Omrab_Lab/Downloads/Met_updated.xlsx"
> 
> # Open the GRIB file
>> raster_data <- stack(grib_file_path)
> 
> # Check the names of the layers to identify which ones to extract
>> layer_names <- names(raster_data)
>> print(layer_names)  # Prints
> 
> 
>> # Extract layers based on layer names - adjust as necessary
>> t2m <- raster_data[[grep("t2m", layer_names)]]
>> d2m <- raster_data[[grep("d2m", layer_names)]]
>> tcc <- raster_data[[grep("tcc", layer_names)]]
>> valid_time <- raster_data[[grep("valid_time", layer_names)]]
>> t2m
> class  : RasterStack
> nlayers: 0
> 
>> # Check if the raster layers are loaded correctly
>> if (is.null(t2m) || is.null(d2m) || is.null(tcc) || is.null(valid_time))
> {
> + stop("One or more raster layers could not be loaded. Please check the
> layer names.")
> + }
> 
>> # Convert raster values to vectors
>> t2m_values <- values(t2m)
> Error in dimnames(x) <- dn :
>  length of 'dimnames' [2] not equal to array extent
>> d2m_values <- values(d2m)
> Error in dimnames(x) <- dn :
>  length of 'dimnames' [2] not equal to array extent
>> tcc_values <-

Re: [R] Problem with converting grib file to excel

2024-09-26 Thread Roy Mendelssohn - NOAA Federal via R-help
The easiest would be to send the link to the site from which you downloaded the 
file.

Thanks,

-Roy

> On Sep 26, 2024, at 1:26 PM, CALUM POLWART  wrote:
> 
> Attachments CAN NOT be sent to group
> 
> 
> On Thu, 26 Sep 2024, 21:22 javad bayat,  wrote:
> Dear Roy,
> Sorry for my mistake, I thought I have uploaded the grib file.
> I really apologise for that. I will send it   on Saturday.
> Thank you very much.
> 
> On Thu, 26 Sept 2024, 17:40 Roy Mendelssohn - NOAA Federal, <
> roy.mendelss...@noaa.gov> wrote:
> 
> > Hi Javad:
> >
> > I know a lot about reading GRIB files,  I work with them all the time.
> > But if you don’t make the file available,  or point me to where I can
> > download it,  there is not much I can do.
> >
> > Thanks,
> >
> > -Roy
> >
> > On Sep 25, 2024, at 9:41 PM, javad bayat  wrote:
> >
> > Dear all;
> > Many thanks for your responses. Actually it is not completely a GIS file,
> > it is a data file which stores meteorological data of a specific region.
> > But the site allows downloading with grib format and as I searched to read
> > this type of file in R, I found the Raster Package.
> > In python it is possible to do this using cdsapi and xarray library, but I
> > am not familiar with python.
> > Sincerely
> >
> > On Thu, Sep 26, 2024 at 2:33 AM Roy Mendelssohn - NOAA Federal via R-help <
> > r-help@r-project.org> wrote:
> >
> >> At least for me the dataset file did not come through.  I will look at it
> >> if it can be made available.  It does look like the finial step of reading
> >> the data into raster failed,  so then did the rest of th commands.
> >>
> >> -Roy
> >>
> >>
> >> > On Sep 25, 2024, at 3:24 PM, CALUM POLWART  wrote:
> >> >
> >> > Noticeable lack of silence in the group on this one.
> >> >
> >> > I've not got time to test currently. But my experience of geo location
> >> > files - they often had more than 2 dimensional data. In other words you
> >> > might have a boundary of a region as an object with long and lat for
> >> maybe
> >> > 100 data points making up the region. So 200 pieces of data. All held
> >> as a
> >> > list or something similar in a single "cell" as excel would refer to it.
> >> >
> >> > My gut feeling is that's likely to make export to excel difficult
> >> without
> >> > data carpentry first?
> >> >
> >> > On Tue, 24 Sep 2024, 21:26 Bert Gunter,  wrote:
> >> >
> >> >> You might try posting on r-sig-geo if you don't get a satisfactory
> >> >> response here. I assume there's a lot of expertise there on handling
> >> >> raster-type data.
> >> >>
> >> >> Cheers,
> >> >> Bert
> >> >>
> >> >> On Mon, Sep 23, 2024 at 11:31 PM javad bayat 
> >> wrote:
> >> >>>
> >> >>> Dear R users;
> >> >>> I have downloaded a grib file format (Met.grib) and I want to export
> >> its
> >> >>> data to excel file. Also I want to do some mathematic on some columns.
> >> >> But
> >> >>> I got error. I would be more than happy if anyone can help me to do
> >> >> this. I
> >> >>> have provided the codes and the Met.grib file in this email.
> >> >>> Sincerely yours
> >> >>>
> >> >>> # Load the necessary libraries
> >> >>>> library(raster)  # For reading GRIB files
> >> >>>> library(dplyr)   # For data manipulation
> >> >>>> library(lubridate)   # For date manipulation
> >> >>>> library(openxlsx)# For writing Excel files
> >> >>>
> >> >>> # Specify the file paths
> >> >>>> grib_file_path <- "C:/Users/Omrab_Lab/Downloads/Met.grib"
> >> >>>> excel_file_path <- "C:/Users/Omrab_Lab/Downloads/Met_updated.xlsx"
> >> >>>
> >> >>> # Open the GRIB file
> >> >>>> raster_data <- stack(grib_file_path)
> >> >>>
> >> >>> # Check the names of the layers to identify which ones to extract
> >> >>>> layer_names <- names(raster_data)
> >> >>>> print(layer_names)  # Prints
> >> >>>
> >> >>>
> >> >&g

Re: [R] Problem with converting grib file to excel

2024-09-26 Thread Roy Mendelssohn - NOAA Federal via R-help
Hi Javad:

I know a lot about reading GRIB files,  I work with them all the time.  But if 
you don’t make the file available,  or point me to where I can download it,  
there is not much I can do.

Thanks,

-Roy

> On Sep 25, 2024, at 9:41 PM, javad bayat  wrote:
> 
> Dear all;
> Many thanks for your responses. Actually it is not completely a GIS file, it 
> is a data file which stores meteorological data of a specific region. But the 
> site allows downloading with grib format and as I searched to read this type 
> of file in R, I found the Raster Package.
> In python it is possible to do this using cdsapi and xarray library, but I am 
> not familiar with python.
> Sincerely
> 
> On Thu, Sep 26, 2024 at 2:33 AM Roy Mendelssohn - NOAA Federal via R-help 
> mailto:r-help@r-project.org>> wrote:
>> At least for me the dataset file did not come through.  I will look at it if 
>> it can be made available.  It does look like the finial step of reading the 
>> data into raster failed,  so then did the rest of th commands.
>> 
>> -Roy
>> 
>> 
>> > On Sep 25, 2024, at 3:24 PM, CALUM POLWART > > <mailto:polc1...@gmail.com>> wrote:
>> > 
>> > Noticeable lack of silence in the group on this one.
>> > 
>> > I've not got time to test currently. But my experience of geo location
>> > files - they often had more than 2 dimensional data. In other words you
>> > might have a boundary of a region as an object with long and lat for maybe
>> > 100 data points making up the region. So 200 pieces of data. All held as a
>> > list or something similar in a single "cell" as excel would refer to it.
>> > 
>> > My gut feeling is that's likely to make export to excel difficult without
>> > data carpentry first?
>> > 
>> > On Tue, 24 Sep 2024, 21:26 Bert Gunter, > > <mailto:bgunter.4...@gmail.com>> wrote:
>> > 
>> >> You might try posting on r-sig-geo if you don't get a satisfactory
>> >> response here. I assume there's a lot of expertise there on handling
>> >> raster-type data.
>> >> 
>> >> Cheers,
>> >> Bert
>> >> 
>> >> On Mon, Sep 23, 2024 at 11:31 PM javad bayat > >> <mailto:j.bayat...@gmail.com>> wrote:
>> >>> 
>> >>> Dear R users;
>> >>> I have downloaded a grib file format (Met.grib) and I want to export its
>> >>> data to excel file. Also I want to do some mathematic on some columns.
>> >> But
>> >>> I got error. I would be more than happy if anyone can help me to do
>> >> this. I
>> >>> have provided the codes and the Met.grib file in this email.
>> >>> Sincerely yours
>> >>> 
>> >>> # Load the necessary libraries
>> >>>> library(raster)  # For reading GRIB files
>> >>>> library(dplyr)   # For data manipulation
>> >>>> library(lubridate)   # For date manipulation
>> >>>> library(openxlsx)# For writing Excel files
>> >>> 
>> >>> # Specify the file paths
>> >>>> grib_file_path <- "C:/Users/Omrab_Lab/Downloads/Met.grib"
>> >>>> excel_file_path <- "C:/Users/Omrab_Lab/Downloads/Met_updated.xlsx"
>> >>> 
>> >>> # Open the GRIB file
>> >>>> raster_data <- stack(grib_file_path)
>> >>> 
>> >>> # Check the names of the layers to identify which ones to extract
>> >>>> layer_names <- names(raster_data)
>> >>>> print(layer_names)  # Prints
>> >>> 
>> >>> 
>> >>>> # Extract layers based on layer names - adjust as necessary
>> >>>> t2m <- raster_data[[grep("t2m", layer_names)]]
>> >>>> d2m <- raster_data[[grep("d2m", layer_names)]]
>> >>>> tcc <- raster_data[[grep("tcc", layer_names)]]
>> >>>> valid_time <- raster_data[[grep("valid_time", layer_names)]]
>> >>>> t2m
>> >>> class  : RasterStack
>> >>> nlayers: 0
>> >>> 
>> >>>> # Check if the raster layers are loaded correctly
>> >>>> if (is.null(t2m) || is.null(d2m) || is.null(tcc) ||
>> >> is.null(valid_time))
>> >>> {
>> >>> + stop("One or more raster layers could not be loaded. Please check
>> >> the
>> &

Re: [R] Problem with converting grib file to excel

2024-09-25 Thread Roy Mendelssohn - NOAA Federal via R-help
At least for me the dataset file did not come through.  I will look at it if it 
can be made available.  It does look like the finial step of reading the data 
into raster failed,  so then did the rest of th commands.

-Roy


> On Sep 25, 2024, at 3:24 PM, CALUM POLWART  wrote:
> 
> Noticeable lack of silence in the group on this one.
> 
> I've not got time to test currently. But my experience of geo location
> files - they often had more than 2 dimensional data. In other words you
> might have a boundary of a region as an object with long and lat for maybe
> 100 data points making up the region. So 200 pieces of data. All held as a
> list or something similar in a single "cell" as excel would refer to it.
> 
> My gut feeling is that's likely to make export to excel difficult without
> data carpentry first?
> 
> On Tue, 24 Sep 2024, 21:26 Bert Gunter,  wrote:
> 
>> You might try posting on r-sig-geo if you don't get a satisfactory
>> response here. I assume there's a lot of expertise there on handling
>> raster-type data.
>> 
>> Cheers,
>> Bert
>> 
>> On Mon, Sep 23, 2024 at 11:31 PM javad bayat  wrote:
>>> 
>>> Dear R users;
>>> I have downloaded a grib file format (Met.grib) and I want to export its
>>> data to excel file. Also I want to do some mathematic on some columns.
>> But
>>> I got error. I would be more than happy if anyone can help me to do
>> this. I
>>> have provided the codes and the Met.grib file in this email.
>>> Sincerely yours
>>> 
>>> # Load the necessary libraries
 library(raster)  # For reading GRIB files
 library(dplyr)   # For data manipulation
 library(lubridate)   # For date manipulation
 library(openxlsx)# For writing Excel files
>>> 
>>> # Specify the file paths
 grib_file_path <- "C:/Users/Omrab_Lab/Downloads/Met.grib"
 excel_file_path <- "C:/Users/Omrab_Lab/Downloads/Met_updated.xlsx"
>>> 
>>> # Open the GRIB file
 raster_data <- stack(grib_file_path)
>>> 
>>> # Check the names of the layers to identify which ones to extract
 layer_names <- names(raster_data)
 print(layer_names)  # Prints
>>> 
>>> 
 # Extract layers based on layer names - adjust as necessary
 t2m <- raster_data[[grep("t2m", layer_names)]]
 d2m <- raster_data[[grep("d2m", layer_names)]]
 tcc <- raster_data[[grep("tcc", layer_names)]]
 valid_time <- raster_data[[grep("valid_time", layer_names)]]
 t2m
>>> class  : RasterStack
>>> nlayers: 0
>>> 
 # Check if the raster layers are loaded correctly
 if (is.null(t2m) || is.null(d2m) || is.null(tcc) ||
>> is.null(valid_time))
>>> {
>>> + stop("One or more raster layers could not be loaded. Please check
>> the
>>> layer names.")
>>> + }
>>> 
 # Convert raster values to vectors
 t2m_values <- values(t2m)
>>> Error in dimnames(x) <- dn :
>>>  length of 'dimnames' [2] not equal to array extent
 d2m_values <- values(d2m)
>>> Error in dimnames(x) <- dn :
>>>  length of 'dimnames' [2] not equal to array extent
 tcc_values <- values(tcc)
>>> Error in dimnames(x) <- dn :
>>>  length of 'dimnames' [2] not equal to array extent
 valid_time_values <- values(valid_time)
>>> Error in dimnames(x) <- dn :
>>>  length of 'dimnames' [2] not equal to array extent
>>> 
>>> # Check for NA values and dimensions
>>> if (any(is.na(t2m_values)) || any(is.na(d2m_values)) || any(is.na
>> (tcc_values))
>>> || any(is.na(valid_time_values))) {
>>>  warning("One or more layers contain NA values. These will be removed.")
>>> }
>>> 
>>> # Create the data frame, ensuring no NA values are included
>>> df <- data.frame(
>>>  t2m = t2m_values,
>>>  d2m = d2m_values,
>>>  tcc = tcc_values,
>>>  valid_time = valid_time_values,
>>>  stringsAsFactors = FALSE
>>> )
>>> 
>>> # Remove rows with NA values
>>> df <- na.omit(df)
>>> 
>>> # Convert temperatures from Kelvin to Celsius
>>> df$t2m <- df$t2m - 273.15
>>> df$d2m <- df$d2m - 273.15
>>> 
>>> # Calculate relative humidity
>>> calculate_relative_humidity <- function(t2m, d2m) {
>>>  es <- 6.112 * exp((17.67 * t2m) / (t2m + 243.5))
>>>  e <- 6.112 * exp((17.67 * d2m) / (d2m + 243.5))
>>>  rh <- (e / es) * 100
>>>  return(rh)
>>> }
>>> df$RH <- calculate_relative_humidity(df$t2m, df$d2m)
>>> 
>>> # Convert valid_time from numeric to POSIXct assuming it's in seconds
>> since
>>> the epoch
>>> df$valid_time <- as.POSIXct(df$valid_time, origin = "1970-01-01")
>>> 
>>> # Extract year, month, day, and hour from valid_time
>>> df$Year <- year(df$valid_time)
>>> df$Month <- month(df$valid_time)
>>> df$Day <- day(df$valid_time)
>>> df$Hour <- hour(df$valid_time)
>>> 
>>> # Select only the desired columns
>>> df_selected <- df %>% select(Year, Month, Day, Hour, tcc, t2m, RH)
>>> 
>>> # Save the updated DataFrame to an Excel file
>>> write.xlsx(df_selected, excel_file_path, row.names = FALSE)
>>> 
>>> 
>>> 
>>> 
>>> 
>>> 
>>> --
>>> Best Regards
>>> Javad Bayat
>>> M.Sc. Environment Engineering
>>> Alternative Mail: bayat...@yahoo.com
>>> 
>>>[[alter

Re: [R] Problem with combining monthly nc files into a yearly file (era5 climate data)

2024-06-21 Thread Roy Mendelssohn - NOAA Federal via R-help
Hi Leni:

You forget to post the important part - the errors you have been getting and if 
you have the errors isolated to particular lines in the code.

HTH,

-Roy


> On Jun 21, 2024, at 3:59 AM, Leni Koehnen via R-help  
> wrote:
> 
> Dear R-help List, 
> 
> I am currently trying to run a code which is available on Zenodo  
> (https://zenodo.org/records/10997880  - 02_MicroClimModel.R).
> 
> The code downloads yearly era5 climate data. Unfortunately, the limit to 
> download these nc-files was recently reduced to 6. Therefore, I can not 
> download the yearly file anymore. I have solved this by rewriting the code, 
> so that it downloads 12 monthly files. 
> 
> However, I have not been able to combine these 12 monthly nc-files into one 
> yearly file. The code gives me errors if I continue running it. I assume that 
> the combination was not successful and might have messed up the format. I 
> would greatly appreciate any advice on how to convert these monthly nc-files 
> into one yearly file.
> 
> Thank you very much in advance!
> 
> Here is the full code: 
> 
> 
> ' *
> #' ~~ STEP 01 DOWNLOADING & PROCESSING HOURLY CLIMATE DATA 
> 
> # Install the remotes package if not already installed
> if (!requireNamespace("remotes", quietly = TRUE)) {
>  install.packages("remotes")
> }
> # Install packages from CRAN
> install.packages(c("terra", "raster", "ncdf4", "lubridate"))
> install.packages("lutz")
> #install dependencies for microclima
> remotes::install_github("ropensci/rnoaa")
> 
> # Install packages from GitHub
> remotes::install_github("dklinges9/mcera5")
> remotes::install_github("ilyamaclean/microclima")
> remotes::install_github("ilyamaclean/microclimf")
> 
> #' ~~ Required libraries:
> require(terra)
> require(raster)
> require(mcera5) # https://github.com/dklinges9/mcera5
> require(ncdf4)
> require(microclima) # https://github.com/ilyamaclean/microclima
> require(microclimf) # https://github.com/ilyamaclean/microclimf
> require(ecmwfr)
> require(lutz)
> require(lubridate)
> 
> # Set paths and year of interest
> pathtodata <- "F:/Dat/"
> pathtoera5 <- paste0(pathtodata, "era5/")
> year <- 2019
> 
> # Set user credentials for CDS API (you have to first register and insert 
> here your UID and API key at https://cds.climate.copernicus.eu/user/register 
> and allow downloads)
> uid <- "xxx"
> cds_api_key <- "xxx"
> ecmwfr::wf_set_key(user = uid, key = cds_api_key, service = "cds")
> 
> # Define the spatial extent for your tile
> xmn <- 18.125
> xmx <- 22.875
> ymn <- -1.625
> ymx <- 1.875
> 
> #HERE STARTS THE SECTION WHERE I AM DOWNLOADING MONTHLY FILES 
> 
> # Define the temporal extent of the run
> start_time <- lubridate::ymd(paste0(year, "-01-01"))
> end_time <- lubridate::ymd(paste0(year, "-12-31"))
> 
> # Function to build and send requests for each month
> request_era5_monthly <- function(year, month, uid, xmn, xmx, ymn, ymx, 
> out_path) {
>  # Define the start and end times for the month
>  st_time <- lubridate::ymd(paste0(year, "-", sprintf("%02d", month), "-01"))
>  en_time <- st_time + months(1) - days(1)
> 
>  # Create the file prefix and request
>  file_prefix <- paste0("era5_reanalysis_", year, "_", sprintf("%02d", month))
>  req <- build_era5_request(xmin = xmn, xmax = xmx, ymin = ymn, ymax = ymx, 
>start_time = st_time, end_time = en_time, 
> outfile_name = file_prefix)
> 
>  # Send the request and save the data
>  request_era5(request = req, uid = uid, out_path = out_path, overwrite = TRUE)
> }
> 
> # Loop over each month and request data
> for (month in 1:12) {
>  request_era5_monthly(year, month, uid, xmn, xmx, ymn, ymx, pathtoera5)
> }
> 
> #HERE I AM EXPLORING ONE EXEMPLARY MONTHLY NC FILE
> 
> file_path <- paste0(pathtoera5, "era5_reanalysis_2019_01_2019.nc")
> nc <- nc_open(file_path)
> 
> # List all variables
> print(nc)
> 
> # List all variable names in the NetCDF file
> var_names <- names(nc$var)
> print(var_names)
> 
> checkJan <- raster(paste0(pathtoera5, "era5_reanalysis_2019_01_2019.nc"))
> print(checkJan)  
> opencheckJan <- getValues(checkJan)
> opencheckJan
> 
> #HERE IS THE PROBLEM, I AM TRYING TO COMBINE THESE MONTHL NC FILES 
> 
> combine_era5_yearly <- function(year, pathtoera5, outfile) {
>  # List of monthly files
>  monthly_files <- list.files(pathtoera5, pattern = paste0("era5_reanalysis_", 
> year, "_\\d{2}_", year, "\\.nc"), full.names = TRUE)
> 
>  if (length(monthly_files) == 0) {
>stop("No monthly files found")
>  }
> 
>  # Initialize lists to store data
>  lons <- NULL
>  lats <- NULL
>  time <- NULL
>  t2m <- list()
>  d2m <- list()
>  sp <- list()
>  u10 <- list()
>  v10 <- list()
>  tp <- list()
>  tcc <- list()
>  msnlwrf <- list()
>  msdwlwrf <- list()
>  fdir <- list()
>  ssrd <- list()
>  lsm <- list()
> 
>  # Read each monthly file and extract variables
>  for (file in monthly_files) {
>nc <- nc_open(file)
> 
>if (is.null(lons)) {
>  

[R] Fwd: Webinar: How to access ERDDAP data using R

2024-06-11 Thread Roy Mendelssohn - NOAA Federal via R-help



> 
>   
>  
>  
> 
>  
> Note   |  June 2024
>  
>  
>   
>   
> Webinar announcement
>  
> How to access ERDDAP data using R
>  
>  
> 
>  
>   20 June 2024 | 18:00 CEST
>   See what time it is for you 
> 
>   Online
>  
> This seminar will demonstrating the use of ERDDAP™ in R. Demonstrating the 
> use of multiple ERDDAP™ servers to pull together collocated datasets, 
> extracting detailed metadata, and accessing ERDDAP™ hosted data using R.
> 
> This webinar is a follow up from the webinar “How to ERDDAP™” using python.
> 
> Future Seminars:
> July - Using ERDDAP to track usage metrics
> August - Abstracting across ERDDAP
> September - Writing Data into ERDDAP
> 
> Webinar registration 
> 
> Flyer 
> 
> Access previous ERDDAP webinars here 
> .
>   
>  
> Register
>  
> 
>  
>  
>  
>  
>  
>   
> 
>   
> 
>   
> 
>   
> 
>  
> The Global Ocean Observing System is sponsored by
> the Intergovernmental Oceanographic Commission of UNESCO,
> the World Meteorological Organization, the United Nations Environment 
> Programme, and the International Science Council.
> 
> Global Ocean Observing System 
> Intergovernmental Oceanographic Commission
> UNESCO
> 7 place de Fontenoy
> 75352 Paris 07-SP
> France
> 
> g...@unesco.org 
> 
> For inquiries about the GOOS mailing list please contact 
> Laura Stukonytė l.stukon...@unesco.org 
>  
>  
> You are receiving this email because you are a member of the 
> Global Ocean Observing Community.
> 
> Unsubscribe 
> 
>  ann-christine.zink...@noaa.gov  from 
> this list 
> Update subscription preferences 
> 
> View email in browser 
> 
> 
> © 2024 Global Ocean Observing System. All rights reserved.
>  
>  
> 
> 
> 
> -- 
> 
> 
> 
> 
> Ann-Christine Zinkann, PhD (she/her)
> Program Manager
> National Oceanic and Atmospheric Administration 
> Global Ocean Monitoring & Observing Program 
> and Cooperative Programs for the Advancement of Earth System Science 
> , University Corporation for Atmospheric Research
> 803.904.8291 | ann-christine.zink...@noaa.gov 
**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new street address***
110 McAllister Way
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: https://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.


[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Need help testing a problem

2024-02-01 Thread Roy Mendelssohn - NOAA Federal via R-help
HI Rui:

First and foremost thanks  for taking the time to do this.  That was a typo,  
it was httr and is not needed,  I was careless.  That request works fine on a 
Mac and a Linux box.  We needed to check if there was something about  iour 
internal network settings, our Windows setup or something else that was causing 
the problem.  Bill Dunlop got the same results so it is something else,  We do 
not know how much Akamai is interfering with this,  we have had issues with 
folks running Windows behind Cloudfare that caused problems.

Again,  thanks for taking the time,  it eliminates on possible cause.

-Roy



> On Feb 1, 2024, at 2:16 AM, Rui Barradas  wrote:
> 
> Às 23:47 de 31/01/2024, Roy Mendelssohn - NOAA Federal via R-help escreveu:
>> HI All:
>> We are trying to figure out a problem that is occurring with a package,  and 
>> we need a non-NOAA person with a Windows computer with the latest R to test 
>> for us what is failing (but works on Macs and Linux from different sites).  
>> Part of the problem is there is an Akamai service in between that we feel 
>> may be messing up things. To do this you would need to be willing to:
>> 1.  install the ‘rerddap’ package from CRAN,  with all dependencies - or if 
>> installed make certain it is the latest
>> 2. make certain that the packages ‘curl’, ‘crul’ and ‘http’ are up to date.
>> 3.  Run the following:
>> # load rerddap
>> library("rerddap”)
>> # delete rerddap cache just in case
>> cache_delete_all()
>> # get the data extract
>> dat <- tabledap('FED_JSATS_detects', url = 
>> "https://oceanview.pfeg.noaa.gov/erddap/";, 'study_id="RBDD_2018"', callopts 
>> = list(verbose = TRUE), store = memory())
>> 4. If it works also do:
>>  head(dat)
>> Either way if all the output could be sent to me it would be very helpful.  
>> The download may take a couple of minutes,
>> the file size is about 3.5MB.
>> If anyone has concerns about what is being downloaded you can check the 
>> ‘rerddap’ docs to see these are all legitimate ‘rerddap' commands,  the 
>> datasets being accessed is 
>> https://oceanview.pfeg.noaa.gov/erddap/tabledap/FED_JSATS_detects.html.  The 
>> downloaded file goes to R temp space, and its location,  if successful is 
>> given in the output.
>> Thanks to anyone willing to help.  One of my issues is I have been unable to 
>> reproduce the error,  but I also do not have ready access to a Windows 
>> machine.
>> -ROy
>>  [[alternative HTML version deleted]]
>> __
>> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
>> https://stat.ethz.ch/mailman/listinfo/r-help
>> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
>> and provide commented, minimal, self-contained, reproducible code.
> Hello,
> 
> Tested with R 4.3.2 running on Windows 11.
> 
> 1. You mention package http but there is no package http on CRAN, did you 
> mean package httr?
> 
> 2. I wrote a script rhelp.R, ran it on a command window and here is what I 
> got on the command window output.
> 
> 
> 
> C:\Users\ruipb\Documents>R -q -f rhelp.R > rhelp_out.txt
> * WARNING: failed to open cookie file ""
> * Found bundle for host: 0x21cb7410670 [serially]
> * Can not multiplex, even if we wanted to
> * Re-using existing connection with host oceanview.pfeg.noaa.gov
> > GET /erddap/tabledap/FED_JSATS_detects.csv?&study_id%3D%22RBDD_2018%22 
> > HTTP/1.1
> Host: oceanview.pfeg.noaa.gov
> User-Agent: libcurl/8.3.0 r-curl/5.2.0 crul/1.4.0
> Accept-Encoding: gzip, deflate
> Accept: application/json, text/xml, application/xml, */*
> 
> < HTTP/1.1 504 Gateway Time-out
> < Server: AkamaiGHost
> < Mime-Version: 1.0
> < Content-Type: text/html
> < Content-Length: 176
> < Expires: Thu, 01 Feb 2024 09:56:40 GMT
> < Date: Thu, 01 Feb 2024 09:56:40 GMT
> < Connection: keep-alive
> < Strict-Transport-Security: max-age=31536000
> <
> * Connection #0 to host oceanview.pfeg.noaa.gov left intact
> 
> 
> 3. The file rhelp_out.txt was created and its contents are the following. (I 
> installed package "hoardr" separately because for some reason it was not 
> being recognized as a dependency)
> 
> 
> 
> > # pkgs <- c("hoardr", "rerddap", "curl", "crul", "httr")
> > # utils::install.packages(
> > # pkgs = pkgs,
> > # lib = .libPaths()[1],
> > # repos = "https://cloud.r-project.org";,
> > # dependencies = TRUE
> > 

[R] Need help testing a problem

2024-01-31 Thread Roy Mendelssohn - NOAA Federal via R-help
HI All:

We are trying to figure out a problem that is occurring with a package,  and we 
need a non-NOAA person with a Windows computer with the latest R to test for us 
what is failing (but works on Macs and Linux from different sites).  Part of 
the problem is there is an Akamai service in between that we feel may be 
messing up things. To do this you would need to be willing to:

1.  install the ‘rerddap’ package from CRAN,  with all dependencies - or if 
installed make certain it is the latest

2. make certain that the packages ‘curl’, ‘crul’ and ‘http’ are up to date.  

3.  Run the following:

# load rerddap
library("rerddap”)
# delete rerddap cache just in case
cache_delete_all()
# get the data extract
dat <- tabledap('FED_JSATS_detects', url = 
"https://oceanview.pfeg.noaa.gov/erddap/";, 'study_id="RBDD_2018"', callopts = 
list(verbose = TRUE), store = memory())

4. If it works also do:

 head(dat)

Either way if all the output could be sent to me it would be very helpful.  The 
download may take a couple of minutes,
the file size is about 3.5MB.

If anyone has concerns about what is being downloaded you can check the 
‘rerddap’ docs to see these are all legitimate ‘rerddap' commands,  the 
datasets being accessed is 
https://oceanview.pfeg.noaa.gov/erddap/tabledap/FED_JSATS_detects.html.  The 
downloaded file goes to R temp space, and its location,  if successful is given 
in the output.

Thanks to anyone willing to help.  One of my issues is I have been unable to 
reproduce the error,  but I also do not have ready access to a Windows machine.

-ROy
[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Help request: Parsing docx files for key words and appending to a spreadsheet

2023-12-29 Thread Roy Mendelssohn - NOAA Federal via R-help
Hi Andy:

I don’t have an answer but I do have what I hope is some friendly advice.  
Generally the more information you can provide,  the more likely you will get 
help that is useful.  In your case you say that you tried several packages and 
they didn’t do what you wanted.  Providing that code,  as well as why they 
didn’t do what you wanted (be specific)  would greatly facilitate things.

Happy new year,

-Roy


> On Dec 29, 2023, at 10:14 AM, Andy  wrote:
> 
> Hello
> 
> I am trying to work through a problem, but feel like I've gone down a rabbit 
> hole. I'd very much appreciate any help.
> 
> The task: I have several directories of multiple (some directories, up to 
> 2,500+) *.docx files (newspaper articles downloaded from Lexis+) that I want 
> to iterate through to append to a spreadsheet only those articles that 
> satisfy a condition (i.e., a specific keyword is present for >= 50% coverage 
> of the subject matter). Lexis+ has a very specific structure and keywords are 
> given in the row "Subject".
> 
> I'd like to be able to accomplish the following:
> 
> (1) Append the title, the month, the author, the number of words, and page 
> number(s) to a spreadsheet
> 
> (2) Read each article and extract keywords (in the docs, these are listed in 
> 'Subject' section as a list of keywords with a percentage showing the extent 
> to which the keyword features in the article (e.g., FAST FASHION (72%)) and 
> to append the keyword and the % coverage to the same row in the spreadsheet. 
> However, I want to ensure that the keyword coverage meets the threshold of >= 
> 50%; if not, then pass onto the next article in the directory. Rinse and 
> repeat for the entire directory.
> 
> So far, I've tried working through some Stack Overflow-based solutions, but 
> most seem to use the textreadr package, which is now deprecated; others use 
> either the officer or the officedown packages. However, these packages don't 
> appear to do what I want the program to do, at least not in any of the 
> examples I have found, nor in the vignettes and relevant package manuals I've 
> looked at.
> 
> The first point is, is what I am intending to do even possible using R? If it 
> is, then where do I start with this? If these docx files were converted to 
> UTF-8 plain text, would that make the task easier?
> 
> I am not a confident coder, and am really only just getting my head around R 
> so appreciate a steep learning curve ahead, but of course, I don't know what 
> I don't know, so any pointers in the right direction would be a big help.
> 
> Many thanks in anticipation
> 
> Andy
> 
> __
> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] strptime with +03:00 zone designator

2023-11-05 Thread Roy Mendelssohn - NOAA Federal via R-help
what if you try lubridate::as_datetime('2017-02-28T13:35:00+03:00’)

-Roy


> On Nov 5, 2023, at 3:45 PM, Richard O'Keefe  wrote:
> 
> I have some data that includes timestamps like this:
> 2017-02-28T13:35:00+03:00
> The documentation for strptime says that %z expects
> an offset like 0300.  I don't see any way in the documentation
> to get it to accept +hh:mm with a colon separator, and
> everything I tried gave me NA as the answer.
> 
> Section 4.2.5.1 of ISO 8601:2004(E) allows both the
> absence of colons in +hh[mm] (basic format) and the
> presence of colons in +hh:mm (extended format).
> Again in section 4.2.5.2 where a zone offset is combined
> with a time of day: if you have hh:mm:ss you are using
> extended format and the offset MUST have a colon; if
> you have hhmmss you are using basic format and the
> offset MUST NOT have a colon.  And again in section
> 4.3.2 (complete representations of date and time of day).
> If you use hyphens and colons in the date and time part
> you MUST have a colon in the zone designator.
> 
> So I am dealing with timestamps in strict ISO 8601
> complete extended representation, and it is rather
> frustrating that strptime doesn't deal with it simply.
> 
> The simplest thing would be for R's own version of
> strptime to allow an optional colon between the hour
> digits and the minute digits of a zone designator.
> 
> I'm about to clone the data source and edit it to
> remove the colons, but is there something obvious
> I am missing?
> 
> __
> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] aniMoutm/foiegras Assistance

2023-01-29 Thread Roy Mendelssohn - NOAA Federal via R-help
Hi Zac:

Two suggestions.  In the map statement,  set silent = FALSE,  hopefully that 
will give you more info,  The second is to try what is suggested in the error 
message, " Try simplifying the model with the following argument: map = 
list(rho_o = factor(NA))".  I don't know what rho_o does,  but it might be a 
first step to figuring things out/

HTH,

-Roy


> On Jan 27, 2023, at 12:42 PM, ZAC WARHAM  
> wrote:
> 
> Hi all, I am receiving the error *Newton failed to find minimum* whilst
> trying to fit a move persistence model in this package. I think the error
> is related to the optimiser in the TMB package but it is beyond my
> statistics/coding knowledge to track down the specific cause and solution.
> I would appreciate it if anyone knew some resources to assist with fixing
> this. I have put a MRE and sample data over on Stackoverflow if anyone
> wants a closer look -
> https://stackoverflow.com/questions/75253642/how-do-i-fix-newton-failed-to-find-minimum-in-r
> 
> --
> 
>   [[alternative HTML version deleted]]
> 
> __
> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.

**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new street address***
110 McAllister Way
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: https://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] categorizing data

2022-05-29 Thread Roy Mendelssohn - NOAA Federal via R-help
Hi Janet:

here is a start to give you the idea,  now you need  loop either use a "for" or 
one of the apply functions.

1.  Preallocate new data  (i am lazy so it is array, for example of size three.

2.  order the data and set values.

junk <- array(0, dim = c(2,3))
values <- c(10, 30, 50)
junk[1, order(c(32, 11, 17))] <- values
junk[1, ]
[1] 50 10 30


This works because order() returns the index of the ordering, not the values.

HTH,

-Roy
> On May 29, 2022, at 1:31 PM, Janet Choate  wrote:
> 
> I'm sorry if this has come across as a homework assignment!I was trying to
> provide a simple example.
> There are actually 38323 rows of data, each row is an observation of the
> percent that each of those veg types occupies in a spatial unit - where
> each line adds to 90 - and values are different every line.
> I need a way to categorize the data, so I can reduce the number of unique
> observations.
> 
> So instead of 38323 unique observations - I can reduce this to
> X number of High/Med/Low
> X number of Med/Low/High
> X number of Low/High/Med
> etc... for all combinations
> 
> I hope this makes it more clear..
> thank you all for your responses,
> JC
> 
> On Sun, May 29, 2022 at 1:16 PM Avi Gross via R-help 
> wrote:
> 
>> Tom,
>> You may have a very different impression of what was asked! LOL!
>> Unless Janet clarifies what seems a bit like a homework assignment, it
>> seems to be a fairly simple and straightforward assignment with exactly
>> three rows/columns and asking how to replace the variables, in a sense, by
>> finding the high and low and perhaps thus identifying the medium, but to do
>> this for each row without changing the order of the resulting data.frame.
>> I note most techniques people have used focus on columns, not rows, but an
>> all-numeric data.frame can be transposed, or converted to a matrix and
>> later converted back.
>> If this is HW, the question becomes what has been taught so far and is
>> supposed to be used in solving it. Can they make their own functions
>> perhaps to be called three times, once per row or column, to replace that
>> row/column, or can they use some form of loop to iterate over the columns?
>> Does it need to sort of be done in place or can they create gradually a
>> second data.frame and then move the pointer to it and lots of other similar
>> ideas.
>> I am not sure, other than as a HW assignment, why this transformation
>> would need to be done but of course, there may well be a reason.
>> I note that the particular example shown just happens to create almost a
>> magic square as the sum of rows and columns and the major diagonal happen
>> to be 0, albeit the reverse diagonal is all 50's.
>> Again, there are many solutions imaginable but the goal may be more
>> specific and I shudder to supply one given that too often questions here
>> are not detailed enough and are misunderstood. In this case, I thought I
>> understood until I saw what Tom wrote! LOL!
>> I will add this. Is it guaranteed that no two items in the same row are
>> never equal or is there some requirement for how to handle a tie? And note
>> there are base R functions called min() and max() and you can ask for
>> things like:
>> 
>> if ( current == min(mydata[1,])) ...
>> 
>> 
>> -Original Message-
>> From: Tom Woolman 
>> To: Janet Choate 
>> Cc: r-help@r-project.org
>> Sent: Sun, May 29, 2022 3:42 pm
>> Subject: Re: [R] categorizing data
>> 
>> 
>> Some ideas:
>> 
>> You could create a cluster model with k=3 for each of the 3 variables,
>> to determine what constitutes high/medium/low centroid values for each
>> of the 3 types of plant types. Centroid values could then be used as the
>> upper/lower boundary ranges for high/med/low.
>> 
>> Or utilize a histogram for each variable, and use quantiles or
>> densities, etc. to determine the natural breaks for the high/med/low
>> ranges for each of the IVs.
>> 
>> 
>> 
>> 
>> On 2022-05-29 15:28, Janet Choate wrote:
>>> Hi R community,
>>> I have a data frame with three variables, where each row adds up to 90.
>>> I want to assign a category of low, medium, or high to the values in
>>> each
>>> row - where the lowest value per row will be set to 10, the medium
>>> value
>>> set to 30, and the high value set to 50 - so each row still adds up to
>>> 90.
>>> 
>>> For example:
>>> Data: Orig
>>> tree  shrub  grass
>>> 3211  47
>>> 23  41  26
>>> 49  23  18
>>> 
>>> Data: New
>>> tree  shrub  grass
>>> 30  10  50
>>> 10  5030
>>> 50  3010
>>> 
>>> I am not attaching any code here as I have not been able to write
>>> anything
>>> effective! appreciate help with this!
>>> thank you,
>>> JC
>>> 
>>> --
>>> 
>>>[[alternative HTML version deleted]]
>>> 
>>> __
>>> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
>>> https://stat.ethz.ch/mailman/listinfo/r-help
>>> PLEASE do read the posting guide
>>> http://www.R-project.org/posti

Re: [R] Extract data from .nc file

2021-02-24 Thread Roy Mendelssohn - NOAA Federal via R-help
Hi Shailendra:

You didn't provide the error messages you received,  which makes it difficult 
to answer.  I will say here is at least one typo,  in:

> write.csv(amo_final, "soi.csv", row.names = FALSE)

You have only defined "soi_final". But I would also be surprised if either of 
the "cbind()" or "dataframe()" commands works,  as everything is of different 
sizes.

HTH,

-Roy

> On Feb 24, 2021, at 6:19 AM, Shailendra Pratap  
> wrote:
> 
> Hi,
> Please help me. I am trying to get information on "soi" from the NetCDF
> file (please see the attached).
> 
> The file is showing this info-
> 
> float soi(time=2001, *MCrun=20, members=100*);
>  :description = "soi";
>  :long_name = "Southern Oscillation Index";
>  :units = "";
>  :level = "sfc";
> 
> And Dimentions
> 
> 5 dimensions:
>*time  Size:2001*
>description: time
>long_name: Time
>standard_name: time
>units: *days since -01-01 00:00:00*
>calendar: noleap
>actual_range: 0
> *actual_range: 73*
>  *  MCrun  Size:20*
> 
> 
> I am not aware of *MCrun* and *members *gaven in a file. I wants to
> create csv output and ggplot (line) in form of years.
> 
> like this format---
> 
> *Longitude Latitude Year soi*
> 
> 
> Please help me, my script is here-
> 
> rm(list=ls(all=TRUE))
> library(ncdf4)
> library(ggplot2)
> library(ncdf4.helpers)
> setwd("D:/work/")
> ncfile <-nc_open("posterior_climate_indices_MCruns_ensemble_full_LMRv2.1.nc
> ")
> head(ncfile)
> lon <- ncvar_get(ncfile, "lon_npac")
> lat<- ncvar_get(ncfile, "lat_npac", verbose = F)
> tt<- ncvar_get(ncfile, "time")
> units<- ncatt_get(ncfile, "time", "units")
> soi <- ncvar_get(ncfile, "soi")
> soi_final = data.frame(cbind(lon, lat, tt, soi))
> 
> #output
> write.csv(amo_final, "soi.csv", row.names = FALSE)
> 
> #plot
> ggplot(data = soi_final, aes(x = time, y = soi)) +
>  geom_line() +
>  xlab("Year") + ylab("soi") +
>  ggtitle("SOI from 1-2001") +
>  theme_classic()
> 
> 
> thank you
> 
> wish you good health.
> 
> 
> 
> Regards
> S. Singh
> 
> posterior_climate_indices_MCruns_ensemble_full_...
> 
> 
>   [[alternative HTML version deleted]]
> 
> __
> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.

**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new street address***
110 McAllister Way
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: https://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] How to understand the mentality behind tidyverse and ggplot2?

2020-11-18 Thread Roy Mendelssohn - NOAA Federal via R-help
Personally I liked two workshops Thomas Lin Pedersen gave:

https://www.youtube.com/watch?v=h29g21z0a68
https://www.youtube.com/watch?v=0m4yywqNPVY&t=5219s

-Roy

> On Nov 18, 2020, at 3:24 PM, John via R-help  wrote:
> 
> On Tue, 17 Nov 2020 12:43:21 -0500
> C W  wrote:
> 
>> Dear R list,
>> 
>> I am an old-school R user. I use apply(), with(), and which() in base
>> package instead of filter(), select(), separate() in Tidyverse. The
>> idea of pipeline (i.e. %>%) my code was foreign to me for a while. It
>> makes the code shorter, but sometimes less readable?
>> 
>> With ggplot2, I just don't understand how it is organized. Take this
>> code:
>> 
>>> ggplot(diamonds, aes(x=carat, y=price)) +
>>> geom_point(aes(color=cut)) +  
>> geom_smooth()
>> 
>> There are three plus signs. How do you know when to "add" and what to
>> "add"? I've seen more plus signs.
>> 
>> To me, aes() stands for aesthetic, meaning looks. So, anything
>> related to looks like points and smooth should be in aes().
>> Apparently, it's not the case.
>> 
>> So, how does ggplot2 work? Could someone explain this for an
>> old-school R user?
>> 
>> Thank you!
>> 
> A really short form is to consider that ggplot2 syntax defines an
> object, and then additional simply adds to it, which is what all the
> plus signs are.  Ideally, you can start a ggplot call with a
> designation of a target:
> 
> Instead of:
> ggplot(diamonds, aes(x=carat, y=price)) + ...
> 
> use something like"
> 
> fig1 <- ggplot(diamonds, aes(x=carat, y=price)) + ...
> 
> This creates an environment object that can then be further modified.
> Learning the syntax is a chore, but the output tends to be fine,
> especially for publications and final graphics. One the other hand it's
> slower and fussier than some of the more traditional approaches, which
> are what I would prefer for EDA. 
> 
> JWDougherty
> 
> __
> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.

**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new street address***
110 McAllister Way
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: https://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] NOAA .grb2 files

2020-09-04 Thread Roy Mendelssohn - NOAA Federal via R-help
Hi Philip:

It would help if you gave the complete script you are trying to run,  and the 
name of the file.  

for those unfamiliar with all this,  Philip has already downloaded the grib2 
file,  either using rNOMADS or directly from the NOAA website,  and the 
function he is calling reads the data from that grib2 file by wrapping a system 
call to a C-program called wgrib2.  And if you don't know about grib2 files,  
they are highly compressed files of fields,  using bit-packing to do the 
compression. Forecast grib2 files usually have a plethora of fields,  for 
example since I don't know exactly which  NOAA Rapid Refresh model files he 
obtain,  looking at the inventory of one possible such file,  see:

https://www.nco.ncep.noaa.gov/pmb/products/rap/rap.t00z.awp252pgrbf00.grib2.shtml

Usually given this,  only one or a couple of fields at a time are unpacked,  
unless you have a large computer.  Since no example script was given,  it is 
impossible to tell if he is trying to read in the entire grib2 file or what.  
And each expanded array in R will be much much bigger than the equivalent in 
grib2.

-Roy



> On Sep 4, 2020, at 3:32 PM, Philip  wrote:
> 
> I’m trying to download NOAA Rapid Refresh model weather data but I keep 
> getting the error message below.  Do I just need a computer with more memory?
> 
> Philip
> 
> ***
> Error in paste(gsub("\"", "", csv.str), collapse = ",") : 
>  could not allocate memory (994 Mb) in C function 'R_AllocStringBuffer' 
>   [[alternative HTML version deleted]]
> 
> __
> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.

**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new street address***
110 McAllister Way
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: https://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Solving derivates, getting the minimum of a function, and helpful documentation of the deriv function

2020-08-29 Thread Roy Mendelssohn - NOAA Federal via R-help
Hi John:

Can I ask if this is the specific problem you are after,  or a test for  more 
general problem?  If the former,  the derivative is

 -0.0263 + 0.002 * B

so the solution for B is:

B = (0263)/0.002

If you are after a more general way fo doing this:

?solve

-Roy

> On Aug 29, 2020, at 2:15 PM, Sorkin, John  wrote:
> 
> I am trying to find the minimum of a linear function:
> 
> y <- (-0.0263*b) + (0.0010*B^2)
> 
> I am having GREAT difficulty with the documentation of the deriv function. I 
> have (after playing for two-hours) been able to get the following to work:
> 
> zoop <- deriv(expression((-0.0263*B)+(0.0010*B^2)),"B",func=TRUE)
> class(zoop)
> zoop(2)
> 
> which appears to give me the value of the derivative of my expression w.r.t. B
> (I am not certain what the func arugment does, but it appears to be necessary)
> 
> Following what one learns in calculus 1, I now need to set the derivative 
> equal to 0 and solve for B. I have no idea how to do this
> 
> Can someone point me in the right direction. Additionally can someone suggest 
> documentation for deriv that is easily intelligible to someone who wants to 
> learn how to use the function, rather that documentation that helps one who 
> is already familiar with the function. (I have a need for derivatives that is 
> beyond finding the minimum of a function)
> 
> Thank you
> John
> 
> P.S. Please don�t flame. I spent a good deal of time looking at documentation 
> and searching the internet. There may be something on line, but I clearly am 
> not using the correct search terms.
> 
> 
> 
> 
> 
> 
> 
>   [[alternative HTML version deleted]]
> 
> __
> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.

**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new street address***
110 McAllister Way
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: https://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Best settings for RStudio video recording?

2020-08-16 Thread Roy Mendelssohn - NOAA Federal via R-help
May I suggest that this discussion is best left for another time and place.  
Some people have very strong opinions about RStudio vis a vis R,  it has been 
discussed here before, shedding mostly heat and not a lot of light  (nor do I 
think anyone had their mind changed),  and worse the discussions have tended to 
move over to twitter,  where twitter mobs have gone after people whose views on 
this subject  didn't agree with theirs,  to the extent of digging up any 
transgression that person  ever committed and claiming that discredited 
anything they said about R and RStudio, rather than dealing with points made.   
 I have seen people who I have reason to believe are well-meaning,  decent 
people trying to improve R,  be tarred and feathered on this subject  (and this 
is on both sides of the discussion),   I can't believe that this helps improve 
R,  nor does it help anyone use R,  which is the main point of this mail-list.

Thanks,

-Roy
 
PS - Or we can start a discussion on the best editor to use - that should be 
good for a few flames!   :-)

> On Aug 16, 2020, at 3:39 PM, Abby Spurdle  wrote:
> 
>> a) Read about it yourself. It is a legal definition.
> 
> Not quite.
> Your statement implies some sort of universalism, which is unrealistic.
> Legal definitions vary from one legal system to the next.
> 
> I'm not an expert in US company/corporate law.
> But as I understand it, the applicable laws vary from state to state.
> 
> It's unlikely that you or most readers will interpret the original
> statement in a strict legal sense.
> But rather, the term is used to imply something.
> 
> If the criteria is:
> Sacrificing prophets (not just theirs, but their *holding/sibling
> companies too*), for some public benefit(s)...
> 
> ...then I would like to see evidence of this.
> 
>> b) Don't "correct" me with misinformation you are clearly inventing. RStudio 
>> the software does not "introduce people to a modified version of R."
> 
> Read this post:
> https://stat.ethz.ch/pipermail/r-help/2020-May/466788.html
> 
> My information is accurate, and RStudio does modify R, unless of
> course something has changed...
> 
> __
> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.

**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new street address***
110 McAllister Way
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: https://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] rNOMAD package

2020-08-13 Thread Roy Mendelssohn - NOAA Federal via R-help


Hi Philip:

Both 'ncdf4' and 'Rnetcdf' should be able to download data using OPeNDAP.  That 
the package is using OPeNDAP is transparent to the user,  other than the fact 
that the "file" is an URL.  Extracts are just like reading a netCDF file using 
these packages,  so you may have to spend some time learning how to do that.

It is possible that 'tidnyNC' can also do OPeNDAP,  I am just not certain that.

-Roy

> On Aug 13, 2020, at 8:59 AM, Philip  wrote:
> 
> Daniel Bowman wrote a wonderful package to access National Weather Service 
> data with R.
> 
> Unfortunately I stuck trying to download archived Rapid Update Forecasts 
> (RAP) going back into 2016.  I have been poking around on the Internet for 
> days but keep getting recycled to three or four websites that assume a 
> certain level of background knowledge that I don’t have.  It has something to 
> do with OPenDAP (Data Access Protocol) which is a piece of software to grab 
> data over the Internet.
> 
> Can someone give me some direction?
> 
> Thanks,
> Philip
>   [[alternative HTML version deleted]]
> 
> __
> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.

**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new street address***
110 McAllister Way
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: https://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] optim with upper and lower bounds

2020-08-11 Thread Roy Mendelssohn - NOAA Federal via R-help
Thanks to all who responded.  Will take me some time to digest it all.

-Roy


> On Aug 11, 2020, at 6:24 AM, J C Nash  wrote:
> 
> Thanks to Peter for noting that the numerical derivative part of code doesn't 
> check bounds in optim().
> I tried to put some checks into Rvmmin and Rcgmin in optimx package (they 
> were separate packages before, and
> still on CRAN), but I'm far from capturing all the places where numerical 
> derivative steps can go outside bounds.
> 
> And if you have a "production" problem where you are going to run a given 
> optimization over a lot of cases, I'd
> strongly suggest that you write your own derivative code, even if it is a 
> numerical approximation. In the case of
> a specialized derivative code e.g., part analytic, part numeric, with bounds 
> checking, I'll be willing
> to kibbitz, but suggest off-list until something is working, in which case it 
> is probably at least worth
> a vignette, as this sort of situation seems to pop up at least once a year 
> and a good example would really
> be helpful to guide the process. I'm reluctant to prepare an artificial 
> example because, well, it will be
> artificial and not capture the sort of details that have to be addressed.
> 
> Best, JN
> 
> 
> On 2020-08-11 3:48 a.m., peter dalgaard wrote:
>> This stuff is of course dependent on exactly which optimization problem you 
>> have, but optimx::optimr is often a very good drop-in replacement for optim, 
>> especially when bounds are involved (e.g., optim has an awkward habit of 
>> attempting evaluations outside the domain when numerical derivatives are 
>> needed).
>> 
>> You might want to look at the last examples in ?stats4::mle (in R 4.x.x)
>> 
>> -pd
>> 
>>> On 10 Aug 2020, at 22:08 , Roy Mendelssohn - NOAA Federal via R-help 
>>>  wrote:
>>> 
>>> I am running a lot of optimization problems, at the moment using 'optim'  
>>> ('optim' is actually called by another program).  All of the problems have 
>>> variables with simple upper and lower bounds,  which I can easily transform 
>>> into a form that is unconstrained and solve using 'BFGS'.  But I was 
>>> wondering is if it is more robust to solve the problem this way,  or to use 
>>> L-BFGS-B instead.
>>> 
>>> Also how much difference can it make using 'optimx' instead 'optim'?  The 
>>> program I am using (KFAS) allows this,  I just have to do some extra 
>>> programming to use it.
>>> 
>>> Thanks,
>>> 
>>> -Roy
>>> 
>>> 
>>> 
>>> **
>>> "The contents of this message do not reflect any position of the U.S. 
>>> Government or NOAA."
>>> **
>>> Roy Mendelssohn
>>> Supervisory Operations Research Analyst
>>> NOAA/NMFS
>>> Environmental Research Division
>>> Southwest Fisheries Science Center
>>> ***Note new street address***
>>> 110 McAllister Way
>>> Santa Cruz, CA 95060
>>> Phone: (831)-420-3666
>>> Fax: (831) 420-3980
>>> e-mail: roy.mendelss...@noaa.gov www: https://www.pfeg.noaa.gov/
>>> 
>>> "Old age and treachery will overcome youth and skill."
>>> "From those who have been given much, much will be expected" 
>>> "the arc of the moral universe is long, but it bends toward justice" -MLK 
>>> Jr.
>>> 
>>> __
>>> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
>>> https://stat.ethz.ch/mailman/listinfo/r-help
>>> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
>>> and provide commented, minimal, self-contained, reproducible code.
>> 

**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new street address***
110 McAllister Way
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: https://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] optim with upper and lower bounds

2020-08-10 Thread Roy Mendelssohn - NOAA Federal via R-help
I am running a lot of optimization problems, at the moment using 'optim'  
('optim' is actually called by another program).  All of the problems have 
variables with simple upper and lower bounds,  which I can easily transform 
into a form that is unconstrained and solve using 'BFGS'.  But I was wondering 
is if it is more robust to solve the problem this way,  or to use L-BFGS-B 
instead.

Also how much difference can it make using 'optimx' instead 'optim'?  The 
program I am using (KFAS) allows this,  I just have to do some extra 
programming to use it.

Thanks,

-Roy



**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new street address***
110 McAllister Way
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: https://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] rNOMADS Package

2020-07-28 Thread Roy Mendelssohn - NOAA Federal via R-help
When you start rNOMADS it says:

> Welcome to rNOMADS 2.4.2 "Pandaemonium Fortress"!
> Questions? Follow @rNOMADS_r on Twitter or send a message to 
> rnomads-u...@lists.r-forge.r-project.org
> 

Likely to get much more knowledgeable answers there.

-Roy

> On Jul 28, 2020, at 1:45 PM, Philip  wrote:
> 
> Don’t know if anyone out there can help me since the rNOMADS package is so 
> specialized and it requires a download of wgrib2 software from NOAA – 
> National Oceanographic and Atmospheric Administration.
> tion.
> 
> Is there a way to grab multiple forecasts with the ReadGrib function (page 33 
> of the documentation).  I’m trying to get  the rapid update forecast (RAP) on 
> the 13 km scale
> 
> 
>RAPOut <- CrawlModels(abbrev = "rap", depth = 1)
> 
>RAPParameters <- ParseModelPage(RAPOut[1])
> 
>RAPPred <- RAPParameters$pred[grep("t12z", RAPParameters$pred)]#12 Zulu is 
> 5 am in Phoenix
> 
>RAPPred2 <- RAPPred[grep(c("awp130"), RAPPred)] #picks out 13 km forecasts
> 
>RAPPred3 <- RAPPred2[grep(c("pgrbf00|pgrbf01"),RAPPred2)] #Forecast for 12 
> Z and 13 Z  
> 
>levels <- c("800 mb")
>variables <- c("TMP") 
> 
>RAPInfo <- GribGrab(RAPOut[1], RAPPred3, levels, variables) #Everything 
> works perfectly to this point.  The two requested forecasts are listed below. 
>  As you can see [[1]] is for 12 Z and [[2]] is for 13 Z – red in 18 point 
> type.
> 
> RAPInfo
> [[1]]
> [[1]]$file.name
> [1] 
> "C:\\Users\\Owner\\Documents\\Ballooning\\WeatherBriefing/rap.t12z.awp130pgrbf00.grib2.grb"
> 
> [[1]]$url
> [1] 
> "https://nomads.ncep.noaa.gov/cgi-bin/filter_rap.pl?file=rap.t12z.awp130pgrbf00.grib2&lev_800_mb=on&var_TMP=on&dir=%2Frap.20200728";
> 
> 
> [[2]]
> [[2]]$file.name
> [1] 
> "C:\\Users\\Owner\\Documents\\Ballooning\\WeatherBriefing/rap.t12z.awp130pgrbf01.grib2.grb"
> 
> [[2]]$url
> [1] 
> "https://nomads.ncep.noaa.gov/cgi-bin/filter_rap.pl?file=rap.t12z.awp130pgrbf01.grib2&lev_800_mb=on&var_TMP=on&dir=%2Frap.20200728";
> 
> 
> RAPData <- ReadGrib(RAPInfo[[2]]$file.name, levels, variables,
>domain=c(-113.20,-112.78,33.70,33.40), #an area west of 
> Phoenix
>domain.type="latlon",
>file.type="grib2")
> 
> I can grab the 12 Z forecast with [[1]] in the first line above or the 13 Z 
> forecast if I enter [[2]].  But is there a way to get both at t he same time?
> 
> Thanks,
> Philip Heinrich
>   [[alternative HTML version deleted]]
> 
> __
> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.

**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new street address***
110 McAllister Way
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: https://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] curl options?

2020-07-24 Thread Roy Mendelssohn - NOAA Federal via R-help
Found it.  On June 9,  in the R-developers mail-list,  a chain under the topic 
"SSL certificate issues".

-Roy


> On Jul 24, 2020, at 1:21 PM, Roy Mendelssohn - NOAA Federal 
>  wrote:
> 
> Thank you very much.  That indeed did work,  more specifically as (to include 
> solution for the record):
> 
>> sshInfo <- rerddap::info('hawaii_soest_f75b_adc6_12ab', url = 
>> 'https://apdrc.soest.hawaii.edu/erddap/', ssl_verifyhost=0, ssl_verifypeer=0)
> 
> BTW - my clock and date are fine, the reason it works on some systems has to 
> do with the SSL library being used, whether openSSL or libreSSL.  And as I 
> said,  similar happened to R itself for a very short time.  I can't find the 
> references to the email exchange,  but Simon Urbanek gave a very nice 
> explanation of why this was happening.  I also wonder whether it really is 
> desirable to ignore an expired Cert in the Cert search path.  Not 
> particularly knowledgeable on this,  but my offhand feeling is that could 
> lead to problems.
> 
> Thanks again for the solution!
> 
> -Roy
> 
> 
>> On Jul 23, 2020, at 4:20 PM, Rasmus Liland  wrote:
>> 
>> On 2020-07-23 14:56 -0700, Roy Mendelssohn - NOAA Federal via R-help wrote:
>>> I am trying to get the following command to work:
>>> 
>>>> sshInfo <- rerddap::info('hawaii_soest_f75b_adc6_12ab', url = 
>>>> 'https://apdrc.soest.hawaii.edu/erddap/') 
>>> 
>>> On a Mac at least (but I know for a 
>>> fact not necessarily on other OSes) I 
>>> get:
>>> 
>>>> Error in curl::curl_fetch_memory(x$url$url, handle = x$url$handle) : 
>>>> SSL certificate problem: certificate has expired
>>> 
>>> Parenthetically,  this also happened 
>>> with R itself awhile back  where an 
>>> intermediate cert on the search path 
>>> had expired,  not the cert of the 
>>> service itself,  and depending on 
>>> which implementation of ssl was used,  
>>> it either ignored it or threw an 
>>> error,  as  in this case.  Someone I 
>>> am working with on another OS can 
>>> indeed run the same command,  so if 
>>> your particular OS may not have an 
>>> issue
>> 
>> I can download the dataset without 
>> problems on my Linux thinkpad.  Perhaps 
>> you can provide some more info (but I do 
>> not know what ...) so I/others can 
>> reproduce this strange error ... 
>> 
>> Have you checked the time on the laptop 
>> being correct and not set to e.g. 
>> 1970-01-01 01:29?  In the past, I have 
>> found ssl errors are caused by the time 
>> being off on my laptop, as in the case 
>> of a flat cmos battery.
>> 
>>> Anyhow, for a problem a I am working 
>>> on I need to access this server with 
>>> that and related commands.  
>>> 'rerddap::info()' allows me to pass 
>>> curl options,  and normally with Curl 
>>> you can get around the expired 
>>> certificate using the  '-k' or 
>>> '--insecure' option.  When I look at:
>>> 
>>>> curl::curl_options()
>>> 
>>> I do not see this option. I do not 
>>> understand all the options listed 
>>> there,  so maybe that option is in a 
>>> different form that I am missing. Or 
>>> is there another way around this still 
>>> using 'curl' .  Using another function 
>>> that does similar to 'curl'  is not an 
>>> option, because I need the 
>>> 'rerddap::info()' call which calls a 
>>> package called "crul" which ultimately 
>>> calls 'curl'.
>>> 
>>> Thanks,
>>> 
>>> -Roy
>>> 
>>> PS - And yes I informed the owner of 
>>> the site about the expired 
>>> certificate,  that was a couple of 
>>> weeks ago.
>> 
>> Perhaps setting
>> 
>>  options("ssl_verifyhost"=0, "ssl_verifypeer"=0) 
>> 
>> helps? [1]
>> 
>> Best,
>> Rasmus
>> 
>> [1] 
>> https://stackoverflow.com/questions/47715918/how-to-pass-the-curl-insecure-alternative-when-using-r
> 
> **
> "The contents of this message do not reflect any position of the U.S. 
> Government or NOAA."
> **
> Roy Mendelssohn
> Supervisory Operations Research Analyst
> NOAA/NMFS
> Environmental Research Division
> Southwest Fisheries Science Center
> ***Note new street address***
&

Re: [R] curl options?

2020-07-24 Thread Roy Mendelssohn - NOAA Federal via R-help
Thank you very much.  That indeed did work,  more specifically as (to include 
solution for the record):

> sshInfo <- rerddap::info('hawaii_soest_f75b_adc6_12ab', url = 
> 'https://apdrc.soest.hawaii.edu/erddap/', ssl_verifyhost=0, ssl_verifypeer=0)

BTW - my clock and date are fine, the reason it works on some systems has to do 
with the SSL library being used, whether openSSL or libreSSL.  And as I said,  
similar happened to R itself for a very short time.  I can't find the 
references to the email exchange,  but Simon Urbanek gave a very nice 
explanation of why this was happening.  I also wonder whether it really is 
desirable to ignore an expired Cert in the Cert search path.  Not particularly 
knowledgeable on this,  but my offhand feeling is that could lead to problems.

Thanks again for the solution!

-Roy


> On Jul 23, 2020, at 4:20 PM, Rasmus Liland  wrote:
> 
> On 2020-07-23 14:56 -0700, Roy Mendelssohn - NOAA Federal via R-help wrote:
>> I am trying to get the following command to work:
>> 
>>> sshInfo <- rerddap::info('hawaii_soest_f75b_adc6_12ab', url = 
>>> 'https://apdrc.soest.hawaii.edu/erddap/') 
>> 
>> On a Mac at least (but I know for a 
>> fact not necessarily on other OSes) I 
>> get:
>> 
>>> Error in curl::curl_fetch_memory(x$url$url, handle = x$url$handle) : 
>>>  SSL certificate problem: certificate has expired
>> 
>> Parenthetically,  this also happened 
>> with R itself awhile back  where an 
>> intermediate cert on the search path 
>> had expired,  not the cert of the 
>> service itself,  and depending on 
>> which implementation of ssl was used,  
>> it either ignored it or threw an 
>> error,  as  in this case.  Someone I 
>> am working with on another OS can 
>> indeed run the same command,  so if 
>> your particular OS may not have an 
>> issue
> 
> I can download the dataset without 
> problems on my Linux thinkpad.  Perhaps 
> you can provide some more info (but I do 
> not know what ...) so I/others can 
> reproduce this strange error ... 
> 
> Have you checked the time on the laptop 
> being correct and not set to e.g. 
> 1970-01-01 01:29?  In the past, I have 
> found ssl errors are caused by the time 
> being off on my laptop, as in the case 
> of a flat cmos battery.
> 
>> Anyhow, for a problem a I am working 
>> on I need to access this server with 
>> that and related commands.  
>> 'rerddap::info()' allows me to pass 
>> curl options,  and normally with Curl 
>> you can get around the expired 
>> certificate using the  '-k' or 
>> '--insecure' option.  When I look at:
>> 
>>> curl::curl_options()
>> 
>> I do not see this option. I do not 
>> understand all the options listed 
>> there,  so maybe that option is in a 
>> different form that I am missing. Or 
>> is there another way around this still 
>> using 'curl' .  Using another function 
>> that does similar to 'curl'  is not an 
>> option, because I need the 
>> 'rerddap::info()' call which calls a 
>> package called "crul" which ultimately 
>> calls 'curl'.
>> 
>> Thanks,
>> 
>> -Roy
>> 
>> PS - And yes I informed the owner of 
>> the site about the expired 
>> certificate,  that was a couple of 
>> weeks ago.
> 
> Perhaps setting
> 
>   options("ssl_verifyhost"=0, "ssl_verifypeer"=0) 
> 
> helps? [1]
> 
> Best,
> Rasmus
> 
> [1] 
> https://stackoverflow.com/questions/47715918/how-to-pass-the-curl-insecure-alternative-when-using-r

**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new street address***
110 McAllister Way
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: https://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] curl options?

2020-07-23 Thread Roy Mendelssohn - NOAA Federal via R-help
I am trying to get the following command to work:

> sshInfo <- rerddap::info('hawaii_soest_f75b_adc6_12ab', url = 
> 'https://apdrc.soest.hawaii.edu/erddap/') 

On a Mac at least (but I know for a fact not necessarily on other OSes) I get:

> Error in curl::curl_fetch_memory(x$url$url, handle = x$url$handle) : 
>   SSL certificate problem: certificate has expired

Parenthetically,  this also happened with R itself awhile back  where an 
intermediate cert on the search path had expired,  not the cert of the service 
itself,  and depending on which implementation of ssl was used,  it either 
ignored it or threw an error,  as  in this case.  Someone I am working with on 
another OS can indeed run the same command,  so if your particular OS may not 
have an issue

Anyhow, for a problem a I am working on I need to access this server with that 
and related commands.  'rerddap::info()' allows me to pass curl options,  and 
normally with Curl you can get around the expired certificate using the  '-k' 
or '--insecure' option.  When I look at:

> curl::curl_options()

I do not see this option. I do not understand all the options listed there,  so 
maybe that option is in a different form that I am missing. Or is there another 
way around this still using 'curl' .  Using another function that does similar 
to 'curl'  is not an option, because I need the 'rerddap::info()' call which 
calls a package called "crul" which ultimately calls 'curl'.

Thanks,

-Roy

PS - And yes I informed the owner of the site about the expired certificate,  
that was a couple of weeks ago.

**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new street address***
110 McAllister Way
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: https://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] National Weather Service Data

2020-07-08 Thread Roy Mendelssohn - NOAA Federal via R-help
I would suggest looking at the NOMADS page for information on what is available 
through NOMADS:

https://nomads.ncep.noaa.gov

-Roy


> On Jul 8, 2020, at 8:19 AM, Philip  wrote:
> 
> Thanks again for confirming that the wgrib2 software loaded correctly.  I 
> have been making good progress finding variables related to low level winds 
> such as the HGT series that, as you know, converts millibars to altitude MSL.
> 
> The next step is to look at Rapid Refresh (RAP).  Can you direct me to 
> someplace that has the names of the forecast models - equivalent to gfs_0p50 
> for the Global Forecast System?
> 
> Thanks.
> 
> -Original Message----- From: Roy Mendelssohn - NOAA Federal
> Sent: Monday, July 6, 2020 6:35 PM
> To: Philip
> Subject: Re: [R] National Weather Service Data
> 
> Skimming the docs seems to assume a lot of knowledge of the data.  The best I 
> can see there are two temperature variables:
> 
>> tmpsfc (surface air temperature, K)
>> tmp2m (air temperature at 2m, K)
> 
> and one relative humidity:
> 
>> rh2m (relative humidity at 2m, %)
> 
> Depending on what you are after,  you might find it easier to use the UAF 
> ERDDAP server (https://upwell.pfeg.noaa.gov/erddap/griddap/ncep_global.html) 
> and 'rerddap' to get the data.  'rerddap' will read in the data into a nice 
> tibble,  I also think the documentation is clearer with some nice vignettes 
> and the ability to get info about the data.  For example the command
> 
>> rerddap::info('ncep_global')
> 
> returns:
> 
> 
>>  ncep_global
>> Base URL: https://upwell.pfeg.noaa.gov/erddap/
>> Dimensions (range):
>> time: (2011-05-06T12:00:00Z, 2020-07-08T12:00:00Z)
>> latitude: (-90.0, 90.0)
>> longitude: (0.0, 359.5)
>> Variables:
>> dlwrfsfc:
>> Units: W m-2
>> dswrfsfc:
>> Units: W m-2
>> pratesfc:
>> Units: kg m-2 s-1
>> prmslmsl:
>> Units: Pa
>> rh2m:
>> Units: %
>> tmp2m:
>> Units: K
>> tmpsfc:
>> Units: K
>> ugrd10m:
>> Units: m s-1
>> vgrd10m:
>> Units: m s-1
> 
> which would have answered a lot of your questions.  If you had saved that 
> command to a variable  there would be a lot more information,  that is just 
> the summary.
> 
> HTH,
> 
> -Roy
> 
>> 
>> On Jul 6, 2020, at 5:47 PM, Philip  wrote:
>> 
>> Thanks for getting back to me.  It is good to know that I am on the right 
>> track.
>> 
>> I understand now that the output byte location of the data in the grib2 file 
>> not the actual data which in this case would be the 2 am forecast six hours 
>> into the future.  Can you advise me which of the examples in Dr. Bowman's 
>> rNOMADS documentation will get me the tem[perature and relative humidity 
>> data?
>> 
>> Philip.
>> 
>> -Original Message- From: Roy Mendelssohn - NOAA Federal
>> Sent: Monday, July 6, 2020 10:43 AM
>> To: Philip
>> Cc: stephen sefick ; r-help
>> Subject: Re: [R] National Weather Service Data
>> 
>> Hi Philip:
>> 
>> Results look correct to me.  This might help you:
>> 
>> https://www.cpc.ncep.noaa.gov/products/wesley/wgrib2/default_inv.html
>> 
>> -Roy
>> 
>> 
>>> On Jul 6, 2020, at 9:29 AM, Philip  wrote:
>>> 
>>> I am trying to access National Weather Service forecasting data through the 
>>> rNOMADS package.  I’m not sure if the Weather Service software – grib2 – 
>>> loaded correctly.  Second, some of the examples in the rNOMADS 
>>> documentation seem to run correctly but I’m not sure what the output means. 
>>>  Any anvise would be greatly appreciated.
>>> 
>>> 1 - I tried to load the wgrib2 software from instructions in the following 
>>> website:
>>> 
>>> https://bovineaerospace.wordpress.com/2015/04/26/how-to-install-rnomads-with-grib-file-support-on-windows/
>>> 
>>> 2 – the instructions say that if it loaded correctly I should get a laundry 
>>> list similar to what is below from the command:  >system(“wgrib2”).
>>> 
>>> 
>>> 
>>> The list I get looks different.  Below is the first 20 or so entries. How 
>>> can I check to see if the wgrib2 loaded correctly?
>>> 
>>> wgrib2 v0.1.9.9 9/2013 Wesley Ebisuzaki, Reinoud Bokhorst, Jaakko Hyvätti, 
>>> Dusan Jovic, Kristian Nilssen, Karl Pfeiffer, Pablo Romero, Manfred 
>>> Schwarb, Arlindo da S

Re: [R] National Weather Service Data

2020-07-06 Thread Roy Mendelssohn - NOAA Federal via R-help
Hi Philip:

Results look correct to me.  This might help you:

https://www.cpc.ncep.noaa.gov/products/wesley/wgrib2/default_inv.html

-Roy


> On Jul 6, 2020, at 9:29 AM, Philip  wrote:
> 
> I am trying to access National Weather Service forecasting data through the 
> rNOMADS package.  I’m not sure if the Weather Service software – grib2 – 
> loaded correctly.  Second, some of the examples in the rNOMADS documentation 
> seem to run correctly but I’m not sure what the output means.  Any anvise 
> would be greatly appreciated.
> 
> 1 - I tried to load the wgrib2 software from instructions in the following 
> website:
> 
>
> https://bovineaerospace.wordpress.com/2015/04/26/how-to-install-rnomads-with-grib-file-support-on-windows/
> 
> 2 – the instructions say that if it loaded correctly I should get a laundry 
> list similar to what is below from the command: >system(“wgrib2”).
> 
> 
> 
> The list I get looks different.  Below is the first 20 or so entries.  How 
> can I check to see if the wgrib2 loaded correctly?
> 
> wgrib2 v0.1.9.9 9/2013 Wesley Ebisuzaki, Reinoud Bokhorst, Jaakko Hyvätti, 
> Dusan Jovic, Kristian Nilssen, Karl Pfeiffer, Pablo Romero, Manfred Schwarb, 
> Arlindo da Silva, Niklas Sondell, Sergey Varlamov
> -0xSec   inv  X  Hex dump of section X (0..8)
> -MM  inv reference time MM
> -N_ens   inv number of ensemble members
> -RT  inv type of reference Time
> -S   inv simple inventory with minutes and seconds 
> (subject to change)
> -Sec0inv contents of section0
> -Sec3inv contents of section 3 (Grid Definition Section)
> -Sec4inv Sec 4 values (Product definition section)
> -Sec5inv Sec 5 values (Data representation section)
> -Sec6inv show bit-map section
> -Sec_len inv length of various grib sections
> -T   inv reference time MMDDHHMMSS
> -V   inv diagnostic output
> -VT  inv verf time = reference_time + forecast_time 
> (MMDDHHMMSS)
> -YY  inv reference time 
> 
> 3 – As I said, some of the documentation examples work and for some I get 
> error messages.  Below is an example of one that seemed to work but I don’t 
> understand the output.
> 
> #GribInfo - page 20
> urlsOut <- CrawlModels(abbrev="gfs_0p50",depth=2)
> ModelParameters <- ParseModelPage(urlsOut[2])#[1] is most recent model
> MyPred <- ModelParameters$pred[grep("06$",ModelParameters$pred)]
>   Levels <- c("2_m_above_ground","800_mb")
>   Variables <- c("TMP","RH")
>   GribInfo <- GribGrab(urlsOut[2],MyPred,Levels,Variables)
> GribInv <- GribInfo(GribInfo[[1]]$file.name,"grib2")
> 
> The command GribInv$inventory returns:
> 
> $inventory
> [1] "1:0:d=2020070606:TMP:800 mb:6 hour fcst:"
> "2:148450:d=2020070606:RH:800 mb:6 hour fcst:"   
> [3] "3:414132:d=2020070606:TMP:2 m above ground:6 hour fcst:" 
> "4:571266:d=2020070606:RH:2 m above ground:6 hour fcst:" 
> 
> This is supposed to be temperature and relative humidity 2 meters above the 
> ground and at 800 milibars for 2020 – July 6 – at ZULU time 0600.  But I have 
> no idea what the numbers 414132 – second line – mean.
> 
> Any advise would be greatly appreciated.
> 
> Philip Heinrich
> 
> From: stephen sefick 
> Sent: Thursday, July 2, 2020 3:20 PM
> To: Philip 
> Cc: r-help 
> Subject: Re: [R] National Weather Service Data
> 
> I am unfamiliar with Rnomads. Could you provide a minimal reproducable 
> example? You are more likely to receive help this way.
> 
> 
> On Thu, Jul 2, 2020, 18:06 Philip  wrote:
> 
>  Is anyone out there familiar with rNOMADS?  It is a package to get into 
> National Weather Service forecasting data with R?
> 
>  I'm not sure the Weather Service software named wgrib2 loaded correctly 
> because some of the stuff won't run and I can't make much sense out of some 
> of the output.
> 
>  Thanks.
>  [[alternative HTML version deleted]]
> 
>  __
>  R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
>  https://stat.ethz.ch/mailman/listinfo/r-help
>  PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
>  and provide commented, minimal, self-contained, reproducible code.
> 
>   [[alternative HTML version deleted]]
> 
> __
> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.

**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental 

Re: [R] Package httr::GET() question

2020-02-22 Thread Roy Mendelssohn - NOAA Federal via R-help
Thanks.  Yes that is what I found out.  We are switching our web service to 
require strict encoding  (a lot of services got caught when Apache Tomcat made 
a similar switch)  and are trying to see if it that will break any of our R 
packages before we make the switch,

As always,  appreciate all the people who took time to respond.

-Roy
> On Feb 22, 2020, at 5:23 AM, Bob Rudis  wrote:
> 
> curl::curl_escape() —
> https://github.com/jeroen/curl/search?q=curl_escape&unscoped_q=curl_escape
> — uses the underlying libcurl curl_easy_escape() which does proper
> escaping b/c it's, well, curl.
> 
> {httr} uses curl::curl_escape() —
> https://github.com/r-lib/httr/search?q=curl_escape&unscoped_q=curl_escape
> 
> The use it's `url-query.r` is the function compose_query().
> 
> compose_query()  is called from build_url() in url.r. handle_url()
> (from handle-url.r) uses build_url().
> 
> All the "verbs" use handle_url() —
> https://github.com/r-lib/httr/search?q=handle_url&unscoped_q=handle_url
> 
> So {httr} relies on the quintessential standard in URL escaping —
> which is libcurl's — for all URL machinations.
> 
> -boB
> 
> On Wed, Feb 19, 2020 at 10:36 AM Roy Mendelssohn - NOAA Federal via
> R-help  wrote:
>> 
>> Thanks.  Yes.  I did that,  it also has a verbose mode so that I could see 
>> what it was doing.  What I needed was not just escaping but strict escaping. 
>>  My memory forma number of years back was that I had issues with urlencode 
>> from base not being strict.  And of course you don't what to encode twice.
>> 
>> Thanks,
>> 
>> -Roy
>> 
>> 
>>> On Feb 19, 2020, at 7:08 AM, Ben Tupper  wrote:
>>> 
>>> Hi,
>>> 
>>> Perhaps you could test it out by using httr::GET() with and without
>>> escaping using xml2::url_escape()?
>>> 
>>> https://www.rdocumentation.org/packages/xml2/versions/1.2.2/topics/url_escape
>>> 
>>> Cheers,
>>> Ben
>>> 
>>> On Tue, Feb 18, 2020 at 1:29 PM Roy Mendelssohn - NOAA Federal via
>>> R-help  wrote:
>>>> 
>>>> Hi All:
>>>> 
>>>> I hav been trying to go through the code for httr::GET() but it is 
>>>> somewhat beyond what I know.  What I am trying to find out is if all urls 
>>>> are automatically percent encoded,  or whether the user needs to do that.
>>>> 
>>>> Thanks,
>>>> 
>>>> -Roy
>>>> 
>>>> **
>>>> "The contents of this message do not reflect any position of the U.S. 
>>>> Government or NOAA."
>>>> **
>>>> Roy Mendelssohn
>>>> Supervisory Operations Research Analyst
>>>> NOAA/NMFS
>>>> Environmental Research Division
>>>> Southwest Fisheries Science Center
>>>> ***Note new street address***
>>>> 110 McAllister Way
>>>> Santa Cruz, CA 95060
>>>> Phone: (831)-420-3666
>>>> Fax: (831) 420-3980
>>>> e-mail: roy.mendelss...@noaa.gov www: https://www.pfeg.noaa.gov/
>>>> 
>>>> "Old age and treachery will overcome youth and skill."
>>>> "From those who have been given much, much will be expected"
>>>> "the arc of the moral universe is long, but it bends toward justice" -MLK 
>>>> Jr.
>>>> 
>>>> __
>>>> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
>>>> https://stat.ethz.ch/mailman/listinfo/r-help
>>>> PLEASE do read the posting guide 
>>>> http://www.R-project.org/posting-guide.html
>>>> and provide commented, minimal, self-contained, reproducible code.
>>> 
>>> 
>>> 
>>> --
>>> Ben Tupper
>>> Bigelow Laboratory for Ocean Science
>>> West Boothbay Harbor, Maine
>>> http://www.bigelow.org/
>>> https://eco.bigelow.org
>> 
>> **
>> "The contents of this message do not reflect any position of the U.S. 
>> Government or NOAA."
>> **
>> Roy Mendelssohn
>> Supervisory Operations Research Analyst
>> NOAA/NMFS
>> Environmental Research Division
>> Southwest Fisheries Science Center
>> ***Note new street address***
>> 110 McAllister Way
>> Santa Cruz, CA 95060
>> Phone: (831)-420-3666
>> Fax: (831) 420-3980
>> e-mail: roy.mendelss...@noaa.gov www: https://www.pfeg.noaa

Re: [R] Package httr::GET() question

2020-02-19 Thread Roy Mendelssohn - NOAA Federal via R-help
Thanks.  Yes.  I did that,  it also has a verbose mode so that I could see what 
it was doing.  What I needed was not just escaping but strict escaping.  My 
memory forma number of years back was that I had issues with urlencode from 
base not being strict.  And of course you don't what to encode twice.

Thanks,

-Roy


> On Feb 19, 2020, at 7:08 AM, Ben Tupper  wrote:
> 
> Hi,
> 
> Perhaps you could test it out by using httr::GET() with and without
> escaping using xml2::url_escape()?
> 
> https://www.rdocumentation.org/packages/xml2/versions/1.2.2/topics/url_escape
> 
> Cheers,
> Ben
> 
> On Tue, Feb 18, 2020 at 1:29 PM Roy Mendelssohn - NOAA Federal via
> R-help  wrote:
>> 
>> Hi All:
>> 
>> I hav been trying to go through the code for httr::GET() but it is somewhat 
>> beyond what I know.  What I am trying to find out is if all urls are 
>> automatically percent encoded,  or whether the user needs to do that.
>> 
>> Thanks,
>> 
>> -Roy
>> 
>> **
>> "The contents of this message do not reflect any position of the U.S. 
>> Government or NOAA."
>> **
>> Roy Mendelssohn
>> Supervisory Operations Research Analyst
>> NOAA/NMFS
>> Environmental Research Division
>> Southwest Fisheries Science Center
>> ***Note new street address***
>> 110 McAllister Way
>> Santa Cruz, CA 95060
>> Phone: (831)-420-3666
>> Fax: (831) 420-3980
>> e-mail: roy.mendelss...@noaa.gov www: https://www.pfeg.noaa.gov/
>> 
>> "Old age and treachery will overcome youth and skill."
>> "From those who have been given much, much will be expected"
>> "the arc of the moral universe is long, but it bends toward justice" -MLK Jr.
>> 
>> __
>> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
>> https://stat.ethz.ch/mailman/listinfo/r-help
>> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
>> and provide commented, minimal, self-contained, reproducible code.
> 
> 
> 
> -- 
> Ben Tupper
> Bigelow Laboratory for Ocean Science
> West Boothbay Harbor, Maine
> http://www.bigelow.org/
> https://eco.bigelow.org

**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new street address***
110 McAllister Way
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: https://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Package httr::GET() question

2020-02-18 Thread Roy Mendelssohn - NOAA Federal via R-help
Hi All:

 I hav been trying to go through the code for httr::GET() but it is somewhat 
beyond what I know.  What I am trying to find out is if all urls are 
automatically percent encoded,  or whether the user needs to do that.

Thanks,

-Roy

**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new street address***
110 McAllister Way
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: https://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] [SPAM] Re: The "--slave" option

2019-09-21 Thread Roy Mendelssohn - NOAA Federal via R-help
Please All:

While as I said in my first post I am still not convinced that the OP was in 
good faith to improve R and not a troll  (yours to decide), I also don't think 
attacking a person's research to counter a point that has nothing to do with 
their research is what is wanted on this mail-list.  There is one very simple 
alternative - don't reply.

Ben - members of R-core do read this mail-list,  and the fact that not a single 
one has replied probably tells you what you need to know.

-Roy


> On Sep 21, 2019, at 3:56 PM, Abby Spurdle  wrote:
> 
> (excerpts only)
>> slavery being easily justified by the Bible while abolition is not is an 
>> experience.
>> P.S. Do any R developers actually read this?
> 
> I've read one or two verses...
> 
> I also found this (by you):
> https://www.ncbi.nlm.nih.gov/pubmed/20362542
> 
> Which uses embryonic stem cells.
> I recognize that they're mouse embryos.
> However, your article cites at least five other articles (probably, a
> lot more), that use human embryonic stem cells.
> 
> You complain about slavery (that doesn't exist), and then prompte
> murder (which does exist).
> What does that say about you...
> 
> And that's ignoring the way you treat animals
> We slice and dice data, you slice and dice living creatures.
> 
> Here's two songs about freedom, if you have ears to hear:
> https://youtu.be/lKw6uqtGFfo
> https://youtu.be/HAIdo707Sac

**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new street address***
110 McAllister Way
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: https://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] [SPAM] Re: The "--slave" option

2019-09-19 Thread Roy Mendelssohn - NOAA Federal via R-help
Hi Ben:

Without commenting one way or another on your point,  your initial post seemed 
a lot like trolling because of:

> Let me reiterate that it is 2019, i.e. "The Future", rather than 1970 when
> R was presumably developed, based on its atrocious syntax, documentation
> and usability (I think I only need to say "NaN", "NULL", and "NA").
> 

You are certainly welcome to your opinions about R,  but these comments are 
totally aside from what I assume is your main point,  and because of this my 
first reaction was don't feed the trolls.

My $0.02.

-Roy

> On Sep 19, 2019, at 2:51 AM, Benjamin Lang  wrote:
> 
> Dear Richard,
> 
> Thank you, that’s interesting. There is also something called an 
> “etymological fallacy”. I think current usage is more useful here than the 
> “science of truth”, i.e. the Ancient Greek idea that the (sometimes inferred) 
> derivation of a word allows us to grasp “the truth of it”. 
> 
> In current usage, a “server” is someone who brings you dishes in a 
> restaurant. A “client” is a customer. A “slave” is a human being forced to 
> perform work under duress and considered nothing more than a machine, say a 
> dishwasher or a tractor. And in some regions, this echoes on and is offensive 
> and hurtful to some.
> 
> A new user, wanting to reduce output from R, would probably reach for “-q” or 
> “—quiet”. This makes sense in the same way that “—stentorian” is not a good 
> alternative to “—verbose”. 
> 
> Best,
> Ben
> 
>> On 19 Sep 2019, at 10:55, Richard O'Keefe  wrote:
>> 
>> One of my grandfathers was from Croatia.  Guess what the word "slave" is 
>> derived
>> from?  That's right, Slavs.  This goes back to the 9th century.  And then of 
>> course
>> my grandfather's people were enslaved by the Ottoman empire, which was only 
>> defeated
>> a little over a hundred years ago.  My other grandfather was from the 
>> British isles,
>> where to this day followers of the same prophet are enslaving people like me
>> (except for being female).  So I'm sorry, but I'm not impressed.
>> 
>> How many computers are "servers"?  There's that whole client-server thing.
>> Guess what "server" comes from?  That's right, the Latin word "servus", which
>> means guess what?  You got it again: "slave".  Are we to abolish the word
>> "server"?  What about the word "client"?  Ah, that's part of the 
>> client-patron
>> system from Rome, so what about the patriarchy, eh?
>> 
>> We are dealing with something called "the genetic fallacy".
>> "The genetic fallacy (also known as the fallacy of origins ...)
>> is a fallacy of irrelevance that is based solely on someone's
>> or something's history, origin, or source rather than its
>> current meaning or context."  (Wikipedia.)
>> 
>> Context matters.
>> 
>> 
>> 
>>> On Thu, 19 Sep 2019 at 17:10, Abby Spurdle  wrote:
 Personally I much prefer backwards compatibility to political correctness.
>>> 
>>> I agree with Rolf, here.
>>> And as someone that's planning to write a Linux Terminal Emulator, in
>>> the medium-term future, I *strongly* defend this approach.
>>> 
>>> And to the original poster.
>>> Haven't you seen The Matrix?
>>> (Second best movie ever, after the Shawshank Redemption).
>>> 
>>> I would prefer the technology to be my slave, than I be a
>>> prisoner/slave to the technology.
>>> 
>>> __
>>> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
>>> https://stat.ethz.ch/mailman/listinfo/r-help
>>> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
>>> and provide commented, minimal, self-contained, reproducible code.
> 
>   [[alternative HTML version deleted]]
> 
> __
> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.

**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new street address***
110 McAllister Way
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: https://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] [FORGED] Re: Regarding R licensing usage guidance

2019-07-24 Thread Roy Mendelssohn - NOAA Federal via R-help
Hi Rolf:

As they say,  do read the posting guide:

> Good manners: Remember that customs differ. Some people are very direct. 
> Others surround everything they say with hedges and apologies. Be tolerant. 
> Rudeness is never warranted, but sometimes `read the manual’ is the 
> appropriate response. Don’t waste time discussing such matters on the list. 
> Ad hominem comments are absolutely out of place.


-Roy


> 
> On Jul 24, 2019, at 2:49 PM, Rolf Turner  wrote:
> 
> 
> On 25/07/19 4:36 AM, Weiwen Ng, MPH wrote:
> 
>> Here's one way to phrase your reply:
>> "I'd recommend you search Google. For example, the search string
>> "proprietary use GPL" produces one hit that's clearly relevant to you:
> 
> 
> 
>> This method is more neutrally worded. It doesn't insult the original
>> poster. It doesn't assume the poster had bad intent.
>> Instead, you chose to phrase it thus:
>> "Your internet skills are pathetic. Search Google for "proprietary use gpl"
>> and the first hit is ...  Note that there are (at least) three obvious
>> alternatives if there is any question in your case ...   I think your
>> desperation to steal the hard work of the various R contributors seems
>> quite odious."
>> Think about the overall tone of your post. Consider also that someone who
>> agrees with you substantive argument said that your comments were "often
>> (almost always?) a bit rough about the edges."
> 
> Yeah, but Jeff's rough-about-the-edges phrasing is much more colourful, and 
> colourful is *GOOD*.  There is far too much bland "S. We *mustn't* offend 
> anybody" content in current discourse.  Tell it like it is!  Ripley into 
> people!  If the recipient can't take the heat, he or she should get out of 
> the kitchen!
> 
> See also fortunes::fortune(87).
> 
> cheers,
> 
> Rolf Turner
> 
> P.S.  Jeff makes a huge and extremely useful contribution to R-help.  He 
> gives generously of time and effort to solve beginners' problems.  They 
> should appreciate the time and effort and not whinge about being offended.
> 
> R. T.
> 
> -- 
> Honorary Research Fellow
> Department of Statistics
> University of Auckland
> Phone: +64-9-373-7599 ext. 88276
> 
> __
> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.

**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new street address***
110 McAllister Way
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: http://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Help with a third ggplot error

2019-06-15 Thread Roy Mendelssohn - NOAA Federal via R-help
If I were a betting man I would bet that one of the things in your "pipeline" 
isn't returning what you think it is.  You can either break it out step by step 
to check or this page lists a variety of resources to debug pipes:

https://www.rostrum.blog/2019/04/07/fix-leaky-pipes/

HTH,

-Roy


> On Jun 15, 2019, at 12:29 PM, Bill Poling  wrote:
> 
> Hello Richard, thank you for your response.
> 
> Here is what I get with your suggestion, however, I am unsure of what I am 
> looking at, perhaps you can interpret.
> 
> I sure appreciate your help Sir!
> 
> g4 <- fcast_arima_tbl1 %>%
>  ggplot(aes(date, NetEditRev, color = key)) + #Now date due to sweep
>  geom_point(data = test_tbl %>% mutate(key = "actual")) +
>  geom_point(alpha = 0.5) +
>  theme_tq() +
>  scale_color_tq() +
>  labs(title = "ARIMA(1,0,2)(0,1,0 with Drift For Net Edit Revenue")
> 
> g4
> 
> #Don't know how to automatically pick scale for object of type function. 
> Defaulting to continuous.
> #Error: All columns in a tibble must be 1d or 2d objects:
> #* Column `x` is function
> 
> ggplotly(g4) %>%
>  layout(xaxis = list(rangeslider = list(type = "date")))
> 
> #Don't know how to automatically pick scale for object of type function. 
> Defaulting to continuous.
> #Error: All columns in a tibble must be 1d or 2d objects:
> #* Column `x` is function
> 
> #Call
>  rlang::last_error()
> #to see a backtrace you will have to figure out which of your variables is 
> not properly specified.
> 
>  1. (function (x, ...) ...
>  2. ggplot2:::print.ggplot(x)
>  4. ggplot2:::ggplot_build.ggplot(x)
>  5. ggplot2:::by_layer(function(l, d) l$compute_aesthetics(d, plot))
>  6. ggplot2:::f(l = layers[[i]], d = data[[i]])
>  7. l$compute_aesthetics(d, plot)
>  8. ggplot2:::f(..., self = self)
>  9. ggplot2:::as_gg_data_frame(evaled)
> 12. tibble:::as_tibble.list(x)
> 13. tibble:::lst_to_tibble(x, .rows, .name_repair, col_lengths(x))
> 14. tibble:::check_valid_cols(x)
> 
> From: Richard M. Heiberger 
> Sent: Saturday, June 15, 2019 3:17 PM
> To: Bill Poling 
> Cc: r-help (r-help@r-project.org) 
> Subject: Re: [R] Help with a third ggplot error
> 
> you did something like this:
> 
>> mydf <- data.frame(y=1:16,
> + AA=rep(factor(letters[1:8]), 2),
> + BB=rep(factor(LETTERS[12:13]), each=8),
> + CC=rep(factor(rep(LETTERS[9:11], times=c(3,1,4))), 2))
>> ggplot(mydf, aes(ls, y))
> Don't know how to automatically pick scale for object of type
> function. Defaulting to continuous.
> Error: All columns in a tibble must be 1d or 2d objects:
> * Column `x` is function
> Call `rlang::last_error()` to see a backtrace
>> 
> 
> you will have to figure out which of your variables is not properly specified.
> 
> Rich
> 
> On Fri, Jun 14, 2019 at 3:30 PM Bill Poling  
> wrote:
>> 
>> #RStudio Version 1.2.1335
>> sessionInfo()
>> #R version 3.5.3 (2019-03-11)
>> #Platform: x86_64-w64-mingw32/x64 (64-bit)
>> #Running under: Windows >= 8 x64 (build 9200)
>> 
>> Hello I am fitting an Arima model and all appears to go well until I get to 
>> the ggplot, (again, lots of laughs).
>> Deja Vu all over again! (God I hope it's not a typo!)
>> 
>> The error at the point of the plot is:
>> # Don't know how to automatically pick scale for object of type function. 
>> Defaulting to continuous.
>> # Error: All columns in a tibble must be 1d or 2d objects:
>> # * Column `x` is function
>> 
>> I hope someone recognizes my problem.
>> 
>> Thank you for any assistance.
>> 
>> #Here is the code and particulars of the data being plotted
>> 
>> #Fit the arima model
>> 
>> fit_arima2 <- train_tbl %>%
>> tk_ts(select = NetEditRev, frequency = 364) %>%
>> Arima(order = c(1,0,2),
>> seasonal=c(0,1,0),
>> include.drift = TRUE)
>> 
>> #Forecast with Sweep Functions
>> 
>> fcast_arima_tbl <- forecast(fit_arima2, h = nrow(test_tbl)) %>%
>> sw_sweep(timetk_idx = TRUE, rename_index = "date")
>> 
>> #Save the DF
>> 
>> fs::dir_create("00_model")
>> 
>> fcast_arima_tbl %>% write_rds("00_model/fcast_arima_tbl.rds")
>> 
>> fcast_arima_tbl1 <- read_rds("00_model/fcast_arima_tbl.rds")
>> 
>> head(fcast_arima_tbl1)
>> 
>> # A tibble: 6 x 7
>> date key NetEditRev lo.80 lo.95 hi.80 hi.95
>>   
>> 1 2017-01-01 actual -923. NA NA NA NA
>> 2 2017-01-02 actual 19222. NA NA NA NA
>> 3 2017-01-03 actual -8397. NA NA NA NA
>> 4 2017-01-04 actual 37697. NA NA NA NA
>> 5 2017-01-05 actual 46075. NA NA NA NA
>> 6 2017-01-06 actual 38329. NA NA NA NA
>> 
>> str(fcast_arima_tbl1)
>> Classes 'tbl_df', 'tbl' and 'data.frame':892 obs. of 7 variables:
>> $ date : Date, format: "2017-01-01" "2017-01-02" "2017-01-03" "2017-01-04" 
>> ...
>> $ key : chr "actual" "actual" "actual" "actual" ...
>> $ NetEditRev: num -923 19222 -8397 37697 46075 ...
>> $ lo.80 : num NA NA NA NA NA NA NA NA NA NA ...
>> $ lo.95 : num NA NA NA NA NA NA NA NA NA NA ...
>> $ hi.80 : num NA NA NA NA NA NA NA NA NA NA ...
>> $ hi.95 : num NA NA NA NA NA NA NA NA NA NA ...
>> 
>> #Plot the model
>> 
>> g4 <- fcast_arima_tbl1 %>

Re: [R] gganimate: A Grammar of Animated Graphics

2019-06-07 Thread Roy Mendelssohn - NOAA Federal via R-help
There may be other ways but you can store the animation in an object and use 
the animate() function.

-Roy

> On Jun 7, 2019, at 7:31 PM,  
>  wrote:
> 
> R-Help Forum
> 
> 
> 
> I've been exploring the gganimate package and  am wondering how one might
> adjust the animation speed?
> 
> 
> 
> Jeff Reichman
> 
> 
> 
> 
>   [[alternative HTML version deleted]]
> 
> __
> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.

**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new street address***
110 McAllister Way
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: http://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Zoom In/Out maps library

2019-03-06 Thread Roy Mendelssohn - NOAA Federal via R-help
Also,  I forgot that tmap can do interactive maps, see:

https://geocompr.robinlovelace.net/adv-map.html#interactive-maps

-Roy

> On Mar 6, 2019, at 2:48 PM, Roy Mendelssohn - NOAA Federal 
>  wrote:
> 
> see https://r-spatial.github.io/mapview/index.html
> 
> The main thing is the data types that map view supports,  so you must have a 
> raster or an spatial object like an "sf" object.   So points would have to 
> also be an sf object and the two combined  (sf has commands to do this) or 
> perhaps you can do ti directly in mapview,  I haven't played with it much.
> 
> The plotly example I sent works with ggplot2,  so if you know how to build up 
> the map in ggplot2 you can try that,  though again as far as I can see plotly 
> for maps needs sf objects.
> 
> Mainly sent you the links to get you started.  I haven't played with either 
> much,  just know that they exist and zoom maps.
> 
> HTH,
> 
> -Roy
> 
>> On Mar 6, 2019, at 2:36 PM,  
>>  wrote:
>> 
>> Roy
>> 
>> Thank you - that's helpful.  Going to have to read up on sf and mapview
>> library. Those are new ones.  Then to add a point feature layer (lat/long)
>> where would I insert that?
>> 
>> Library(maps)
>> Library(sf) # simple features
>> Library(mapview)
>> 
>> world.map <- maps::map("world", plot = FALSE, fill = TRUE) 
>> p <- sf::st_as_sf(world.map, coords = c('x', 'y')) 
>> mapview::mapview(p, legend=FALSE)
>> 
>> 
>> 
>> -Original Message-
>> From: rmendelss gmail  
>> Sent: Wednesday, March 6, 2019 4:11 PM
>> To: reichm...@sbcglobal.net
>> Cc: R help Mailing list 
>> Subject: Re: [R] Zoom In/Out maps library
>> 
>> world.map <- maps::map("world", plot = FALSE, fill = TRUE) p <- sf::
>> st_as_sf(world.map, coords = c('x', 'y')) map view::map view(p)
>> 
>> HTH,
>> 
>> -Roy
>> 
>>> On Mar 6, 2019, at 1:44 PM, reichm...@sbcglobal.net wrote:
>>> 
>>> R Help
>>> 
>>> Anyone know if I can add a zoom In/Out function to the maps available via
>> the "maps" library? Or do I need to use a different mapping library?
>>> 
>>> world.map <- map_data("world")
>>> 
>>> ggplot(data = world.map) +
>>> geom_polygon(mapping = aes(x=long, y=lat, group=group))
>>> 
>>> Jeff
>>> 
>>> __
>>> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see 
>>> https://stat.ethz.ch/mailman/listinfo/r-help
>>> PLEASE do read the posting guide 
>>> http://www.R-project.org/posting-guide.html
>>> and provide commented, minimal, self-contained, reproducible code.
>> 
>> __
>> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
>> https://stat.ethz.ch/mailman/listinfo/r-help
>> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
>> and provide commented, minimal, self-contained, reproducible code.
> 
> 
> 
> **
> "The contents of this message do not reflect any position of the U.S. 
> Government or NOAA."
> **
> Roy Mendelssohn
> Supervisory Operations Research Analyst
> NOAA/NMFS
> Environmental Research Division
> Southwest Fisheries Science Center
> ***Note new street address***
> 110 McAllister Way
> Santa Cruz, CA 95060
> Phone: (831)-420-3666
> Fax: (831) 420-3980
> e-mail: roy.mendelss...@noaa.gov www: http://www.pfeg.noaa.gov/
> 
> "Old age and treachery will overcome youth and skill."
> "From those who have been given much, much will be expected" 
> "the arc of the moral universe is long, but it bends toward justice" -MLK Jr.
> 

**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new street address***
110 McAllister Way
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: http://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Zoom In/Out maps library

2019-03-06 Thread Roy Mendelssohn - NOAA Federal via R-help
see https://r-spatial.github.io/mapview/index.html

The main thing is the data types that map view supports,  so you must have a 
raster or an spatial object like an "sf" object.   So points would have to also 
be an sf object and the two combined  (sf has commands to do this) or perhaps 
you can do ti directly in mapview,  I haven't played with it much.

The plotly example I sent works with ggplot2,  so if you know how to build up 
the map in ggplot2 you can try that,  though again as far as I can see plotly 
for maps needs sf objects.

Mainly sent you the links to get you started.  I haven't played with either 
much,  just know that they exist and zoom maps.

HTH,

-Roy

> On Mar 6, 2019, at 2:36 PM,  
>  wrote:
> 
> Roy
> 
> Thank you - that's helpful.  Going to have to read up on sf and mapview
> library. Those are new ones.  Then to add a point feature layer (lat/long)
> where would I insert that?
> 
> Library(maps)
> Library(sf) # simple features
> Library(mapview)
> 
> world.map <- maps::map("world", plot = FALSE, fill = TRUE) 
> p <- sf::st_as_sf(world.map, coords = c('x', 'y')) 
> mapview::mapview(p, legend=FALSE)
> 
> 
> 
> -Original Message-
> From: rmendelss gmail  
> Sent: Wednesday, March 6, 2019 4:11 PM
> To: reichm...@sbcglobal.net
> Cc: R help Mailing list 
> Subject: Re: [R] Zoom In/Out maps library
> 
> world.map <- maps::map("world", plot = FALSE, fill = TRUE) p <- sf::
> st_as_sf(world.map, coords = c('x', 'y')) map view::map view(p)
> 
> HTH,
> 
> -Roy
> 
>> On Mar 6, 2019, at 1:44 PM, reichm...@sbcglobal.net wrote:
>> 
>> R Help
>> 
>> Anyone know if I can add a zoom In/Out function to the maps available via
> the "maps" library? Or do I need to use a different mapping library?
>> 
>> world.map <- map_data("world")
>> 
>> ggplot(data = world.map) +
>> geom_polygon(mapping = aes(x=long, y=lat, group=group))
>> 
>> Jeff
>> 
>> __
>> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see 
>> https://stat.ethz.ch/mailman/listinfo/r-help
>> PLEASE do read the posting guide 
>> http://www.R-project.org/posting-guide.html
>> and provide commented, minimal, self-contained, reproducible code.
> 
> __
> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.



**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new street address***
110 McAllister Way
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: http://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Zoom In/Out maps library

2019-03-06 Thread Roy Mendelssohn - NOAA Federal via R-help
Or if you prefer plotly:

world.map <- maps::map("world", plot = FALSE, fill = TRUE)
p <- sf:: st_as_sf(world.map, coords = c('x', 'y'))
plotly::ggplotly(
ggplot2::ggplot(data = p) + ggplot2::geom_sf()
)

> On Mar 6, 2019, at 2:12 PM, Roy Mendelssohn - NOAA Federal 
>  wrote:
> 
> world.map <- maps::map("world", plot = FALSE, fill = TRUE)
> p <- sf:: st_as_sf(world.map, coords = c('x', 'y'))
> map view::map view(p)
> 
> 
> HTH,
> 
> -Roy
>> On Mar 6, 2019, at 2:10 PM, rmendelss gmail  wrote:
>> 
>> world.map <- maps::map("world", plot = FALSE, fill = TRUE)
>> p <- sf:: st_as_sf(world.map, coords = c('x', 'y'))
>> map view::map view(p)
>> 
>> HTH,
>> 
>> -Roy
>> 
>>> On Mar 6, 2019, at 1:44 PM, reichm...@sbcglobal.net wrote:
>>> 
>>> R Help
>>> 
>>> Anyone know if I can add a zoom In/Out function to the maps available via 
>>> the "maps" library? Or do I need to use a different mapping library?
>>> 
>>> world.map <- map_data("world")
>>> 
>>> ggplot(data = world.map) +
>>> geom_polygon(mapping = aes(x=long, y=lat, group=group))
>>> 
>>> Jeff
>>> 
>>> __
>>> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
>>> https://stat.ethz.ch/mailman/listinfo/r-help
>>> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
>>> and provide commented, minimal, self-contained, reproducible code.
>> 
> 
> **
> "The contents of this message do not reflect any position of the U.S. 
> Government or NOAA."
> **
> Roy Mendelssohn
> Supervisory Operations Research Analyst
> NOAA/NMFS
> Environmental Research Division
> Southwest Fisheries Science Center
> ***Note new street address***
> 110 McAllister Way
> Santa Cruz, CA 95060
> Phone: (831)-420-3666
> Fax: (831) 420-3980
> e-mail: roy.mendelss...@noaa.gov www: http://www.pfeg.noaa.gov/
> 
> "Old age and treachery will overcome youth and skill."
> "From those who have been given much, much will be expected" 
> "the arc of the moral universe is long, but it bends toward justice" -MLK Jr.
> 

**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new street address***
110 McAllister Way
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: http://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Zoom In/Out maps library

2019-03-06 Thread Roy Mendelssohn - NOAA Federal via R-help
world.map <- maps::map("world", plot = FALSE, fill = TRUE)
p <- sf:: st_as_sf(world.map, coords = c('x', 'y'))
map view::map view(p)


HTH,

-Roy
> On Mar 6, 2019, at 2:10 PM, rmendelss gmail  wrote:
> 
> world.map <- maps::map("world", plot = FALSE, fill = TRUE)
> p <- sf:: st_as_sf(world.map, coords = c('x', 'y'))
> map view::map view(p)
> 
> HTH,
> 
> -Roy
> 
>> On Mar 6, 2019, at 1:44 PM, reichm...@sbcglobal.net wrote:
>> 
>> R Help
>> 
>> Anyone know if I can add a zoom In/Out function to the maps available via 
>> the "maps" library? Or do I need to use a different mapping library?
>> 
>> world.map <- map_data("world")
>> 
>> ggplot(data = world.map) +
>> geom_polygon(mapping = aes(x=long, y=lat, group=group))
>> 
>> Jeff
>> 
>> __
>> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
>> https://stat.ethz.ch/mailman/listinfo/r-help
>> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
>> and provide commented, minimal, self-contained, reproducible code.
> 

**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new street address***
110 McAllister Way
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: http://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] cmocean color palette

2019-02-20 Thread Roy Mendelssohn - NOAA Federal via R-help
Hi All:

If it would be of use to anyone,  I have the latest version of the Kristen 
Thyng's beautiful cmocean color palettes  (see https://matplotlib.org/cmocean/ 
) converted to be used in R.  These colormaps have been carefully designed 
given the latest ideas of what makes for a good palette,  and to make palettes 
that are really specific to the type of parameters in oceanography.  They are 
quite popular in the oceanographic community.

The file is a small .RData file.  If you are interested,  email me off-line so 
as not to spam the entire mail-list.

-Roy M.


**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new street address***
110 McAllister Way
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: http://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Problems trying to place a global map with Ncdf data plot

2019-02-17 Thread Roy Mendelssohn - NOAA Federal via R-help
Hi:

> On Feb 16, 2019, at 9:33 AM, rain1290--- via R-help  
> wrote:
> 
>> ggplot()+geom_point(aes(x=nc_lon,y=nc_lat,color="onedaymax"),
> size=0.8)+borders("world",
> colour="black")+scale_color_viridis(name="onedaymax")+theme_void()+coord_quickmap()
> *Error: Aesthetics must be either length 1 or the same as the data (128): x,
> y, colour*

Maybe I am missing something (i am old and it is early on a Sunday),  but I 
don't see whee the dataset is defined in either ggplot or geom_point. You must 
have a data frame defined that contains both your expanded grid and the 
precipitation data and that has to be defined in either the ggplot() call or 
the geom_point() cal.

HTH,

-Roy

**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new street address***
110 McAllister Way
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: http://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Two gganimate questions.

2019-02-07 Thread Roy Mendelssohn - NOAA Federal via R-help
I have two gganimate questions that I have made some headway on but not too 
much,  and they are actually related.  The questions are:

1.  Suppose I have a list where each element of the list is a pre-defined 
ggplots2 graphic  (in my case each is a map).  Is there a way to animate this,  
and if so,  what is the best way?  I have tried using 
gganimate::transition_layers()   and while it recognized the lists and produced 
an animation  (albeit very slowly).

2. is there a way to build up an animation as we go?  it is a long story,  but 
each map in the list above has to be calculated separated,  it it done using 
geom_sf.  can an initial animation be defined and then frames added to it as 
new maps are made?

Thanks for any help.

-Roy



**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new street address***
110 McAllister Way
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: http://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] how to plot gridded data

2018-09-13 Thread Roy Mendelssohn - NOAA Federal via R-help
Hi Lily:

I haven't used it to any extent to give you specifics,  but I strongly suggest 
you look at the package sf,  it is designed to do these sorts of things.  sf 
can read in the shapefile,  and it has features to covert the dataframe you 
describe to one of its objects,  and to combine objects.  There are even 
plotting functions I believe,  or if not there is a ggplot2::geom_sf()

HTH,

-Roy


> On Sep 13, 2018, at 7:02 PM, lily li  wrote:
> 
> Hi Petr,
> 
> I have merged the data using cbind. The dataset is like this:
> DF
> lat1_lon1  lat1_lon2  lat1_lon3  ...  lat2_lon1
>  1.20   1.30  2.11  ... 1.28
>  1.50   1.81  3.12  ... 2.34
>  2.41   2.22  1.56  ... 2.50
>  3.11   4.21  2.12  ... 3.21
> 
> The other file is a shapfile, which I can open using readOGR. Then it shows
> a polygon according to geographical latitude and longitude in degrees. How
> to overlay the values in DF onto the polygon? note that DF has the
> coordinates for a rectangular box that includes the shapefile, but is
> larger. I don't know how to do this. Thanks for your help.
> 
> On Wed, Sep 12, 2018 at 3:22 PM, PIKAL Petr  wrote:
> 
>> Hi
>> 
>> 1. Read files/lines into R ?read.table, ?read.lines
>> 2. Merge files according to your specification ?merge, ?rbind
>> 3. Plot values by suitable command(s) ?plot, ?ggplot
>> 4. If you want more specific answer, please post more specific question,
>> preferably with concise and clear example.
>> 5. Avoid posting in HTML
>> 
>> Cheers
>> Petr
>> 
>>> -Original Message-
>>> From: R-help  On Behalf Of lily li
>>> Sent: Wednesday, September 12, 2018 8:55 AM
>>> To: R mailing list 
>>> Subject: [R] how to plot gridded data
>>> 
>>> Hi R users,
>>> 
>>> I have a question about plotting gridded data. I have the files
>> separately, but do
>>> not know how to combine them. For example, each txt file has daily
>>> precipitation data at a specific grid cell, named pr_lat_lon.txt. How to
>> plot all
>>> txt files for one surface (which is rectangular in this case), or how to
>> combine
>>> the txt files together? Thanks.
>>> 
>>> [[alternative HTML version deleted]]
>>> 
>>> __
>>> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
>>> https://stat.ethz.ch/mailman/listinfo/r-help
>>> PLEASE do read the posting guide http://www.R-project.org/
>> posting-guide.html
>>> and provide commented, minimal, self-contained, reproducible code.
>> Osobní údaje: Informace o zpracování a ochraně osobních údajů obchodních
>> partnerů PRECHEZA a.s. jsou zveřejněny na: https://www.precheza.cz/
>> zasady-ochrany-osobnich-udaju/ | Information about processing and
>> protection of business partner’s personal data are available on website:
>> https://www.precheza.cz/en/personal-data-protection-principles/
>> Důvěrnost: Tento e-mail a jakékoliv k němu připojené dokumenty jsou
>> důvěrné a podléhají tomuto právně závaznému prohláąení o vyloučení
>> odpovědnosti: https://www.precheza.cz/01-dovetek/ | This email and any
>> documents attached to it may be confidential and are subject to the legally
>> binding disclaimer: https://www.precheza.cz/en/01-disclaimer/
>> 
>> 
> 
>   [[alternative HTML version deleted]]
> 
> __
> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.

**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new street address***
110 McAllister Way
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: http://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] How deep into function calls does trycatch() work

2018-08-16 Thread Roy Mendelssohn - NOAA Federal via R-help
Hi All:

I am using another package in a project I have. Because of that,  I have no 
control on how that package behaves or what it returns.  This package has a 
function foo()  that calls httr::GET(),  and if it gets an error from 
httr::GET() it calls the following routine:


err_handle2 <- function(x) {
  if (x$status_code > 201) {
tt <- content(x, "text")
mssg <- xml_text(xml_find_all(read_html(tt), "//h1"))
stop(paste0(mssg, collapse = "\n\n"), call. = FALSE)
  }
}

My question is if I embed my call to foo() in try...catch will that override 
the stop() call or am I a goner, or is there another way to override it,  given 
that I can't change the code to err_handle2().

Thanks,

-Roy


**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new street address***
110 McAllister Way
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: http://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] ggplot2 version 3

2018-07-03 Thread Roy Mendelssohn - NOAA Federal via R-help
Thanks!

-Roy


> On Jul 3, 2018, at 2:40 PM, William Dunlap  wrote:
> 
> One way to test the new ggplot2 is to make a new directory to use as an R 
> library and to install the new ggplot2 there.
>newLibrary <- "C:/tmp/newRLibrary"
>dir.create(newLibrary)
>install.packages("ggplot2", lib=newLibrary)
> Then you can run two R sessions at once, starting one with
>.libPaths("C:/tmp/newRLibrary")
> to use the new ggplot2 and the othe without that line to use the old ggpot2.
> 
> 
> Bill Dunlap
> TIBCO Software
> wdunlap tibco.com
> 
> On Tue, Jul 3, 2018 at 2:08 PM, Roy Mendelssohn - NOAA Federal via R-help 
>  wrote:
> Hi All:
> 
> When I ask about updating packages in my R distribution,  it lists ggplot2 
> version 3.0.0 as being available.  I know that ggplot2 version 3.0.0 has made 
> some significant changes that will break certain things.  I would like to 
> install the new version, to see if it breaks anything that I do,  but I would 
> also like to be able to revert back to the old version if it makes it 
> impossible to do some of the work I need to get done,  and then switch back 
> again to the new version to test some more.  Is there some elegant way of 
> doing this?  If I just drag the appropriate Folder out of my directory and 
> replace it with the one I want,  will that do it,  or are there too many 
> other dependencies that are involved?
> 
> Thanks for any suggestions.
> 
> -Roy
> 
> 
> 
> 
> **
> "The contents of this message do not reflect any position of the U.S. 
> Government or NOAA."
> **
> Roy Mendelssohn
> Supervisory Operations Research Analyst
> NOAA/NMFS
> Environmental Research Division
> Southwest Fisheries Science Center
> ***Note new street address***
> 110 McAllister Way
> Santa Cruz, CA 95060
> Phone: (831)-420-3666
> Fax: (831) 420-3980
> e-mail: roy.mendelss...@noaa.gov www: http://www.pfeg.noaa.gov/
> 
> "Old age and treachery will overcome youth and skill."
> "From those who have been given much, much will be expected" 
> "the arc of the moral universe is long, but it bends toward justice" -MLK Jr.
> 
> __
> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
> 

**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new street address***
110 McAllister Way
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: http://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] ggplot2 version 3

2018-07-03 Thread Roy Mendelssohn - NOAA Federal via R-help
Hi All:

When I ask about updating packages in my R distribution,  it lists ggplot2 
version 3.0.0 as being available.  I know that ggplot2 version 3.0.0 has made 
some significant changes that will break certain things.  I would like to 
install the new version, to see if it breaks anything that I do,  but I would 
also like to be able to revert back to the old version if it makes it 
impossible to do some of the work I need to get done,  and then switch back 
again to the new version to test some more.  Is there some elegant way of doing 
this?  If I just drag the appropriate Folder out of my directory and replace it 
with the one I want,  will that do it,  or are there too many other 
dependencies that are involved?

Thanks for any suggestions.

-Roy




**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new street address***
110 McAllister Way
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: http://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] httr::content without message

2018-01-02 Thread Roy Mendelssohn - NOAA Federal
Thanks to all that replied.  I had just looked through the httr code and sure 
enough for a .csv mime time it calls readr::read_csv().  The httr::content docs 
suggest not using automatic parsing in a package,  rather to determine mime 
type and parse yourself and Ben's suggestion also works if I do:

junk <- readr::read_csv(r1$content, col_types = cols())

Perfect.  Using httr rather than putting the url in any of the read.csv or 
read_csv type code allows me greater control if the request fails.

Thanks again,

-Roy

> On Jan 2, 2018, at 9:44 AM, Ben Tupper  wrote:
> 
> Ahoy!
> 
> That's a message generated by the readr::read_table() function (or it's 
> friends).  You can suppress it a number of ways, but this should work as 
> httr::content() will pass through arguments, like col_types = cols(), to the 
> file reader.
> 
> junk <- httr::content(r1, col_types = cols())
> 
> See more here...
> 
> https://blog.rstudio.com/2016/08/05/readr-1-0-0/
> 
> 
> Cheers,
> Ben
> 
> 
> 
>> On Jan 2, 2018, at 12:30 PM, Roy Mendelssohn - NOAA Federal 
>>  wrote:
>> 
>> Hi All:
>> 
>> I am using httr to download files form a service, in this case a .csv file.  
>> When I use httr::content on the result,  I get a message.  Since this will 
>> be in a package.  I want to suppress the message,  but haven't figured out 
>> how to do so.
>> 
>> The following should reproduce the result:
>> 
>> myURL <- 
>> 'https://coastwatch.pfeg.noaa.gov/erddap/griddap/erdMH1sstdmday.csvp?time[0:1:last]'
>> r1 <- httr::GET(myURL)
>> junk <- httr::content(r1)
>> 
>> when the last command is run, you get:
>> 
>> Parsed with column specification:
>> cols(
>>  `time (UTC)` = col_datetime(format = "")
>> )
>> 
>> I want to suppress that output.
>> 
>> Thanks,
>> 
>> -Roy
>> 
>> **
>> "The contents of this message do not reflect any position of the U.S. 
>> Government or NOAA."
>> **
>> Roy Mendelssohn
>> Supervisory Operations Research Analyst
>> NOAA/NMFS
>> Environmental Research Division
>> Southwest Fisheries Science Center
>> ***Note new street address***
>> 110 McAllister Way
>> Santa Cruz, CA 95060
>> Phone: (831)-420-3666
>> Fax: (831) 420-3980
>> e-mail: roy.mendelss...@noaa.gov www: http://www.pfeg.noaa.gov/
>> 
>> "Old age and treachery will overcome youth and skill."
>> "From those who have been given much, much will be expected" 
>> "the arc of the moral universe is long, but it bends toward justice" -MLK Jr.
>> 
>> __
>> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
>> https://stat.ethz.ch/mailman/listinfo/r-help
>> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
>> and provide commented, minimal, self-contained, reproducible code.
> 
> Ben Tupper
> Bigelow Laboratory for Ocean Sciences
> 60 Bigelow Drive, P.O. Box 380
> East Boothbay, Maine 04544
> http://www.bigelow.org
> 
> Ecocast Reports: http://seascapemodeling.org/ecocast.html
> Tick Reports: https://report.bigelow.org/tick/
> Jellyfish Reports: https://jellyfish.bigelow.org/jellyfish/
> 
> 
> 

**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new street address***
110 McAllister Way
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: http://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] httr::content without message

2018-01-02 Thread Roy Mendelssohn - NOAA Federal
Hi All:

I am using httr to download files form a service, in this case a .csv file.  
When I use httr::content on the result,  I get a message.  Since this will be 
in a package.  I want to suppress the message,  but haven't figured out how to 
do so.

The following should reproduce the result:

myURL <- 
'https://coastwatch.pfeg.noaa.gov/erddap/griddap/erdMH1sstdmday.csvp?time[0:1:last]'
r1 <- httr::GET(myURL)
junk <- httr::content(r1)

when the last command is run, you get:

Parsed with column specification:
cols(
  `time (UTC)` = col_datetime(format = "")
)

I want to suppress that output.

Thanks,

-Roy

**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new street address***
110 McAllister Way
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: http://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Development versions of xtractomatic and rerddapXtracto

2017-12-22 Thread Roy Mendelssohn - NOAA Federal
If you are a user of the R package "xtractomatic",  I have a new development 
version available,  as well as a test version of the package "rerddapXtracto".  
The  biggest changes are functions that can take the output of any of the data 
download functions and quickly map the data.   These functions use the package 
"plotdap". Also, a lot of the code in the development version of "xtractomatic" 
has been cleaned up and simplified.   If you have used "xtractomatic" before,  
be certain to read the vignette  
(https://rmendels.github.io/Usingxtractomatic_Dev.nb.html) as there have been 
some changes,  most notably in the order that arguments are passed in the 
function calls, what is optional in the call, and the two new plotting 
routines.  These changes were  made to support the idea that "data" should be 
the first argument  in the function calls,  and also to make the xtractomatic 
functions very close to the calls in  "rerddapXtracto".  This is important 
because this will likely be the last 
 version of "xtractomatic", with future development being put on 
"rerddapXtracto".

"rerddapXtracto" has the same functionality as "xtractomatic" but will work 
with any gridded dataset on any ERDDAP server,  by making use of the R package 
"rerddap", while "xtractomatic" only works with certain datasets on the ERD 
ERDDAP server.  The advantage of the "xtractomatic" approach is we chose what 
we thought were the most useful datasets (out of over 1000 datasets), and 
information about those datasets are built into the program.  But that means 
the "xtractomatic" package can not be used to access a large number of 
datasets.  "rerddapXtracto" is much more general,  but  the user must know the 
ERDDAP server they want to access, which dataset, and must first obtain 
information about that dataset by using the function rerddap::info().  
"rerddapXtracto" also now has simple mapping of the data using the package 
"plotdap".  The vignette for "rerddapXtracto" is at  
https://rmendels.github.io/UsingrerddapXtracto.nb.html.

I have not submitted these packages to CRAN because at the moment they depend  
either on packages or versions of packages that are only on Github,  not CRAN.  
While I know there are ways around this when submitting to CRAN,  I feel that 
this defeats a lot of the purpose of CRAN.  CRAN checks consistency of package 
with other packages,  with development versions of R,  and also notifies 
developers when packages they use are changed, and provides an uniform 
installation of compatible packages.  These packages will be submitted to CRAN 
when the packages they depend on are available.  

The best information on installation is the vignettes.  The quick start 
version,  the development version of "xtractomatic" can be installed using:

devtools::install_github("rmendels/xtractomatic",  ref = "development")

and "rerddapXtracto" can be installed using:

devtools::install_github("rmendels/rerddapXtracto")

Several warnings about the installations:

1.  Both packages use the package "plotdap" for graphics.  This package at the 
moment is available only on Github,  at:

https://github.com/ropensci/plotdap

2. "plotdap" itself depends on a fairly large number of packages,  In some 
testing,  people sometimes had to get a more recent,  non-CRAN version of other 
packages to have correct functionality.  If you run into problems let me know. 

3. "rerddapXtracto" depends on the package "rerddap",  but not on the CRAN 
version.  There was some changes in functionality in the Github version, in 
particular the handling of caches,  as well as some code changes.  To use 
"rerddapXtracto" you must install the Github version of "rerddap" available 
from https://github.com/ropensci/rerddap.

Basically the initial installation of either package may not go cleanly,  if 
there are problems let me know, it may take several attempts to get all the 
dependencies correct.  As noted earlier,  this is one of the benefits provided 
by CRAN,  and why I do not want to put these packages on CRAN until all of 
their dependencies are there also.


-Roy

 


**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new street address***
110 McAllister Way
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: http://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and prov

Re: [R] dygraphs, multiple graphs and shiny

2017-10-18 Thread Roy Mendelssohn - NOAA Federal
Answering my own question.  It took a lot of trial and error,  but the code 
below will work.  The trick is to do form the lis to plots, create the html 
tag, and use renderUI() for that, and then in the UI.R part use  htmlOutput() 
to output the result.

-Roy


> library(shiny)
> 
> # Define UI for application that draws a histogram
> ui <- fluidPage(
>
># Application title
>titlePanel("Test"),
>
># Sidebar with a slider input for number of bins 
>sidebarLayout(
>   sidebarPanel(
>  sliderInput("bins",
>  "Number of bins:",
>  min = 1,
>  max = 50,
>  value = 30)
>   ),
>   
>   # Show a plot of the generated distribution
>   mainPanel(
> #dygraphs::dygraphOutput("distPlot")
> htmlOutput("distPlot")
>   )
>)
> )
> 
> # Define server logic required to draw a histogram
> server <- function(input, output) {
>   lungDeaths <- cbind(mdeaths, fdeaths)
>  # output$distPlot <- dygraphs::renderDygraph({
> res = list()
> res[[1]] <-  dygraph(lungDeaths[, 1], group = 'lungs') %>% 
> dyRangeSelector()
> res[[2]] <-  dygraph(lungDeaths[, 1], group = 'lungs') %>% 
> dyRangeSelector()
>res <- htmltools::tagList(res)
>output$distPlot <- renderUI({
> res
>})
> }
> 
> # Run the application 
> shinyApp(ui = ui, server = server)
> 




> On Oct 18, 2017, at 10:43 AM, Roy Mendelssohn - NOAA Federal 
>  wrote:
> 
> Hi All:
> 
> This is really getting into the weeds,  but I am hoping someone will have a 
> solution.  I am trying to use dygrahs for R, within Shiny.
> 
> The situation arises when I am combining a number of dygraphs into one plot.  
> If I am just in an RNotebook, if you look at:
> 
> https://stackoverflow.com/questions/30509866/for-loop-over-dygraph-does-not-work-in-r
> 
> the solution to have the plot shown from a RNotebook is code like this:
> 
>> library(dygraphs)
>> lungDeaths <- cbind(mdeaths, fdeaths)
>> res <- lapply(1:2, function(i) dygraph(lungDeaths[, i]))
>> htmltools::tagList(res)
> 
> and if you put that into an RNotebook and knit it, it works.  Okay in order 
> to have a reproducible example,  I now try to create a Shiny App using that 
> example,  and the template generated by RStudio.  To use dygraphs in Shiny,  
> you replace the normal render and plot routines,  such that the following 
> "works"  in the sense when run the graph and slider are shown - this is 
> showing just a single digraph:
> 
>> library(shiny)
>> 
>> # Define UI for application that draws a histogram
>> ui <- fluidPage(
>> 
>>   # Application title
>>   titlePanel("Test"),
>> 
>>   # Sidebar with a slider input for number of bins 
>>   sidebarLayout(
>>  sidebarPanel(
>> sliderInput("bins",
>> "Number of bins:",
>> min = 1,
>> max = 50,
>> value = 30)
>>  ),
>> 
>>  # Show a plot of the generated distribution
>>  mainPanel(
>>dygraphs::dygraphOutput("distPlot")
>>  )
>>   )
>> )
>> 
>> # Define server logic required to draw a histogram
>> server <- function(input, output) {
>> 
>>   output$distPlot <- dygraphs::renderDygraph({
>> lungDeaths <- cbind(mdeaths, fdeaths)
>> res <- lapply(1:2, function(i) dygraph(lungDeaths[, i]))
>> dygraph(lungDeaths[, 1])
>>   })
>> }
>> 
>> # Run the application 
>> shinyApp(ui = ui, server = server)
>> 
> 
> 
> Now make the single change to try and render the combined plot:
> 
>> library(shiny)
>> 
>> # Define UI for application that draws a histogram
>> ui <- fluidPage(
>> 
>>   # Application title
>>   titlePanel("Test"),
>> 
>>   # Sidebar with a slider input for number of bins 
>>   sidebarLayout(
>>  sidebarPanel(
>> sliderInput("bins",
>> "Number of bins:",
>> min = 1,
>> max = 50,
>> value = 30)
>>  ),
>> 
>>  # Show a plot of the generated distribution
>>  mainPanel(
>>dygraphs::dygraphOutput("distPlot")
>>  )
>>   )
>> )
&g

[R] dygraphs, multiple graphs and shiny

2017-10-18 Thread Roy Mendelssohn - NOAA Federal
Hi All:

This is really getting into the weeds,  but I am hoping someone will have a 
solution.  I am trying to use dygrahs for R, within Shiny.

The situation arises when I am combining a number of dygraphs into one plot.  
If I am just in an RNotebook, if you look at:

https://stackoverflow.com/questions/30509866/for-loop-over-dygraph-does-not-work-in-r

the solution to have the plot shown from a RNotebook is code like this:

> library(dygraphs)
> lungDeaths <- cbind(mdeaths, fdeaths)
> res <- lapply(1:2, function(i) dygraph(lungDeaths[, i]))
> htmltools::tagList(res)

and if you put that into an RNotebook and knit it, it works.  Okay in order to 
have a reproducible example,  I now try to create a Shiny App using that 
example,  and the template generated by RStudio.  To use dygraphs in Shiny,  
you replace the normal render and plot routines,  such that the following 
"works"  in the sense when run the graph and slider are shown - this is showing 
just a single digraph:

> library(shiny)
> 
> # Define UI for application that draws a histogram
> ui <- fluidPage(
>
># Application title
>titlePanel("Test"),
>
># Sidebar with a slider input for number of bins 
>sidebarLayout(
>   sidebarPanel(
>  sliderInput("bins",
>  "Number of bins:",
>  min = 1,
>  max = 50,
>  value = 30)
>   ),
>   
>   # Show a plot of the generated distribution
>   mainPanel(
> dygraphs::dygraphOutput("distPlot")
>   )
>)
> )
> 
> # Define server logic required to draw a histogram
> server <- function(input, output) {
>
>output$distPlot <- dygraphs::renderDygraph({
>  lungDeaths <- cbind(mdeaths, fdeaths)
>  res <- lapply(1:2, function(i) dygraph(lungDeaths[, i]))
>  dygraph(lungDeaths[, 1])
>})
> }
> 
> # Run the application 
> shinyApp(ui = ui, server = server)
> 


Now make the single change to try and render the combined plot:

> library(shiny)
> 
> # Define UI for application that draws a histogram
> ui <- fluidPage(
>
># Application title
>titlePanel("Test"),
>
># Sidebar with a slider input for number of bins 
>sidebarLayout(
>   sidebarPanel(
>  sliderInput("bins",
>  "Number of bins:",
>  min = 1,
>  max = 50,
>  value = 30)
>   ),
>   
>   # Show a plot of the generated distribution
>   mainPanel(
> dygraphs::dygraphOutput("distPlot")
>   )
>)
> )
> 
> # Define server logic required to draw a histogram
> server <- function(input, output) {
>
>output$distPlot <- dygraphs::renderDygraph({
>  lungDeaths <- cbind(mdeaths, fdeaths)
>  res <- lapply(1:2, function(i) dygraph(lungDeaths[, i]))
>  htmltools::tagList(res)
>})
> }
> 
> # Run the application 
> shinyApp(ui = ui, server = server)
> 



If you run the second example,  the plot does not appear.

Thanks for any help.

-Roy

**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new street address***
110 McAllister Way
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: http://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] ggridges help

2017-10-17 Thread Roy Mendelssohn - NOAA Federal
Perfect,  thank you.  

I find the Unix style help usual in R is really only helpful once you know what 
everything is doing.  That makes a good vignette, that shows what all of the 
options do in a careful way,  really important. 

Thanks again.

-Roy

> On Oct 17, 2017, at 2:01 PM, William Dunlap  wrote:
> 
> The min_height = -0.25 is there to make it show cycle values down to -1/4.  
> You may want to change it to -1 so it shows more of the cycle values.
> 
> Bill Dunlap
> TIBCO Software
> wdunlap tibco.com
> 
> On Tue, Oct 17, 2017 at 1:26 PM, Roy Mendelssohn - NOAA Federal 
>  wrote:
> yes, thanks,  and I was getting close to that.  One thing I found is the 
> manual says the height is the distance above the y-line,  which should be, 
> but doesn't have to be positive.  In fact,  the time series are  estimates of 
> a cycle,  and has negative values,  which unfortunately are not included in 
> my sub-sample.  And the negative values are not handled properly (the series 
> disappears for the negative values)
> 
> Also,  maybe it is my bad eyesight,  but in examples of the older, deprecated 
> ggjoy package,  there seem to be a slight offset added to the depth effect,  
> which doesn't appear to be the case now, or am I missing something.  There is 
> this in the manual:
> 
> 
> > position
> > Position adjustment, either as a string, or the result of a call to a 
> > position adjustment function.
> 
> I assume this refers to the ggplot2 position adjustments.  Would one of those 
> calls have that effect?
> 
> Thanks,
> 
> -Roy
> 
> 
> 
> 
> > On Oct 17, 2017, at 1:09 PM, William Dunlap  wrote:
> >
> > Does the following work for you?
> >
> >ggplot2::ggplot(plotFrame, aes(x = time, y = depth, height = cycle, 
> > group = depth)) + ggridges::geom_ridgeline(fill="red", min_height=-0.25)
> >
> >
> > Bill Dunlap
> > TIBCO Software
> > wdunlap tibco.com
> >
> > On Tue, Oct 17, 2017 at 12:43 PM, Roy Mendelssohn - NOAA Federal 
> >  wrote:
> > I have tried:
> >
> > ggplot(plotFrame, aes(x = time, y = cycle, height = cycle, group = depth)) 
> > + geom_ridgeline()
> > ggplot(plotFrame, aes(x = time, y = depth, height = cycle, group = depth)) 
> > + geom_ridgeline()
> > ggplot(plotFrame, aes(x = time, y = depth, group = depth)) + 
> > geom_density_ridges()
> >
> > none are producing a plot that was a ridgeline for each depth showing the 
> > time series at that depth.  The plot should be like the geom_line plot,  
> > but as a ridgeline for each depth.
> >
> > -Roy
> >
> > > On Oct 17, 2017, at 12:39 PM, Bert Gunter  wrote:
> > >
> > > ...and your question is...?
> > > ... and the code you tried that didn't work was?
> > >
> > > Bert
> > >
> > >
> > > On Oct 17, 2017 12:22 PM, "Roy Mendelssohn - NOAA Federal" 
> > >  wrote:
> > > Hi All:
> > >
> > > I am just not understanding ggridges.  The data I have are time series at 
> > > different depths in the ocean.  I want to make a joy plot of the time 
> > > series by depth.
> > >
> > > If I was just doing a ggplot2 line plot I would be doing:
> > >
> > > ggplot(plotFrame, aes(x = time, y = cycle, group = depth)) + geom_line()
> > >
> > > but translating that to ggridges has not worked right.  Below is the 
> > > result from dput() of a simplified data frame that has 3 depths and 2 
> > > years of monthly data.  (In fact I have  20 depths and 30  years of 
> > > monthly data).   The command above will work with this data frame.
> > >
> > > Thanks for any help.
> > >
> > > -Roy
> > >
> > > >dput(plotFrame)
> > > structure(list(time = structure(c(719236800, 721915200, 724507200,
> > > 727185600, 729777600, 732283200, 734961600, 737553600, 740232000,
> > > 742824000, 745502400, 748180800, 750772800, 753451200, 756043200,
> > > 758721600, 761313600, 763819200, 766497600, 769089600, 771768000,
> > > 77436, 777038400, 779716800, 719236800, 721915200, 724507200,
> > > 727185600, 729777600, 732283200, 734961600, 737553600, 740232000,
> > > 742824000, 745502400, 748180800, 750772800, 753451200, 756043200,
> > > 758721600, 761313600, 763819200, 766497600, 769089600, 771768000,
> > > 77436, 777038400, 779716800, 719236800, 721915200, 724507200,
> > > 727185600, 729777600, 732283200, 734961600, 737553600, 740232000,
> > > 742824000, 745502400, 748180800, 750772800, 753451200

Re: [R] ggridges help

2017-10-17 Thread Roy Mendelssohn - NOAA Federal
yes, thanks,  and I was getting close to that.  One thing I found is the manual 
says the height is the distance above the y-line,  which should be, but doesn't 
have to be positive.  In fact,  the time series are  estimates of a cycle,  and 
has negative values,  which unfortunately are not included in my sub-sample.  
And the negative values are not handled properly (the series disappears for the 
negative values)

Also,  maybe it is my bad eyesight,  but in examples of the older, deprecated 
ggjoy package,  there seem to be a slight offset added to the depth effect,  
which doesn't appear to be the case now, or am I missing something.  There is 
this in the manual:


> position  
> Position adjustment, either as a string, or the result of a call to a 
> position adjustment function.

I assume this refers to the ggplot2 position adjustments.  Would one of those 
calls have that effect?

Thanks,

-Roy




> On Oct 17, 2017, at 1:09 PM, William Dunlap  wrote:
> 
> Does the following work for you?
>  
>ggplot2::ggplot(plotFrame, aes(x = time, y = depth, height = cycle, group 
> = depth)) + ggridges::geom_ridgeline(fill="red", min_height=-0.25)
> 
> 
> Bill Dunlap
> TIBCO Software
> wdunlap tibco.com
> 
> On Tue, Oct 17, 2017 at 12:43 PM, Roy Mendelssohn - NOAA Federal 
>  wrote:
> I have tried:
> 
> ggplot(plotFrame, aes(x = time, y = cycle, height = cycle, group = depth)) + 
> geom_ridgeline()
> ggplot(plotFrame, aes(x = time, y = depth, height = cycle, group = depth)) + 
> geom_ridgeline()
> ggplot(plotFrame, aes(x = time, y = depth, group = depth)) + 
> geom_density_ridges()
> 
> none are producing a plot that was a ridgeline for each depth showing the 
> time series at that depth.  The plot should be like the geom_line plot,  but 
> as a ridgeline for each depth.
> 
> -Roy
> 
> > On Oct 17, 2017, at 12:39 PM, Bert Gunter  wrote:
> >
> > ...and your question is...?
> > ... and the code you tried that didn't work was?
> >
> > Bert
> >
> >
> > On Oct 17, 2017 12:22 PM, "Roy Mendelssohn - NOAA Federal" 
> >  wrote:
> > Hi All:
> >
> > I am just not understanding ggridges.  The data I have are time series at 
> > different depths in the ocean.  I want to make a joy plot of the time 
> > series by depth.
> >
> > If I was just doing a ggplot2 line plot I would be doing:
> >
> > ggplot(plotFrame, aes(x = time, y = cycle, group = depth)) + geom_line()
> >
> > but translating that to ggridges has not worked right.  Below is the result 
> > from dput() of a simplified data frame that has 3 depths and 2 years of 
> > monthly data.  (In fact I have  20 depths and 30  years of monthly data).   
> > The command above will work with this data frame.
> >
> > Thanks for any help.
> >
> > -Roy
> >
> > >dput(plotFrame)
> > structure(list(time = structure(c(719236800, 721915200, 724507200,
> > 727185600, 729777600, 732283200, 734961600, 737553600, 740232000,
> > 742824000, 745502400, 748180800, 750772800, 753451200, 756043200,
> > 758721600, 761313600, 763819200, 766497600, 769089600, 771768000,
> > 77436, 777038400, 779716800, 719236800, 721915200, 724507200,
> > 727185600, 729777600, 732283200, 734961600, 737553600, 740232000,
> > 742824000, 745502400, 748180800, 750772800, 753451200, 756043200,
> > 758721600, 761313600, 763819200, 766497600, 769089600, 771768000,
> > 77436, 777038400, 779716800, 719236800, 721915200, 724507200,
> > 727185600, 729777600, 732283200, 734961600, 737553600, 740232000,
> > 742824000, 745502400, 748180800, 750772800, 753451200, 756043200,
> > 758721600, 761313600, 763819200, 766497600, 769089600, 771768000,
> > 77436, 777038400, 779716800), class = c("POSIXct", "POSIXt"
> > ), tzone = "UTC"), depth = structure(c(1L, 1L, 1L, 1L, 1L, 1L,
> > 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L,
> > 1L, 1L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L,
> > 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 3L, 3L, 3L, 3L, 3L, 3L,
> > 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L,
> > 3L, 3L), .Label = c("20", "40", "60"), class = "factor"), cycle = 
> > structure(c(-0.164397110685046,
> > -0.0639063592004652, -0.124275650584243, 0.232340199700421, 
> > 0.23265929828899,
> > 0.452800479990173, 0.631515844862171, 0.708775442811806, 0.402797787246307,
> > 0.540279471159411, 0.625583653374072, 0.607609707505232, 0.789645459141814,
> > 0.57943178332249, 0.395406041578379, 0.278792362706845, 
> > 0.000158405680203533,
&

Re: [R] ggridges help

2017-10-17 Thread Roy Mendelssohn - NOAA Federal
I have tried:

ggplot(plotFrame, aes(x = time, y = cycle, height = cycle, group = depth)) + 
geom_ridgeline()
ggplot(plotFrame, aes(x = time, y = depth, height = cycle, group = depth)) + 
geom_ridgeline()
ggplot(plotFrame, aes(x = time, y = depth, group = depth)) + 
geom_density_ridges()

none are producing a plot that was a ridgeline for each depth showing the time 
series at that depth.  The plot should be like the geom_line plot,  but as a 
ridgeline for each depth.

-Roy

> On Oct 17, 2017, at 12:39 PM, Bert Gunter  wrote:
> 
> ...and your question is...?
> ... and the code you tried that didn't work was?
> 
> Bert
> 
> 
> On Oct 17, 2017 12:22 PM, "Roy Mendelssohn - NOAA Federal" 
>  wrote:
> Hi All:
> 
> I am just not understanding ggridges.  The data I have are time series at 
> different depths in the ocean.  I want to make a joy plot of the time series 
> by depth.
> 
> If I was just doing a ggplot2 line plot I would be doing:
> 
> ggplot(plotFrame, aes(x = time, y = cycle, group = depth)) + geom_line()
> 
> but translating that to ggridges has not worked right.  Below is the result 
> from dput() of a simplified data frame that has 3 depths and 2 years of 
> monthly data.  (In fact I have  20 depths and 30  years of monthly data).   
> The command above will work with this data frame.
> 
> Thanks for any help.
> 
> -Roy
> 
> >dput(plotFrame)
> structure(list(time = structure(c(719236800, 721915200, 724507200,
> 727185600, 729777600, 732283200, 734961600, 737553600, 740232000,
> 742824000, 745502400, 748180800, 750772800, 753451200, 756043200,
> 758721600, 761313600, 763819200, 766497600, 769089600, 771768000,
> 77436, 777038400, 779716800, 719236800, 721915200, 724507200,
> 727185600, 729777600, 732283200, 734961600, 737553600, 740232000,
> 742824000, 745502400, 748180800, 750772800, 753451200, 756043200,
> 758721600, 761313600, 763819200, 766497600, 769089600, 771768000,
> 77436, 777038400, 779716800, 719236800, 721915200, 724507200,
> 727185600, 729777600, 732283200, 734961600, 737553600, 740232000,
> 742824000, 745502400, 748180800, 750772800, 753451200, 756043200,
> 758721600, 761313600, 763819200, 766497600, 769089600, 771768000,
> 77436, 777038400, 779716800), class = c("POSIXct", "POSIXt"
> ), tzone = "UTC"), depth = structure(c(1L, 1L, 1L, 1L, 1L, 1L,
> 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L,
> 1L, 1L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L,
> 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 3L, 3L, 3L, 3L, 3L, 3L,
> 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L,
> 3L, 3L), .Label = c("20", "40", "60"), class = "factor"), cycle = 
> structure(c(-0.164397110685046,
> -0.0639063592004652, -0.124275650584243, 0.232340199700421, 0.23265929828899,
> 0.452800479990173, 0.631515844862171, 0.708775442811806, 0.402797787246307,
> 0.540279471159411, 0.625583653374072, 0.607609707505232, 0.789645459141814,
> 0.57943178332249, 0.395406041578379, 0.278792362706845, 0.000158405680203533,
> 0.0618374997078718, 0.0942838757427736, 0.046510899158667, 0.13680034298,
> 0.165118554160811, 0.228812312665972, -0.225383761996565, -0.204115732057999,
> -0.0883683879679482, -0.135111844938738, 0.147562070872115, 0.198086355354394,
> 0.386803300687593, 0.684023051288189, 0.69669009829253, 0.381213154479,
> 0.550118050327324, 0.641294267691433, 0.614909878956221, 0.772779409518665,
> 0.581967160929841, 0.381662488154885, 0.293380662335543, 0.0391733417449068,
> -0.0844674860904995, 0.163695040677223, 0.0444585016269223, 0.130561029192561,
> 0.180784990884611, 0.242929491090375, -0.126543843540014, -0.112781525045155,
> -0.1803388763034, -0.0939120153437669, 0.150968491445835, 0.0992298571202001,
> 0.289294645512006, 0.540517483378127, 0.625091194385051, 0.432224338078479,
> 0.504654513110009, 0.62584673393424, 0.56834612321311, 0.789331138620147,
> 0.6389908671341, 0.45156693996368, 0.412578785088203, 0.212440848202924,
> 0.146392303930216, 0.654494252844301, 0.470248736982212, 0.239891529116349,
> 0.200137949677769, 0.2858429346658, -0.121094155739595), .Dim = 72L)), .Names 
> = c("time",
> "depth", "cycle"), row.names = c(NA, -72L), class = "data.frame")
> 
> **
> "The contents of this message do not reflect any position of the U.S. 
> Government or NOAA."
> **
> Roy Mendelssohn
> Supervisory Operations Research Analyst
> NOAA/NMFS
> Environmental Research Division
> Southwest Fisheries Science Center
> ***Note new street address***
> 110 McAllister Way
> Santa Cruz, CA 95060
> Phone: (831)-420-3666
> Fax: (831) 420-3980
>

[R] ggridges help

2017-10-17 Thread Roy Mendelssohn - NOAA Federal
Hi All:

I am just not understanding ggridges.  The data I have are time series at 
different depths in the ocean.  I want to make a joy plot of the time series by 
depth.

If I was just doing a ggplot2 line plot I would be doing:

ggplot(plotFrame, aes(x = time, y = cycle, group = depth)) + geom_line()

but translating that to ggridges has not worked right.  Below is the result 
from dput() of a simplified data frame that has 3 depths and 2 years of monthly 
data.  (In fact I have  20 depths and 30  years of monthly data).   The command 
above will work with this data frame.

Thanks for any help.

-Roy

>dput(plotFrame)
structure(list(time = structure(c(719236800, 721915200, 724507200, 
727185600, 729777600, 732283200, 734961600, 737553600, 740232000, 
742824000, 745502400, 748180800, 750772800, 753451200, 756043200, 
758721600, 761313600, 763819200, 766497600, 769089600, 771768000, 
77436, 777038400, 779716800, 719236800, 721915200, 724507200, 
727185600, 729777600, 732283200, 734961600, 737553600, 740232000, 
742824000, 745502400, 748180800, 750772800, 753451200, 756043200, 
758721600, 761313600, 763819200, 766497600, 769089600, 771768000, 
77436, 777038400, 779716800, 719236800, 721915200, 724507200, 
727185600, 729777600, 732283200, 734961600, 737553600, 740232000, 
742824000, 745502400, 748180800, 750772800, 753451200, 756043200, 
758721600, 761313600, 763819200, 766497600, 769089600, 771768000, 
77436, 777038400, 779716800), class = c("POSIXct", "POSIXt"
), tzone = "UTC"), depth = structure(c(1L, 1L, 1L, 1L, 1L, 1L, 
1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 
1L, 1L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 
2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 3L, 3L, 3L, 3L, 3L, 3L, 
3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 
3L, 3L), .Label = c("20", "40", "60"), class = "factor"), cycle = 
structure(c(-0.164397110685046, 
-0.0639063592004652, -0.124275650584243, 0.232340199700421, 0.23265929828899, 
0.452800479990173, 0.631515844862171, 0.708775442811806, 0.402797787246307, 
0.540279471159411, 0.625583653374072, 0.607609707505232, 0.789645459141814, 
0.57943178332249, 0.395406041578379, 0.278792362706845, 0.000158405680203533, 
0.0618374997078718, 0.0942838757427736, 0.046510899158667, 0.13680034298, 
0.165118554160811, 0.228812312665972, -0.225383761996565, -0.204115732057999, 
-0.0883683879679482, -0.135111844938738, 0.147562070872115, 0.198086355354394, 
0.386803300687593, 0.684023051288189, 0.69669009829253, 0.381213154479, 
0.550118050327324, 0.641294267691433, 0.614909878956221, 0.772779409518665, 
0.581967160929841, 0.381662488154885, 0.293380662335543, 0.0391733417449068, 
-0.0844674860904995, 0.163695040677223, 0.0444585016269223, 0.130561029192561, 
0.180784990884611, 0.242929491090375, -0.126543843540014, -0.112781525045155, 
-0.1803388763034, -0.0939120153437669, 0.150968491445835, 0.0992298571202001, 
0.289294645512006, 0.540517483378127, 0.625091194385051, 0.432224338078479, 
0.504654513110009, 0.62584673393424, 0.56834612321311, 0.789331138620147, 
0.6389908671341, 0.45156693996368, 0.412578785088203, 0.212440848202924, 
0.146392303930216, 0.654494252844301, 0.470248736982212, 0.239891529116349, 
0.200137949677769, 0.2858429346658, -0.121094155739595), .Dim = 72L)), .Names = 
c("time", 
"depth", "cycle"), row.names = c(NA, -72L), class = "data.frame")

**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new street address***
110 McAllister Way
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: http://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] withr::set_makevars

2017-09-09 Thread Roy Mendelssohn - NOAA Federal
As a follow-up to this,  thanks to Bill Dunlap I was able to resolve what was 
causing this problem (I still had problems with covr::package_coverage() - but 
of a different sort and not directly related to this report,  I had an existing 
.R/Makevars file,  created in Nov. 2014 related to the installation of the 
rstan package. Commenting out some of the rstan specific lines removed this set 
of error messages.

-Roy




> On Sep 6, 2017, at 5:26 PM, William Dunlap  wrote:
> 
> withr:::set_makevars() can give that error if the makefile named by the 
> 'old_path' argument (default "~/.R/Makevars) contains more than one 
> definition of a variable of the form 'name=value'.  You can see what file it 
> is reading and its contents by using the trace() function:
> 
> trace(withr:::set_makevars, quote({ cat(old_path, "\n"); writeLines(paste0("  
>   ", tryCatch(readLines(old_path), error=function(e)conditionMessage(e}))
> 
> Then run your test and see what file set_makevars is complaining about and 
> what in the file might cause trouble for set_makevars.
> 
> 
> Bill Dunlap
> TIBCO Software
> wdunlap tibco.com
> 
> On Wed, Sep 6, 2017 at 3:41 PM, Roy Mendelssohn - NOAA Federal 
>  wrote:
> Hi All;
> 
> This problem has come about from trying to learn some of the review practices 
> recommend by rOpensci.  One of them is to use the package goodpractice.  
> After installing goodpractice, it kept failing on my own packages which are 
> under development, and I was concerned something was funny in my own ,  so I 
> have a fork of the package rerddap,  and I tested goodpractice on that.  I 
> get the error:
> 
> > Error in set_makevars(new, path, makevars_file, assignment = assignment) :
> >   Multiple results for CXXFLAGS found, something is wrong.FALSE
> >
> 
> 
> So after some playing around that is from the very first test,  which uses 
> the covr:package_coverage(), and sure enough running that produces the same 
> error.  Looking at the code,  that error is being thrown by the function 
> withr::set_makevars().  We are now too many layers deep into packages for me 
> to follow what is going on,  but the kicker is Scott Chamberlain can run it 
> without any errors on the same package.  Session_info for both of us follows. 
>  If any one has any suggestions both as to what is causing this and a 
> possible solution,  would appreciate it.
> 
> Roy's sessionInfo is after running the commands:
> 
> Sys.setenv(NOT_CRAN = "true")
> x = goodpractice::gp(path = ".", checks = all_checks()[2:230])
> 
> Scott's is after running:
> 
> Sys.setenv(NOT_CRAN = "true")
> x = goodpractice::gp()
> 
> 
> 
> 
> Roy's_session_info()
> ─ Session info 
> ──
>  setting  value
>  version  R version 3.4.1 (2017-06-30)
>  os   macOS Sierra 10.12.6
>  system   x86_64, darwin15.6.0
>  ui   RStudio
>  language (EN)
>  collate  en_US.UTF-8
>  tz   America/Los_Angeles
>  date 2017-09-06
> 
> ─ Packages  package  * version date   source
>  assertthat 0.2.0   2017-04-11 CRAN (R 3.4.1)
>  backports  1.1.0   2017-05-22 CRAN (R 3.4.0)
>  callr  1.0.0.9000  2017-09-02 Github (r-lib/callr@2dffbbe)
>  clisymbols 1.2.0   2017-09-02 Github (gaborcsardi/clisymbols@e49b4f5)
>  covr   3.0.0   2017-06-26 CRAN (R 3.4.1)
>  crayon 1.3.2.9000  2017-08-25 Github (gaborcsardi/crayon@e4dba3b)
>  cyclocomp  1.1.0   2017-09-02 Github (MangoTheCat/cyclocomp@6156a12)
>  debugme1.0.2   2017-03-01 CRAN (R 3.4.0)
>  desc   1.1.1   2017-08-03 CRAN (R 3.4.1)
>  devtools   1.13.3.9000 2017-08-31 Github (hadley/devtools@91490d1)
>  digest 0.6.12  2017-01-27 CRAN (R 3.4.1)
>  goodpractice * 1.0.0   2017-09-02 Github 
> (MangoTheCat/goodpractice@9969799)
>  httr   1.3.1   2017-08-20 CRAN (R 3.4.1)
>  igraph 1.1.2   2017-07-21 CRAN (R 3.4.1)
>  jsonlite   1.5 2017-06-01 CRAN (R 3.4.0)
>  knitr  1.172017-08-10 CRAN (R 3.4.1)
>  lazyeval   0.2.0   2016-06-12 CRAN (R 3.4.0)
>  lintr  1.0.1   2017-08-10 CRAN (R 3.4.1)
>  magrittr   1.5 2014-11-22 CRAN (R 3.4.0)
>  memoise1.1.0   2017-04-21 CRAN (R 3.4.0)
>  pkgbuild   0.0.0.9000  2017-08-31 Github (r-lib/pkgbuild@6574561)
>  pkgconfig  2.0.1   2017-03-21 CRAN (R 3.4.0)
>  pkgload0.0.0.9000  2017-08-31 Github (r-pkgs/pkgload@80a6493)
>  praise 1.0.0   2015-08-11 CRAN (R 3.4.0)
>  proc

Re: [R] withr::set_makevars

2017-09-06 Thread Roy Mendelssohn - NOAA Federal
Perfect,  thank you very much for the tip.

-Roy
> On Sep 6, 2017, at 5:26 PM, William Dunlap  wrote:
> 
> withr:::set_makevars() can give that error if the makefile named by the 
> 'old_path' argument (default "~/.R/Makevars) contains more than one 
> definition of a variable of the form 'name=value'.  You can see what file it 
> is reading and its contents by using the trace() function:
> 
> trace(withr:::set_makevars, quote({ cat(old_path, "\n"); writeLines(paste0("  
>   ", tryCatch(readLines(old_path), error=function(e)conditionMessage(e}))
> 
> Then run your test and see what file set_makevars is complaining about and 
> what in the file might cause trouble for set_makevars.
> 
> 
> Bill Dunlap
> TIBCO Software
> wdunlap tibco.com <http://tibco.com/>
> On Wed, Sep 6, 2017 at 3:41 PM, Roy Mendelssohn - NOAA Federal 
> mailto:roy.mendelss...@noaa.gov>> wrote:
> Hi All;
> 
> This problem has come about from trying to learn some of the review practices 
> recommend by rOpensci.  One of them is to use the package goodpractice.  
> After installing goodpractice, it kept failing on my own packages which are 
> under development, and I was concerned something was funny in my own ,  so I 
> have a fork of the package rerddap,  and I tested goodpractice on that.  I 
> get the error:
> 
> > Error in set_makevars(new, path, makevars_file, assignment = assignment) :
> >   Multiple results for CXXFLAGS found, something is wrong.FALSE
> >
> 
> 
> So after some playing around that is from the very first test,  which uses 
> the covr:package_coverage(), and sure enough running that produces the same 
> error.  Looking at the code,  that error is being thrown by the function 
> withr::set_makevars().  We are now too many layers deep into packages for me 
> to follow what is going on,  but the kicker is Scott Chamberlain can run it 
> without any errors on the same package.  Session_info for both of us follows. 
>  If any one has any suggestions both as to what is causing this and a 
> possible solution,  would appreciate it.
> 
> Roy's sessionInfo is after running the commands:
> 
> Sys.setenv(NOT_CRAN = "true")
> x = goodpractice::gp(path = ".", checks = all_checks()[2:230])
> 
> Scott's is after running:
> 
> Sys.setenv(NOT_CRAN = "true")
> x = goodpractice::gp()
> 
> 
> 
> 
> Roy's_session_info()
> ─ Session info 
> ──
>  setting  value
>  version  R version 3.4.1 (2017-06-30)
>  os   macOS Sierra 10.12.6
>  system   x86_64, darwin15.6.0
>  ui   RStudio
>  language (EN)
>  collate  en_US.UTF-8
>  tz   America/Los_Angeles
>  date 2017-09-06
> 
> ─ Packages  package  * version date   source
>  assertthat 0.2.0   2017-04-11 CRAN (R 3.4.1)
>  backports  1.1.0   2017-05-22 CRAN (R 3.4.0)
>  callr  1.0.0.9000  2017-09-02 Github (r-lib/callr@2dffbbe)
>  clisymbols 1.2.0   2017-09-02 Github (gaborcsardi/clisymbols@e49b4f5)
>  covr   3.0.0   2017-06-26 CRAN (R 3.4.1)
>  crayon 1.3.2.9000  2017-08-25 Github (gaborcsardi/crayon@e4dba3b)
>  cyclocomp  1.1.0   2017-09-02 Github (MangoTheCat/cyclocomp@6156a12)
>  debugme1.0.2   2017-03-01 CRAN (R 3.4.0)
>  desc   1.1.1   2017-08-03 CRAN (R 3.4.1)
>  devtools   1.13.3.9000 2017-08-31 Github (hadley/devtools@91490d1)
>  digest 0.6.12  2017-01-27 CRAN (R 3.4.1)
>  goodpractice * 1.0.0   2017-09-02 Github 
> (MangoTheCat/goodpractice@9969799)
>  httr   1.3.1   2017-08-20 CRAN (R 3.4.1)
>  igraph 1.1.2   2017-07-21 CRAN (R 3.4.1)
>  jsonlite   1.5 2017-06-01 CRAN (R 3.4.0)
>  knitr  1.172017-08-10 CRAN (R 3.4.1)
>  lazyeval   0.2.0   2016-06-12 CRAN (R 3.4.0)
>  lintr  1.0.1   2017-08-10 CRAN (R 3.4.1)
>  magrittr   1.5 2014-11-22 CRAN (R 3.4.0)
>  memoise1.1.0   2017-04-21 CRAN (R 3.4.0)
>  pkgbuild   0.0.0.9000  2017-08-31 Github (r-lib/pkgbuild@6574561)
>  pkgconfig  2.0.1   2017-03-21 CRAN (R 3.4.0)
>  pkgload0.0.0.9000  2017-08-31 Github (r-pkgs/pkgload@80a6493)
>  praise 1.0.0   2015-08-11 CRAN (R 3.4.0)
>  processx   2.0.0.1 2017-07-30 CRAN (R 3.4.1)
>  R6 2.2.2   2017-06-17 CRAN (R 3.4.0)
>  rcmdcheck  1.2.1   2016-09-28 CRAN (R 3.4.0)
>  Rcpp   0.12.12 2017-07-15 CRAN (R 3.4.1)
>  remotes1.1.0   2017-07-09 CRAN (R 3.4.1)
>  rex1.1.1  

[R] withr::set_makevars

2017-09-06 Thread Roy Mendelssohn - NOAA Federal
Hi All;

This problem has come about from trying to learn some of the review practices 
recommend by rOpensci.  One of them is to use the package goodpractice.  After 
installing goodpractice, it kept failing on my own packages which are under 
development, and I was concerned something was funny in my own ,  so I have a 
fork of the package rerddap,  and I tested goodpractice on that.  I get the 
error:

> Error in set_makevars(new, path, makevars_file, assignment = assignment) : 
>   Multiple results for CXXFLAGS found, something is wrong.FALSE
> 


So after some playing around that is from the very first test,  which uses the 
covr:package_coverage(), and sure enough running that produces the same error.  
Looking at the code,  that error is being thrown by the function 
withr::set_makevars().  We are now too many layers deep into packages for me to 
follow what is going on,  but the kicker is Scott Chamberlain can run it 
without any errors on the same package.  Session_info for both of us follows.  
If any one has any suggestions both as to what is causing this and a possible 
solution,  would appreciate it.

Roy's sessionInfo is after running the commands:

Sys.setenv(NOT_CRAN = "true") 
x = goodpractice::gp(path = ".", checks = all_checks()[2:230])

Scott's is after running:

Sys.setenv(NOT_CRAN = "true") 
x = goodpractice::gp()




Roy's_session_info()
─ Session info 
──
 setting  value   
 version  R version 3.4.1 (2017-06-30)
 os   macOS Sierra 10.12.6
 system   x86_64, darwin15.6.0
 ui   RStudio 
 language (EN)
 collate  en_US.UTF-8 
 tz   America/Los_Angeles 
 date 2017-09-06  

─ Packages  package  * version date   source
   
 assertthat 0.2.0   2017-04-11 CRAN (R 3.4.1)   
 backports  1.1.0   2017-05-22 CRAN (R 3.4.0)   
 callr  1.0.0.9000  2017-09-02 Github (r-lib/callr@2dffbbe) 
 clisymbols 1.2.0   2017-09-02 Github (gaborcsardi/clisymbols@e49b4f5)  
 covr   3.0.0   2017-06-26 CRAN (R 3.4.1)   
 crayon 1.3.2.9000  2017-08-25 Github (gaborcsardi/crayon@e4dba3b)  
 cyclocomp  1.1.0   2017-09-02 Github (MangoTheCat/cyclocomp@6156a12)   
 debugme1.0.2   2017-03-01 CRAN (R 3.4.0)   
 desc   1.1.1   2017-08-03 CRAN (R 3.4.1)   
 devtools   1.13.3.9000 2017-08-31 Github (hadley/devtools@91490d1) 
 digest 0.6.12  2017-01-27 CRAN (R 3.4.1)   
 goodpractice * 1.0.0   2017-09-02 Github (MangoTheCat/goodpractice@9969799)
 httr   1.3.1   2017-08-20 CRAN (R 3.4.1)   
 igraph 1.1.2   2017-07-21 CRAN (R 3.4.1)   
 jsonlite   1.5 2017-06-01 CRAN (R 3.4.0)   
 knitr  1.172017-08-10 CRAN (R 3.4.1)   
 lazyeval   0.2.0   2016-06-12 CRAN (R 3.4.0)   
 lintr  1.0.1   2017-08-10 CRAN (R 3.4.1)   
 magrittr   1.5 2014-11-22 CRAN (R 3.4.0)   
 memoise1.1.0   2017-04-21 CRAN (R 3.4.0)   
 pkgbuild   0.0.0.9000  2017-08-31 Github (r-lib/pkgbuild@6574561)  
 pkgconfig  2.0.1   2017-03-21 CRAN (R 3.4.0)   
 pkgload0.0.0.9000  2017-08-31 Github (r-pkgs/pkgload@80a6493)  
 praise 1.0.0   2015-08-11 CRAN (R 3.4.0)   
 processx   2.0.0.1 2017-07-30 CRAN (R 3.4.1)   
 R6 2.2.2   2017-06-17 CRAN (R 3.4.0)   
 rcmdcheck  1.2.1   2016-09-28 CRAN (R 3.4.0)   
 Rcpp   0.12.12 2017-07-15 CRAN (R 3.4.1)   
 remotes1.1.0   2017-07-09 CRAN (R 3.4.1)   
 rex1.1.1   2016-12-05 CRAN (R 3.4.0)   
 rlang  0.1.2.9000  2017-09-05 Github (tidyverse/rlang@fd64bce) 
 rprojroot  1.2 2017-01-16 CRAN (R 3.4.0)   
 rstudioapi 0.6.0.9000  2017-08-31 Github (rstudio/rstudioapi@e1e466b)  
 sessioninfo1.0.1   2017-08-31 Github (r-lib/sessioninfo@e813de4)   
 stringi1.1.5   2017-04-07 CRAN (R 3.4.0)   
 stringr1.2.0   2017-02-18 CRAN (R 3.4.0)   
 usethis0.0.0.9000  2017-08-31 Github (r-lib/usethis@12e6f95)   
 whoami 1.1.1   2015-07-13 CRAN (R 

Re: [R] RMarkdown question

2017-08-29 Thread Roy Mendelssohn - NOAA Federal
Thanks.  I will try that and see if I can get it to work.  I am working on a 
vignette for a package.  I should have been more careful in my wording,  it was 
really a RMarkdown question,  not specifically an RNotebook question.  I will 
leave to others to decide if that implies R content, I am just grateful for the 
help.

-Roy

> On Aug 29, 2017, at 2:52 PM, Yihui Xie  wrote:
> 
> Although it is not an elegant solution, but if your output format is
> HTML, you can add an arbitrary empty HTML element like  id="foo"> before your code chunk. Then you can jump to this
>  via a link like "see [this code chunk](#foo)".
> 
> Regards,
> Yihui
> --
> https://yihui.name
> 
> 
> On Tue, Aug 29, 2017 at 1:30 PM, Roy Mendelssohn - NOAA Federal
>  wrote:
>> Hi All:
>> 
>> In creating a R Notebook I know that in the text I can  link to a (sub) 
>> section by using the command:
>> 
>> 
>> [Header 1](#anchor)
>> 
>> 
>> and putting the appropriate anchor name at the appropriate header.  But can 
>> the same be done for code chunks,  if the code chunk is named? What I want 
>> to do is say that such and such code chunk is an example of how to do 
>> something,  and have that link to the appropriate code chunk.
>> 
>> Thanks for any help.
>> 
>> -Roy
>> 
>> 
>> 
>> 
>> 
>> **
>> "The contents of this message do not reflect any position of the U.S. 
>> Government or NOAA."
>> **
>> Roy Mendelssohn
>> Supervisory Operations Research Analyst
>> NOAA/NMFS
>> Environmental Research Division
>> Southwest Fisheries Science Center
>> ***Note new street address***
>> 110 McAllister Way
>> Santa Cruz, CA 95060
>> Phone: (831)-420-3666
>> Fax: (831) 420-3980
>> e-mail: roy.mendelss...@noaa.gov www: http://www.pfeg.noaa.gov/
>> 
>> "Old age and treachery will overcome youth and skill."
>> "From those who have been given much, much will be expected"
>> "the arc of the moral universe is long, but it bends toward justice" -MLK Jr.
>> 
>> __
>> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
>> https://stat.ethz.ch/mailman/listinfo/r-help
>> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
>> and provide commented, minimal, self-contained, reproducible code.
> 

**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new street address***
110 McAllister Way
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: http://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] RMarkdown question

2017-08-29 Thread Roy Mendelssohn - NOAA Federal
Hi All:

In creating a R Notebook I know that in the text I can  link to a (sub) section 
by using the command:

 
[Header 1](#anchor)


 and putting the appropriate anchor name at the appropriate header.  But can 
the same be done for code chunks,  if the code chunk is named? What I want to 
do is say that such and such code chunk is an example of how to do something,  
and have that link to the appropriate code chunk.

Thanks for any help.

-Roy





**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new street address***
110 McAllister Way
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: http://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Extracting subset from netCDF file using lat/lon and converting into .csv in R

2017-08-28 Thread Roy Mendelssohn - NOAA Federal
Two questions:

1.  Is the order of the dimensions shown what is shown if you look at str(ncin) 
- I mean shown at the end where it describes the variable and its dimensions?

2.  Is you problem how to subset the netcdf file,  how to write to the .csv 
file, or both?

-Roy

> On Aug 28, 2017, at 2:21 PM, Eeusha Nafi  wrote:
> 
> I have a series of nertCDF files containing global data for a particular
> variable, e.g. tmin/tmax/precipiation/windspeed/relative
> humuidity/radiation etc. I get the following information when using
> *nc_open* function in R:
> 
> datafile: https://www.dropbox.com/s/xpo7zklcmtm3g5r/gfdl_preci.nc?dl=0
> 
> File gfdl_preci.nc (NC_FORMAT_NETCDF4_CLASSIC):
> 
> 1 variables (excluding dimension variables):
>float prAdjust[lon,lat,time]
>_FillValue: 1.0002004088e+20
>missing_value: 1.0002004088e+20
>comment: includes all types (rain, snow, large-scale,
> convective, etc.)
>long_name: Bias-Corrected Precipitation
>units: kg m-2 s-1
>standard_name: precipitation_flux
> 
> 3 dimensions:
>lon  Size:720
>standard_name: longitude
>long_name: longitude
>units: degrees_east
>axis: X
>lat  Size:360
>standard_name: latitude
>long_name: latitude
>units: degrees_north
>axis: Y
>time  Size:365   *** is unlimited ***
>standard_name: time
>units: days since 1860-1-1 00:00:00
>calendar: standard
>axis: T
> 
>14 global attributes:
>CDI: Climate Data Interface version 1.7.0 (http://mpimet.mpg.de/cdi)
>Conventions: CF-1.4
>title: Model output climate of GFDL-ESM2M r1i1p1 Interpolated
> to 0.5 degree and bias corrected using observations from 1960 - 1999
> for EU WATCH project
>CDO: Climate Data Operators version 1.7.0 (http://mpimet.mpg.de/cdo)
>product_id: input
>model_id: gfdl-esm2m
>institute_id: PIK
>experiment_id: historical
>ensemble_id: r1i1p1
>time_frequency: daily
>creator: isi...@pik-potsdam.de
>description: GFDL-ESM2M bias corrected impact model input
> prepared for ISIMIP2
> 
> Now I want to extract a subset from this dataset using pair of lon and lat
> points, e.g., (12.875, -11.625) & (8.875, 4.125) and convert the file into
> .csv format.
> 
> so far I could make up to this step:
> 
> # load the ncdf4 package
> library(ncdf4)
> # set path and filename
> setwd("D:/netcdf")
> ncname <- "gfdl_preci"
> ncfname <- paste(ncname, ".nc", sep = "")
> dname <- "prAdjust"
> # open a netCDF file
> ncin <- nc_open(ncfname)
> print(ncin)# get longitude and latitude
> lon <- ncvar_get(ncin,"lon")
> nlon <- dim(lon)
> head(lon)
> 
> lat <- ncvar_get(ncin,"lat")
> nlat <- dim(lat)
> head(lat)
> 
> print(c(nlon,nlat))
> # get time
> time <- ncvar_get(ncin,"time")
> time
> 
> tunits <- ncatt_get(ncin,"time","units")
> nt <- dim(time)
> nt
> tunits
> # get variable
> preci.array <- ncvar_get(ncin,dname)
> 
> dlname <- ncatt_get(ncin,"prAdjust","long_name")
> 
> dunits <- ncatt_get(ncin,"prAdjust","units")
> 
> fillvalue <- ncatt_get(ncin,"prAdjust","_FillValue")
> 
> dim(preci.array)
> # split the time units string into fields
> tustr <- strsplit(tunits$value, " ")
> 
> tdstr <- strsplit(unlist(tustr)[3], "-")
> 
> tmonth = as.integer(unlist(tdstr)[2])
> 
> tday = as.integer(unlist(tdstr)[3])
> 
> tyear = as.integer(unlist(tdstr)[1])
> 
> chron(time, origin = c(tmonth, tday, tyear))
> 
> *Any help would be appreciated!!*
> 
>   [[alternative HTML version deleted]]
> 
> __
> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
> 

**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new street address***
110 McAllister Way
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: http://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Kalman filter for a time series

2017-07-30 Thread Roy Mendelssohn - NOAA Federal
> structSSM

Is no longer part of KFAS.  All you needed to do was:

library(KFAS)
?KFAS

and you would have seen that if you went to the index.  A structural state 
space model is now built up from its components,  much like in LM.   Look at;

?SSModel

-Roy

> On Jul 29, 2017, at 9:26 PM, Staff  wrote:
> 
> I found an example at
> http://www.bearcave.com/finance/random_r_hacks/kalman_smooth.html shown
> below.  But it seems the structSSM function has been removed from KFAS
> library so it won't run.  Does anyone know how to fix the code so that it
> runs?
> 
> 
> 
> library(KFAS)
> library(tseries)
> library(timeSeries)
> library(zoo)
> library(quantmod)
> 
> getDailyPrices = function( tickerSym, startDate, endDate )
> {
>  prices = get.hist.quote( instrument = tickerSym, start = startDate,
> end = endDate,
>   quote="AdjClose", provider="yahoo",
>   compression="d",  quiet=T)
> 
>  prices.ts = ts(prices)
>  return( prices.ts )
> }
> 
> kalmanFilter = function( x )
> {
>  t = x
>  if (class(t) != "ts") {
>t = ts(t)
>  }
>  ssModel = structSSM( y = t, distribution="Gaussian")
>  ssFit = fitSSM(inits=c(0.5*log(var(t)), 0.5*log(var(t))), model = ssModel )
>  kfs = KFS( ssFit$model, smoothing="state", nsim=length(t))
>  vals = kfs$a
>  lastVal = vals[ length(vals)]
>  return(lastVal)
> }
> 
> Start = "2011-01-01"
> End   = "2012-12-31"
> SandP = "^GSPC"
> 
> windowWidth = 20
> tsLength = 100
> 
> SAndP.ts = getDailyPrices( SandP, Start, End )
> SAndP.ts = SAndP.ts[1:tsLength]
> SAndP.smoothed = rollapply( data=SAndP.ts, width=windowWidth, 
> FUN=kalmanFilter)
> 
> par(mfrow=c(1,1))
> prices = coredata( SAndP.ts[windowWidth:length(SAndP.ts)])
> plot(prices, col="blue", type="l")
> lines(coredata(SAndP.smoothed), col="magenta")
> par(mfrow=c(1,1))
> 
>   [[alternative HTML version deleted]]
> 
> __
> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.

**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new street address***
110 McAllister Way
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: http://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Accessing Pointers

2017-06-22 Thread Roy Mendelssohn - NOAA Federal
Hi Lawrence:
> On Jun 22, 2017, at 4:26 PM, David Winsemius  wrote:
> 
> 
> 
>> is pointing to in the following line of code.  Need some help.
>> 
>> #install.packages('xml2')
>> library('xml2')
>> pg1 <- read_html("www.msn.com")
> 
> Error: 'www.msn.com' does not exist in current working directory 
> ('/Users/davidwinsemius').
> 
> 

I suggest you do:

?read_html

and peruse it carefully.  You are not passing it a URL.

> library('xml2')
> pg1 <- read_html("http://www.msn.com";)
> str(pg1)
List of 2
 $ node: 
 $ doc : 
 - attr(*, "class")= chr [1:2] "xml_document" "xml_node"


**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new street address***
110 McAllister Way
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: http://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Reversing one dimension of an array, in a generalized case

2017-06-01 Thread Roy Mendelssohn - NOAA Federal
Thanks again.  I am going to try the different versions.   But I probably won't 
be able to get to it till next week.

This is probably at the point where anything further should be sent to me 
privately.

-Roy



> On Jun 1, 2017, at 1:56 PM, David L Carlson  wrote:
> 
> On the off chance that anyone is still interested, here is the corrected 
> function using aperm():
> 
> z <- array(1:120,dim=2:5)
> f2 <- function(a, wh) {
>idx <- seq_len(length(dim(a)))
>dims <- setdiff(idx, wh)
>idx <- append(idx[-1], idx[1], wh-1)
>aperm(apply(a, dims, rev), idx)
> }
> 
> all.equal(f(z, 1), f2(z, 1))
> # [1] TRUE
> all.equal(f(z, 2), f2(z, 2))
> # [1] TRUE
> all.equal(f(z, 3), f2(z, 3))
> # [1] TRUE
> all.equal(f(z, 4), f2(z, 4))
> # [1] TRUE
> 
> David C
> 
> 
> From: Ismail SEZEN [mailto:sezenism...@gmail.com] 
> Sent: Thursday, June 1, 2017 3:35 PM
> To: Roy Mendelssohn - NOAA Federal 
> Cc: David L Carlson ; R-help 
> Subject: Re: [R] Reversing one dimension of an array, in a generalized case
> 
> 
> On 1 Jun 2017, at 22:42, Roy Mendelssohn - NOAA Federal 
>  wrote:
> 
> Thanks to all for responses/.  There was a question of exactly what was 
> wanted.  It is the generalization of the obvious example I gave,  
> 
> 
> junk1 <- junk[, rev(seq_len(10), ]
> 
> 
> so that
> 
> junk[1,1,1 ] = junk1[1,10,1]
> junk[1,2,1]  = junk1[1,9,1]
> 
> etc.
> 
> The genesis of this is the program is downloading data from a variety of 
> sources on (time, altitude, lat, lon) coordinates,  but all the coordinates 
> are not always there, and sometime the latitude coordinates go from north to 
> south and sometimes from south to north.  I want to always return the data 
> going from south to north, so if I find that the data is north to south,  I 
> have to first reverse the array with the coordinate values (easy enough),  
> and then reverse the one dimension in the data array that corresponds to 
> latitude. The downloaded information tells me which dimension is latitude 
> plus how many coordinates are in the data.
> 
> Hello Roy,
> Perhaps you are aware of but I want to mention anyway. Basic issue is that 
> you always want latitudes are monotonously increasing. Let me tell what I do 
> when I read a ncdf file:
> 
> 1- Set latitudes always monotonously decreasing (from 90 to -90)
> 2- Set longitudes always mononously increasing but from -180 to 180.
> 3- Set levels always monotonously decreasing (this is not relevant)
> 
> Why? If you plan to plot variables in R, you will need coordinates in this 
> order. For instance, if you set latitudes monotonously increasing, your map 
> will be plotted upside down. To fix this, you will need reverse dimension 
> again. And also if your longitudes ranges from 0 to 360, you will see the 
> only the east side of the plot on a world map. West of Greencwich will be 
> empty.  They were the problems that I faced last year when I tried to plot 
> netcdf files using lattice and rasterVis packages. 
> 
> 
> 
> 
> As I the said,  I haven't done extensive testing on what Bert sent,  but on a 
> toy 3-dimensional example I have it appeared to do what I want.
> 
> Thanks again,
> 
> -Roy
> 
> 
> On Jun 1, 2017, at 12:22 PM, David L Carlson  wrote:
> 
> My error. Clearly I did not do enough testing.
> 
> z <- array(1:24,dim=2:4)
> 
> all.equal(f(z,1),f2(z,1))
> [1] TRUE
> 
> all.equal(f(z,2),f2(z,2))
> [1] TRUE
> 
> all.equal(f(z,3),f2(z,3))
> [1] "Attributes: < Component “dim”: Mean relative difference: 0.444 >"
> [2] "Mean relative difference: 0.6109091"
> 
> # Your earlier example
> 
> z <- array(1:120, dim=2:5)
> all.equal(f(z,1),f2(z,1))
> [1] TRUE
> 
> all.equal(f(z,2),f2(z,2))
> [1] TRUE
> 
> all.equal(f(z,3),f2(z,3))
> [1] "Attributes: < Component “dim”: Mean relative difference: 0.444 >"
> [2] "Mean relative difference: 0.1262209" 
> 
> all.equal(f(z,4),f2(z,4))
> [1] "Attributes: < Component “dim”: Mean relative difference: 0.5714286 >"
> [2] "Mean relative difference: 0.5855162"  
> 
> David C
> 
> -Original Message-
> From: Bert Gunter [mailto:bgunter.4...@gmail.com] 
> Sent: Thursday, June 1, 2017 2:00 PM
> To: David L Carlson 
> Cc: Roy Mendelssohn - NOAA Federal ; R-help 
> 
> Subject: Re: [R] Reversing one dimension of an array, in a generalized case
> 
> ??
> 
> 
> z <- array(1:24,dim=2:4)
> all.equal(f(z,3),f2(z,3))
> 
> [1] "Attributes: < Component “dim”: Mean relative difference: 0.444

Re: [R] Reversing one dimension of an array, in a generalized case

2017-06-01 Thread Roy Mendelssohn - NOAA Federal
Thank you.  That ignores certain standards in the communities I work in,  it 
also ignore the fact that whether I decide to always return increasing or 
decreasing latitudes,  if the sources aren't consistent,  then I need to 
reverse some of the data,  no matter which way I decide.

Increasing latitudes  (and longitudes) among other things  have the nice 
property that the array indices are also increasing in the same way. I do a lot 
of mapping in R, and a number of the mapping routines actually require you to  
"melt" the data to long-form  in which case the order of the latitudes in the 
array is irrelevant,  as long as the mapping is correct.


-Roy



> On Jun 1, 2017, at 1:35 PM, Ismail SEZEN  wrote:
> 
>> 
>> On 1 Jun 2017, at 22:42, Roy Mendelssohn - NOAA Federal 
>>  wrote:
>> 
>> Thanks to all for responses/.  There was a question of exactly what was 
>> wanted.  It is the generalization of the obvious example I gave,  
>> 
>>>>> junk1 <- junk[, rev(seq_len(10), ]
>> 
>> 
>> so that
>> 
>> junk[1,1,1 ] = junk1[1,10,1]
>> junk[1,2,1]  = junk1[1,9,1]
>> 
>> etc.
>> 
>> The genesis of this is the program is downloading data from a variety of 
>> sources on (time, altitude, lat, lon) coordinates,  but all the coordinates 
>> are not always there, and sometime the latitude coordinates go from north to 
>> south and sometimes from south to north.  I want to always return the data 
>> going from south to north, so if I find that the data is north to south,  I 
>> have to first reverse the array with the coordinate values (easy enough),  
>> and then reverse the one dimension in the data array that corresponds to 
>> latitude. The downloaded information tells me which dimension is latitude 
>> plus how many coordinates are in the data.
> 
> Hello Roy,
> Perhaps you are aware of but I want to mention anyway. Basic issue is that 
> you always want latitudes are monotonously increasing. Let me tell what I do 
> when I read a ncdf file:
> 
> 1- Set latitudes always monotonously decreasing (from 90 to -90)
> 2- Set longitudes always mononously increasing but from -180 to 180.
> 3- Set levels always monotonously decreasing (this is not relevant)
> 
> Why? If you plan to plot variables in R, you will need coordinates in this 
> order. For instance, if you set latitudes monotonously increasing, your map 
> will be plotted upside down. To fix this, you will need reverse dimension 
> again. And also if your longitudes ranges from 0 to 360, you will see the 
> only the east side of the plot on a world map. West of Greencwich will be 
> empty.  They were the problems that I faced last year when I tried to plot 
> netcdf files using lattice and rasterVis packages. 
> 
> 
>> 
>> As I the said,  I haven't done extensive testing on what Bert sent,  but on 
>> a toy 3-dimensional example I have it appeared to do what I want.
>> 
>> Thanks again,
>> 
>> -Roy
>> 
>>> On Jun 1, 2017, at 12:22 PM, David L Carlson  wrote:
>>> 
>>> My error. Clearly I did not do enough testing.
>>> 
>>> z <- array(1:24,dim=2:4)
>>>> all.equal(f(z,1),f2(z,1))
>>> [1] TRUE
>>>> all.equal(f(z,2),f2(z,2))
>>> [1] TRUE
>>>> all.equal(f(z,3),f2(z,3))
>>> [1] "Attributes: < Component “dim”: Mean relative difference: 0.444 >"
>>> [2] "Mean relative difference: 0.6109091"
>>> 
>>> # Your earlier example
>>>> z <- array(1:120, dim=2:5)
>>>> all.equal(f(z,1),f2(z,1))
>>> [1] TRUE
>>>> all.equal(f(z,2),f2(z,2))
>>> [1] TRUE
>>>> all.equal(f(z,3),f2(z,3))
>>> [1] "Attributes: < Component “dim”: Mean relative difference: 0.444 >"
>>> [2] "Mean relative difference: 0.1262209" 
>>>> all.equal(f(z,4),f2(z,4))
>>> [1] "Attributes: < Component “dim”: Mean relative difference: 0.5714286 >"
>>> [2] "Mean relative difference: 0.5855162"  
>>> 
>>> David C
>>> 
>>> -Original Message-
>>> From: Bert Gunter [mailto:bgunter.4...@gmail.com] 
>>> Sent: Thursday, June 1, 2017 2:00 PM
>>> To: David L Carlson 
>>> Cc: Roy Mendelssohn - NOAA Federal ; R-help 
>>> 
>>> Subject: Re: [R] Reversing one dimension of an array, in a generalized case
>>> 
>>> ??
>>> 
>>>> z <- array(1:24,dim=2:4)
>>>> all.equal(f(z,3),f2(z,3))
>>> 
>>> [1] &qu

Re: [R] Reversing one dimension of an array, in a generalized case

2017-06-01 Thread Roy Mendelssohn - NOAA Federal
Thanks to all for responses/.  There was a question of exactly what was wanted. 
 It is the generalization of the obvious example I gave,  

>>> junk1 <- junk[, rev(seq_len(10), ]


so that

junk[1,1,1 ] = junk1[1,10,1]
junk[1,2,1]  = junk1[1,9,1]

etc.

The genesis of this is the program is downloading data from a variety of 
sources on (time, altitude, lat, lon) coordinates,  but all the coordinates are 
not always there, and sometime the latitude coordinates go from north to south 
and sometimes from south to north.  I want to always return the data going from 
south to north, so if I find that the data is north to south,  I have to first 
reverse the array with the coordinate values (easy enough),  and then reverse 
the one dimension in the data array that corresponds to latitude. The 
downloaded information tells me which dimension is latitude plus how many 
coordinates are in the data.

As I the said,  I haven't done extensive testing on what Bert sent,  but on a 
toy 3-dimensional example I have it appeared to do what I want.

Thanks again,

-Roy

> On Jun 1, 2017, at 12:22 PM, David L Carlson  wrote:
> 
> My error. Clearly I did not do enough testing.
> 
> z <- array(1:24,dim=2:4)
>> all.equal(f(z,1),f2(z,1))
> [1] TRUE
>> all.equal(f(z,2),f2(z,2))
> [1] TRUE
>> all.equal(f(z,3),f2(z,3))
> [1] "Attributes: < Component “dim”: Mean relative difference: 0.444 >"
> [2] "Mean relative difference: 0.6109091"
> 
> # Your earlier example
>> z <- array(1:120, dim=2:5)
>> all.equal(f(z,1),f2(z,1))
> [1] TRUE
>> all.equal(f(z,2),f2(z,2))
> [1] TRUE
>> all.equal(f(z,3),f2(z,3))
> [1] "Attributes: < Component “dim”: Mean relative difference: 0.444 >"
> [2] "Mean relative difference: 0.1262209" 
>> all.equal(f(z,4),f2(z,4))
> [1] "Attributes: < Component “dim”: Mean relative difference: 0.5714286 >"
> [2] "Mean relative difference: 0.5855162"  
> 
> David C
> 
> -Original Message-
> From: Bert Gunter [mailto:bgunter.4...@gmail.com] 
> Sent: Thursday, June 1, 2017 2:00 PM
> To: David L Carlson 
> Cc: Roy Mendelssohn - NOAA Federal ; R-help 
> 
> Subject: Re: [R] Reversing one dimension of an array, in a generalized case
> 
> ??
> 
>> z <- array(1:24,dim=2:4)
>> all.equal(f(z,3),f2(z,3))
> 
> [1] "Attributes: < Component “dim”: Mean relative difference: 0.444 >"
> [2] "Mean relative difference: 0.6109091"
> 
> In fact,
> 
>> dim(f(z,3))
> [1] 2 3 4
> 
>> dim(f2(z,3))
> [1] 3 4 2
> 
> Have I made some sort of stupid error here? Or have I misunderstood
> what was wanted?
> 
> Cheers,
> Bert
> 
> 
> 
> 
> Bert Gunter
> 
> "The trouble with having an open mind is that people keep coming along
> and sticking things into it."
> -- Opus (aka Berkeley Breathed in his "Bloom County" comic strip )
> 
> 
> On Thu, Jun 1, 2017 at 11:34 AM, David L Carlson  wrote:
>> Here is an alternative approach using apply(). Note that with apply() you 
>> are reversing rows or columns not indices of rows or columns so apply(junk, 
>> 2, rev) reverses the values in each column not the column indices. We 
>> actually need to use rev() on everything but the index we are interested in 
>> reversing:
>> 
>> f2 <- function(a, wh) {
>>dims <- seq_len(length(dim(a)))
>>dims <- setdiff(dims, wh)
>>apply(apply(a, dims, rev), dims, t)
>> }
>> 
>> # Your example
>> j1 <- junk[ , rev(1:10), ]
>> j2 <- f2(junk, 2)
>> all.equal(j1, j2)
>> # [1] TRUE
>> 
>> # Bert's example
>> z1 <- f(z, 2)
>> z2 <- f2(z, 2)
>> all.equal(z1, z2)
>> # [1] TRUE
>> 
>> -
>> David L Carlson
>> Department of Anthropology
>> Texas A&M University
>> College Station, TX 77840-4352
>> 
>> 
>> 
>> 
>> 
>> 
>> -Original Message-
>> From: R-help [mailto:r-help-boun...@r-project.org] On Behalf Of Bert Gunter
>> Sent: Thursday, June 1, 2017 12:46 PM
>> To: Roy Mendelssohn - NOAA Federal 
>> Cc: R-help 
>> Subject: Re: [R] Reversing one dimension of an array, in a generalized case
>> 
>> How about this:
>> 
>> f <- function(a,wh){ ## a is the array; wh is the index to be reversed
>>   l<- lapply(dim(a),seq_len)
>>   l[[wh]]<- rev(l[[wh]])
>>   do.call(`[`,c(list(a),l))
>> }
>> 
>> ## test
>> z <- array(1:120,dim=2:5)
>> 
>>

Re: [R] Reversing one dimension of an array, in a generalized case

2017-06-01 Thread Roy Mendelssohn - NOAA Federal
Thank you very much.  I have a little test example I have been working with,  
and it does seem to work.I will have to go through and parse this to 
understand what you are doing

What I had been doing is building up a string with the arguments and calling 
it,  it works but very kludgy and very fragile.

Thanks again.

-Roy


> On Jun 1, 2017, at 10:45 AM, Bert Gunter  wrote:
> 
> How about this:
> 
> f <- function(a,wh){ ## a is the array; wh is the index to be reversed
>   l<- lapply(dim(a),seq_len)
>   l[[wh]]<- rev(l[[wh]])
>   do.call(`[`,c(list(a),l))
> }
> 
> ## test
> z <- array(1:120,dim=2:5)
> 
> ##  I omit the printouts
> 
> f(z,2)
> 
> f(z,3)
> 
> 
> Cheers,
> Bert
> 
> Bert Gunter
> 
> "The trouble with having an open mind is that people keep coming along
> and sticking things into it."
> -- Opus (aka Berkeley Breathed in his "Bloom County" comic strip )
> 
> 
> On Thu, Jun 1, 2017 at 9:51 AM, Roy Mendelssohn - NOAA Federal
>  wrote:
>> Hi All:
>> 
>> I have been looking for an elegant way to do the following,  but haven't 
>> found it,  I have never had a good understanding of any of the "apply" 
>> functions.
>> 
>> A simplified idea is I have an array, say:
>> 
>> junk(5, 10, 3)
>> 
>> where  (5, 10, 3) give the dimension sizes, and I want to reverse the second 
>> dimension, so I could do:
>> 
>> junk1 <- junk[, rev(seq_len(10), ]
>> 
>> but what I am after is a general function that will do that where the array 
>> could be two, three or four dimensions,  and I pass to the function which 
>> dimension I want to reverse, that is the function can not assume the number 
>> of dimensions of the array nor which dimension to reverse.
>> 
>> For example,  if i try:
>> 
>> junk1 <- apply(junk, 2, rev)
>> 
>> junk1 comes out as two-dimensional,  not three-dimensional.
>> 
>> It is probably something obvious but I am not getting it.
>> 
>> Thanks for any help.
>> 
>> -Roy
>> 
>> 
>> **
>> "The contents of this message do not reflect any position of the U.S. 
>> Government or NOAA."
>> **
>> Roy Mendelssohn
>> Supervisory Operations Research Analyst
>> NOAA/NMFS
>> Environmental Research Division
>> Southwest Fisheries Science Center
>> ***Note new street address***
>> 110 McAllister Way
>> Santa Cruz, CA 95060
>> Phone: (831)-420-3666
>> Fax: (831) 420-3980
>> e-mail: roy.mendelss...@noaa.gov www: http://www.pfeg.noaa.gov/
>> 
>> "Old age and treachery will overcome youth and skill."
>> "From those who have been given much, much will be expected"
>> "the arc of the moral universe is long, but it bends toward justice" -MLK Jr.
>> 
>> __
>> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
>> https://stat.ethz.ch/mailman/listinfo/r-help
>> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
>> and provide commented, minimal, self-contained, reproducible code.

**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new street address***
110 McAllister Way
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: http://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Reversing one dimension of an array, in a generalized case

2017-06-01 Thread Roy Mendelssohn - NOAA Federal
Hi All:

I have been looking for an elegant way to do the following,  but haven't found 
it,  I have never had a good understanding of any of the "apply" functions.

A simplified idea is I have an array, say:

junk(5, 10, 3)

where  (5, 10, 3) give the dimension sizes, and I want to reverse the second 
dimension, so I could do:

junk1 <- junk[, rev(seq_len(10), ]

but what I am after is a general function that will do that where the array 
could be two, three or four dimensions,  and I pass to the function which 
dimension I want to reverse, that is the function can not assume the number of 
dimensions of the array nor which dimension to reverse.

For example,  if i try:

junk1 <- apply(junk, 2, rev)

junk1 comes out as two-dimensional,  not three-dimensional.

It is probably something obvious but I am not getting it.

Thanks for any help.

-Roy


**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new street address***
110 McAllister Way
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: http://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Xtractomatic version 3.3.2 now available on CRAN

2017-05-23 Thread Roy Mendelssohn - NOAA Federal
The xtractomatic package version 3.3.2 is now available on CRAN.  Besides the 
improvements listed below,  this release fixes a problem caused by an update to 
the Apache Tomcat used by the ERDDAP server,  that broke an important function 
in the package.

Many thanks to the CRAN maintainers for help in getting this on CRAN.

List of changes below,  as always the development version is available at 
https://github.com/rmendels/xtractomatic

-Roy

• Fixes problem with newer versions of Apache Tomcat handling of 
special characters in URLS
• dtype as number no longer allowed
• searchData() now takes a list of objects of the form 
"searchType:searchString"
• new datasets added
• inactive or out of date datasets removed



**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new street address***
110 McAllister Way
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: http://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

[R] plotly example that highlights a line

2017-02-28 Thread Roy Mendelssohn - NOAA Federal
Hi All:

In searching online,  I have found examples of using plotly with ggplot2 
graphics,  say using geom_line,  where there are multiple lines and by 
selecting the "factor" in the legend makes the particular line disappear or 
reappear  (see https://plot.ly/ggplot2/).  I am wondering if anyone's an 
example or knows how to set it up, so that instead if I click on the factor in 
the legend it highlights the line, and hen unhighlights if i select again

Thanks for any help.

-Roy



**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new street address***
110 McAllister Way
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: http://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] testers sought for xtractomatic-like package

2017-02-03 Thread Roy Mendelssohn - NOAA Federal
rerddapXtracto is an R package developed to subset and extract satellite and 
other oceanographic related data from a remote ERDDAP server. The program can 
extract data for a moving point in time along a user-supplied set of longitude, 
latitude and time points; in a 3D bounding box; or within a polygon (through 
time). 

These functions differ from those in the xtractomatic package in that they use 
the rerddap package to access gridded data on any ERDDAP server, but they 
require the user to provide initial information about the data to be extracted. 

I am looking for people to test the package.  Right now it is only available on 
github at https://github.com/rmendels/rerddapXtracto.  A number of people have 
successful used the package,  but for now consider it test quality only.  The 
vignette can be seen at https://rmendels.github.io/UsingrerddapXtracto.nb.html  
and an RNotebook downloaded from there.

Thanks in advance for any help.

-Roy




**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new street address***
110 McAllister Way
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: http://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Error in R_nc4_open

2017-01-11 Thread Roy Mendelssohn - NOAA Federal

> On Jan 11, 2017, at 8:39 PM, Debasish Pai Mazumder  wrote:
> 
> Thanks so much Roy. It works. 
> Thanks Jeff for all your help. 
> As a part of NCAR Command Language help group, I was only concern about the 
> first response I received from this help group which will discourage new user 
> like me to post their problems in this forum.
> 
> I would like apologize if I caused any inconvenience to anyone.
> 
> with regards
> -Deb

Your welcome.  The update to RStudio was just a coincidence.  It is the often 
sudden switch to https at many sites,  usually with http redirect that is 
causing a lot of problems.  Not every library handles the redirect cleanly. It 
is a decision made without much thought as to what might break.  Having https 
available is a good thing.  Forced re-direct without any testing of what 
will/will not work not so good.  We had a slew of python code break for the 
same reason.  And oddly enough versions of OS,  versions and suppliers of 
python, and versions of OpenSSL greatly affect whether things  work.

-Roy


**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new street address***
110 McAllister Way
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: http://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Error in R_nc4_open

2017-01-11 Thread Roy Mendelssohn - NOAA Federal
Try replacing http with https.

>  gribfile <- 
> "https://nomads.ncdc.noaa.gov/thredds/dodsC/modeldata/cfsv2_forecast_ts_9mon/2011/201104/20110401/2011040100/tmax.01.2011040100.daily.grb2";
> nc <- nc_open(gribfile)
> str(nc)
List of 14
 $ filename   : chr 
"https://nomads.ncdc.noaa.gov/thredds/dodsC/modeldata/cfsv2_forecast_ts_9mon/2011/201104/20110401/2011040100/tmax.01.2011040100.";|
 __truncated__
 $ writable   : logi FALSE
 $ id : int 65536
 $ safemode   : logi FALSE
 $ format : chr "NC_FORMAT_64BIT"
 $ is_GMT : logi FALSE
 $ groups :List of 1
..

-Roy



> On Jan 11, 2017, at 1:46 PM, Debasish Pai Mazumder  wrote:
> 
> Hi Jeff,
> Thanks for your detail response but I am really baffled by this sort of
> response from a help group because I am also a part of a help group
> (NCAR-Command Language).
> 
> Anyway I felt its R/Rstudio issue because my code was working properly
> before..but since the update it isn't working anymore
> 
> Here is few lines of my code
> 
> library("ncdf4")
> gribfile<-"
> http://nomads.ncdc.noaa.gov/thredds/dodsC/modeldata/cfsv2_forecast_ts_9mon/2011/201104/20110401/2011040100/tmax.01.2011040100.daily.grb2
> "
> ## open connection
> nc <- nc_open(gribfile)
> 
> error-Error in R_nc4_open: NetCDF: DAP server error
> 
> I am getting the error message in nc_open.
> 
> Please excuse me if this not R issue. Please let me know which "*R HELP
> GROUP*" I should post this message.
> 
> -Deb
> 
> 
> 
> 
> On Wed, Jan 11, 2017 at 4:27 PM, Jeff Newmiller 
> wrote:
> 
>> You will do yourself a favor if you pay attention to which software you
>> are using.  When you use RStudio, the console window is a direct connection
>> to R, which is NOT RStudio... it has a separate installer and a different
>> support group (e.g. this mailing list). All of this means you have failed
>> to tell us anything about the software that actually matters in your
>> question... R, nor the CONTRIBUTED package (meaning NOT part of R or
>> RStudio) containing the function you are having trouble with.
>> 
>> Go read the Posting Guide... carefully... and provide a reproducible
>> example that leads to your error... and post in plain text rather than HTML
>> email so your example doesn't get messed up in transit. The output of
>> sessionInfo() after the error would probably be a good idea also. For good
>> measure you should open R directly rather than through RStudio and confirm
>> the results (if it doesn't happen in R then RStudio may be breaking R
>> (rare, but it has happened) in which case you would have to ask them for
>> help.
>> --
>> Sent from my phone. Please excuse my brevity.
>> 
>> On January 11, 2017 12:32:09 PM PST, Debasish Pai Mazumder <
>> pai1...@gmail.com> wrote:
>>> Hi all,
>>> I recently updated my Rstudio to the newer version (Version 1.0.136)
>>> and I
>>> started to getting following error when I am trying to read netcdf
>>> files
>>> 
>>> Error in R_nc4_open: NetCDF: DAP server error
>>> 
>>> Any ideas?
>>> 
>>> with regards
>>> -Deb
>>> 
>>>  [[alternative HTML version deleted]]
>>> 
>>> __
>>> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
>>> https://stat.ethz.ch/mailman/listinfo/r-help
>>> PLEASE do read the posting guide
>>> http://www.R-project.org/posting-guide.html
>>> and provide commented, minimal, self-contained, reproducible code.
>> 
> 
>   [[alternative HTML version deleted]]
> 
> __
> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.

**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new street address***
110 McAllister Way
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: http://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] xtractomatic v3.2.0

2017-01-04 Thread Roy Mendelssohn - NOAA Federal
Hi All:

I am pleased to announce that xtractomatic v3.2.0 is now available on CRAN.  
The changes in this version will be invisible to the user - the major changes 
are the use of https instead of http,  and some changes in the vignette so that 
multiple attempts are made to download the data (the vignette as published will 
 look the same as in the previous version,  just the raw .Rmd  file has 
changes).

If your institution requires or recommends the use of https, as is increasingly 
the case, then I recommend upgrading the package.  Existing code using the 
package should not be affected.

Thanks,

-Roy



**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new street address***
110 McAllister Way
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: http://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Lubdridate: subset based on hour and minute

2017-01-02 Thread Roy Mendelssohn - NOAA Federal
The trick is to realize times are stored internally in a special format,  
though they can be displayed in several ways based on the internal 
representation.  The trick in doing comparison with times is to make certain 
what you are comparing with has also been put into the same internal time 
format.  This should make clear the basis for the logical tests:

> as.numeric(test)
[1] 34200 36000 37800 39600
> 
> as.numeric(hm("10:00"))
[1] 36000
> 

-Roy


> On Jan 2, 2017, at 12:39 PM, Joe Ceradini  wrote:
> 
> Bingo! Thanks! I somehow couldn't find an example like that via google.
> 
> Joe
> 
> On Mon, Jan 2, 2017 at 1:35 PM, Roy Mendelssohn - NOAA Federal
>  wrote:
>>> test > hm("10:00")
>> [1] FALSE FALSE  TRUE  TRUE
>>> test[test > hm("10:00")]
>> [1] "10H 30M 0S" "11H 0M 0S"
>> 
>> -Roy
>> 
>> 
>>> On Jan 2, 2017, at 12:27 PM, Joe Ceradini  wrote:
>>> 
>>> Thanks for the reply Roy!
>>> 
>>> Perhaps you're showing me the way and I'm missing it - how would I
>>> subset to only 1030 and 1100, excluding 1000? It seems I would need to
>>> say, give me all time greater than 10:00, but the hours and minutes
>>> are in separate slots, which is throwing me off.
>>> 
>>> Thanks again.
>>> 
>>> On Mon, Jan 2, 2017 at 1:13 PM, Roy Mendelssohn - NOAA Federal
>>>  wrote:
>>>> Hi Joe:
>>>> 
>>>> See below.
>>>>> On Jan 2, 2017, at 12:01 PM, Joe Ceradini  wrote:
>>>>> 
>>>>> Hi folks,
>>>>> 
>>>>> I must be missing something obvious/painfully simple here
>>>>> 
>>>>> How do I subset a time vector based on hours AND minutes? So, in this
>>>>> example, I want all time greater than 10:00, i.e., 10:30 and 11:00.
>>>>> I'm working with lubridate which separates the hours and minutes into
>>>>> separate slots.
>>>>> 
>>>>> require(lubridate)
>>>>> 
>>>>> test <- hm(c("9:30", "10:00", "10:30", "11:00"))
>>>>> test
>>>>> [1] "9H 30M 0S"  "10H 0M 0S"  "10H 30M 0S" "11H 0M 0S"
>>>>> 
>>>>> This gets 11 but not 1030
>>>>> test[test@hour > 10]
>>>>> [1] "11H 0M 0S"
>>>>> 
>>>>> This gets 1030 but not 11
>>>>> test[test@hour > 9 & test@minute > 0]
>>>>> [1] "10H 30M 0S"
>>>> 
>>>> test[test@hour > 9]
>>>> [1] "10H 0M 0S"  "10H 30M 0S" "11H 0M 0S"
>>>> 
>>>> 
>>>> You are using a logical "and" in your test - so the condition "test@minute 
>>>> > 0" isn't met for 11:00 and therefore it doesn't show up. as both 
>>>> conditions must be met  You could also do:
>>>> 
>>>>> test[test@hour >= 10]
>>>> [1] "10H 0M 0S"  "10H 30M 0S" "11H 0M 0S"
>>>> 
>>>> -HTH,
>>>> 
>>>> Roy
>>>> 
>>>>> 
>>>>> Thanks and happy new year!
>>>>> Joe
>>>>> 
>>>>> __
>>>>> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
>>>>> https://stat.ethz.ch/mailman/listinfo/r-help
>>>>> PLEASE do read the posting guide 
>>>>> http://www.R-project.org/posting-guide.html
>>>>> and provide commented, minimal, self-contained, reproducible code.
>>>> 
>>>> **
>>>> "The contents of this message do not reflect any position of the U.S. 
>>>> Government or NOAA."
>>>> **
>>>> Roy Mendelssohn
>>>> Supervisory Operations Research Analyst
>>>> NOAA/NMFS
>>>> Environmental Research Division
>>>> Southwest Fisheries Science Center
>>>> ***Note new street address***
>>>> 110 McAllister Way
>>>> Santa Cruz, CA 95060
>>>> Phone: (831)-420-3666
>>>> Fax: (831) 420-3980
>>>> e-mail: roy.mendelss...@noaa.gov www: http://www.pfeg.noaa.gov/
>>>> 
>>>> "Old age and treachery will overcome youth and skill."
>>>> "From those w

Re: [R] Lubdridate: subset based on hour and minute

2017-01-02 Thread Roy Mendelssohn - NOAA Federal
> test > hm("10:00")
[1] FALSE FALSE  TRUE  TRUE
> test[test > hm("10:00")]
[1] "10H 30M 0S" "11H 0M 0S" 

-Roy


> On Jan 2, 2017, at 12:27 PM, Joe Ceradini  wrote:
> 
> Thanks for the reply Roy!
> 
> Perhaps you're showing me the way and I'm missing it - how would I
> subset to only 1030 and 1100, excluding 1000? It seems I would need to
> say, give me all time greater than 10:00, but the hours and minutes
> are in separate slots, which is throwing me off.
> 
> Thanks again.
> 
> On Mon, Jan 2, 2017 at 1:13 PM, Roy Mendelssohn - NOAA Federal
>  wrote:
>> Hi Joe:
>> 
>> See below.
>>> On Jan 2, 2017, at 12:01 PM, Joe Ceradini  wrote:
>>> 
>>> Hi folks,
>>> 
>>> I must be missing something obvious/painfully simple here
>>> 
>>> How do I subset a time vector based on hours AND minutes? So, in this
>>> example, I want all time greater than 10:00, i.e., 10:30 and 11:00.
>>> I'm working with lubridate which separates the hours and minutes into
>>> separate slots.
>>> 
>>> require(lubridate)
>>> 
>>> test <- hm(c("9:30", "10:00", "10:30", "11:00"))
>>> test
>>> [1] "9H 30M 0S"  "10H 0M 0S"  "10H 30M 0S" "11H 0M 0S"
>>> 
>>> This gets 11 but not 1030
>>> test[test@hour > 10]
>>> [1] "11H 0M 0S"
>>> 
>>> This gets 1030 but not 11
>>> test[test@hour > 9 & test@minute > 0]
>>> [1] "10H 30M 0S"
>> 
>> test[test@hour > 9]
>> [1] "10H 0M 0S"  "10H 30M 0S" "11H 0M 0S"
>> 
>> 
>> You are using a logical "and" in your test - so the condition "test@minute > 
>> 0" isn't met for 11:00 and therefore it doesn't show up. as both conditions 
>> must be met  You could also do:
>> 
>>> test[test@hour >= 10]
>> [1] "10H 0M 0S"  "10H 30M 0S" "11H 0M 0S"
>> 
>> -HTH,
>> 
>> Roy
>> 
>>> 
>>> Thanks and happy new year!
>>> Joe
>>> 
>>> __
>>> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
>>> https://stat.ethz.ch/mailman/listinfo/r-help
>>> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
>>> and provide commented, minimal, self-contained, reproducible code.
>> 
>> **
>> "The contents of this message do not reflect any position of the U.S. 
>> Government or NOAA."
>> **
>> Roy Mendelssohn
>> Supervisory Operations Research Analyst
>> NOAA/NMFS
>> Environmental Research Division
>> Southwest Fisheries Science Center
>> ***Note new street address***
>> 110 McAllister Way
>> Santa Cruz, CA 95060
>> Phone: (831)-420-3666
>> Fax: (831) 420-3980
>> e-mail: roy.mendelss...@noaa.gov www: http://www.pfeg.noaa.gov/
>> 
>> "Old age and treachery will overcome youth and skill."
>> "From those who have been given much, much will be expected"
>> "the arc of the moral universe is long, but it bends toward justice" -MLK Jr.
>> 
> 
> 
> 
> -- 
> Cooperative Fish and Wildlife Research Unit
> Zoology and Physiology Dept.
> University of Wyoming
> joecerad...@gmail.com / 914.707.8506
> wyocoopunit.org

**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new street address***
110 McAllister Way
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: http://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Lubdridate: subset based on hour and minute

2017-01-02 Thread Roy Mendelssohn - NOAA Federal
Hi Joe:

See below.
> On Jan 2, 2017, at 12:01 PM, Joe Ceradini  wrote:
> 
> Hi folks,
> 
> I must be missing something obvious/painfully simple here
> 
> How do I subset a time vector based on hours AND minutes? So, in this
> example, I want all time greater than 10:00, i.e., 10:30 and 11:00.
> I'm working with lubridate which separates the hours and minutes into
> separate slots.
> 
> require(lubridate)
> 
> test <- hm(c("9:30", "10:00", "10:30", "11:00"))
> test
> [1] "9H 30M 0S"  "10H 0M 0S"  "10H 30M 0S" "11H 0M 0S"
> 
> This gets 11 but not 1030
> test[test@hour > 10]
> [1] "11H 0M 0S"
> 
> This gets 1030 but not 11
> test[test@hour > 9 & test@minute > 0]
> [1] "10H 30M 0S"

 test[test@hour > 9]
[1] "10H 0M 0S"  "10H 30M 0S" "11H 0M 0S" 


You are using a logical "and" in your test - so the condition "test@minute > 0" 
isn't met for 11:00 and therefore it doesn't show up. as both conditions must 
be met  You could also do:

> test[test@hour >= 10]
[1] "10H 0M 0S"  "10H 30M 0S" "11H 0M 0S" 

-HTH,

Roy

> 
> Thanks and happy new year!
> Joe
> 
> __
> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.

**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new street address***
110 McAllister Way
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: http://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] xtractomatic package is now available in CRAN

2016-10-12 Thread Roy Mendelssohn - NOAA Federal
I am pleased to announce that the xtractomatic package is now available from 
CRAN.

xtractomatic is an R package developed to subset and extract satellite and 
other oceanographic related data from a remote server. The program can extract 
data for a moving point in time along a user-supplied set of longitude, 
latitude and time points; in a 3D bounding box; or within a polygon (through 
time). The xtractomatic functions were originally developed for the marine 
biology tagging community, to match up environmental data available from 
satellites (sea-surface temperature, sea-surface chlorophyll, sea-surface 
height, sea-surface salinity, vector winds) to track data from various tagged 
animals or shiptracks (xtracto). The package has since been extended to include 
the routines that extract data a 3D bounding box (xtracto_3D) or within a 
polygon (xtractogon). The xtractomatic package accesses data that are served 
through the ERDDAP  server at the NOAA/SWFSC Environmental Research Division in 
Santa Cruz, California. The ERDDAP server can also be directly accessed at !
 http://coastwatch.pfeg.noaa.gov/erddap. ERDDAP is a simple to use yet powerful 
web data service developed by Bob Simons. 


-Roy



**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new address and phone***
110 Shaffer Road
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: http://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] OPeNDAP access / OPeNDAP subsetting with R

2016-09-30 Thread Roy Mendelssohn - NOAA Federal
Hi Deb:

> > gribfile <- 
> > 'http://thredds.ucar.edu/thredds/ncss/grib/NCEP/GFS/Global_0p5deg/best?north=47.0126&west=-114.841&east=-112.641&south=44.8534&time_start=present&time_duration=PT3H&accept=netcdf&var=v-component_of_wind_height_above_ground,u-component_of_wind_height_above_ground'
> > download.file(gribfile,'junk.nc',mode = "wb")
> trying URL 
> 'http://thredds.ucar.edu/thredds/ncss/grib/NCEP/GFS/Global_0p5deg/best?north=47.0126&west=-114.841&east=-112.641&south=44.8534&time_start=present&time_duration=PT3H&accept=netcdf&var=v-component_of_wind_height_above_ground,u-component_of_wind_height_above_ground'
> Content type 'application/x-netcdf' length unknown
> 
> downloaded 4360 bytes
> 
> > library(ncdf4)
> > junkFile <- nc_open('junk.nc')
> > str(junkFile)
> List of 14
>  $ filename   : chr "junk.nc"
>  $ writable   : logi FALSE
>  $ id : int 65536
>  $ safemode   : logi FALSE
>  $ format : chr "NC_FORMAT_CLASSIC"
>  $ is_GMT : logi FALSE
>  $ groups :List of 1
>   ..$ :List of 7
>   .. ..$ id   : int 65536
>   .. ..$ name : chr ""
>   .. ..$ ndims: int 4
>   .. ..$ nvars: int 7
>   .. ..$ natts: int 13
>   .. ..$ dimid: int [1:4(1d)] 0 1 2 3
>   .. ..$ fqgn : chr ""
>   .. ..- attr(*, "class")= chr "ncgroup4"
>  $ fqgn2Rindex:List of 1
>   ..$ : int 1
>  $ ndims  : num 4
>  $ natts  : num 13
>  $ dim:List of 4
> 



I cut off the rest as that is not important for your question.

HTH,

-Roy

> On Sep 30, 2016, at 4:21 PM, Debasish Pai Mazumder  wrote:
> 
> Hi 
> Now I am using netcdfSubset and I am able to download the file but not sure 
> how to read the files. here my scripts 
> library("ncdf4")
> 
> gribfile<-"http://thredds.ucar.edu/thredds/ncss/grib/NCEP/GFS/Pacific_40km/best/dataset.html";
> download.file(gribfile,basename(gribfile),mode = "wb")
> x<-nc_open(gribfile)
> 
> gribfile<-"http://thredds.ucar.edu/thredds/ncss/grib/NCEP/GFS/Global_0p5deg/best?north=47.0126&west=-114.841&east=-112.641&south=44.8534&time_start=present&time_duration=PT3H&accept=netcdf&var=v-component_of_wind_height_above_ground,u-component_of_wind_height_above_ground";
>   
> download.file(gribfile,basename(gribfile),mode = "wb")
> x<-nc_open(gribfile)
> 
> 
> nc_open doesn't work.
> 
> which command should I use?
> 
> with regards
> -Deb
> 
> 
> On Tue, Sep 27, 2016 at 9:30 PM, Michael Sumner  wrote:
> Opendap won't work on Windows CRAN build of ncdf4, though the rgdal build 
> does work directly on grib.
> 
> Summary: download the files wholus for use on Windows, or set your own system 
> on Linux.
> 
> Building ncdf4 on Windows is not too hard if you know about doing that.
> 
> Cheers, Mike
> 
> On Wed, 28 Sep 2016, 06:49 Roy Mendelssohn - NOAA Federal 
>  wrote:
> Please post the code of what you tried, as I have no idea otherwise what did 
> or did not work for you.
> 
> -Roy
> 
> > On Sep 27, 2016, at 12:44 PM, Debasish Pai Mazumder  
> > wrote:
> >
> > Hi Roy,
> > Thanks for your response. I have tried according your suggestion but it 
> > doesn't work.
> > the OPeNDAP link of the data
> > http://nomads.ncdc.noaa.gov/thredds/dodsC/modeldata/cfsv2_forecast_ts_9mon/2014/201404/20140403/2014040312/
> >
> > datafile:
> > tmax.01.2014040312.daily.grb2
> >
> > Thanks
> > -Deb
> >
> > On Tue, Sep 27, 2016 at 11:51 AM, Roy Mendelssohn - NOAA Federal 
> >  wrote:
> > Look at the package ncdf4.  You can use an OPeNDAP URL in place of the file 
> > name to perform subsets.,
> >
> > -Roy
> >
> > > On Sep 27, 2016, at 9:06 AM, Debasish Pai Mazumder  
> > > wrote:
> > >
> > > Hi all,
> > >
> > > I would like to access and subset following OpeNDAP files.
> > > server:
> > > http://nomads.ncdc.noaa.gov/thredds/dodsC/modeldata/cfsv2_forecast_ts_9mon/2014/201404/20140403/2014040312/
> > >
> > > file name: tmax.01.2014040312.daily.grb2
> > > <http://nomads.ncdc.noaa.gov/thredds/dodsC/modeldata/cfsv2_forecast_ts_9mon/2014/201404/20140403/2014040312/catalog.html?dataset=modeldata/cfsv2_forecast_ts_9mon/2014/201404/20140403/2014040312/tmax.01.2014040312.daily.grb2>
> > > I would like to access and subset the file. Any help will be appreciated.
> > >
> > > with regards
> > > -Deb
> &g

Re: [R] OPeNDAP access / OPeNDAP subsetting with R

2016-09-27 Thread Roy Mendelssohn - NOAA Federal
Please post the code of what you tried, as I have no idea otherwise what did or 
did not work for you.

-Roy

> On Sep 27, 2016, at 12:44 PM, Debasish Pai Mazumder  wrote:
> 
> Hi Roy,
> Thanks for your response. I have tried according your suggestion but it 
> doesn't work.
> the OPeNDAP link of the data
> http://nomads.ncdc.noaa.gov/thredds/dodsC/modeldata/cfsv2_forecast_ts_9mon/2014/201404/20140403/2014040312/
> 
> datafile: 
> tmax.01.2014040312.daily.grb2
> 
> Thanks 
> -Deb
> 
> On Tue, Sep 27, 2016 at 11:51 AM, Roy Mendelssohn - NOAA Federal 
>  wrote:
> Look at the package ncdf4.  You can use an OPeNDAP URL in place of the file 
> name to perform subsets.,
> 
> -Roy
> 
> > On Sep 27, 2016, at 9:06 AM, Debasish Pai Mazumder  
> > wrote:
> >
> > Hi all,
> >
> > I would like to access and subset following OpeNDAP files.
> > server:
> > http://nomads.ncdc.noaa.gov/thredds/dodsC/modeldata/cfsv2_forecast_ts_9mon/2014/201404/20140403/2014040312/
> >
> > file name: tmax.01.2014040312.daily.grb2
> > <http://nomads.ncdc.noaa.gov/thredds/dodsC/modeldata/cfsv2_forecast_ts_9mon/2014/201404/20140403/2014040312/catalog.html?dataset=modeldata/cfsv2_forecast_ts_9mon/2014/201404/20140403/2014040312/tmax.01.2014040312.daily.grb2>
> > I would like to access and subset the file. Any help will be appreciated.
> >
> > with regards
> > -Deb
> >
> >   [[alternative HTML version deleted]]
> >
> > __
> > R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
> > https://stat.ethz.ch/mailman/listinfo/r-help
> > PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> > and provide commented, minimal, self-contained, reproducible code.
> 
> **
> "The contents of this message do not reflect any position of the U.S. 
> Government or NOAA."
> **
> Roy Mendelssohn
> Supervisory Operations Research Analyst
> NOAA/NMFS
> Environmental Research Division
> Southwest Fisheries Science Center
> ***Note new address and phone***
> 110 Shaffer Road
> Santa Cruz, CA 95060
> Phone: (831)-420-3666
> Fax: (831) 420-3980
> e-mail: roy.mendelss...@noaa.gov www: http://www.pfeg.noaa.gov/
> 
> "Old age and treachery will overcome youth and skill."
> "From those who have been given much, much will be expected"
> "the arc of the moral universe is long, but it bends toward justice" -MLK Jr.
> 
> 

**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new address and phone***
110 Shaffer Road
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: http://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] OPeNDAP access / OPeNDAP subsetting with R

2016-09-27 Thread Roy Mendelssohn - NOAA Federal
Look at the package ncdf4.  You can use an OPeNDAP URL in place of the file 
name to perform subsets.,

-Roy

> On Sep 27, 2016, at 9:06 AM, Debasish Pai Mazumder  wrote:
> 
> Hi all,
> 
> I would like to access and subset following OpeNDAP files.
> server:
> http://nomads.ncdc.noaa.gov/thredds/dodsC/modeldata/cfsv2_forecast_ts_9mon/2014/201404/20140403/2014040312/
> 
> file name: tmax.01.2014040312.daily.grb2
> 
> I would like to access and subset the file. Any help will be appreciated.
> 
> with regards
> -Deb
> 
>   [[alternative HTML version deleted]]
> 
> __
> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.

**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new address and phone***
110 Shaffer Road
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: http://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Variable String formation

2016-09-22 Thread Roy Mendelssohn - NOAA Federal
Hi All:

I am trying to write code to create a string to be executed as a command.  The 
string will be of the form:

"param <- param[,rev(seq_len(dataYLen)),,drop = FALSE]"

Now just creating that string is simple enough.  Where the problem arises is 
the array param could be 2, 3, or 4 dimensions, and the dimension where  
"rev(seq_len(dataYLen))" occurs can vary.  At present I have the following 
solution:

  paramLen <-  3
  latLoc <- 2
  myComma1 <- paste(rep(',', times = (latLoc-1)), 'rev(seq_len(dataYLen))', 
sep="", collapse="")
  myComma2 <- paste(rep(',', times = (paramLen-latLoc+1)),sep="", 
collapse="")
  paramCommand <- paste0('param <- param[', myComma1, myComma2, 'drop = 
FALSE]')

(paramLen can be 2,3,4 and latLoc can be 1,2,3,4)  but this strikes me as 
pretty kludgy.  I am hoping there is a more elegant way of doing this.

Thanks,

-Roy

**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new address and phone***
110 Shaffer Road
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: http://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Update to the xtractomatic package

2016-08-08 Thread Roy Mendelssohn - NOAA Federal
xtractomatic is an R package developed to subset and extract satellite and 
other oceanographic related data from a remote server. The program can extract 
data for a moving point in time along a user-supplied set of longitude, 
latitude and time points; in a 3D bounding box; or within a polygon (through 
time).  An update to the package has been released.  The major change (besides 
some minor bug fixes and prettifying of code) is the inclusion of some 20 new 
datasets.  These include the reprocessed SeaWIFS chlorophyll data (R2014.0), 
MUR SST v4.1, a MUR-based SST anomaly, a 750m VIIIRS chlorophyll dataset for 
the North Pacific, a HYCOM-based estimate of sea surface height,  and an 
estimate of frontal probability.

I am slowly working on getting xtractomatic suitable for submission to CRAN,  
in the meantime the instructions for installing the package can be found on the 
Github site https://github.com/rmendels/xtractomatic.  Also there is the 
wonderful "rerddap" package from the rOpenSci  folk which is available from 
CRAN.

-Roy
 


**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new address and phone***
110 Shaffer Road
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: http://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] open a zip file

2016-08-07 Thread Roy Mendelssohn - NOAA Federal
If I break it into parts, I find that the "GET" fails.

>   Year <- format(Sys.Date(), "%Y")
>Month <- format(Sys.Date(), "%m")
> junk <- paste("https://mbsdisclosure.fanniemae.com/disclosure-docs/monthly/";,
+"mbs", 
+as.character(Month), 
+as.character(Year),
+".zip", 
+sep = "")
> junk
[1] "https://mbsdisclosure.fanniemae.com/disclosure-docs/monthly/mbs082016.zip";
> httr::GET("https://mbsdisclosure.fanniemae.com/disclosure-docs/monthly/mbs082016.zip";)
Response 
[https://mbsdisclosure.fanniemae.com/disclosure-docs/monthly/mbs082016.zip]
  Date: 2016-08-08 02:02
  Status: 404
  Content-Type: text/html; charset=iso-8859-1
  Size: 235 B


404 Not Found

Not Found
The requested URL /disclosure-docs/monthly/mbs082016.zip was not found on 
this server.



Your saved file will have the ending ".zip" no matter what, because that is 
what you called it, but I wouldn't be surprised if it just a txt file with the 
error message.

HTH,

-Roy

> On Aug 7, 2016, at 6:42 PM, Glenn Schultz  wrote:
> 
> paste("https://mbsdisclosure.fanniemae.com/disclosure-docs/monthly/";,
>"mbs", 
>as.character(Month), 
>as.character(Year),
>".zip", 
>sep = "")
> junk <- paste("https://mbsdisclosure.fanniemae.com/disclosure-docs/monthly/";,
+"mbs", 
+as.character(Month), 
+as.character(Year),
+".zip", 
+sep = "")
> junk
[1] "https://mbsdisclosure.fanniemae.com/disclosure-docs/monthly/mbs082016.zip";
> httr::GET("https://mbsdisclosure.fanniemae.com/disclosure-docs/monthly/mbs082016.zip";)
Response 
[https://mbsdisclosure.fanniemae.com/disclosure-docs/monthly/mbs082016.zip]
  Date: 2016-08-08 02:02
  Status: 404
  Content-Type: text/html; charset=iso-8859-1
  Size: 235 B


404 Not Found

Not Found
The requested URL /disclosure-docs/monthly/mbs082016.zip was not found on 
this server.


**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new address and phone***
110 Shaffer Road
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: http://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] about netcdf files

2016-08-01 Thread Roy Mendelssohn - NOAA Federal
Hi Lily:

If you download the vignette to my xtractomatic package  
(http://coastwatch.pfeg.noaa.gov/xtracto/index.html) there are any number of 
examples using ggplot2 to make maps from netcdf data,

HTH,

-Roy

> On Aug 1, 2016, at 10:35 AM, lily li  wrote:
> 
> Hi all,
> 
> I can read the data, but how to plot it using ggplot or something? In this
> case, x-axis should be longitude, and y-axis should be latitude. I tried to
> plot using raster function, but the x and y axes are from 0 to 1.
> Thanks again.
> 
> The code is like this:
> pre1 = nc_open('sample_precip_daily.nc')
> pre.3d = ncvar_get(pre1, 'precipitation')
> 
> require(raster)
> rplot = t(pre.3d[, , 1])
> r = raster(rplot[nrow(rplot):1, ]
> plot(r)
> 
> 
> On Tue, Jul 26, 2016 at 1:07 PM, lily li  wrote:
> 
>> Thanks for your reply. But it says "Error in (function (classes, fdef,
>> mtable)):
>> unable to find an inherited method for function 'brick' for signature
>> 'ncdf4' "
>> 
>> The dataset is attached. It contains daily precipitation data for 20
>> years, within a rectangle, so that there are several grid points. I use the
>> code to open it, but don't know how to get csv files, while each file
>> contains continuous daily precipitation data for each grid cell.
>> pre1 = nc_open('sample_precip_daily.nc')
>> pre1
>> pre1_rd = ncvar_get(pre1, 'precipitation')
>> nc_close(pre1)
>> 
>> Thanks for your help.
>> 
>> On Tue, Jul 26, 2016 at 4:08 AM, Jon Skoien 
>> wrote:
>> 
>>> You could try with the brick function from the raster package.
>>> 
>>> bvar = brick(netcdfName)
>>> 
>>> This uses the ncdf4 functions for opening and reading the netcdf, but
>>> makes it easier to extract data for each day:
>>> 
>>> p1 = rasterToPoints(bvar[[1]])
>>> and write p1 to csv.
>>> 
>>> Best,
>>> Jon
>>> 
>>> 
>>> 
>>> On 7/26/2016 6:54 AM, lily li wrote:
>>> 
 Hi all,
 
 I have a problem in opening netcdf files. If one netcdf file contains
 longitude, latitude, and daily precipitation. How to relate each
 precipitation record to its associated location, and export them as csv
 files? Thanks.
 
 I just use nc_open(), ncvar_get(), but it is not very helpful. Thanks for
 any ideas.
 
[[alternative HTML version deleted]]
 
 __
 R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide
 http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.
 
 
>>> --
>>> Jon Olav Skøien
>>> Joint Research Centre - European Commission
>>> Institute for Space, Security & Migration
>>> Disaster Risk Management Unit
>>> 
>>> Via E. Fermi 2749, TP 122,  I-21027 Ispra (VA), ITALY
>>> 
>>> jon.sko...@jrc.ec.europa.eu
>>> Tel:  +39 0332 789205
>>> 
>>> Disclaimer: Views expressed in this email are those of the individual and
>>> do not necessarily represent official views of the European Commission.
>>> 
>> 
>> 
> 
>   [[alternative HTML version deleted]]
> 
> __
> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.

**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new address and phone***
110 Shaffer Road
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: http://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Re: [R] about netcdf files

2016-07-26 Thread Roy Mendelssohn - NOAA Federal
Hi Lily:

> On Jul 26, 2016, at 2:00 PM, lily li  wrote:
> 
> Here are the results. Yes, I tried to read netcdf files, but cannot grasp the 
> contents. Thanks for helping out.
> 
> > str(pre1)

A guide to netcdf files and R can be found at 
https://www.image.ucar.edu/GSP/Software/Netcdf/, it would be worth your time to 
read through it.  Not certain why you want .csv files,  but I can give you a 
rough idea of how to read a slice into a variable in R,  it is then up to you 
to save the result as a .csv.

netcdf files have parameters, precipitation in this case, defined on a grid, in 
your case (lon, lat, time).  The file contains dimensions,  which gives the 
number of points in each dimension of the grid, and coordinate variables,  
which give the values of that coordinate variable at each grid point.  So if I 
wanted to to subset at a certain lat, lon - I would first need to get those 
values

> lat <- ncvar_get(pre1, "lat")
> lon <- ncvar_get(pre1, "lon")
> 

I have no idea of contents of your file, but let's say you know there is a lat 
of 30, and a lon of 120, so I would have to find where they occur:

> myLat <- which(lat == 30)
> myLon <- which(lon == 120)

then you get the slice at that lat-lon for all time by telling ncvar_get where 
to start on the grid, and how many steps to take:

> myPrecip <- ncvar_get(pre1,"precipitation", start = c(myLat, myLon,1), count 
> = c(1, 1, -1))

where the -1 value tells it to get all the values along the time dimension.  
You now have that slice of the data stored in myPrecip.  I hope this gets you 
started, but i strongly urge you to read through the link above.

-Roy

> 
> On Tue, Jul 26, 2016 at 2:52 PM, Roy Mendelssohn - NOAA Federal 
>  wrote:
> Hi Lily:
> 
> I doubt the mail-list would pass through the netcdf file. Instead, could you 
> do the following, and post the results:
> 
> library(ncdf4
> pre1 = nc_open('sample_precip_daily.nc')
> str(pre1)
> nc_close(pre1)
> 
> I have a feeling you haven't worked much with netcdf files. I will try to 
> find a tutorial also to help you along.
> 
> Thanks,
> 
> -Roy
> 
> > On Jul 26, 2016, at 12:07 PM, lily li  wrote:
> >
> > Thanks for your reply. But it says "Error in (function (classes, fdef,
> > mtable)):
> > unable to find an inherited method for function 'brick' for signature
> > 'ncdf4' "
> >
> > The dataset is attached. It contains daily precipitation data for 20 years,
> > within a rectangle, so that there are several grid points. I use the code
> > to open it, but don't know how to get csv files, while each file contains
> > continuous daily precipitation data for each grid cell.
> > pre1 = nc_open('sample_precip_daily.nc')
> > pre1
> > pre1_rd = ncvar_get(pre1, 'precipitation')
> > nc_close(pre1)
> >
> > Thanks for your help.
> >
> > On Tue, Jul 26, 2016 at 4:08 AM, Jon Skoien 
> > wrote:
> >
> >> You could try with the brick function from the raster package.
> >>
> >> bvar = brick(netcdfName)
> >>
> >> This uses the ncdf4 functions for opening and reading the netcdf, but
> >> makes it easier to extract data for each day:
> >>
> >> p1 = rasterToPoints(bvar[[1]])
> >> and write p1 to csv.
> >>
> >> Best,
> >> Jon
> >>
> >>
> >>
> >> On 7/26/2016 6:54 AM, lily li wrote:
> >>
> >>> Hi all,
> >>>
> >>> I have a problem in opening netcdf files. If one netcdf file contains
> >>> longitude, latitude, and daily precipitation. How to relate each
> >>> precipitation record to its associated location, and export them as csv
> >>> files? Thanks.
> >>>
> >>> I just use nc_open(), ncvar_get(), but it is not very helpful. Thanks for
> >>> any ideas.
> >>>
> >>>[[alternative HTML version deleted]]
> >>>
> >>> __
> >>> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
> >>> https://stat.ethz.ch/mailman/listinfo/r-help
> >>> PLEASE do read the posting guide
> >>> http://www.R-project.org/posting-guide.html
> >>> and provide commented, minimal, self-contained, reproducible code.
> >>>
> >>>
> >> --
> >> Jon Olav Skøien
> >> Joint Research Centre - European Commission
> >> Institute for Space, Security & Migration
> >> Disaster Risk Management Unit
> >>
> >>

Re: [R] about netcdf files

2016-07-26 Thread Roy Mendelssohn - NOAA Federal
Hi Lily:

I doubt the mail-list would pass through the netcdf file. Instead, could you do 
the following, and post the results:

library(ncdf4
pre1 = nc_open('sample_precip_daily.nc')
str(pre1)
nc_close(pre1)

I have a feeling you haven't worked much with netcdf files. I will try to find 
a tutorial also to help you along.

Thanks,

-Roy

> On Jul 26, 2016, at 12:07 PM, lily li  wrote:
> 
> Thanks for your reply. But it says "Error in (function (classes, fdef,
> mtable)):
> unable to find an inherited method for function 'brick' for signature
> 'ncdf4' "
> 
> The dataset is attached. It contains daily precipitation data for 20 years,
> within a rectangle, so that there are several grid points. I use the code
> to open it, but don't know how to get csv files, while each file contains
> continuous daily precipitation data for each grid cell.
> pre1 = nc_open('sample_precip_daily.nc')
> pre1
> pre1_rd = ncvar_get(pre1, 'precipitation')
> nc_close(pre1)
> 
> Thanks for your help.
> 
> On Tue, Jul 26, 2016 at 4:08 AM, Jon Skoien 
> wrote:
> 
>> You could try with the brick function from the raster package.
>> 
>> bvar = brick(netcdfName)
>> 
>> This uses the ncdf4 functions for opening and reading the netcdf, but
>> makes it easier to extract data for each day:
>> 
>> p1 = rasterToPoints(bvar[[1]])
>> and write p1 to csv.
>> 
>> Best,
>> Jon
>> 
>> 
>> 
>> On 7/26/2016 6:54 AM, lily li wrote:
>> 
>>> Hi all,
>>> 
>>> I have a problem in opening netcdf files. If one netcdf file contains
>>> longitude, latitude, and daily precipitation. How to relate each
>>> precipitation record to its associated location, and export them as csv
>>> files? Thanks.
>>> 
>>> I just use nc_open(), ncvar_get(), but it is not very helpful. Thanks for
>>> any ideas.
>>> 
>>>[[alternative HTML version deleted]]
>>> 
>>> __
>>> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
>>> https://stat.ethz.ch/mailman/listinfo/r-help
>>> PLEASE do read the posting guide
>>> http://www.R-project.org/posting-guide.html
>>> and provide commented, minimal, self-contained, reproducible code.
>>> 
>>> 
>> --
>> Jon Olav Skøien
>> Joint Research Centre - European Commission
>> Institute for Space, Security & Migration
>> Disaster Risk Management Unit
>> 
>> Via E. Fermi 2749, TP 122,  I-21027 Ispra (VA), ITALY
>> 
>> jon.sko...@jrc.ec.europa.eu
>> Tel:  +39 0332 789205
>> 
>> Disclaimer: Views expressed in this email are those of the individual and
>> do not necessarily represent official views of the European Commission.
>> 
> __
> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.

**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new address and phone***
110 Shaffer Road
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: http://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Re: [R] C/C++/Fortran Rolling Window Regressions

2016-07-21 Thread Roy Mendelssohn - NOAA Federal
I have no idea which method produces the fastest results,  but the package KFAS 
has a function to do recursive regressions using the Kalman filter.  One 
difference is that it is not, as far as a I can telll, a moving window (so past 
data are being dropped),  just a recursively computed regression.

HTH,

-Roy

> On Jul 21, 2016, at 2:08 PM, Mark Leeds  wrote:
> 
> Hi Jermiah: another possibly faster way would be to use a kalman filtering
> framework. I forget the details but duncan and horne have a paper which
> shows how a regression can be re-computed each time a new data point is
> added .I
> forget if they handle taking one off of the back also which is what you
> need.
> 
> The paper at the link below isn't the paper I'm talking about but it's
> reference[1] in that paper. Note that this suggestion might not be a better
> approach  than the various approaches already suggested so I wouldn't go
> this route unless you're very interested.
> 
> 
> Mark
> 
> https://www.le.ac.uk/users/dsgp1/COURSES/MESOMET/ECMETXT/recurse.pdf
> 
> 
> 
> 
> 
> 
> On Thu, Jul 21, 2016 at 4:28 PM, Gabor Grothendieck > wrote:
> 
>> I would be careful about making assumptions regarding what is faster.
>> Performance tends to be nonintuitive.
>> 
>> When I ran rollapply/lm, rollapply/fastLm and roll_lm on the example
>> you provided rollapply/fastLm was three times faster than roll_lm.  Of
>> course this could change with data of different dimensions but it
>> would be worthwhile to do actual benchmarks before making assumptions.
>> 
>> I also noticed that roll_lm did not give the same coefficients as the
>> other two.
>> 
>> set.seed(1)
>> library(zoo)
>> library(RcppArmadillo)
>> library(roll)
>> z <- zoo(matrix(rnorm(10), ncol = 2))
>> colnames(z) <- c("y", "x")
>> 
>> ## rolling regression of width 4
>> library(rbenchmark)
>> benchmark(fastLm = rollapplyr(z, width = 4,
>> function(x) coef(fastLm(cbind(1, x[, 2]), x[, 1])),
>> by.column = FALSE),
>>   lm = rollapplyr(z, width = 4,
>> function(x) coef(lm(y ~ x, data = as.data.frame(x))),
>> by.column = FALSE),
>>   roll_lm =  roll_lm(coredata(z[, 1, drop = F]), coredata(z[, 2, drop =
>> F]), 4,
>> center = FALSE))[1:4]
>> 
>> 
>> test replications elapsed relative
>> 1  fastLm  1000.221.000
>> 2  lm  1000.723.273
>> 3 roll_lm  1000.642.909
>> 
>> On Thu, Jul 21, 2016 at 3:45 PM, jeremiah rounds
>>  wrote:
>>> Thanks all.  roll::roll_lm was essentially what I wanted.   I think
>> maybe
>>> I would prefer it to have options to return a few more things, but it is
>>> the coefficients, and the remaining statistics you might want can be
>>> calculated fast enough from there.
>>> 
>>> 
>>> On Thu, Jul 21, 2016 at 12:36 PM, Achim Zeileis <
>> achim.zeil...@uibk.ac.at>
>>> wrote:
>>> 
 Jeremiah,
 
 for this purpose there are the "roll" and "RcppRoll" packages. Both use
 Rcpp and the former also provides rolling lm models. The latter has a
 generic interface that let's you define your own function.
 
 One thing to pay attention to, though, is the numerical reliability.
 Especially on large time series with relatively short windows there is a
 good chance of encountering numerically challenging situations. The QR
 decomposition used by lm is fairly robust while other more
>> straightforward
 matrix multiplications may not be. This should be kept in mind when
>> writing
 your own Rcpp code for plugging it into RcppRoll.
 
 But I haven't check what the roll package does and how reliable that
>> is...
 
 hth,
 Z
 
 
 On Thu, 21 Jul 2016, jeremiah rounds wrote:
 
 Hi,
> 
> A not unusual task is performing a multiple regression in a rolling
>> window
> on a time-series.A standard piece of advice for doing in R is
> something
> like the code that follows at the end of the email.  I am currently
>> using
> an "embed" variant of that code and that piece of advice is out there
>> too.
> 
> But, it occurs to me that for such an easily specified matrix operation
> standard R code is really slow.   rollapply constantly returns to R
> interpreter at each window step for a new lm.   All lm is at its heart
>> is
> (X^t X)^(-1) * Xy,  and if you think about doing that with Rcpp in
>> rolling
> window you are just incrementing a counter and peeling off rows (or
> columns
> of X and y) of a particular window size, and following that up with
>> some
> matrix multiplication in a loop.   The psuedo-code for that Rcpp
> practically writes itself and you might want a wrapper of something
>> like:
> rolling_lm (y=y, x=x, width=4).
> 
> My question is this: has any of the thousands of R packages out there
> published anything like that.  Rolling window multiple regressions that
> stay in C/C++ until the rolling window completes?  No sense and
>> writing it
>>>

Re: [R] a package-building opinion question, please

2016-07-08 Thread Roy Mendelssohn - NOAA Federal
Hi Erin:

Everyone's tastes differ,  but when I started out knowing nothing about 
packages,  I found Hadley's guide  (thank you Hadley!!!) indispensable:

 http://r-pkgs.had.co.nz/intro.html


For obvious reasons, his guide really is geared to built-in features of 
RStudio, though I believe you could do the same things without it.  It is just 
easier to do the steps he describes from within RStudio.

HTH,

-Roy


> On Jul 8, 2016, at 11:53 AM, Erin Hodgess  wrote:
> 
> Hello everyone:
> 
> I'm starting to write a package from scratch; having done that in ages.
> 
> My opinion question:  what is the best way, please:  Should I use R studio,
> please?  Back in the back, I used package.skeleton, or just copied over
> other people's stuff.  But I want to do a nice clean one from the beginning.
> 
> For the record, I will have S4 classes.
> 
> And I'm sure I will get many differing answering, but that's good too!
> 
> Sort of like those "if you put 1000 statisticians end to end" jokes.
> 
> Thanks in advance,
> Have a great weekend,
> Sincerely,
> Erin
> 
> 
> -- 
> Erin Hodgess
> Associate Professor
> Department of Mathematical and Statistics
> University of Houston - Downtown
> mailto: erinm.hodg...@gmail.com
> 
>   [[alternative HTML version deleted]]
> 
> __
> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.

**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new address and phone***
110 Shaffer Road
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: http://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] netcdf data precision or least significant digit

2016-07-07 Thread Roy Mendelssohn - NOAA Federal
I have moved this over to the netcdf-group mail list, which I think is the more 
appropriate place at this point.  You are copied, and hopefully someone from 
ESRL will see it and provide the proper response.

HTH,

-Roy

> On Jul 7, 2016, at 6:02 PM, Ismail SEZEN  wrote:
> 
> Thank you Roy. If I use "round(uwind, digits = 2)”, all data will have 2 
> decimal places after decimal point. It’s ok. But How do you know you should 
> round the number to 2 decimal digits? According to definitions of precision 
> and least_significant_digit, should I round to 2 decimal digits or 1 decimal 
> digit? 
> 
> For instance, If you check the header information of omega.2015.nc file it 
> says;
> 
> $ ncdump -h omega.2015.nc
> 
> ...
> omega:precision = 3s;
> omega:least_significant_digit = 3s;
> …
> 
> So, I need to round values to 3 decimal places after point?
> 
> and if you check the output of rhum.2015.nc;
> 
> $ ncdump -h rhum.2015.nc
> ...
> rhum:precision = 2s ;
> rhum:least_significant_digit = 0s ;
> …
> 
> Then I need to round values to 2 decimal places after point?
> 
> Should I accomplish the rounding operation according to precision or 
> least_significant_digit attributes? I think someone put these attributes in 
> netcdf files for some reason. Also I belive, if required, this kind of an 
> operation must be done in related package but author said that it is nothing 
> to do with ncdf4 package.
> 
> Please, forgive me for taking your time.
> 
> 
>> On 08 Jul 2016, at 03:21, Roy Mendelssohn - NOAA Federal 
>>  wrote:
>> 
>> After looking at the file, doing an extract say into the variable uwind,  if 
>> I do:
>> 
>> str(uwind)
>> 
>> I see what I expect, but if I just do:
>> 
>> uwind
>> 
>> 
>> I see what you are seeing.  Try:
>> 
>> uwindnew <- round(uwind, digits = 2) 
>> 
>> 
>> and see if that gives you the results you would expect.  
>> 
>> HTH,
>> 
>> -Roy
>> 
>>> On Jul 7, 2016, at 4:49 PM, Ismail SEZEN  wrote:
>>> 
>>> Thank you Roy. 
>>> 
>>> I use NCEP/NCAR Reanalysis 2 data [1]. More precisely, u-wind data of the 
>>> year 2015 [2]. I am also pretty sure that the variables like scale_factor 
>>> or add_offset should be precise like 0.01 or 187.65 but somehow (I hope 
>>> this is not an issue originated by me) they are not, including data. Also 
>>> let me note that I already contacted to author of ncdf4 package and also 
>>> sent an email to ESRL, too, but no luck yet.
>>> 
>>> For a vectoral data, absolute and mutual u components of wind speed at the 
>>> poles must be equal. For instance, at “2015-01-01 00 GMT”, u-wind at 
>>> longitude=0 and latitude=90 is 9.179 m/s and u-wind at longitude=180 
>>> and latitude=90 is -9.217 m/s. Minus sign comes from positive north 
>>> direction. Physically, their absolute values must be equal.
>>> 
>>> 1- http://www.esrl.noaa.gov/psd/data/gridded/data.ncep.reanalysis2.html
>>> 2- 
>>> ftp://ftp.cdc.noaa.gov/Datasets/ncep.reanalysis2.dailyavgs/pressure/uwnd.2015.nc
>>> 
>>> 
>>> 
>>>> On 08 Jul 2016, at 02:27, Roy Mendelssohn - NOAA Federal 
>>>>  wrote:
>>>> 
>>>> Hi Ismail:
>>>> 
>>>> Can you point me to a particular netcdf file you are working with.  I 
>>>> would like to play with it for awhile.  I am pretty certain the scale 
>>>> factor is 0.01 and what you are seeing in rounding error (or mor precisely 
>>>> I should say problems with representations of floating point numbers),  
>>>> but i would like to see if there is away around this.
>>>> 
>>>> Thank,
>>>> 
>>>> -Roy
>>>> 
>>>>> On Jul 7, 2016, at 4:16 PM, Ismail SEZEN  wrote:
>>>>> 
>>>>> Thank you very much Jeff.  I think I’m too far to be able to explain 
>>>>> myself. Perhaps, this is the wrong list for this question but I sent it 
>>>>> in hope there is someone has deep understanding of netcdf data and use R. 
>>>>> Let me tell the story simpler. Assume that you read a numeric vector of 
>>>>> data from a netcdf file:
>>>>> 
>>>>> data <- c(9.179, 8.779, 7.979, 3.080, 6.118, 
>>>>> 10.117, 10.417, 9.217)
>>>>> 
>>>>> you know that the values above are a model output and also you know that, 
>>>>> physically, 

Re: [R] netcdf data precision or least significant digit

2016-07-07 Thread Roy Mendelssohn - NOAA Federal
After looking at the file, doing an extract say into the variable uwind,  if I 
do:

str(uwind)

I see what I expect, but if I just do:

uwind


I see what you are seeing.  Try:

uwindnew <- round(uwind, digits = 2) 


and see if that gives you the results you would expect.  

HTH,

-Roy

> On Jul 7, 2016, at 4:49 PM, Ismail SEZEN  wrote:
> 
> Thank you Roy. 
> 
> I use NCEP/NCAR Reanalysis 2 data [1]. More precisely, u-wind data of the 
> year 2015 [2]. I am also pretty sure that the variables like scale_factor or 
> add_offset should be precise like 0.01 or 187.65 but somehow (I hope this is 
> not an issue originated by me) they are not, including data. Also let me note 
> that I already contacted to author of ncdf4 package and also sent an email to 
> ESRL, too, but no luck yet.
> 
> For a vectoral data, absolute and mutual u components of wind speed at the 
> poles must be equal. For instance, at “2015-01-01 00 GMT”, u-wind at 
> longitude=0 and latitude=90 is 9.179 m/s and u-wind at longitude=180 and 
> latitude=90 is -9.217 m/s. Minus sign comes from positive north 
> direction. Physically, their absolute values must be equal.
> 
> 1- http://www.esrl.noaa.gov/psd/data/gridded/data.ncep.reanalysis2.html
> 2- 
> ftp://ftp.cdc.noaa.gov/Datasets/ncep.reanalysis2.dailyavgs/pressure/uwnd.2015.nc
> 
> 
> 
>> On 08 Jul 2016, at 02:27, Roy Mendelssohn - NOAA Federal 
>>  wrote:
>> 
>> Hi Ismail:
>> 
>> Can you point me to a particular netcdf file you are working with.  I would 
>> like to play with it for awhile.  I am pretty certain the scale factor is 
>> 0.01 and what you are seeing in rounding error (or mor precisely I should 
>> say problems with representations of floating point numbers),  but i would 
>> like to see if there is away around this.
>> 
>> Thank,
>> 
>> -Roy
>> 
>>> On Jul 7, 2016, at 4:16 PM, Ismail SEZEN  wrote:
>>> 
>>> Thank you very much Jeff.  I think I’m too far to be able to explain 
>>> myself. Perhaps, this is the wrong list for this question but I sent it in 
>>> hope there is someone has deep understanding of netcdf data and use R. Let 
>>> me tell the story simpler. Assume that you read a numeric vector of data 
>>> from a netcdf file:
>>> 
>>> data <- c(9.179, 8.779, 7.979, 3.080, 6.118, 
>>> 10.117, 10.417, 9.217)
>>> 
>>> you know that the values above are a model output and also you know that, 
>>> physically, first and last values must be equal but somehow they are not.
>>> 
>>> And now, you want to use “periodic” spline for the values above.
>>> 
>>> spline(1:8, data, method = “periodic”)
>>> 
>>> Voila! spline method throws a warning message: “spline: first and last y 
>>> values differ - using y[1] for both”. Then I go on digging and discover 2 
>>> attributes in netcdf file: “precision = 2” and “least_significant_digit = 
>>> 1”. And I also found their definitions at [1].
>>> 
>>> precision -- number of places to right of decimal point that are 
>>> significant, based on packing used. Type is short.
>>> least_significant_digit -- power of ten of the smallest decimal place in 
>>> unpacked data that is a reliable value. Type is short.
>>> 
>>> Please, do not condemn me, english is not my main language :). At this 
>>> point, as a scientist, what would you do according to explanations above? I 
>>> think I didn’t exactly understand the difference between precision and 
>>> least_significant_digit. One says “significant” and latter says “reliable”. 
>>> Should I round the numbers to 2 decimal places or 1 decimal place after 
>>> decimal point?
>>> 
>>> Thanks,
>>> 
>>> 1- 
>>> http://www.esrl.noaa.gov/psd/data/gridded/conventions/cdc_netcdf_standard.shtml
>>> 
>>> 
>>>> On 08 Jul 2016, at 01:29, Jeff Newmiller  wrote:
>>>> 
>>>> Correction:
>>>> 
>>>> ?options (not par)
>>>> -- 
>>>> Sent from my phone. Please excuse my brevity.
>>>> 
>>>> On July 7, 2016 3:26:06 PM PDT, Jeff Newmiller  
>>>> wrote:
>>>>> Same as with any floating point numeric computation environment... you
>>>>> don't. There is always uncertainty in any floating point number... it
>>>>> is just larger in this data than you might be used to.
>>>>> 
>>>>> Once you get to the stage where you want to output values, read up on
>>>&g

Re: [R] netcdf data precision or least significant digit

2016-07-07 Thread Roy Mendelssohn - NOAA Federal
Hi Ismail:

Can you point me to a particular netcdf file you are working with.  I would 
like to play with it for awhile.  I am pretty certain the scale factor is 0.01 
and what you are seeing in rounding error (or mor precisely I should say 
problems with representations of floating point numbers),  but i would like to 
see if there is away around this.

Thank,

-Roy

> On Jul 7, 2016, at 4:16 PM, Ismail SEZEN  wrote:
> 
> Thank you very much Jeff.  I think I’m too far to be able to explain myself. 
> Perhaps, this is the wrong list for this question but I sent it in hope there 
> is someone has deep understanding of netcdf data and use R. Let me tell the 
> story simpler. Assume that you read a numeric vector of data from a netcdf 
> file:
> 
> data <- c(9.179, 8.779, 7.979, 3.080, 6.118, 10.117, 
> 10.417, 9.217)
> 
> you know that the values above are a model output and also you know that, 
> physically, first and last values must be equal but somehow they are not.
> 
> And now, you want to use “periodic” spline for the values above.
> 
> spline(1:8, data, method = “periodic”)
> 
> Voila! spline method throws a warning message: “spline: first and last y 
> values differ - using y[1] for both”. Then I go on digging and discover 2 
> attributes in netcdf file: “precision = 2” and “least_significant_digit = 1”. 
> And I also found their definitions at [1].
> 
> precision -- number of places to right of decimal point that are significant, 
> based on packing used. Type is short.
> least_significant_digit -- power of ten of the smallest decimal place in 
> unpacked data that is a reliable value. Type is short.
> 
> Please, do not condemn me, english is not my main language :). At this point, 
> as a scientist, what would you do according to explanations above? I think I 
> didn’t exactly understand the difference between precision and 
> least_significant_digit. One says “significant” and latter says “reliable”. 
> Should I round the numbers to 2 decimal places or 1 decimal place after 
> decimal point?
> 
> Thanks,
> 
> 1- 
> http://www.esrl.noaa.gov/psd/data/gridded/conventions/cdc_netcdf_standard.shtml
> 
> 
>> On 08 Jul 2016, at 01:29, Jeff Newmiller  wrote:
>> 
>> Correction:
>> 
>> ?options (not par)
>> -- 
>> Sent from my phone. Please excuse my brevity.
>> 
>> On July 7, 2016 3:26:06 PM PDT, Jeff Newmiller  
>> wrote:
>>> Same as with any floating point numeric computation environment... you
>>> don't. There is always uncertainty in any floating point number... it
>>> is just larger in this data than you might be used to.
>>> 
>>> Once you get to the stage where you want to output values, read up on
>>> 
>>> ?round
>>> ?par (digits)
>>> 
>>> and don't worry about the incidental display of extra digits prior to
>>> presentation (output). 
>>> -- 
>>> Sent from my phone. Please excuse my brevity.
>>> 
>>> On July 7, 2016 12:50:54 AM PDT, Ismail SEZEN 
>>> wrote:
 Hello,
 
 I use ncdf4 and ncdf4.helpers packages to get wind data from ncep/ncar
 reanalysis ncetcdf files. But data is in the form of (9.18,
 8.78, 7.98, 3.08, -6.818, …). I’m aware of precision
 and least_significant_digit attributes of ncdf4 object [1]. For uwnd
 data, precision = 2 and least_significant_digits = 1. My doubt is that
 should I round data to 2 decimal places or 1 decimal place after
 decimal point?
 
 Same issue is valid for some header info.
 
 Output of ncdf4 object:
 
 
 Output of ncdump on terminal:
 
 
 for instance, ncdump's scale factor is 0.01f but ncdf4 object’s
 scale_factor is 0.0099977648258. You can notice same issue for
 actual_range and add_offset. Also a similar issue exist for the data.
 How can I truncate those extra unsignificant decimal places or round
 the numbers to significant decimal places?
 
 1 -
 http://www.esrl.noaa.gov/psd/data/gridded/conventions/cdc_netcdf_standard.shtml
 
 __
 R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide
 http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.
>>> 
>>> __
>>> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
>>> https://stat.ethz.ch/mailman/listinfo/r-help
>>> PLEASE do read the posting guide
>>> http://www.R-project.org/posting-guide.html
>>> and provide commented, minimal, self-contained, reproducible code.
>> 
> 
> 
>   [[alternative HTML version deleted]]
> 
> __
> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do r

Re: [R] dplyr : row total for all groups in dplyr summarise

2016-07-06 Thread Roy Mendelssohn - NOAA Federal
Thanks muchly.  I hate the smart quotes!

-Roy


> On Jul 6, 2016, at 11:42 AM, David Winsemius  wrote:
> 
> 
>> On Jul 6, 2016, at 9:45 AM, rmendelss gmail  wrote:
>> 
>> 
>>> On Jul 6, 2016, at 9:36 AM, David Winsemius  wrote:
>>> 
>>> n this case the text was cut from the R session console text and pasted 
>>> without modification into Mail.app version 8.2. In replicating this action, 
>>> I see now that hitting "return" then unfortunately converts the final 
>>> double-quote to a "smart-quote”.
>> 
>> Speaking of this, this happens to me a lot when I post, so any example code 
>> can not be copied and paste to be reproducible.  Does anyone know of a 
>> workaround for this?
> 
> Google search: Mail.app changes quotes
> 
> https://discussions.apple.com/thread/7065405?tstart=0
> 
> If this is on a Mac then the answer it to change setting on the Keyboard/text 
> Control Panel: uncheck the Use smart quotes and dashes:
> 
> 
> 
> 
> Test:
> 
> mtcars %>%
>group_by (am, gear) %>%
>summarise (n=n()) %>%
>mutate(rel.freq = paste0(round(100 * n/sum(n), 0), "%")) %>%
>ungroup() %>% plyr::rbind.fill(data.frame( n=nrow(mtcars),rel.freq="100%"))
> 
> 
> Seems to work.
> -- 
> David
>> 
>> Thanks,
>> 
>> -Roy
>> 
>> 
> 

**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new address and phone***
110 Shaffer Road
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: http://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Re: [R] Extracting matrix from netCDF file using ncdf4 package

2016-07-02 Thread Roy Mendelssohn - NOAA Federal
Sending this to Hemant a second time as i forgot to reply to list.

Hi Hemant:

Well technically the code you give below shouldn’t work, because “start” and 
“count” are suppose to be of the same dimensions as the variables.  I guess 
Pierce’s code must be very forgiving if that is working.  One thing you can do 
to speed things up is pre-allocate the array you want to create, say

> dX <- array(NA_real_, dim=c(5,365))


and then have the ncvar_get call write directly to the array:

> dX[i,] <- ncvar_get(nc=myNC, varid="myVar", start=c(reqX[i],reqY[i],1), 
> count=c(1,1,-1)) 

The second thing you can do, is to use “lapply” instead of the “for” loop, but 
I don’t know how much faster that will make your code.  The fastest however, if 
you have the memory, is to just read the array into memory:

> dX <-  ncvar_get(nc=myNC, varid=“myVar”)


and then use R’s subsetting abilities. You can do fancier subsetting of arrays 
in memory than you can to arrays on disk.

HTH,

-Roy


> On Jul 2, 2016, at 3:43 PM, Hemant Chowdhary via R-help 
>  wrote:
> 
> I am working with a 3-dimensional netCDF file having dimensions of X=100, 
> Y=200, T=365. 
> My objective is to extract time vectors of a few specific grids that may not 
> be contiguous on X and/or Y. 
> 
> For example, I want to extract a 5x365 matrix where 5 rows are each vectors 
> of length 365 of 5 specific X,Y combinations. 
> 
> For this, I am currently using the following 
> 
> reqX = c(35,35,40,65,95); 
> reqY = c(2,5,10,112,120,120); 
> nD = length(reqX) 
> for(i in 1:nD){ 
> idX = ncvar_get(nc=myNC, varid="myVar", start=c(reqX[i],reqY[i]), 
> count=c(1,1)) 
> if(i==1){dX = idX} else {dX = rbind(dX,idX)} 
> } 
> 
> Is there more elegant/faster way other than to using a For Loop like this? It 
> seems very slow when I may have to get much larger matrix where nD can be 
> more than 1000. 
> 
> Thank you HC
> 
>   [[alternative HTML version deleted]]
> 
> __
> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.

> On Jul 2, 2016, at 3:43 PM, Hemant Chowdhary via R-help 
>  wrote:
> 
> I am working with a 3-dimensional netCDF file having dimensions of X=100, 
> Y=200, T=365. 
> My objective is to extract time vectors of a few specific grids that may not 
> be contiguous on X and/or Y. 
> 
> For example, I want to extract a 5x365 matrix where 5 rows are each vectors 
> of length 365 of 5 specific X,Y combinations. 
> 
> For this, I am currently using the following 
> 
> reqX = c(35,35,40,65,95); 
> reqY = c(2,5,10,112,120,120); 
> nD = length(reqX) 
> for(i in 1:nD){ 
> idX = ncvar_get(nc=myNC, varid="myVar", start=c(reqX[i],reqY[i]), 
> count=c(1,1)) 
> if(i==1){dX = idX} else {dX = rbind(dX,idX)} 
> } 
> 
> Is there more elegant/faster way other than to using a For Loop like this? It 
> seems very slow when I may have to get much larger matrix where nD can be 
> more than 1000. 
> 
> Thank you HC
> 
>   [[alternative HTML version deleted]]
> 
> __
> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.

**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new address and phone***
110 Shaffer Road
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: http://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Re: [R] How to get bathymetry data using R

2016-06-11 Thread Roy Mendelssohn - NOAA Federal
Use the “rerddap” package that accesses our ERDDAP server.  You can see the 
data available at:

http://upwell.pfeg.noaa.gov/erddap

In the search box type in “bathymetry”.You can subset as you want, using 
“rerddap” you get back a netcdf file that is already read into your R 
workspace,  or you can write your own 2 lines of code  (since the URL 
completely defines the request) and get the data back in any number of file 
formats.  I don’t want to get into a format war, but for large gridded datasets 
like this I much prefer netcdf files.

If you have any questions at all about either the “rerddap” package  (developed 
by the wonderful people at ROpenSci) or about our ERDDAP service, don’t 
hesitate to ask.

-Roy

> On Jun 11, 2016, at 12:30 AM, Michael Sumner  wrote:
> 
> On Sat, 11 Jun 2016 at 15:43 javad bayat  wrote:
> 
>> Dear R users;
>> I am searching for a package to extract bathymetry data from topography map
>> to produce the control file for CE-Qual-w2 model.
>> Is there anyone to know how to do it?
>> many thanks.
>> 
>> 
> There are few things around but I highly recommend find your own data
> source, suitable for your purpose - and reading it directly with the raster
> package.  If anyone knows a reliable source I'd like to hear it.
> 
> As a global go-to I use Etopo1, but you may want some more detailed (like
> Gebco 2014) or a non-global one. (I can't tell where you need this for or
> what resolution would be suitable from what you've asked though).
> 
> (The best format when you get a choice generally is GeoTIFF, but it depends
> who creates them. I tend to use the NetCDF files from Etopo).
> 
> Raster uses rgdal for many formats, but ncdf4 exclusively for NetCDF - if
> it's not called "*.nc" you can get around that).
> 
> Cheers, Mike.
> 
> 
> 
>> --
>> Best Regards
>> Javad Bayat
>> M.Sc. Environment Engineering
>> Alternative Mail: bayat...@yahoo.com
>> 
>>[[alternative HTML version deleted]]
>> 
>> __
>> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
>> https://stat.ethz.ch/mailman/listinfo/r-help
>> PLEASE do read the posting guide
>> http://www.R-project.org/posting-guide.html
>> and provide commented, minimal, self-contained, reproducible code.
>> 
> -- 
> Dr. Michael Sumner
> Software and Database Engineer
> Australian Antarctic Division
> 203 Channel Highway
> Kingston Tasmania 7050 Australia
> 
>   [[alternative HTML version deleted]]
> 
> __
> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.

**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new address and phone***
110 Shaffer Road
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: http://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Re: [R] system() command not working

2016-06-04 Thread Roy Mendelssohn - NOAA Federal
Hi John:

When El Capitan first came out there was a discussion in the  R-SIg-Mac  list 
about environmental variables not being passed down to applications  (not just 
R abut in general).  I believe a work around was suggested, but I would search 
the archives for that.

So what is happening, when you run from the command line, the variables   
MRT_DATA_DIR and MRTDATADIR which are defined somewhere in your environment are 
found in the terminal, but when the same command is run from the application 
they are not being found.  I would search the R-SIg-Mac archive or post to that 
list, because i can’t remember what the work around was for it.

HTH,

-Roy

> On Jun 4, 2016, at 11:59 AM, J Payne  wrote:
> 
> I’ve posted this question on StackExchange at 
> http://stackoverflow.com/questions/37604466/r-system-not-working-with-modis-reprojection-tool,
>  but haven’t received any replies.  I’m hoping that someone who understands 
> the operation of the R system() command can help.  
> 
> 
> 
> I have a command that works when typed into the Terminal on a Mac (OSX El 
> Cap), but exactly the same command fails when called from R using `system()`. 
> 
>  
> 
> I am trying to batch-process MODIS satellite files using a small program 
> called the MODIS Reprojection Tool 
> (https://lpdaac.usgs.gov/tools/modis_reprojection_tool).  My software is all 
> up to date.
> 
> 
> 
> This is a simple example in which I mosaic two files.  The names of the two 
> files are in a text input file called `input.list`.  The command just tells 
> the `mrtmosaic` routine where to find the input list and where to put the 
> output. 
> 
> 
> 
> This command works correctly in the Terminal:  
> 
> 
> 
> /Applications/Modis_Reprojection_Tool/bin/mrtmosaic -i ~/temp/input.list 
> -o ~/temp/output.hdf
> 
> 
> 
> However, if I put exactly the same string into a variable and run it from R 
> (using RStudio), it fails:  
> 
> 
> 
> comstring<-"/Applications/Modis_Reprojection_Tool/bin/mrtmosaic -i 
> ~/temp/input.list -o ~/temp/output.hdf"  
> 
> system(comstring)
> 
> 
> 
>> Warning: gctp_call : Environmental Variable Not Found:   
> 
> MRT_DATA_DIR nor MRTDATADIR not defined
> 
> Error: GetInputGeoCornerMosaic : General Processing Error converting 
> lat/long coordinates to input projection coordinates.  
> 
> Fatal Error, Terminating...
> 
> 
> 
> The strange thing is that the system knows what the environment variables 
> are.  In the terminal, the command
> 
> `echo $MRT_DATA_DIR`
> 
> shows the correct directory: /Applications/Modis_Reprojection_Tool/data
> 
> 
> 
> I don't see why it would have trouble finding the variables from an `R 
> system()` call when it has no trouble in the Terminal.  I'm very stumped!  
> 
> 
> 
> 
> 
> 
> 
> 
>   [[alternative HTML version deleted]]
> 
> __
> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.

**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new address and phone***
110 Shaffer Road
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: http://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Re: [R] Request for help

2016-06-03 Thread Roy Mendelssohn - NOAA Federal
Hi All:


> On Jun 3, 2016, at 11:33 AM, jlu...@ria.buffalo.edu wrote:
> 
> There is a video tutorial on the RStudio web site showing how to create R 
> packages within RStudio.  Hadley Wickham also has a book on creating R 
> packages.
> 


And I would add that Hadley has kindly put the book online, at 
http://r-pkgs.had.co.nz/intro.html

-Roy

**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new address and phone***
110 Shaffer Road
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: http://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Matrix multiplications

2016-05-21 Thread Roy Mendelssohn - NOAA Federal
>  str(t(y-X %*% b))
 num [1, 1:10] 0.595 -1.7538 -0.0498 -1.651 -0.6328 ...
> str((y-X %*% b))
 num [1:10, 1] 0.595 -1.7538 -0.0498 -1.651 -0.6328 …

-Roy


> On May 21, 2016, at 12:00 PM, george brida  wrote:
> 
> Dear R users:
> 
> I have written the following lines :
> 
>> x=c(10,11,12,13,14,17,15,16,10,11,41,25,26,14,12,14,15,20,14,22)
>> x=matrix(x,ncol=2)
>> a=matrix(1,nrow(x),1)
>> X=cbind(a,x)
>> y=c(12.00, 11.00, 13.00, 12.50, 14.00, 18.50, 15.00, 12.50, 13.75, 15.00)
> 
>> b=solve(t(X)%*% X)%*% t(X)%*% y
> 
> when I wrote the following line
>> (t(y-X %*% b)%*%(y-X %*% b)/(length(y)-ncol(X)))*solve(t(X)%*% X)
> I have obtained an error message, I don't know why namely (t(y-X %*%
> b)%*%(y-X %*% b)/(length(y)-ncol(X))) is a scalar:
>> (t(y-X %*% b)%*%(y-X %*% b)/(length(y)-ncol(X)))
> [,1]
> [1,] 3.620354
> 
> 
> Can you please help me.
> 
> Thank you
> 
> George
> 
>   [[alternative HTML version deleted]]
> 
> __
> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.

**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new address and phone***
110 Shaffer Road
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: http://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Re: [R] Unexpected values obtained when reading in data using ncdf and ncdf4

2016-04-22 Thread Roy Mendelssohn - NOAA Federal
Hi Louise:

If Dave can’t figure it out, I can give a look also.  A couple of things I 
would suggest:

1.  Don’t use the name “data” in the nc_open command, that is a reserved 
command in R and you never know what problems that can cause.

2. You are doing calculations to get set the start and count values in the 
ncvar_get commands, print those values out before you make the calls to make 
certain they are valid.

HTH,

-Roy

> On Apr 22, 2016, at 8:08 AM, David W. Pierce  wrote:
> 
> On Fri, Apr 22, 2016 at 1:32 AM, Louise Mair  wrote:
> 
>> Dear R Users,
>> 
>> I am encountering a problem when reading nc files into R using the ncdf
>> and ncdf4 libraries. The nc files are too large to attach an example (but
>> if someone is interested in helping out I could send a file privately via
>> an online drive), but the code is basic:
>> 
> ​[...]​
> 
> 
> ​Hi Louise,
> 
> I'm the author of the ncdf and ncdf4 libraries. What are the details --
> what operating system are you running on, what version of R and the netcdf
> library are you using?
> 
> If you make the files available to me I can take a look.
> 
> Regards,
> 
> --Dave Pierce
> ​
> 
> 
> 
> 
> 
>> for(i in 1:length(thesenames[,1])){
>>   data <- nc_open(paste(INDIR, thesenames[i,c("wholename")], sep=""),
>> write=F)
>>   d.vars <- names(data$var)
>>   d.size <- (data$var[[length(d.vars)]])$size
>> 
>>   # Obtaining longitude and latitude values
>>   d.lon <- as.vector(ncvar_get(data, varid="lon", start=c(1,1),
>> count=c(d.size[1],d.size[2])))
>>   d.lat <- as.vector(ncvar_get(data, varid="lat", start=c(1,1),
>> count=c(d.size[1],d.size[2])))
>> 
>>   # Obtaining climate data values
>>   df.clim <- data.frame(rn=seq(1:length(d.lon)))
>>   for(y in 1:d.size[3]){
>> df.clim[,1+y] <- as.vector(ncvar_get(data,
>> varid=d.vars[length(d.vars)], start=c(1,1,y),
>> count=c(d.size[1],d.size[2],1)))
>>  names(df.clim)[1+y] <- paste("y",y,sep="")  }
>>   tosummarise[,,i] <- as.matrix(df.clim[,-1])
>> }
>> 
>> The data are temperature or precipitation, across space and time.
>> 
>> For most of the >250 files I have, there are no problems, but for around 8
>> of these files, I get strange values. The data should be within a
>> relatively narrow range, yet I get values such as -8.246508e+07  or
>> 7.659506e+11. The particularly strange part is that these kind of values
>> occur at regularly spaced intervals across the data, usually within a
>> single time step.
>> 
>> I have the same problem (including the exact same strange values) when
>> using ArcMap, yet the data provider assures me that the data look normal
>> when using CDO (climate data operators) to view them, and that there are no
>> missing values.
>> 
>> I realise this is very difficult to diagnose without the nc files
>> themselves, so my questions are (1) Has anyone encountered something like
>> this before?, (2) Is there something I am failing to specify in the code
>> when reading in?, and (3) Is anyone interested in digging into this and
>> willing to play around with the nc files if I make them available privately?
>> 
>> Thanks very much in advance!
>> Louise
>> 
>> 
>> 
>> 
>> 
>>[[alternative HTML version deleted]]
>> 
>> __
>> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
>> https://stat.ethz.ch/mailman/listinfo/r-help
>> PLEASE do read the posting guide
>> http://www.R-project.org/posting-guide.html
>> and provide commented, minimal, self-contained, reproducible code.
>> 
> 
> 
> 
> -- 
> David W. Pierce
> Division of Climate, Atmospheric Science, and Physical Oceanography
> Scripps Institution of Oceanography, La Jolla, California, USA
> (858) 534-8276 (voice)  /  (858) 534-8561 (fax)dpie...@ucsd.edu
> 
>   [[alternative HTML version deleted]]
> 
> __
> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.

**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new address and phone***
110 Shaffer Road
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: http://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting

Re: [R] Memory usage in prcomp

2016-03-22 Thread Roy Mendelssohn - NOAA Federal

> On Mar 22, 2016, at 10:00 AM, Martin Maechler  
> wrote:
> 
>>>>>> Roy Mendelssohn <- NOAA Federal >
>>>>>>on Tue, 22 Mar 2016 07:42:10 -0700 writes:
> 
>> Hi All:
>> I am running prcomp on a very large array, roughly [50, 3650].  The 
>> array itself is 16GB.  I am running on a Unix machine and am running “top” 
>> at the same time and am quite surprised to see that the application memory 
>> usage is 76GB.  I have the “tol” set very high  (.8) so that it should only 
>> pull out a few components.  I am surprised at this memory usage because 
>> prcomp uses the SVD if I am not mistaken, and when I take guesses at the 
>> size of the SVD matrices they shouldn’t be that large.   While I can fit 
>> this  in, for a variety of reasons I would like to reduce the memory 
>> footprint.  She questions:
> 
>> 1.  I am running with “center=FALSE” and “scale=TRUE”.  Would I save memory 
>> if I scaled the data first myself, saved the result, cleared out the 
>> workspace, read the scaled data back in and did the prcomp call?  Basically 
>> are the intermediate calculations for scaling kept in memory after use.
> 
>> 2. I don’t know how prcomp memory usage compares to a direct call to “svn” 
>> which allows me to explicitly set how many  singular vectors to compute (I 
>> only need like the first five at most).  prcomp is convenient because it 
>> does a lot of the other work for me
> 
> For your example, where p := ncol(x)  is 3650  but you only want
> the first 5 PCs, it would be *considerably* more efficient to
> use svd(..., nv = 5) directly.
> 
> So I would take  stats:::prcomp.default  and modify it
> correspondingly.
> 
> This seems such a useful idea in general that I consider
> updating the function in R with a new optional 'rank.'  argument which
> you'd set to 5 in your case.
> 
> Scrutinizing R's underlying svd() code however, I know see that
> there are typicall still two other [n x p] matrices created (on
> in R's La.svd(), one in C code) ... which I think should be
> unnecessary in this case... but that would really be another
> topic (for R-devel , not R-help).
> 
> Martin
> 


Thanks.  It is easy enough to recode using SVN, and I think I will.It gives 
me a ;title more control on what the algorithm does.

-Roy



**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new address and phone***
110 Shaffer Road
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: http://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

[R] Memory usage in prcomp

2016-03-22 Thread Roy Mendelssohn - NOAA Federal
Hi All:

I am running prcomp on a very large array, roughly [50, 3650].  The array 
itself is 16GB.  I am running on a Unix machine and am running “top” at the 
same time and am quite surprised to see that the application memory usage is 
76GB.  I have the “tol” set very high  (.8) so that it should only pull out a 
few components.  I am surprised at this memory usage because prcomp uses the 
SVD if I am not mistaken, and when I take guesses at the size of the SVD 
matrices they shouldn’t be that large.   While I can fit this  in, for a 
variety of reasons I would like to reduce the memory footprint.  She questions:

1.  I am running with “center=FALSE” and “scale=TRUE”.  Would I save memory if 
I scaled the data first myself, saved the result, cleared out the workspace, 
read the scaled data back in and did the prcomp call?  Basically are the 
intermediate calculations for scaling kept in memory after use.

2. I don’t know how prcomp memory usage compares to a direct call to “svn” 
which allows me to explicitly set how many  singular vectors to compute (I only 
need like the first five at most).  prcomp is convenient because it does a lot 
of the other work for me


**
"The contents of this message do not reflect any position of the U.S. 
Government or NOAA."
**
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new address and phone***
110 Shaffer Road
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: roy.mendelss...@noaa.gov www: http://www.pfeg.noaa.gov/

"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected" 
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

  1   2   >