Dear Bert,
You have made my day!! Your post is a great help and very useful in my
field.
The paper is not among the off-the-shelf research output. Some of us, who
get into unenviable conflict and disputation with some reverenced
authorities in our field, understand the weight of the article. I ha
; to %>% everywhere.
>
> I really think that in June 2024 you (Ogbos) should not run
> "productively" an R version that is older than May 2021 (where R
> 4.1.0 was released) :
>
> $ R-4.1.0 --version | head 1
> R version 4.1.0 (2021-05-18) -- "Camp
.1.0 or later.
>
> Good luck
>
>
> On Wed, Jun 19, 2024 at 10:00 AM Ogbos Okike
> wrote:
>
>> The output of the command is, indeed, 55. See below:
>>
>> > library(magrittr)
>> > 1:10 %>% sum()
>> [1] 55
>>
>> On Wed, Jun 19, 2024
This works:
>
> 1:10 |>
> mean()
>
> but this fails:
>
> 1:10
> |> mean()
>
> Duncan Murdoch
>
> >
> > If you don't want to do that, install and load the 'magrittr' package
> > and change |> to %>% everywhe
stall and load the 'magrittr' package
> and change |> to %>% everywhere.
>
> On 2024-06-18 12:13 p.m., Ogbos Okike wrote:
> > Greetings to everyone and thank you for your readiness to help.
> >
> > I have problems using the pipe command (|>).
> &
Greetings to everyone and thank you for your readiness to help.
I have problems using the pipe command (|>).
Once I have it in any script on my system, it won't.
The error message will say:
unexpected '>'
I loaded the two packages below to see if it would handle it. But the
problem remains.
lib
This is great!
Many thanks to all for helping to further resolve the problem.
Best wishes
Ogbos
On Fri, Mar 29, 2024 at 6:39 AM Rui Barradas wrote:
> Às 01:43 de 29/03/2024, Ogbos Okike escreveu:
> > Dear Rui,
> > Thanks again for resolving this. I have already started us
gt; Datecount
> 1 2024-03-23 5.416667
> 2 2024-03-24 5.50
> 3 2024-03-25 6.00
> 4 2024-03-26 4.476190
> 5 2024-03-27 6.538462
> 6 2024-03-28 5.20
>
> Best,
> -Deepayan
>
> On Wed, 27 Mar 2024 at 13:15, Rui Barradas wrote:
> >
> >
ound
> # note that the row names are still tapply's names vector
> # and that the columns order is not Date/count. Both are fixed
> # after the calculations.
> res
You can see that the error message is on the pipe. Please, let me know
where I am missing it.
Thanks.
On Wed
Rui Barradas wrote:
> Às 04:30 de 27/03/2024, Ogbos Okike escreveu:
> > Warm greetings to you all.
> >
> > Using the tapply function below:
> > data<-read.table("FD1month",col.names = c("Dates","count"))
> > x=data$count
> >
Warm greetings to you all.
Using the tapply function below:
data<-read.table("FD1month",col.names = c("Dates","count"))
x=data$count
f<-factor(data$Dates)
AB<- tapply(x,f,mean)
I made a simple calculation. The result, stored in AB, is of the form
below. But an effort to write AB to a file as a
I'm yet to accept this. I have sent an email to his wife.
Ogbos.
On Thu, Oct 5, 2023, 06:35 Rui Barradas wrote:
>
> My sympathies for your loss.
> Jim Lemon was a dedicated contributor to the R community and his answers
> were always welcome.
> Jim will be missed.
>
> Rui Barradas
>
> Às 23:36 d
Dear Experts,
Happy New Year!!, and thank you very much for all the assistance you
rendered to me last year.
I ran a PCA (Principal component analysis) and would like to combine PC1
and PC2 in order to define a new space. The variances associated with PC1
and PC2 are respectively 51% and 14%.
The
Dear Friends,
I have been scarce here. This is because I am busy implementing what I have
learned from you. You seem to have answered all my queries and I have not
got any new ones. I will quickly contact you as soon as I encounter
challenges in my analysis.
I hope you are well.
I have two recent
Dear Experts,
In the first few months of last year, I was asking a question based on of
one my codes. I received useful help here. Many thanks again.
I am glad to share one of the results of the script. I acknowledged the
entire list and some specific individuals that contributed.
Best regards.
O
Dear Friends,
I am really glad to share this link with you. Some of you have been
instrumental to the success of the work. You are acknowledged accordingly.
I am ever indebted.
Best wishes
Ogbos
-- Forwarded message -
From: Elsevier - Article Status
Date: Wed, Oct 7, 2020 at 10:50
it might be easier for you to
spot where I run into error (my plot was just empty).
Thanks again.
Best regards
Ogbos
On Wed, Jun 3, 2020 at 8:47 PM Jeff Newmiller
wrote:
> df[[ 5 ]][ 0 == df[[ 5 ]] ] <- NA
>
> On June 3, 2020 1:59:06 AM PDT, Ogbos Okike
> wrote:
> >Dear R
eff Newmiller
wrote:
> Perhaps read ?mean...
>
> On June 3, 2020 6:15:11 PM PDT, Ogbos Okike
> wrote:
> >Dear Jeff,
> >Thank you so much for your time.
> >I tried your code. It successfully assigned NA to the zeros.
> >
> >But the main code seems not to w
une 3, 2020 6:15:11 PM PDT, Ogbos Okike
> wrote:
> >Dear Jeff,
> >Thank you so much for your time.
> >I tried your code. It successfully assigned NA to the zeros.
> >
> >But the main code seems not to work with the NAs. The mean, for
> >example,
> >resul
Dear R-Experts,
I have a cosmic ray data that span several years. The data frame is of the
form:
03 01 01 003809
03 01 01 013771
03 01 01 023743
03 01 01 033747
03 01 01 043737
03 01 01 053751
03 01 01 063733
03 01 01 073732.
where the columns 1 to 5 stand for year,
trix)
> barp(rbind(FD1counts,FD2counts),names.arg=1996:2016,
> main="Observation counts for FD1 and FD2",
> xlab="Year",ylab="Observations",col=barcol)
> legend(12,80,c("FD1 only","FD1 & FD2","FD2 & FD1"),
> fill=c("
e frequency of observations in your two
> vectors by calendar year. If this is what you want, and you can
> explain how you would like "overlap" to be displayed, we can probably
> provide better help.
>
> Jim
>
>
> On Fri, May 8, 2020 at 7:01 AM Ogbos Okike
> wrot
Dear Experts,
Greetings.
I am trying to display two datasets in a histogram. I have been able to
plot the graph and added the legend for the two colors. I am, however,
having difficulties adding a legend to represent the regions of overlap
(the third legend). Below are my data and code.
Thank yo
t;)
> axis(2)
> axis.POSIXct(1,dta$datetime,format="%a")
> mtext("A1", side = 1, outer = TRUE, line = 2.2)
> mtext("B1", side = 2, outer = TRUE, line = 2.2,at =0.2)
> mtext("B2", side = 2, outer = TRUE, line = 2.2,at =0.5)
> mtext("B2&qu
> > values_to = "value"
> >) %>%
> >ggplot(aes(x, value)) +
> >geom_line() +
> >xlab("") + ylab("") +
> >scale_x_datetime(
> > breaks = seq(min(x), max(x), by = "days"),
> &
Dear Contributors,
I am trying to do a plot with multiple y-axis on a common x-axis. I have
made some progress but I am having difficulties with datetime appearing on
x-axis. Instead of as date, it appears as large numbers.
Sample of my data:
78 09 28 0 6.7 -40.4 -3.5 2.3 -3.6 278036 5.8 612
78 0
gt; dates, and look for NA values in the data.
>
> Sarah
>
> On Sun, Mar 1, 2020 at 4:12 PM Ogbos Okike wrote:
> >
> > Dear Friends,
> > I have two data frame of the form:
> > 1997-11-2219 -2.54910135429339
> > 1997-11- -2.66865640466636
> > 19
s that can be used in further calculations.
>
> Jim
>
> On Mon, Mar 2, 2020 at 8:12 AM Ogbos Okike wrote:
> >
> > Dear Friends,
> > I have two data frame of the form:
> > 1997-11-2219 -2.54910135429339
> > 1997-11- -2.66865640466636
> > 1997-11-2305 -2.6
Dear Experts,
I asked about a turning identification last few weeks. I got much help
from the list and was really really happy.
When part of the work was published, I was happy to show you and
further thank the contributors. Unfortunately, the moderator didn't
allow the pdf to go. A link was rathe
ards
Ogbos
On Sun, Mar 1, 2020 at 10:42 PM Jeff Newmiller wrote:
>
> a) Your use of HTML is corrupting your data. Post using plain text, and use
> dput output instead of trying to insert tables.
>
> b) You should read about ?setdiff.
>
> On March 1, 2020 1:12:04 PM PST, Ogbos Oki
Dear Friends,
I have two data frame of the form:
1997-11-2219 -2.54910135429339
1997-11- -2.66865640466636
1997-11-2305 -2.60761691358946
1997-12-1104 -2.5323738405159
1997-12-1106 -2.6020470080341
1998-05-0204 -4.49745020062937
1998-05-0209 -4.9462725263541
1998-05-0213 -4.60533021405541
1998-
gt; There are no "residuals less than -100.
> So you'll need to fix that too.
>
>
> B.
>
>
> On Sun, Feb 16, 2020 at 10:32 PM Ogbos Okike wrote:
> >
> > Dear Abby,
> > Thank you. I will look at your stimulated data and then run the code with
> &
6258
365512 31 6150
Best regards
Ogbos
On Sun, Feb 16, 2020 at 8:57 AM Ogbos Okike wrote:
>
> Dear Abby,
> Thank you. I will look at your stimulated data and then run the code with it.
>
> But since I am dealing with real data and also have volumes of it, I would
> li
Dear Abby,
Thank you. I will look at your stimulated data and then run the code with
it.
But since I am dealing with real data and also have volumes of it, I would
like to send my real data to you.
The OULU05 is attached with dput function. It is labeled Ogbos_dput. I
would be surprised if it
n that tp$pits is a logical index, trying to apply it to a vector
> (or data.frame) of a different size is problematic.
> Assuming that you're dealing with datasets of different sizes, the
> simplest solution is to modify your code, such that they're the same
> size.
>
>
> O
Dear Friends,
Wishing you the best of the day.
I have a data (Cosmic Ray) which exhibit flow patterns of a
sine/cosine wave, i.e. decreasing/increasing and registering crests
(points maximal increases) and troughs/pits (points maximal
decreases). These turning points are of interest to me. With pa
> 2018 365 22
> 2018 365 23")
> oodates$Pdate<-strptime(paste(oodates[,1],oodates[,2],oodates[,3]),
> format="%Y %j %H")
>
> Jim
>
> On Thu, Jan 23, 2020 at 8:24 PM Ogbos Okike wrote:
> >
> > Dear Experts,
> > I have a data spanning 56 yea
Dear Experts,
I have a data spanning 56 years from 1963 to 2018.
The datetime format is in DOY hour:
1963 335 0
1963 335 1
1963 335 2
1963 335 3
1963 335 4
1963 335 5
1963 335 6
1963 335 7
1963 335 8
1963 335 9
1996 202 20
1996 202 21
1996 202 22
1996 202 23
1996 203 0
1996 203 1
1996 203 2
1996 20
Dear ALL,
I am really happy. Erinco has pointed out the error.
Out of the the 5 stations' Cosmic ray data I am staking together to
show some similar effects, I wrongly (and I am sorry for taking your
time) copied one of the dates, including January 1, 1998 instead of
May.
Others were correct and
ese are still not correct and I am yet longing for further help.
Best regards
Ogbos
On Wed, Jan 8, 2020 at 4:22 PM Jeff Newmiller wrote:
>
> In your first email you said you were using
>
> Sys.setenv( TZ="GMT" )
>
> in your code, which defines the default
ance for additional suggestions.
Best wishes
Ogbos
On Wed, Jan 8, 2020 at 1:07 PM Enrico Schumann wrote:
>
>
> Quoting Ogbos Okike :
>
> > Dear Friends,
> > A sample of my data is:
> > 98 05 01 028541
> > 98 05 01 038548
> > 98 05 01 048512
>
s
On Wed, Jan 8, 2020 at 12:55 PM Ogbos Okike wrote:
>
> Dear Jim,
> Thank you for coming my assist me.
> I have tried all you suggested but the same result keep coming.
> I tried, for example:
> dta <- read.table("Ohr1may98", col.names = c("year", &qu
Dear Jim,
Thank you for coming my assist me.
I have tried all you suggested but the same result keep coming.
I tried, for example:
dta <- read.table("Ohr1may98", col.names = c("year", "month", "day",
"hour", "counts"))
dta$year <- with( dta, ifelse(year < 50, year + 2000, year + 1900))
dta$date<-s
Dear Friends,
A sample of my data is:
98 05 01 028541
98 05 01 038548
98 05 01 048512
98 05 01 058541
98 05 01 068509
98 05 01 078472
98 05 01 088454
98 05 01 098461
98 05 01 108462
98 05 01 118475
98 05 01 128433
98 05 01 138479
98 05 01 148417
9
Dear Friends,
Happy New Year.
Best regards
Ogbos
[[alternative HTML version deleted]]
__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R
line", side = 2, line=2.5, at=0)
>
> par(new=TRUE)
> plot(x2, y2, pch = 1,type="b",col="blue",
> xaxt="n", yaxt="n",
> xlab="", ylab="")
> axis(side=4, at=c(-5,-1,0), labels=c("98%",
Dear Contributors,
I have two data. A is of the form:
05 01 01 -0.00376058013285748
05 01 02 -0.0765481943910918
05 01 03 -1.28158758599964
05 01 04 -1.51612545416506
05 01 05 -1.39481276373467
05 01 06 -1.17644992095997
05 01 07 -0.788249311582716
05 01 08 -0.925737027403825
05 01 09 -1.0
t; xorigin<-as.POSIXct("1970-01-01 0:0:0","%Y-%m-%d %H:%M:%S")
> xticks<-as.POSIXct(pretty(range(as.numeric(dta$date))),origin=xorigin)
> twoord.plot(dta$date,dta$Li,dta$date,dta$CR,xlim=xlim,xaxt="n",
> main="Lightning and GCR frequency",
Dear Members,
I have some hourly data. Usin:
dta$year <- with( dta, ifelse(year < 50, year + 2000, year + 1900))
dta$datetime <- with( dta, as.POSIXct(ISOdatetime(year, month,day,hour,0,0)))
I converted the hour to time format and stored in Year.
The data consists of two different observations and
05-02 -6.4357083781542
> 1998-05-17 -2.25359807972754
> 1998-05-21 -2.55799006865995
> 1999-08-22 -2.25114162617707
> 1999-08-25 -1.47905397376409
> 1999-09-05 -0.641589808755325
> 1999-09-09 -0.648954682695949
> 1999-09-13 -0.726364489272492
> 1999-09-16 -1.28445236942011
&g
at results you get. I highly recommend using the reprex
> package to verify that your example is reproducible.
>
> On September 9, 2019 12:14:50 PM PDT, Ogbos Okike
> wrote:
> >Dear Bert and Jeff,
> >Thank you for looking at this.
> >I am surprised my message was no
aybe others may have greater
> >insight.
> >
> >Bert Gunter
> >
> >"The trouble with having an open mind is that people keep coming along
> >and
> >sticking things into it."
> >-- Opus (aka Berkeley Breathed in his "Bloom County" comic
Dear Contributors,
I have a data frame of the form:
1997-11-23 -2.91709629064653
1997-12-07 -0.960255426066815
1997-12-11 -1.98210752999868
1997-12-20 -1.10800598439855
1998-01-01 -1.00090115428118
1998-01-29 -1.03056081882709
1998-03-27 -0.873243859498216
1998-04-09 -2.06378384750109
1998-04-12 -2
72-03-02"),-9,"Total FDs = 1575")
#box()
par(new=TRUE)
plot(date_x2,CR, pch=16,
xlab="",ylab="",ylim=c(-0.5030138,-14.39884),axes=FALSE,type="l",col="red")
points(date_x2,CR,col="red")
mtext("FD (%)",side=4,col="red&
rt Gunter
>
> "The trouble with having an open mind is that people keep coming along and
> sticking things into it."
> -- Opus (aka Berkeley Breathed in his "Bloom County" comic strip )
>
>
> On Sat, Aug 24, 2019 at 9:49 PM Ogbos Okike wrote:
>>
>&
Dear Contributors,
I have two dataset of different lengths, each containing year, month,
day and counts.
After converting the date using as.Date function, I plotted the two
dateset on one graph. That was fine.
I am, however, having problems with the axis 1 where I am trying to
put the dates.
Since
ypothesis and a good understanding
> of how the data was generated, it is impossible for you (and anyone else) to
> say how such a test might be designed.
>
> Michael
>
>
> > -Original Message-
> > From: Ogbos Okike
> > Sent: Mittwoch, 27. Februar 2019 22:53
>
ated numbers to test the
statistical significance level of the signal generated by
plot(-5:10,oomean,type="l",ylim=c(8890,9100), )?
I wish to test for 90% and 99% percentile.
I am sorry that this is too long.
Many thanks for your kind contributions
Best
Ogbos
On Sun, Feb 10, 201
7;s not going to
> work.
>
> Jim
>
> On Sun, Feb 17, 2019 at 12:55 PM Ogbos Okike wrote:
> >
> > Dear Jim,
> > Thank you and welcome back. It seems you have been away as you have not
> > been responding to people's questions as before.
> >
> &g
an actual minimal working example of what you had working
> before you changed to POSIXct.
>
> On February 16, 2019 1:08:38 PM PST, Ogbos Okike
> wrote:
> >Dear Jeff,
> >One more problem please.
> >
> >When I used as.Date(ISOdate(dta$year, dta$month, dta$day,dta$hour
ll me how to text "b" on the
point corresponding with 1960-05-04 09:00:00 on my plot.
Many thanks for your extra time.
Best wishes
Ogbos
On Fri, Feb 15, 2019 at 8:25 AM Ogbos Okike wrote:
>
> Dear Jeff,
>
> Please hold.
> It is begging to work. There was an error somewher
Dear Jeff,
I am alright now
Please accept my indebtedness!!!
Warmest regards
Ogbos
On Fri, Feb 15, 2019 at 8:25 AM Ogbos Okike wrote:
>
> Dear Jeff,
>
> Please hold.
> It is begging to work. There was an error somewhere. One ")" is
> missing and as I went back t
h.
Thanks in a hurry.
Best regards
Ogbos
On Fri, Feb 15, 2019 at 8:15 AM Ogbos Okike wrote:
>
> Dear Jeff,
>
> Thank you so much.
>
> I ran the code but got an error message. I then try to run them line by line.
>
> The problem is in:
> dta$datetime <- with( dta, as.
the data frame
> into x, but that is your business.
>
> Appropriate timezone values can be reviewed with the OlsonNames() function.
>
>
> On February 14, 2019 10:29:58 PM PST, Ogbos Okike
> wrote:
> >Dear List,
> >I have a simple code with which I convert year,
Dear List,
I have a simple code with which I convert year, month, and day to a date format.
My data looks like:
67 01 2618464
67 01 2618472
67 01 2618408
67 01 2618360
67 01 2618328
67 01 2618320
67 01 2618296
while my code is:
data <- read.table("CALG.txt", col.names
her you adhered to the principles or not, which
> is impossible for me to judge based on the information provided (and I won't
> be able to look at excessive code to check either).
>
> Michael
>
> > -Original Message-
> > From: R-help On Behalf Of Ogbos Okike
Dear Contributors,
I conducting epoch analysis. I tried to test the significance of my
result using randomization test.
Since I have 71 events, I randomly selected another 71 events, making
sure that none of the dates in the random events corresponds with the
ones in the real event.
Following th
Dear Jim,
Good news to me!! Welcome.
I am fine. The code elegantly displayed the color.
I also tried to adjust the line:
draw.circle(lonmids[lon],latmids[lat],radius=sqrt(counts[lat,lon])/100,
border=countcol[lat,lon],col=countcol[lat,lon]) in order to reduce
the radius of the circle in order
ec 11, 2018 at 11:14 AM Jim Lemon wrote:
>
> Hi Ogbos,
> I have been off the air for a couple of days. Look at the color.legend
> function in the plotrix package.
>
> Jim
> On Tue, Dec 11, 2018 at 12:39 PM Ogbos Okike wrote:
> >
> > Dear Jim,
> > I a
quot;blue","red"))
> map("world",xlim=c(-90,160),ylim=c(20,45))
> for(lon in 1:length(lonmids)) {
> for(lat in 1:length(latmids)) {
> if(counts[lat,lon] > 0)
>draw.circle(lonmids[lon],latmids[lat],radius=sqrt(counts[lat,lon]),
> border=countcol[
Dear Contributors,
I have a data of the form:
Lat Lon
30.1426 104.7854
30.5622 105.0837
30.0966 104.6213
29.9795 104.8430
39.2802 147.7295
30.2469 104.6543
26.4428 157.7293
29.4782 104.5590
32.3839 105.3293
26.4746 157.8411
25.1014 159.6959
25.1242 159.6558
30.1607 104.9100
31.4900 -71.89
h you could modify or you
> are free to modify the code itself.
>
> Cheers
> Petr
>
> > -Original Message-
> > From: Ogbos Okike
> > Sent: Thursday, November 29, 2018 5:17 PM
> > To: PIKAL Petr
> > Cc: r-help
> > Subject: Re: [R] Correct
if (linky)
> lines(x, yleft, col = col[2], lty = 2, ...)
> if (smooth != 0)
> lines(supsmu(x, yleft, span = smooth), col = col[2],lty = 2, lwd
> = lwds, ...)
> }
>
> something like
>
> plot.yy(Year, Li, CR)
>
> Cheers
> Petr
>
> > -Origin
Dear Contributors,
I have a data of the form:
4 8 10 8590 12516
4 8 11 8641 98143
4 8 12 8705 98916
4 8 13 8750 89911
4 8 14 8685 104835
4 8 15 8629 121963
4 8 16 8676 77655
4 8 17 8577 81081
4 8 18 8593 83385
4 8 19 8642 112164
4 8 20 8708 103684
4 8 21 8622 83982
4 8 22 8593 75944
4 8 23 8600 97
9.4889267
> #> 39 4 8561 122195 6 -0.028359802 16.6179943
> #> 40 5 8532 100945 6 -0.367009209 -3.6621512
> #> 41 6 8560 108552 6 -0.040037368 3.5976637
> #> 42 7 8634 108707 6 0.824102496 3.7455895
> #> 43 1 8646 117420 7 -0.816125860 14.4890796
> #> 44 2 8633 113
lt;-((d4$CR-mean(d4$CR))/mean(CR))*100
a5<-((d5$CR-mean(d5$CR))/mean(CR))*100
a6<-((d6$CR-mean(d6$CR))/mean(CR))*100
a7<-((d7$CR-mean(d7$CR))/mean(CR))*100
a1-a7 actually gives percentage change in the data.
Instead of doing this one after the other, can you please give an
indication on
8)
> col2means<-by(oodf[,2],oodf[,4],mean)
> col3means<-by(oodf[,3],oodf[,4],mean)
>
> Jim
>
> On Wed, Nov 28, 2018 at 2:06 PM Ogbos Okike
> wrote:
> >
> > Dear List,
> > I have three data-column data. The data is of the form:
> > 1 8590 12516
>
8)
> col2means<-by(oodf[,2],oodf[,4],mean)
> col3means<-by(oodf[,3],oodf[,4],mean)
>
> Jim
>
> On Wed, Nov 28, 2018 at 2:06 PM Ogbos Okike
> wrote:
> >
> > Dear List,
> > I have three data-column data. The data is of the form:
> > 1 8590 12516
>
Dear List,
I have three data-column data. The data is of the form:
1 8590 12516
2 8641 98143
3 8705 98916
4 8750 89911
5 8685 104835
6 8629 121963
7 8676 77655
1 8577 81081
2 8593 83385
3 8642 112164
4 8708 103684
5 8622 83982
6 8593 75944
7 8600 97036
1 8650 104911
2 8730 114098
3 8731 99421
4 871
Hi David,
That's it!!! The outcome is attached.
Many thanks please.
Best
Ogbos
On Wed, Sep 19, 2018 at 11:34 PM David Winsemius
wrote:
>
> > On Sep 19, 2018, at 7:55 AM, Ogbos Okike
> wrote:
> >
> > Dear Experts,
> > I generated the plot attached. Every
Dear Experts,
I generated the plot attached. Every other thing is OK except the black
horizontal lines which should appear like points or dots as the coloured
ones. I can't understand why.
I tried to change it to look like dots by calling empty plots so that I
will add them as points.
Since I hav
your
> confusion. Beware that if you don't read them first and mention why they
> didn't answer your question then you may get less helpful responses when
> when you do ask questions.
>
> On September 17, 2018 10:49:43 PM PDT, Ogbos Okike <
> giftedlife2...@gmail.co
been looking for.
xtable(data, digits=12) and I am fine.
Thank you so much.
Ogbos
On Tue, Sep 18, 2018 at 6:27 AM Jeff Newmiller
wrote:
> Have you read
>
> ?xtable
>
>
> On September 17, 2018 7:24:37 PM PDT, Ogbos Okike <
> giftedlife2...@gmail.com> wrote:
> >
Dear Volunteers,
I have a table involving many decimal places:
2005-01-04 -2.13339817688037
2005-01-19 -6.86312349695117
2005-01-22 -4.33662370554386
2005-02-10 -1.40789214441639
2005-02-13 -1.1334121785854
2005-02-19 -1.28411233010119
2005-05-09 -1.6895978161324
2005-05-16 -3.07664523496947
2005-
t( n )
> a <- data.frame( d1_date=d1$date[ix], d2_date=d2$date[ix],
> d3_date=d3$date[ix] )
>
> On September 17, 2018 12:17:03 AM PDT, Ogbos Okike <
> giftedlife2...@gmail.com> wrote:
> >Dear Contributors,
> >
> >I have two data frame of different column l
Dear Contributors,
I have two data frame of different column lengths. I am trying to have them
in one data frame.
Using
A<-d1$date
B<-d2$date
a<-data.table(A )[ , I := .I][data.table(B )[ , I := .I], on = "I"]
I got
1: 2005-01-04 1 2005-01-04
2: 2005-01-19 2 2005-01-19
3: 2005-01-22 3 2005-01
s:
>
> range(oose)
> [1] 1728.234 6890.916
>
> What was the range of oose values for the data in the plot you included
> with your message?
>
>
> David L Carlson
> Department of Anthropology
> Texas A&M University
&
Dear List,
I have a dataset of high variability. I conducted epoch analysis and
attempted to plot the standard error bar alongside.
I am, however, surprised that the error bars are of equal length. I do not
think that the variability in the data is captured, except there is a kind
of averaging th
sumb2 <- sumb2 + coef(mod)[[2]]
> }
> print(sumb2/C, digits = 3)
>
> Best,
> Eric
>
>
>
> On Wed, Aug 22, 2018 at 7:28 PM, Ogbos Okike
> wrote:
>
>> Hello Erick,
>>
>> Thanks again.
>> Another line indicated error:
>>
>> source
comma ... it should be
>
> Li[sample(1:N, size = S, replace = TRUE)]
>
> i.e. no comma after the closing parenthesis
>
>
>
> On Wed, Aug 22, 2018 at 7:20 PM, Ogbos Okike
> wrote:
>
>> Hello Eric,
>> Thanks for this.
>>
>> I tried it. It
se
>
> N <- length(Li)
>
> HTH,
> Eric
>
>
> On Wed, Aug 22, 2018 at 6:02 PM, Ogbos Okike
> wrote:
>
>> Kind R-users,
>> I run a simple regression. I am interested in using the Monte Carlo to
>> test
>> the slope parameter.
>> Here i
Kind R-users,
I run a simple regression. I am interested in using the Monte Carlo to test
the slope parameter.
Here is what I have done:
d1<-read.table("Lightcor",col.names=c("a"))
d2<-read.table("CRcor",col.names=c("a"))
Li<-d1$a
CR<-d2$a
fit<-lm(Li~CR)
a<-summary(fit)
a gives the slope as 88.
8 107074
> 799 108103
> 80 10 7576",
> header=TRUE)
> library(plotrix)
> std.error<-function(x) return(sd(x)/(sum(!is.na(x
> oomean<-as.vector(by(oodf$B,oodf$A,mean))
> oose<-as.vector(by(oodf$B,oodf$A,std.error))
> plot(-5:10,oomean,type="b&quo
,std.error))
> plot(-5:10,oomean,type="b",ylim=c(5,110000),
> xlab="days (epoch is the day of Fd)",ylab="strikes/km2/day")
> dispersion(-5:10,oomean,oose)
>
> I get the expected plot.
>
> Jim
>
>
> On Sat, Jun 23, 2018 at 9:36 P
Dear List,
I am happy to report that the problem is fixed. as.Date("1998-02-10") as
suggested by David handled the problem with easy. Many thanks to everybody.
as.Date(1998-02-10) really resulted in error. It is my oversight. I really
tried many things the day I was working on that and have forgott
Dear workers,
I have a data of length 1136. Below is the code I use to get the means B.
It worked fine and I had the mean calculated and plotted.
I wish to plot the error bars as well. I already plotted such means with
error bars before. Please see attached for example.
I tried to redo the same p
Dear Contributors,
I am surprised that I cannot add legend to a certain plot. Although the
x-axis indicates years, the actual data was in year, month and day format.
I then used as.Date to play around and get what I am looking for. I am,
however, worried that I cannot add legend to the plot no mat
;2018-01-22 12:00:00 GMT"
> > ISOdate(2018,01,22,18,17)
> [1] "2018-01-22 18:17:00 GMT"
>
> Add something like:
>
> if(is.null(data$hour),data$hour<-12
>
> then pass data$hour as it will default to the same value as if you
> hadn't passed it.
&g
Dear Members,
Compliments of the Season!!
Below is a part of a code I use for Fourier analysis of signals. The code
handles data with the format 05 01 018628 (year, month, day and count)
05 01 028589 (year, month, day and count)
The sample data is attached a
Hi Saba,
Your main worry may be that of non- zero status and hence your attempt to
load what the system claims you have. I have encountered such problems
severally. You can try two things: run it several times ( network issues
might play a role) or try different crab mirrors.
Ogbos
On May 19, 2016
1 - 100 of 161 matches
Mail list logo