This is a follow-up mail of "the effect of blocking on the size of
confidence intervals - analysis using aov".

In both mails I pursue the idea of using blocking factors in order to
reduce the width of confidence intervals.

My dataset comprises,
a quantitative response variable, namely: "response", and
three categorical eplanatory variables, namely: "method", "target", and
"query".
The method variable has 4 levels, namely:  XXX.pg.trained,
XXX.rpg.trained, MACCS, and FTREES.
The target variable has 28 levels.
The query variable has 84 levels.
For every target three queries are selected. Thus "query" is nested in
"target".
The design is completely balanced.

"response" "method" "target" "query"
0.68   "XXX.pg.trained" "06243" "06243:218379"
0.96   "XXX.pg.trained" "06243" "06243:218384"
0.72   "XXX.pg.trained" "06243" "06243:246192"
0.7696 "XXX.pg.trained" "06245" "06245:235027"
...

For the complete dataset see the attached file data.txt

Using lm and lme, I analyze three models:
> m0.lm  = lm ( response ~ method -1, data = d)
> m1.lme = lme( response ~ method -1, random = ~ 1 | target, data = d)
> m2.lme = lme( response ~ method -1, random = ~ 1 | query , data = d)

Using lmer the calls are:
> m1.lmer = lmer( response ~ method -1 + ( 1 | target ), data = d)
...

> summary.lm( m0.lm )

...
                      Estimate Std. Error t value Pr(>|t|)
methodFTREES           0.34163    0.03221   10.61   <2e-16 ***
methodMACCS            0.45525    0.03221   14.13   <2e-16 ***
methodXXX.pg.trained   0.49498    0.03221   15.37   <2e-16 ***
methodXXX.rpg.trained  0.45049    0.03221   13.99   <2e-16 ***
...
Residual standard error: 0.2952 on 332 degrees of freedom
...

> summary( m1.lme )

...
                         Value  Std.Error  DF   t-value p-value
methodFTREES          0.3416262 0.04871665 305  7.012514       0
methodMACCS           0.4552452 0.04871665 305  9.344756       0
methodXXX.pg.trained  0.4949833 0.04871665 305 10.160455       0
methodXXX.rpg.trained 0.4504917 0.04871665 305  9.247180       0
...

> summary( m2.lme )

...
                          Value  Std.Error  DF  t-value p-value
methodFTREES          0.3416262 0.03220994 249 10.60624       0
methodMACCS           0.4552452 0.03220994 249 14.13369       0
methodXXX.pg.trained  0.4949833 0.03220994 249 15.36741       0
methodXXX.rpg.trained 0.4504917 0.03220994 249 13.98611       0
...

Thus the confidence interval for "methodFTREES" is given by:

0.34163   +- qt( 0.025, 332 )*0.03221    = ( 0.2782686, 0.4049914 )  # m0.lm
0.3416262 +- qt( 0.025, 305 )*0.04871665 = ( 0.2457629, 0.4374895 )  #
m1.lme
0.3416262 +- qt( 0.025, 249 )*0.03220994 = ( 0.2781875, 0.4050649 )  #
m2.lme

Thus the interval for m1.lme is larger than the interval for m0.lm which
contradicts the utility of blocking...

Using lmer instead of lme, identical results can be derived.
Note that the intervals for lmer can alternatively be derived by:

> intervals( m1.lme )

...
methodFTREES          0.2457629 0.3416262 0.4374895
methodMACCS           0.3593820 0.4552452 0.5511085
methodXXX.pg.trained  0.3991201 0.4949833 0.5908466
methodXXX.rpg.trained 0.3546284 0.4504917 0.5463549
...


I have the following questions:

- How are the  std. errors of lme calculated?
It seems like the std errors of lme/lmer are more conservative than the
std errors reported by lm. This would explain for the increase of the
std error (and hence the size of the confidence interval) from m0.lm to
m1.lme.
- Is its possible to use lme without a random term, or alternatively
with a trivial random term?
- Is the overall approach: use of blocking factors to reduce the
residual varaince, and hence to decrease the width of the confidence
intervals valid?


Using lmer, alternatively to the calculation of CIs, credible intervals
can be calculated:

> m1.sample = mcmcsamp( m1.lmer, n =10000 )
> m1.ci = apply( m1.sample, 2, function(x)quantile(x, prob=c(.025,0.975)))
> m1.ci[,"methodFTREES"]
     2.5%     97.5%
0.2405115 0.4424920

...
> m2.ci[,"methodFTREES"]
     2.5%     97.5%
0.2767626 0.4042516

For my intended use: the comparison of the average performance of
different methods credible intervals a probably as good as CI. However,
how do I calculate credible intervals for m0.lm???

Thanks for your help,

Lars



 


"response" "method" "target" "query"
0.68 "XXX.pg.trained" "06243" "06243:218379"
0.96 "XXX.pg.trained" "06243" "06243:218384"
0.72 "XXX.pg.trained" "06243" "06243:246192"
0.7696 "XXX.pg.trained" "06245" "06245:235027"
0.5806 "XXX.pg.trained" "06245" "06245:261128"
0.8111 "XXX.pg.trained" "06245" "06245:293245"
0.5217 "XXX.pg.trained" "06280" "06280:201466"
0.6087 "XXX.pg.trained" "06280" "06280:201666"
0.1304 "XXX.pg.trained" "06280" "06280:299665"
0.84 "XXX.pg.trained" "07712" "07712:184034"
0.736 "XXX.pg.trained" "07712" "07712:197475"
0.608 "XXX.pg.trained" "07712" "07712:210421"
0.1429 "XXX.pg.trained" "08400" "08400:155143"
0.0476 "XXX.pg.trained" "08400" "08400:224050"
0.2857 "XXX.pg.trained" "08400" "08400:268253"
0.1795 "XXX.pg.trained" "08410" "08410:210117"
0.2308 "XXX.pg.trained" "08410" "08410:210929"
0.0513 "XXX.pg.trained" "08410" "08410:308663"
0.5724 "XXX.pg.trained" "09221" "09221:153896"
0.5586 "XXX.pg.trained" "09221" "09221:183162"
0.6069 "XXX.pg.trained" "09221" "09221:191600"
0.25 "XXX.pg.trained" "09226" "09226:210893"
0.3026 "XXX.pg.trained" "09226" "09226:213199"
0.4079 "XXX.pg.trained" "09226" "09226:285643"
0.8429 "XXX.pg.trained" "09229" "09229:271866"
0.8143 "XXX.pg.trained" "09229" "09229:271875"
0.8571 "XXX.pg.trained" "09229" "09229:276675"
0.7879 "XXX.pg.trained" "09249" "09249:172199"
0.5786 "XXX.pg.trained" "09249" "09249:195195"
0.7619 "XXX.pg.trained" "09249" "09249:219579"
1 "XXX.pg.trained" "10500" "10500:215751"
0.9412 "XXX.pg.trained" "10500" "10500:217463"
1 "XXX.pg.trained" "10500" "10500:217470"
0.4374 "XXX.pg.trained" "12455" "12455:155194"
0.4168 "XXX.pg.trained" "12455" "12455:173259"
0.0164 "XXX.pg.trained" "12455" "12455:180623"
0.1774 "XXX.pg.trained" "12457" "12457:192666"
0.5 "XXX.pg.trained" "12457" "12457:210683"
0.4355 "XXX.pg.trained" "12457" "12457:230734"
0.8235 "XXX.pg.trained" "12458" "12458:184134"
0.8235 "XXX.pg.trained" "12458" "12458:253641"
0.0588 "XXX.pg.trained" "12458" "12458:265099"
1 "XXX.pg.trained" "12462" "12462:196715"
1 "XXX.pg.trained" "12462" "12462:196717"
1 "XXX.pg.trained" "12462" "12462:214059"
0.6256 "XXX.pg.trained" "12464" "12464:239587"
0.0529 "XXX.pg.trained" "12464" "12464:245484"
0.3172 "XXX.pg.trained" "12464" "12464:275235"
0.2763 "XXX.pg.trained" "16200" "16200:142458"
0.25 "XXX.pg.trained" "16200" "16200:142464"
0.5263 "XXX.pg.trained" "16200" "16200:144257"
0.0992 "XXX.pg.trained" "43210" "43210:153391"
0.0248 "XXX.pg.trained" "43210" "43210:190179"
0.4793 "XXX.pg.trained" "43210" "43210:289932"
0.1484 "XXX.pg.trained" "71522" "71522:192905"
0.1484 "XXX.pg.trained" "71522" "71522:258173"
0.3203 "XXX.pg.trained" "71522" "71522:285071"
0.8182 "XXX.pg.trained" "78329" "78329:228569"
0.8636 "XXX.pg.trained" "78329" "78329:380243"
0.8485 "XXX.pg.trained" "78329" "78329:380245"
0.0606 "XXX.pg.trained" "78348" "78348:148763"
0.1667 "XXX.pg.trained" "78348" "78348:198962"
0.1364 "XXX.pg.trained" "78348" "78348:235987"
0.1708 "XXX.pg.trained" "78351" "78351:164446"
0.0854 "XXX.pg.trained" "78351" "78351:164447"
0.1326 "XXX.pg.trained" "78351" "78351:237303"
0.0294 "XXX.pg.trained" "78370" "78370:141095"
0.1863 "XXX.pg.trained" "78370" "78370:182764"
0.1176 "XXX.pg.trained" "78370" "78370:214757"
0.5556 "XXX.pg.trained" "78376" "78376:177269"
0.5556 "XXX.pg.trained" "78376" "78376:182777"
0.7778 "XXX.pg.trained" "78376" "78376:182778"
0.6667 "XXX.pg.trained" "78387" "78387:143257"
0.2222 "XXX.pg.trained" "78387" "78387:170329"
0.6667 "XXX.pg.trained" "78387" "78387:270359"
0.2192 "XXX.pg.trained" "78394" "78394:275118"
0.3014 "XXX.pg.trained" "78394" "78394:329964"
0.3151 "XXX.pg.trained" "78394" "78394:329965"
0.8182 "XXX.pg.trained" "78396" "78396:160580"
0.8182 "XXX.pg.trained" "78396" "78396:160587"
0.8182 "XXX.pg.trained" "78396" "78396:160589"
0.75 "XXX.pg.trained" "78406" "78406:172305"
0.6667 "XXX.pg.trained" "78406" "78406:172306"
0.6667 "XXX.pg.trained" "78406" "78406:172307"
0.76 "XXX.rpg.trained" "06243" "06243:218379"
0.88 "XXX.rpg.trained" "06243" "06243:218384"
0.92 "XXX.rpg.trained" "06243" "06243:246192"
0.5241 "XXX.rpg.trained" "06245" "06245:235027"
0.5379 "XXX.rpg.trained" "06245" "06245:261128"
0.4839 "XXX.rpg.trained" "06245" "06245:293245"
0.5217 "XXX.rpg.trained" "06280" "06280:201466"
0.0435 "XXX.rpg.trained" "06280" "06280:201666"
0.087 "XXX.rpg.trained" "06280" "06280:299665"
0.864 "XXX.rpg.trained" "07712" "07712:184034"
0.808 "XXX.rpg.trained" "07712" "07712:197475"
0.88 "XXX.rpg.trained" "07712" "07712:210421"
0.1429 "XXX.rpg.trained" "08400" "08400:155143"
0.0476 "XXX.rpg.trained" "08400" "08400:224050"
0.5238 "XXX.rpg.trained" "08400" "08400:268253"
0.1538 "XXX.rpg.trained" "08410" "08410:210117"
0.1282 "XXX.rpg.trained" "08410" "08410:210929"
0.0256 "XXX.rpg.trained" "08410" "08410:308663"
0.4345 "XXX.rpg.trained" "09221" "09221:153896"
0.1655 "XXX.rpg.trained" "09221" "09221:183162"
0.1172 "XXX.rpg.trained" "09221" "09221:191600"
0.2763 "XXX.rpg.trained" "09226" "09226:210893"
0.3158 "XXX.rpg.trained" "09226" "09226:213199"
0.3026 "XXX.rpg.trained" "09226" "09226:285643"
0.7286 "XXX.rpg.trained" "09229" "09229:271866"
0.7 "XXX.rpg.trained" "09229" "09229:271875"
0.7429 "XXX.rpg.trained" "09229" "09229:276675"
0.316 "XXX.rpg.trained" "09249" "09249:172199"
0.368 "XXX.rpg.trained" "09249" "09249:195195"
0.6537 "XXX.rpg.trained" "09249" "09249:219579"
0.8824 "XXX.rpg.trained" "10500" "10500:215751"
0.8824 "XXX.rpg.trained" "10500" "10500:217463"
0.8824 "XXX.rpg.trained" "10500" "10500:217470"
0.345 "XXX.rpg.trained" "12455" "12455:155194"
0.3511 "XXX.rpg.trained" "12455" "12455:173259"
0.037 "XXX.rpg.trained" "12455" "12455:180623"
0.1935 "XXX.rpg.trained" "12457" "12457:192666"
0.4677 "XXX.rpg.trained" "12457" "12457:210683"
0.4677 "XXX.rpg.trained" "12457" "12457:230734"
0.8235 "XXX.rpg.trained" "12458" "12458:184134"
0.8235 "XXX.rpg.trained" "12458" "12458:253641"
0.0588 "XXX.rpg.trained" "12458" "12458:265099"
1 "XXX.rpg.trained" "12462" "12462:196715"
1 "XXX.rpg.trained" "12462" "12462:196717"
1 "XXX.rpg.trained" "12462" "12462:214059"
0.2159 "XXX.rpg.trained" "12464" "12464:239587"
0.0617 "XXX.rpg.trained" "12464" "12464:245484"
0.4273 "XXX.rpg.trained" "12464" "12464:275235"
0.25 "XXX.rpg.trained" "16200" "16200:142458"
0.4737 "XXX.rpg.trained" "16200" "16200:142464"
0.3289 "XXX.rpg.trained" "16200" "16200:144257"
0.2066 "XXX.rpg.trained" "43210" "43210:153391"
0.1157 "XXX.rpg.trained" "43210" "43210:190179"
0.5372 "XXX.rpg.trained" "43210" "43210:289932"
0.2188 "XXX.rpg.trained" "71522" "71522:192905"
0.1094 "XXX.rpg.trained" "71522" "71522:258173"
0.2578 "XXX.rpg.trained" "71522" "71522:285071"
0.7727 "XXX.rpg.trained" "78329" "78329:228569"
0.7424 "XXX.rpg.trained" "78329" "78329:380243"
0.197 "XXX.rpg.trained" "78329" "78329:380245"
0.1667 "XXX.rpg.trained" "78348" "78348:148763"
0.2576 "XXX.rpg.trained" "78348" "78348:198962"
0.1667 "XXX.rpg.trained" "78348" "78348:235987"
0.2539 "XXX.rpg.trained" "78351" "78351:164446"
0.0786 "XXX.rpg.trained" "78351" "78351:164447"
0.2831 "XXX.rpg.trained" "78351" "78351:237303"
0.1078 "XXX.rpg.trained" "78370" "78370:141095"
0.1961 "XXX.rpg.trained" "78370" "78370:182764"
0.1765 "XXX.rpg.trained" "78370" "78370:214757"
0.3333 "XXX.rpg.trained" "78376" "78376:177269"
0.3333 "XXX.rpg.trained" "78376" "78376:182777"
0.2222 "XXX.rpg.trained" "78376" "78376:182778"
0.4444 "XXX.rpg.trained" "78387" "78387:143257"
0.4444 "XXX.rpg.trained" "78387" "78387:170329"
0.6667 "XXX.rpg.trained" "78387" "78387:270359"
0.1507 "XXX.rpg.trained" "78394" "78394:275118"
0.3425 "XXX.rpg.trained" "78394" "78394:329964"
0.3836 "XXX.rpg.trained" "78394" "78394:329965"
1 "XXX.rpg.trained" "78396" "78396:160580"
1 "XXX.rpg.trained" "78396" "78396:160587"
1 "XXX.rpg.trained" "78396" "78396:160589"
0.75 "XXX.rpg.trained" "78406" "78406:172305"
0.75 "XXX.rpg.trained" "78406" "78406:172306"
0.75 "XXX.rpg.trained" "78406" "78406:172307"
0.64 "FTREES" "06243" "06243:218379"
0.8 "FTREES" "06243" "06243:218384"
0.8 "FTREES" "06243" "06243:246192"
0.0323 "FTREES" "06245" "06245:235027"
0.2258 "FTREES" "06245" "06245:261128"
0.4055 "FTREES" "06245" "06245:293245"
0.5652 "FTREES" "06280" "06280:201466"
0.5652 "FTREES" "06280" "06280:201666"
0.6087 "FTREES" "06280" "06280:299665"
0.632 "FTREES" "07712" "07712:184034"
0.368 "FTREES" "07712" "07712:197475"
0.176 "FTREES" "07712" "07712:210421"
0.1429 "FTREES" "08400" "08400:155143"
0.0476 "FTREES" "08400" "08400:224050"
0.381 "FTREES" "08400" "08400:268253"
0.0513 "FTREES" "08410" "08410:210117"
0.0513 "FTREES" "08410" "08410:210929"
0.0513 "FTREES" "08410" "08410:308663"
0.4345 "FTREES" "09221" "09221:153896"
0.0621 "FTREES" "09221" "09221:183162"
0.2966 "FTREES" "09221" "09221:191600"
0.1447 "FTREES" "09226" "09226:210893"
0.1579 "FTREES" "09226" "09226:213199"
0.3193 "FTREES" "09226" "09226:285643"
0.2429 "FTREES" "09229" "09229:271866"
0.6571 "FTREES" "09229" "09229:271875"
0.6857 "FTREES" "09229" "09229:276675"
0.2295 "FTREES" "09249" "09249:172199"
0.0505 "FTREES" "09249" "09249:195195"
0.3983 "FTREES" "09249" "09249:219579"
0.3529 "FTREES" "10500" "10500:215751"
0.3529 "FTREES" "10500" "10500:217463"
0.2353 "FTREES" "10500" "10500:217470"
0.2136 "FTREES" "12455" "12455:155194"
0.2998 "FTREES" "12455" "12455:173259"
0.1704 "FTREES" "12455" "12455:180623"
0.0806 "FTREES" "12457" "12457:192666"
0.5968 "FTREES" "12457" "12457:210683"
0.5806 "FTREES" "12457" "12457:230734"
0.3529 "FTREES" "12458" "12458:184134"
0.4706 "FTREES" "12458" "12458:253641"
0.5882 "FTREES" "12458" "12458:265099"
1 "FTREES" "12462" "12462:196715"
1 "FTREES" "12462" "12462:196717"
1 "FTREES" "12462" "12462:214059"
0.2863 "FTREES" "12464" "12464:239587"
0.0749 "FTREES" "12464" "12464:245484"
0.2159 "FTREES" "12464" "12464:275235"
0.1053 "FTREES" "16200" "16200:142458"
0.1184 "FTREES" "16200" "16200:142464"
0.3421 "FTREES" "16200" "16200:144257"
0.0992 "FTREES" "43210" "43210:153391"
0.0331 "FTREES" "43210" "43210:190179"
0.3306 "FTREES" "43210" "43210:289932"
0.2344 "FTREES" "71522" "71522:192905"
0.125 "FTREES" "71522" "71522:258173"
0.2266 "FTREES" "71522" "71522:285071"
0.4091 "FTREES" "78329" "78329:228569"
0.3788 "FTREES" "78329" "78329:380243"
0.4091 "FTREES" "78329" "78329:380245"
0.0606 "FTREES" "78348" "78348:148763"
0.1667 "FTREES" "78348" "78348:198962"
0.0606 "FTREES" "78348" "78348:235987"
0.0944 "FTREES" "78351" "78351:164446"
0.0899 "FTREES" "78351" "78351:164447"
0.1514 "FTREES" "78351" "78351:237303"
0.0392 "FTREES" "78370" "78370:141095"
0.2059 "FTREES" "78370" "78370:182764"
0.0686 "FTREES" "78370" "78370:214757"
0.3333 "FTREES" "78376" "78376:177269"
0.3333 "FTREES" "78376" "78376:182777"
0.3333 "FTREES" "78376" "78376:182778"
0.6667 "FTREES" "78387" "78387:143257"
0 "FTREES" "78387" "78387:170329"
0.1111 "FTREES" "78387" "78387:270359"
0.1096 "FTREES" "78394" "78394:275118"
0.3699 "FTREES" "78394" "78394:329964"
0.3151 "FTREES" "78394" "78394:329965"
0.8182 "FTREES" "78396" "78396:160580"
0.8182 "FTREES" "78396" "78396:160587"
0.7273 "FTREES" "78396" "78396:160589"
0.5833 "FTREES" "78406" "78406:172305"
0.6667 "FTREES" "78406" "78406:172306"
0.6667 "FTREES" "78406" "78406:172307"
0.36 "MACCS" "06243" "06243:218379"
0.52 "MACCS" "06243" "06243:218384"
0.3383 "MACCS" "06243" "06243:246192"
0.3293 "MACCS" "06245" "06245:235027"
0.3006 "MACCS" "06245" "06245:261128"
0.2627 "MACCS" "06245" "06245:293245"
0.6522 "MACCS" "06280" "06280:201466"
0.7391 "MACCS" "06280" "06280:201666"
0.3043 "MACCS" "06280" "06280:299665"
0.793 "MACCS" "07712" "07712:184034"
0.7976 "MACCS" "07712" "07712:197475"
0.688 "MACCS" "07712" "07712:210421"
0.1429 "MACCS" "08400" "08400:155143"
0.048 "MACCS" "08400" "08400:224050"
0.0952 "MACCS" "08400" "08400:268253"
0.0848 "MACCS" "08410" "08410:210117"
0.1026 "MACCS" "08410" "08410:210929"
0.1424 "MACCS" "08410" "08410:308663"
0.4552 "MACCS" "09221" "09221:153896"
0.1345 "MACCS" "09221" "09221:183162"
0.2483 "MACCS" "09221" "09221:191600"
0.4737 "MACCS" "09226" "09226:210893"
0.5 "MACCS" "09226" "09226:213199"
0.1755 "MACCS" "09226" "09226:285643"
0.614 "MACCS" "09229" "09229:271866"
0.5756 "MACCS" "09229" "09229:271875"
0.5598 "MACCS" "09229" "09229:276675"
0.2987 "MACCS" "09249" "09249:172199"
0.2476 "MACCS" "09249" "09249:195195"
0.196 "MACCS" "09249" "09249:219579"
0.7647 "MACCS" "10500" "10500:215751"
0.7647 "MACCS" "10500" "10500:217463"
0.7059 "MACCS" "10500" "10500:217470"
0.351 "MACCS" "12455" "12455:155194"
0.2761 "MACCS" "12455" "12455:173259"
0.0287 "MACCS" "12455" "12455:180623"
0.3013 "MACCS" "12457" "12457:192666"
0.5323 "MACCS" "12457" "12457:210683"
0.6087 "MACCS" "12457" "12457:230734"
0.8235 "MACCS" "12458" "12458:184134"
0.8235 "MACCS" "12458" "12458:253641"
0.0588 "MACCS" "12458" "12458:265099"
1 "MACCS" "12462" "12462:196715"
1 "MACCS" "12462" "12462:196717"
1 "MACCS" "12462" "12462:214059"
0.2584 "MACCS" "12464" "12464:239587"
0.2505 "MACCS" "12464" "12464:245484"
0.2 "MACCS" "12464" "12464:275235"
0.9868 "MACCS" "16200" "16200:142458"
0.748 "MACCS" "16200" "16200:142464"
0.8547 "MACCS" "16200" "16200:144257"
0.0579 "MACCS" "43210" "43210:153391"
0.0666 "MACCS" "43210" "43210:190179"
0.3554 "MACCS" "43210" "43210:289932"
0.1333 "MACCS" "71522" "71522:192905"
0.129 "MACCS" "71522" "71522:258173"
0.1719 "MACCS" "71522" "71522:285071"
0.5021 "MACCS" "78329" "78329:228569"
0.8401 "MACCS" "78329" "78329:380243"
0.8753 "MACCS" "78329" "78329:380245"
0 "MACCS" "78348" "78348:148763"
0.2974 "MACCS" "78348" "78348:198962"
0.0455 "MACCS" "78348" "78348:235987"
0.1893 "MACCS" "78351" "78351:164446"
0.1958 "MACCS" "78351" "78351:164447"
0.2635 "MACCS" "78351" "78351:237303"
0.2999 "MACCS" "78370" "78370:141095"
0.3196 "MACCS" "78370" "78370:182764"
0.0995 "MACCS" "78370" "78370:214757"
1 "MACCS" "78376" "78376:177269"
1 "MACCS" "78376" "78376:182777"
1 "MACCS" "78376" "78376:182778"
0.5556 "MACCS" "78387" "78387:143257"
0 "MACCS" "78387" "78387:170329"
0.5556 "MACCS" "78387" "78387:270359"
0.0255 "MACCS" "78394" "78394:275118"
0.5205 "MACCS" "78394" "78394:329964"
0.5192 "MACCS" "78394" "78394:329965"
0.8182 "MACCS" "78396" "78396:160580"
0.8182 "MACCS" "78396" "78396:160587"
0.8182 "MACCS" "78396" "78396:160589"
0.75 "MACCS" "78406" "78406:172305"
0.75 "MACCS" "78406" "78406:172306"
0.75 "MACCS" "78406" "78406:172307"
______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to