George Duncan wrote:
To make sure our votes count ...

...
For the Our Vote Live website:
http://www.ourvotelive.org/

For the EFF press release about OurVoteLive.org:
http://www.eff.org/press/archives/2008/10/26
George,
Thanks for the link.  I'd be interested in a professional Statisticians opinion of the numbers and presentation on the site..

I am very pleased to have EFF on this task.   I think transparency is critical to any open system or society.  Unfortunately too many "grassroots transparency" initiatives are initiated and operated by unprofessional enthusiasts seeking evidence to support their existing point of view.  Their attempts at "transparency" often yield shall we say "rose" or "shit" colored lenses? 

On this particular topic, I can imagine two independent efforts lead by our two opposing parties managing to collect widely disagreeing data.   The further one faction (or party) leans out their side of the boat, the further the other has to lean the other way to keep things from upsetting, but the rest of us are left at their mercy, trying to make a safe passage in an otherwise sound boat.

On closer inspection of "ourvotelive.org", the only significantly "misleading" thing I found was that the DB (and map) reflected the raw number of contacts made with them, which were approximately 4-1 inquiries in the states I drilled into (about early voting rules, polling places) vs reports of problems.  On casual inspection, it would be easy to assume that there were 4-5 times as many reports of potential misconduct than there were. 

Another minor point is that the map symbology (saturation of color for each state) reflects total number of contacts w/o normalization.   California and Florida show up looking like hotbeds of problems when in fact, (esp. for California) what is shown is that it is a state with a large population.

I was surprised at how low the numbers are.  From the media talk, one would think there were at least ten times as many reports of potential misconduct out there.   By their numbers, 2600 reports of problems across the whole nation vs 14000 inquiries.  I do not know how to easily evaluate how many people know of this website or how likely they are to use it if they in fact have a problem.  If my order-of-magnitude calculation is correct, we are seeing .01 percent of the population weighing in with a question and .001 percent reporting a (potential) problem between August and now.   Perhaps these numbers will spike next week.

I also don't know how to evaluate how sensitive people are to the appearance of problems.  In my own circle of aquaintances, I know that there is a wide range of "sensitivities" from those who would report an attempt at intimidation simply because they'd had a disagreement over politics with someone within a week of casting their vote and others who would shrug off someone openly making threats to them on their way to the polls.  

This said, I suppose that while I trust and respect EFF's motives and skill in these matters, I fear this DB has many of the same problems any "poll" would have.   Since it is a self-selected group who report in such a case, I assume this also skews the numbers.

George, other statisticians?  Any thoughts on how to interpret these numbers?

- Steve
============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org

Reply via email to