Re: [Foundation-l] Usability Study Results (Sneak Preview)

2009-05-08 Thread Nikola Smolenski
Parul Vora wrote:
 Thanks for all of the feedback, comments, and support. I just wanted to 
 let you know that our full report (including highlight videos!!) is now 
 up our the Usability Initiative's project wiki:
 
 http://usability.wikimedia.org/wiki/UX_and_Usability_Study

It may be too late for this now, but when I read that The cheat sheet 
was the only item in the help section that led to a subjects successful 
edit., I wanted to know how well would people edit if they had the 
printed cheat sheet. Maybe if you have more tests in future you could do 
this - it will tell us exactly what is the value of the printed 
promotional material.

I am wary of this: Users often missed the ‘edit’ buttons next to each 
section, clicking on ‘edit this page’ all the way at the top. In my 
experience, users do exactly the opposite, and I have seen new users who 
know how to edit sections asking how to edit top section; some 
Wikipedias (f.e. ruwiki) have even added [edit] link to top of the 
article that mimics section edit links. What could be the cause of this 
discrepancy?

Would it be possible to get some conclusions and recommendations from 
this study so that Wikipedia's interface could be improved even before a 
full WYSIVYG interface is developed? (Because, from past experience, 
that will take years.) For example, one obvious thing is that editing 
help should be moved to top right of the edit box. (Maybe it could also 
pop next to the editing box instead of in a new window?) Another obvious 
thing, Create a new article link should be added somewhere. Yet 
another, toolbar images should be replaced with text.

___
foundation-l mailing list
foundation-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


Re: [Foundation-l] Usability Study Results (Sneak Preview)

2009-05-08 Thread Tomasz Ganicz
2009/5/8 Nikola Smolenski smole...@eunet.yu:

 I am wary of this: Users often missed the ‘edit’ buttons next to each
 section, clicking on ‘edit this page’ all the way at the top. In my
 experience, users do exactly the opposite, and I have seen new users who
 know how to edit sections asking how to edit top section; some
 Wikipedias (f.e. ruwiki) have even added [edit] link to top of the
 article that mimics section edit links. What could be the cause of this
 discrepancy?

Yes. This is also quite common question sent to OTRS. People quite
often ask I know how to edit the section, but I don't know how to
edit the top part of the article. By the way: maybe it is good idea
to ask OTRS English team to mark for one week or month all E-mail
asking for help in editing and than to make some sumarization of it.

-- 
Tomek Polimerek Ganicz
http://pl.wikimedia.org/wiki/User:Polimerek
http://www.ganicz.pl/poli/
http://www.ptchem.lodz.pl/en/TomaszGanicz.html

___
foundation-l mailing list
foundation-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


Re: [Foundation-l] Usability Study Results (Sneak Preview)

2009-05-08 Thread Delirium
Brian wrote:
 Quite frankly the advice that you should only use five subjects makes no
 sense. The appeal to Nielsen's authority is not going to work on me or
 anyone else who understands why the scientific method exists. It's
 unscientific thinking and it's going cause to you waste money. You're going
 to draw conclusions based on results that simply aren't valid, and you won't
 know it until the study is over and you didn't make progress.
   

I would caution people unfamiliar with the scientific literature in the 
area from making overbroad about fields outside their area of expertise. 
There are pros and cons of quantitative and qualitative studies in 
human-computer interaction (HCI), and in many cases qualitative studies 
give more useful information, especially for fairly early stage design 
questions. In particular, it's quite easy to get statistically rigorous 
but in-practice useless data from quantitative studies. Those are very 
good when you have precisely one variable you want to measure (or maybe 
a few), but in many HCI studies, especially with open-ended interface 
design questions (like make Wikipedia better) there are thousands of 
possible variables, and a major purpose of a study is to better 
understand the concerns of the domain and figure out what the relevant 
variables even *are*. Quantitative studies are much more useful when 
you've fixed your interface but have a handful of final decisions to 
make---do we want a particular feature to have slight variant A or B, 
while everything else stays fixed.

-Mark


___
foundation-l mailing list
foundation-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


Re: [Foundation-l] Usability Study Results (Sneak Preview)

2009-05-08 Thread geni
2009/5/8 Brian brian.min...@colorado.edu:
 I will have no part in your efforts to redefine the scientific method on its
 talk page.

Fortunately you don't need to. People who have put far more effort
into the subject than you are ever likely to do so have pretty much
shot apart the idea of a single hard scientific method that scientists
actually use.

-- 
geni

___
foundation-l mailing list
foundation-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


Re: [Foundation-l] Usability Study Results (Sneak Preview)

2009-05-08 Thread Brian
Ok, I'll agree that the motiviations and size of this pilot study are
reasonable. Then I'd just like to know how much money was spent getting
these answers.  If you're not planning to measure the subjects
scientifically and you just want to figure out what the big issues are then
the premise of the lab itself comes into question.

On Thu, May 7, 2009 at 10:32 PM, Robert Rohde raro...@gmail.com wrote:

 On Thu, May 7, 2009 at 8:58 PM, Brian brian.min...@colorado.edu wrote:
  Quite frankly the advice that you should only use five subjects makes no
  sense. The appeal to Nielsen's authority is not going to work on me or
  anyone else who understands why the scientific method exists. It's
  unscientific thinking and it's going cause to you waste money. You're
 going
  to draw conclusions based on results that simply aren't valid, and you
 won't
  know it until the study is over and you didn't make progress.
 
  Careful analysis of site data could allow you to draw some conclusions.
 I'm
  curious how you're planning to go about that. Dependent/independent
  variables?

 If five subjects, chosen at random, all have the same problem, then
 with 95% confidence you can predict that at least half of the
 population will report having this problem.

 This kind of work generally focuses on BIG problems, and you don't
 need a huge sample to identify some of the most common issues.  In
 things like UI development it would be surprising if there weren't
 complaints reported by most of the subjects.  You may overlook some
 other problems, but when coming up with a list of common problems to
 work on, I would say that 15 subjects is plenty.

 -Robert Rohde

 ___
 foundation-l mailing list
 foundation-l@lists.wikimedia.org
 Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l

___
foundation-l mailing list
foundation-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


Re: [Foundation-l] Usability Study Results (Sneak Preview)

2009-05-08 Thread Nathan
On Fri, May 8, 2009 at 10:10 AM, Brian brian.min...@colorado.edu wrote:

 You don't have to be an expert (more formally defined as someone with ten
 years of experience in a field) to spot unscientific thinking. I don't
 think
 you're an expert either so maybe you should just leave expertise out of it.
 More to the point, the question in my mind is, how much money did the
 Foundation waste on this study that you can't draw valid, generalizable
 conclusions from?
 The distinction between quantitative and qualitative is for naught -
 you can, and should, quantify what you mean when making qualitative
 assessments. Especially if you will use them to bolster your arguments
 about how to spend millions of dollars of donated money.


Honestly, you as much as admit that you have no idea what you're talking
about. You could at least be a little more collegial and a little less
accusatory if you are looking for good faith responses to your complaints.

Nathan
___
foundation-l mailing list
foundation-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


Re: [Foundation-l] Usability Study Results (Sneak Preview)

2009-05-08 Thread Brian
Really, I admit that? Where.

On Fri, May 8, 2009 at 8:30 AM, Nathan nawr...@gmail.com wrote:

 On Fri, May 8, 2009 at 10:10 AM, Brian brian.min...@colorado.edu wrote:

  You don't have to be an expert (more formally defined as someone with ten
  years of experience in a field) to spot unscientific thinking. I don't
  think
  you're an expert either so maybe you should just leave expertise out of
 it.
  More to the point, the question in my mind is, how much money did the
  Foundation waste on this study that you can't draw valid, generalizable
  conclusions from?
  The distinction between quantitative and qualitative is for naught -
  you can, and should, quantify what you mean when making qualitative
  assessments. Especially if you will use them to bolster your arguments
  about how to spend millions of dollars of donated money.


 Honestly, you as much as admit that you have no idea what you're talking
 about. You could at least be a little more collegial and a little less
 accusatory if you are looking for good faith responses to your complaints.

 Nathan
 ___
 foundation-l mailing list
 foundation-l@lists.wikimedia.org
 Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l

___
foundation-l mailing list
foundation-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


Re: [Foundation-l] Usability Study Results (Sneak Preview)

2009-05-08 Thread phoebe ayers
On Fri, May 8, 2009 at 1:35 AM, Nikola Smolenski smole...@eunet.yu wrote:
 Parul Vora wrote:
 Thanks for all of the feedback, comments, and support. I just wanted to
 let you know that our full report (including highlight videos!!) is now
 up our the Usability Initiative's project wiki:

 http://usability.wikimedia.org/wiki/UX_and_Usability_Study

 It may be too late for this now, but when I read that The cheat sheet
 was the only item in the help section that led to a subjects successful
 edit., I wanted to know how well would people edit if they had the
 printed cheat sheet. Maybe if you have more tests in future you could do
 this - it will tell us exactly what is the value of the printed
 promotional material.

 I am wary of this: Users often missed the ‘edit’ buttons next to each
 section, clicking on ‘edit this page’ all the way at the top. In my
 experience, users do exactly the opposite, and I have seen new users who
 know how to edit sections asking how to edit top section; some
 Wikipedias (f.e. ruwiki) have even added [edit] link to top of the
 article that mimics section edit links. What could be the cause of this
 discrepancy?

About this: on en:wp, at least, under user preferences/gadgets, users
can turn this on themselves by clicking the Add an [edit] link for
the lead section of a page box. Is there any particular reason not to
turn this on by default for everyone? Could be (one small) problem
(temporarily) solved.

Regarding this whole discussion in general, and comments made by Brian
(who I respect quite a lot) -- as I understand it, usability and HCI
are something of an inexact science. It seems like several approaches
and studies would be helpful, to see if different methods come to more
or less the same conclusions about what's broken, or not.

There are also a number of world-respected HCI experts that are
interested in this community that we could probably draw on for input;
see past Wikimania speaker lists.

-- Phoebe

___
foundation-l mailing list
foundation-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


Re: [Foundation-l] Usability Study Results (Sneak Preview)

2009-05-08 Thread Aryeh Gregor
On Thu, May 7, 2009 at 11:58 PM, Brian brian.min...@colorado.edu wrote:
 Quite frankly the advice that you should only use five subjects makes no
 sense. The appeal to Nielsen's authority is not going to work on me or
 anyone else who understands why the scientific method exists.

Experience shows that most people end up being very similar when it
comes to usability.  Most problems show up repeatedly even with groups
of five people.  If you run the tests on a hundred people, you're
going to get a somewhat more accurate picture, but not enough to
justify the extra expense.  It's much better to run a five-person
study, assume that any objections raised by (say) at least three are
representative, fix those, and run another few five-person studies on
the fixed software for the same cost.

You don't need large sample sizes if something is regular enough.  You
only need high sample sizes if the object of your study is variable
enough to require it.  That's usually the case in pharmacological
studies, for instance, but that doesn't mean it's true everywhere.  If
you have a tiny standard deviation, then a study of five people could
provide very clear conclusions.  It all depends on your data.

___
foundation-l mailing list
foundation-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


Re: [Foundation-l] Usability Study Results (Sneak Preview)

2009-05-08 Thread Brion Vibber
El 5/7/09 5:36 PM, geni escribió:
 2009/5/8 Brianbrian.min...@colorado.edu:
 This usability study is so tiny. I want MediaWiki to be really, really good.
 Please tell me you guys hope to go large scale with the remote testing
 setup.

 Nit just mediawiki. Looks like we need to improve the paths through
 the help namespace.

The primary target on the help pages is to make key pieces like the 
markup cheat-sheet available quickly, easily, and universally.

Wikia's got a cute little editing-help-sidebar version of this which 
we've considered adapting or taking as a model for that purpose -- it 
gets you some important stuff without taking you out of your editing 
environment.

Style guides, licensing recommendations, etc are another level of help 
page which needs more consideration, but remain of secondary importance.

-- brion

___
foundation-l mailing list
foundation-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


Re: [Foundation-l] Usability Study Results (Sneak Preview)

2009-05-08 Thread Brion Vibber
El 5/8/09 9:21 PM, phoebe ayers escribió:
 About this: on en:wp, at least, under user preferences/gadgets, users
 can turn this on themselves by clicking the Add an [edit] link for
 the lead section of a page box. Is there any particular reason not to
 turn this on by default for everyone? Could be (one small) problem
 (temporarily) solved.

We disabled the section-0 edit link a couple years ago because it 
interfered with the floating infoboxes and images which appear at the 
top of most pages.

Finding a way to put it back that's reliable and interferes with neither 
floating boxes nor the various badge icons that a lot of templated put 
in above the line is on the task list. :) Once it's back permanently we 
can dump the temporary gadgets.

-- brion

___
foundation-l mailing list
foundation-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


Re: [Foundation-l] Usability Study Results (Sneak Preview)

2009-05-07 Thread Parul Vora
Hi All!

Thanks for all of the feedback, comments, and support. I just wanted to 
let you know that our full report (including highlight videos!!) is now 
up our the Usability Initiative's project wiki:

http://usability.wikimedia.org/wiki/UX_and_Usability_Study

- The Usability Team


Parul Vora wrote:
 Hi all!

 The Wikipedia Usability Initiative conducted a user research study 
 with SF based Bolt Peters in late March to uncover barriers new 
 editors face. We are in the process of completing a full report on our 
 methodology, process and analysis, but wanted to share with you some 
 of the major themes and findings in the meantime


 Some quotes from our participants that illustrate these findings:

 “Usually it’s the most information in the easiest spot to access. It 
 always looks very well put together….it boggles my mind how many 
 people can contribute and it still looks like an encyclopedia.” – ‘Galen’

 “I like Wikipedia because it’s plain text and nothing flashes” – 
 ‘Claudia’

 “Rather than making a mess, I’d rather take some time to figure out 
 how to do it right.
 (later) There sure is a lot of stuff to read.” – ‘Dan’

 “ [I felt] kind of stupid.” – ‘Galen’

 “It’d be nice to have a GUI, so you could see what you’re editing. 
 You’ve made these changes and you’re looking at it, and you don’t know 
 how it’s going to look on the page. It’s a little clumsy to see how 
 it’s going to look.” – ‘Bryan’

 “[This is] where I’d give up.” – ‘Shaun’


 Check out the full post on the foundation blog:
 http://blog.wikimedia.org/2009/04/24/usability-study-results-sneak-preview/ 



 We would love to hear any initial thoughts, opinions, and reactions. 
 If you have any similar or dissimilar experiences - either personally 
 or in your own work/research, we'd love to hear about that too!


 Always on your side,
 The Usability Team



___
foundation-l mailing list
foundation-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


Re: [Foundation-l] Usability Study Results (Sneak Preview)

2009-05-07 Thread Brian
This usability study is so tiny. I want MediaWiki to be really, really good.
Please tell me you guys hope to go large scale with the remote testing
setup.

On Thu, May 7, 2009 at 5:41 PM, Parul Vora pv...@wikimedia.org wrote:

 Hi All!

 Thanks for all of the feedback, comments, and support. I just wanted to
 let you know that our full report (including highlight videos!!) is now
 up our the Usability Initiative's project wiki:

 http://usability.wikimedia.org/wiki/UX_and_Usability_Study

 - The Usability Team


 Parul Vora wrote:
  Hi all!
 
  The Wikipedia Usability Initiative conducted a user research study
  with SF based Bolt Peters in late March to uncover barriers new
  editors face. We are in the process of completing a full report on our
  methodology, process and analysis, but wanted to share with you some
  of the major themes and findings in the meantime
 
 
  Some quotes from our participants that illustrate these findings:
 
  “Usually it’s the most information in the easiest spot to access. It
  always looks very well put together….it boggles my mind how many
  people can contribute and it still looks like an encyclopedia.” – ‘Galen’
 
  “I like Wikipedia because it’s plain text and nothing flashes” –
  ‘Claudia’
 
  “Rather than making a mess, I’d rather take some time to figure out
  how to do it right.
  (later) There sure is a lot of stuff to read.” – ‘Dan’
 
  “ [I felt] kind of stupid.” – ‘Galen’
 
  “It’d be nice to have a GUI, so you could see what you’re editing.
  You’ve made these changes and you’re looking at it, and you don’t know
  how it’s going to look on the page. It’s a little clumsy to see how
  it’s going to look.” – ‘Bryan’
 
  “[This is] where I’d give up.” – ‘Shaun’
 
 
  Check out the full post on the foundation blog:
 
 http://blog.wikimedia.org/2009/04/24/usability-study-results-sneak-preview/
 
 
 
  We would love to hear any initial thoughts, opinions, and reactions.
  If you have any similar or dissimilar experiences - either personally
  or in your own work/research, we'd love to hear about that too!
 
 
  Always on your side,
  The Usability Team
 


 ___
 foundation-l mailing list
 foundation-l@lists.wikimedia.org
 Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l

___
foundation-l mailing list
foundation-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


Re: [Foundation-l] Usability Study Results (Sneak Preview)

2009-05-07 Thread David Gerard
2009/5/8 Brian brian.min...@colorado.edu:

 You went from 2,500 subjects to just 10?


For a software test, which this mostly was, 5 is enough for excellent
results in most cases.


- d.

___
foundation-l mailing list
foundation-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


Re: [Foundation-l] Usability Study Results (Sneak Preview)

2009-05-07 Thread Brian
That may be true. This study does not allow you to draw that conclusion,
however.

On Thu, May 7, 2009 at 6:36 PM, geni geni...@gmail.com wrote:

 2009/5/8 Brian brian.min...@colorado.edu:
  This usability study is so tiny. I want MediaWiki to be really, really
 good.
  Please tell me you guys hope to go large scale with the remote testing
  setup.

 Nit just mediawiki. Looks like we need to improve the paths through
 the help namespace.

 --
 geni

 ___
 foundation-l mailing list
 foundation-l@lists.wikimedia.org
 Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l

___
foundation-l mailing list
foundation-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


Re: [Foundation-l] Usability Study Results (Sneak Preview)

2009-05-07 Thread Erik Moeller
2009/5/7 Brian brian.min...@colorado.edu:

 Based on these criteria, the 2,500 users that responded to our survey were
 filtered down to 500 viable subjects based on their answers to these
 questions. The team, along with B|P, partnered with Davis Recruiting to
 contact, filter, and screen these 500 participants based on their Wikipedia
 contribution history, Wikipedia usage patterns, their given reasons for not
 contributing, and their talkativeness and openness to discuss their thoughts
 and actions. From 2,500 users, we ended up with 10 study participants and
 3-5 waitlisted participants.

 You went from 2,500 subjects to just 10?

The purpose of a study like this is focused observation of the
behavior of individual human beings. As David has pointed out, for any
study like this there are laws of diminishing returns, and any serious
observation of an individual is time-consuming and costly (raw data is
worthless if you can't analyze it). That's why usability gurus like
Nielsen suggest 5 is enough for most tests:
http://www.useit.com/alertbox/2319.html - due to our highly
diverse audience, we chose a larger group, and we split between remote
and lab testing to compensate for biases of both methods. This has
worked well to identify plenty of very obvious usability barriers to
focus on.

There are alternative data collection methods such as large scale
quantitative testing where the level of individual engagement is
limited; those can give you behavioral patterns etc. They can be
useful, too, but are an entirely different thing.
-- 
Erik Möller
Deputy Director, Wikimedia Foundation

Support Free Knowledge: http://wikimediafoundation.org/wiki/Donate

___
foundation-l mailing list
foundation-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


Re: [Foundation-l] Usability Study Results (Sneak Preview)

2009-05-07 Thread Brian
This all goes back to how you aim to quantify improvement in usability.
These samples sizes are so small that it will be hard (or even impossible)
to evaluate your progress based on statistical significance. You've got to
prove to us that its really getting better, and doesn't just look prettier.

On Thu, May 7, 2009 at 7:44 PM, Erik Moeller e...@wikimedia.org wrote:

 2009/5/7 Brian brian.min...@colorado.edu:
 
  Based on these criteria, the 2,500 users that responded to our survey
 were
  filtered down to 500 viable subjects based on their answers to these
  questions. The team, along with B|P, partnered with Davis Recruiting to
  contact, filter, and screen these 500 participants based on their
 Wikipedia
  contribution history, Wikipedia usage patterns, their given reasons for
 not
  contributing, and their talkativeness and openness to discuss their
 thoughts
  and actions. From 2,500 users, we ended up with 10 study participants
 and
  3-5 waitlisted participants.

  You went from 2,500 subjects to just 10?

 The purpose of a study like this is focused observation of the
 behavior of individual human beings. As David has pointed out, for any
 study like this there are laws of diminishing returns, and any serious
 observation of an individual is time-consuming and costly (raw data is
 worthless if you can't analyze it). That's why usability gurus like
 Nielsen suggest 5 is enough for most tests:
 http://www.useit.com/alertbox/2319.html - due to our highly
 diverse audience, we chose a larger group, and we split between remote
 and lab testing to compensate for biases of both methods. This has
 worked well to identify plenty of very obvious usability barriers to
 focus on.

 There are alternative data collection methods such as large scale
 quantitative testing where the level of individual engagement is
 limited; those can give you behavioral patterns etc. They can be
 useful, too, but are an entirely different thing.
 --
 Erik Möller
 Deputy Director, Wikimedia Foundation

 Support Free Knowledge: http://wikimediafoundation.org/wiki/Donate

 ___
 foundation-l mailing list
 foundation-l@lists.wikimedia.org
 Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l

___
foundation-l mailing list
foundation-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


Re: [Foundation-l] Usability Study Results (Sneak Preview)

2009-05-07 Thread Erik Moeller
2009/5/7 Brian brian.min...@colorado.edu:
 This all goes back to how you aim to quantify improvement in usability.
 These samples sizes are so small that it will be hard (or even impossible)
 to evaluate your progress based on statistical significance. You've got to
 prove to us that its really getting better, and doesn't just look prettier.

That's why, in addition to follow-up tests, the usability team is
working closely with Erik Zachte to make sure we've got good editing
metrics that show whether our changes are actually making a
substantial difference in engaging new contributors.
-- 
Erik Möller
Deputy Director, Wikimedia Foundation

Support Free Knowledge: http://wikimediafoundation.org/wiki/Donate

___
foundation-l mailing list
foundation-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


Re: [Foundation-l] Usability Study Results (Sneak Preview)

2009-05-07 Thread Robert Rohde
On Thu, May 7, 2009 at 8:58 PM, Brian brian.min...@colorado.edu wrote:
 Quite frankly the advice that you should only use five subjects makes no
 sense. The appeal to Nielsen's authority is not going to work on me or
 anyone else who understands why the scientific method exists. It's
 unscientific thinking and it's going cause to you waste money. You're going
 to draw conclusions based on results that simply aren't valid, and you won't
 know it until the study is over and you didn't make progress.

 Careful analysis of site data could allow you to draw some conclusions. I'm
 curious how you're planning to go about that. Dependent/independent
 variables?

If five subjects, chosen at random, all have the same problem, then
with 95% confidence you can predict that at least half of the
population will report having this problem.

This kind of work generally focuses on BIG problems, and you don't
need a huge sample to identify some of the most common issues.  In
things like UI development it would be surprising if there weren't
complaints reported by most of the subjects.  You may overlook some
other problems, but when coming up with a list of common problems to
work on, I would say that 15 subjects is plenty.

-Robert Rohde

___
foundation-l mailing list
foundation-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


Re: [Foundation-l] Usability Study Results (Sneak Preview)

2009-05-07 Thread Sage Ross
On Thu, May 7, 2009 at 11:58 PM, Brian brian.min...@colorado.edu wrote:
 Quite frankly the advice that you should only use five subjects makes no
 sense. The appeal to Nielsen's authority is not going to work on me or
 anyone else who understands why the scientific method exists. It's
 unscientific thinking and it's going cause to you waste money. You're going
 to draw conclusions based on results that simply aren't valid, and you won't
 know it until the study is over and you didn't make progress.

 Careful analysis of site data could allow you to draw some conclusions. I'm
 curious how you're planning to go about that. Dependent/independent
 variables?


An exercise in statistical thinking: when everyone or almost everyone
cites problem X, how many people does it take to reach statistical
significance that X is a problem worth addressing?  Even if the
results are a statistical fluke and in reality only 20% of new users
run into trouble with problem X, that's still a problem worth
addressing.

The fact that so many of the 15 people had the same problems, and
those problems also align with common sense, is a strong indication
that the study has found some things worth fixing.

There is more than one way to come to reliable conclusions.  Any time
I see someone invoking the [singular] scientific method, as if there
is only one and it is set in stone and universally agreed upon by all
rational people, I have trouble taking them seriously.  See
[[Talk:Scientific method]].

-Sage (User:Ragesoss)

___
foundation-l mailing list
foundation-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


Re: [Foundation-l] Usability Study Results (Sneak Preview)

2009-05-07 Thread Brian
I will have no part in your efforts to redefine the scientific method on its
talk page.

On Thu, May 7, 2009 at 10:35 PM, Sage Ross
ragesoss+wikipe...@gmail.comragesoss%2bwikipe...@gmail.com
 wrote:

 On Thu, May 7, 2009 at 11:58 PM, Brian brian.min...@colorado.edu wrote:
  Quite frankly the advice that you should only use five subjects makes no
  sense. The appeal to Nielsen's authority is not going to work on me or
  anyone else who understands why the scientific method exists. It's
  unscientific thinking and it's going cause to you waste money. You're
 going
  to draw conclusions based on results that simply aren't valid, and you
 won't
  know it until the study is over and you didn't make progress.
 
  Careful analysis of site data could allow you to draw some conclusions.
 I'm
  curious how you're planning to go about that. Dependent/independent
  variables?
 

 An exercise in statistical thinking: when everyone or almost everyone
 cites problem X, how many people does it take to reach statistical
 significance that X is a problem worth addressing?  Even if the
 results are a statistical fluke and in reality only 20% of new users
 run into trouble with problem X, that's still a problem worth
 addressing.

 The fact that so many of the 15 people had the same problems, and
 those problems also align with common sense, is a strong indication
 that the study has found some things worth fixing.

 There is more than one way to come to reliable conclusions.  Any time
 I see someone invoking the [singular] scientific method, as if there
 is only one and it is set in stone and universally agreed upon by all
 rational people, I have trouble taking them seriously.  See
 [[Talk:Scientific method]].

 -Sage (User:Ragesoss)

 ___
 foundation-l mailing list
 foundation-l@lists.wikimedia.org
 Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l

___
foundation-l mailing list
foundation-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


Re: [Foundation-l] Usability Study Results (Sneak Preview)

2009-05-07 Thread Brian
Sounds easy. I wonder why this study doesn't mention a p value. The grant
must not have been large enough to fund someone with any experience using R,
or god forbid, a pencil.

On Thu, May 7, 2009 at 10:32 PM, Robert Rohde raro...@gmail.com wrote:

 On Thu, May 7, 2009 at 8:58 PM, Brian brian.min...@colorado.edu wrote:
  Quite frankly the advice that you should only use five subjects makes no
  sense. The appeal to Nielsen's authority is not going to work on me or
  anyone else who understands why the scientific method exists. It's
  unscientific thinking and it's going cause to you waste money. You're
 going
  to draw conclusions based on results that simply aren't valid, and you
 won't
  know it until the study is over and you didn't make progress.
 
  Careful analysis of site data could allow you to draw some conclusions.
 I'm
  curious how you're planning to go about that. Dependent/independent
  variables?

 If five subjects, chosen at random, all have the same problem, then
 with 95% confidence you can predict that at least half of the
 population will report having this problem.

 This kind of work generally focuses on BIG problems, and you don't
 need a huge sample to identify some of the most common issues.  In
 things like UI development it would be surprising if there weren't
 complaints reported by most of the subjects.  You may overlook some
 other problems, but when coming up with a list of common problems to
 work on, I would say that 15 subjects is plenty.

 -Robert Rohde

 ___
 foundation-l mailing list
 foundation-l@lists.wikimedia.org
 Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l

___
foundation-l mailing list
foundation-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


Re: [Foundation-l] Usability Study Results (Sneak Preview)

2009-04-24 Thread Magnus Manske
On Fri, Apr 24, 2009 at 2:08 AM, Parul Vora pv...@wikimedia.org wrote:
 Hi all!

 The Wikipedia Usability Initiative conducted a user research study with
 SF based Bolt Peters in late March to uncover barriers new editors face.
 We are in the process of completing a full report on our methodology,
 process and analysis, but wanted to share with you some of the major
 themes and findings in the meantime

From what I read, the main problem is that new, eager, serious
contributers surrender between our markup and an overwhelming flood of
descriptions.

I know a new GUI is being worked on. For the moment I hacked the
following JavaScript suggestion:

http://en.wikipedia.org/wiki/User:Magnus_Manske/newbiehelp.js

This adds a how? link into the edit tab, and launches a floating
panel with some extremely general content:

http://en.wikipedia.org/wiki/File:Edit_how.png

Never mind the wording, the color scheme, or important points I missed :-)

If that were added for all anons by default, it might save the willing
and able some grief.

Just a thought.

Cheers,
Magnus

___
foundation-l mailing list
foundation-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


Re: [Foundation-l] Usability Study Results (Sneak Preview)

2009-04-24 Thread Gregory Kohs
Will the final report include a note about how unwelcome User:NawlinWiki
made the study participants feel when he indefinitely blocked their accounts
for abusing Wikipedia?

http://en.wikipedia.org/w/index.php?title=Special:Logpage=User%3AUsability_Tester_3
___
foundation-l mailing list
foundation-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


Re: [Foundation-l] Usability Study Results (Sneak Preview)

2009-04-24 Thread Naoko Komura


 I know a new GUI is being worked on. For the moment I hacked the
 following JavaScript suggestion:

 http://en.wikipedia.org/wiki/User:Magnus_Manske/newbiehelp.js

 This adds a how? link into the edit tab, and launches a floating
 panel with some extremely general content:

 http://en.wikipedia.org/wiki/File:Edit_how.png

 Never mind the wording, the color scheme, or important points I missed :-)

 If that were added for all anons by default, it might save the willing
 and able some grief.

 Just a thought.


Great suggestion.  One of the repeated sentiment from the study participants
was what is the editing process.  This will give a quick overview to
anons.

Will you post your idea to our project page?  :-)
http://usability.wikimedia.org/wiki/Main_Page


Best,

- Naoko

-- 
Support Free Knowledge:  http://wikimediafoundation.org/wiki/Donate
___
foundation-l mailing list
foundation-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


Re: [Foundation-l] Usability Study Results (Sneak Preview)

2009-04-24 Thread Naoko Komura
On Fri, Apr 24, 2009 at 10:26 AM, Gregory Kohs thekoh...@gmail.com wrote:

 Will the final report include a note about how unwelcome User:NawlinWiki
 made the study participants feel when he indefinitely blocked their
 accounts
 for abusing Wikipedia?


We, the usability team, with lots of help from stewards and admins, worked
behind the scene not to expose such blockage to study participants.  I am
sure NawlinkWiki was doing his part in protecting Wikipedia.  We notified
this list and WikiEn about the usability study and described the usage of
these accounts in the user page, hoping to avoid such blockage.  But some
folks are double-cautious and I think that is a good thing.  We confirmed
that it is really the usability team of WMF doing the test, and could clear
the concern.  So it was not a problem.

Best,

- Naoko





-- 
Support Free Knowledge:  http://wikimediafoundation.org/wiki/Donate
___
foundation-l mailing list
foundation-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


[Foundation-l] Usability Study Results (Sneak Preview)

2009-04-23 Thread Parul Vora
Hi all!

The Wikipedia Usability Initiative conducted a user research study with 
SF based Bolt Peters in late March to uncover barriers new editors face. 
We are in the process of completing a full report on our methodology, 
process and analysis, but wanted to share with you some of the major 
themes and findings in the meantime


Some quotes from our participants that illustrate these findings:

“Usually it’s the most information in the easiest spot to access. It 
always looks very well put together….it boggles my mind how many people 
can contribute and it still looks like an encyclopedia.” – ‘Galen’

“I like Wikipedia because it’s plain text and nothing flashes” – ‘Claudia’

“Rather than making a mess, I’d rather take some time to figure out how 
to do it right.
(later) There sure is a lot of stuff to read.” – ‘Dan’

“ [I felt] kind of stupid.” – ‘Galen’

“It’d be nice to have a GUI, so you could see what you’re editing. 
You’ve made these changes and you’re looking at it, and you don’t know 
how it’s going to look on the page. It’s a little clumsy to see how it’s 
going to look.” – ‘Bryan’

“[This is] where I’d give up.” – ‘Shaun’


Check out the full post on the foundation blog:
http://blog.wikimedia.org/2009/04/24/usability-study-results-sneak-preview/


We would love to hear any initial thoughts, opinions, and reactions. If 
you have any similar or dissimilar experiences - either personally or in 
your own work/research, we'd love to hear about that too!


Always on your side,
The Usability Team

___
foundation-l mailing list
foundation-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l