KG 01 - See comments inline

On Wed, Jun 6, 2012 at 8:00 PM, Rob Weir <robw...@apache.org> wrote:

> Keep in mind that we already have a large recent survey of sorts,
> based on Google Analytics data from those who have visited the website
> and downloaded AOO.
>

KG01 - Do we ask for email addresses of people who download. How might we
engage downloaders to complete survey?


> It won't tell us some of the detailed stuff, like whether they use AOO
> at home or at work, but there is more info available here than might
> be generally known.
>
> KG01 - Re-use is great. Can you send along a link to where I can review
the specific data we capture? Yes, I think we can eliminate many questions
form our demographic questions and pull the data from other sources, such
as the download info.


> For example:
>

> - what countries users are mainly from.   Can also get detail to the
> level of what cities are most often downloading AOO.
> - what languages
> - what operating systems and versions they are using
> - what screen resolution they have
> - what browser they are using
> - if they found our website from searching Google, what were the most
> used search strings
> - if they came to our website via a link from another website, what
> were the most common "referring" sites
> - what social networking sites lead them most to the website
> - what pages on the website are most frequently read
> - what paths through the website most often lead to a download
>
> and any of these can be correlated against download conversion rate.
> So for example we can look at what % of visitors download AOO based on
> country, or language, or OS or browser or whatever.
>

KG01 - Despite the overlap, I suspect that we will need to include some
basic demographic questions in the user surveys to ensure we can correlate
the results. For example, if users from a certain geography, or users of a
certain role have issues, we need to assocaite their task prioritization
and satisfaction ratings against their demographic data. Data from
disparate data sources would not support such analysis.

>
> Obviously this is not a replacement for a survey that looks at the
> habits and preferences of the user's in-application behavior.  But
> this information is "low hanging fruit" that is based on data already
> collected.
>
> KG01 - Indeed, task assessment research is another category all together.



> -Rob
>
>
> On Tue, Jun 5, 2012 at 1:51 AM, Graham Lauder <y...@apache.org> wrote:
> >> KG01 - see comments inline.
> >>
> >> On Tue, Jun 5, 2012 at 8:26 AM, Graham Lauder <y...@apache.org> wrote:
> >> > > > Hi.
> >> > > >
> >> > > > Questions relating to research!
> >> >
> >> > [....]
> >> >
> >> > > Perhaps the first survey we should conduct is a survey about what
> sort
> >> > > of surveys our users would respond to.
> >>
> >> KG01 - Thanks for your feedback and interest in the user research
> effort.
> >> While I agree we could deploy different types of surveys to gather
> >> different types of data, I feel that a survey of surveys might induce
> >> premature survey fatigue.
> >
> > Survey fatique has already set in, that is not a new thing, that is
> > recognisable simply by those surveys conducted by SUN.  We haven't caused
> > that, it is a factor of the modern marketing malaise.  The cost of
> incentives
> > these days, that one needs to hand out to get a significant sampling in a
> > timely manner is huge.
> >
> >> User research, especially surveys, consumes
> >> people's time and energy.
> >
> > Indeed as I myself pointed out earlier in this thread
> >
> >> Rather, I propose we work from the other
> >> direction. If the goal of the research activity is to gather data that
> will
> >> help us build insight and drive informed design and development
> decisions,
> >> then we should focus the surveys on the information we need to do that.
> I
> >> have captured some comments in the wiki discussion page.
> >
> > Indeed, however if the sample of respondents is ridiculously small, as
> has
> > historically been the case, then the data is useless.
> >
> > You cannot use corporate methodologies in an open source environment.
>  We have
> > no ability to offer incentives, we therefore need to make the survey
> process
> > as pleasant and enjoyable as possible or we need to find out from people
> what
> > would encourage them to participate.
> >
> > That requires research, I doubt it will require as big a sample as a UX
> survey
> > but that is only because there are a limited number of answers needed.
> >
> > Every good research organisation I have worked with does short surveys
> to find
> > out what they're doing right or wrong.  For the most part they do these
> at the
> > end of another survey, but that is because the group of respondents they
> are
> > questioning will probably never do the same survey again.  For us the
> problem
> > has been getting respondents to finish.  Lose them once and they won't
> come
> > back again and we will need to talk to our user community if not often,
> at
> > least regularly
> >
> > I would prefer to do things right first time up so people will happily
> respond
> > to any surveys we need to put out.  Remember that there are not only UX
> > surveys to be done but Marketing as well.
> >
> > We know already know two things that get people to complete surveys:
> > Brevity and Fun.
> >
> > If we do a light hearted, quick survey that gives us the reasons that
> people
> > will participate, I think that's a really good use of resources.
> >
> > The Surveys already put up are boring, generic and not likely to inspire
> > people to complete them.
> >
> > OOo has a user base in the hundreds of millions a few hundred
> completions is
> > not a sample.  We need 10s of thousands of responses across scores of
> > languages, to get a easonable sample.
> >
> > So first we need to figure out how to get that sample.
> >
> > Cheers
> > G
> >
> >
>

Reply via email to