[Wikimedia-l] Fundraising updates?

2012-12-13 Thread Itzik Edri
Hi,

Could we get some updates (or that I missed them?) about how the
fundraising goes? WMF and Chapters will be great. Since we only focus on
few countries this year, the discussions regarding it is very low, but it
still very interesting to know how much we collect and how the banners
works (and which one of them).

Thanks,

Itzik
___
Wikimedia-l mailing list
Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l


Re: [Wikimedia-l] Fundraising updates?

2012-12-13 Thread Theo10011
On Thu, Dec 13, 2012 at 7:34 PM, Itzik Edri  wrote:

> Hi,
>
> Could we get some updates (or that I missed them?) about how the
> fundraising goes? WMF and Chapters will be great. Since we only focus on
> few countries this year, the discussions regarding it is very low, but it
> still very interesting to know how much we collect and how the banners
> works (and which one of them)
>

http://wikimediafoundation.org/wiki/Special:FundraiserStatistics  - is a
good start.

-Theo
___
Wikimedia-l mailing list
Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l


Re: [Wikimedia-l] Fundraising updates?

2012-12-13 Thread Till Mletzko
and: http://meta.wikimedia.org/wiki/Fundraising_2012

Till


Am 13.12.2012 15:12, schrieb Theo10011:
> On Thu, Dec 13, 2012 at 7:34 PM, Itzik Edri  wrote:
>
>> Hi,
>>
>> Could we get some updates (or that I missed them?) about how the
>> fundraising goes? WMF and Chapters will be great. Since we only focus on
>> few countries this year, the discussions regarding it is very low, but it
>> still very interesting to know how much we collect and how the banners
>> works (and which one of them)
>>
> http://wikimediafoundation.org/wiki/Special:FundraiserStatistics  - is a
> good start.
>
> -Theo
> ___
> Wikimedia-l mailing list
> Wikimedia-l@lists.wikimedia.org
> Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l


-- 
Mit freundlichen Grüßen

Till Mletzko
Fundraiser
-
Wikimedia Fördergesellschaft
Obentrautstr. 72
10963 Berlin

Telefon 030 - 219 158 26 -19
www.wikimedia.de

Helfen Sie mit, dass WIKIPEDIA von der UNESCO als erstes digitales 
Weltkulturerbe anerkannt wird.
Unterzeichnen Sie die Online-Petition unter 
https://wke.wikimedia.de/wke/Main_Page!

Stellen Sie sich eine Welt vor, in der jeder Mensch freien Zugang zu der 
Gesamtheit des Wissens der Menschheit hat. Helfen Sie uns dabei! 
http://spenden.wikimedia.de/

Gemeinnützige Wikimedia Fördergesellschaft mbH.
Eingetragen beim Amtsgericht Berlin-Charlottenburg unter der Nummer 130183 B. 
Als gemeinnützig anerkannt durch das Finanzamt für Körperschaften I Berlin, 
Steuernummer 27/603/54814. 


___
Wikimedia-l mailing list
Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l


Re: [Wikimedia-l] Fundraising updates?

2012-12-13 Thread Itzik Edri
Indeed, a good start, that I already checked.

The statistics shows only WMF data. Last year we had google docs file that
the chapters shared their numbers every day also.

And there is no indication to which banners running and their performance,
things that I'll like to see.

The fundraising team did some interesting changes this year, which will
be interesting to know their performances.

Itzik.

2012/12/13 Till Mletzko 

> and: http://meta.wikimedia.org/wiki/Fundraising_2012
>
> Till
>
>
> Am 13.12.2012 15:12, schrieb Theo10011:
> > On Thu, Dec 13, 2012 at 7:34 PM, Itzik Edri  wrote:
> >
> >> Hi,
> >>
> >> Could we get some updates (or that I missed them?) about how the
> >> fundraising goes? WMF and Chapters will be great. Since we only focus on
> >> few countries this year, the discussions regarding it is very low, but
> it
> >> still very interesting to know how much we collect and how the banners
> >> works (and which one of them)
> >>
> > http://wikimediafoundation.org/wiki/Special:FundraiserStatistics  - is a
> > good start.
> >
> > -Theo
> > ___
> > Wikimedia-l mailing list
> > Wikimedia-l@lists.wikimedia.org
> > Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l
>
>
> --
> Mit freundlichen Grüßen
>
> Till Mletzko
> Fundraiser
> -
> Wikimedia Fördergesellschaft
> Obentrautstr. 72
> 10963 Berlin
>
> Telefon 030 - 219 158 26 -19
> www.wikimedia.de
>
> Helfen Sie mit, dass WIKIPEDIA von der UNESCO als erstes digitales
> Weltkulturerbe anerkannt wird.
> Unterzeichnen Sie die Online-Petition unter
> https://wke.wikimedia.de/wke/Main_Page!
>
> Stellen Sie sich eine Welt vor, in der jeder Mensch freien Zugang zu der
> Gesamtheit des Wissens der Menschheit hat. Helfen Sie uns dabei!
> http://spenden.wikimedia.de/
>
> Gemeinnützige Wikimedia Fördergesellschaft mbH.
> Eingetragen beim Amtsgericht Berlin-Charlottenburg unter der Nummer 130183
> B. Als gemeinnützig anerkannt durch das Finanzamt für Körperschaften I
> Berlin, Steuernummer 27/603/54814.
>
>
> ___
> Wikimedia-l mailing list
> Wikimedia-l@lists.wikimedia.org
> Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l
>
___
Wikimedia-l mailing list
Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l


Re: [Wikimedia-l] Fundraising updates?

2012-12-14 Thread Zack Exley
Hi Itzik -

I can give a short update -- and there will be more details in the
fundraising report after the campaign.

The banners from last year with the faces of editors, staff or Jimmy and
"Please read a personal appeal from..." stopped working between last year
and this year. We tried very hard to figure out why, but I still can't say
exactly why. It could be a mix of underlying issues that are pulling down
performance of any kind of banner and the fact that everyone on the
internet knows exactly what's in that "personal appeal" now and are no
longer curious enough to click.

What saved us was taking text from the personal appeals and putting it into
the banner itself. These banners did very well. These new message-driven
banners are what made us split the campaign in two -- because we knew we
were going to develop a lot of new messages and not have time to translate
them well. The campaign that started on the 27th of November ran only in
five countries: Australia, Britain, Canada, New Zealand and the United
States.

At first, we had a short version of the new banner:

http://en.wikipedia.org/wiki/Main_Page?banner=B12_1123_Smallinfo_fix

We could have run that for 46 days (the length of last year's campaign) and
probably made our goal. But it performed better the more information we put
into it. Through a series of tests we became confident that, while greater
banner height improved performance, that wasn't as big a factor as the
additional information we put into the banner. We tested many new versions
of messages in the banners and found many improvements. It looked like we
might be able to have a 25 day-long fundraiser.

We launched on the 27th. A few days into the campaign, we were still
paranoid about the goal. We were afraid that maybe the new banners would
burn out faster than the old ones did. Maybe they were good at getting
donations faster, but maybe we were not increasing the overall pool of
donors. We were constantly testing to boost performance. Out of curiosity,
we tested making the banners stick to the top of the screen while the page
scrolled. We knew that would be a dramatic step in a more annoying
direction, but like I said, we were still worried about the goal and just
wanted to know what our options were. The "sticky" banners did about 30%
better for donations. So we decided to keep them and see if we could get
the campaign done with in a very short time.

After 8 days of having the banners up, we were able take banners down and
only display them to people who had not seen them before (or rather,
browsers and computers that had not seen them before). We've never been
able to do this before and only had this feature fully developed several
days into the campaign. Since we took banners down for everyone, we've
mostly been displaying them only 1 or 2 times to people who've never seen
them. Though yesterday we pushed that up to 10 because we're hoping to
reach our US$25 million goal in the next few days.

We also made the banners stop sticking after the first 8 days, and
hopefully we'll never feel we have to use sticky banners again. In total,
we had sticky banners up for 4 or 5 days.

We hope that next week we'll be able to start a sort of "Thank you
campaign." We will feature a thank you message, a video of Wikimedia
editors talking about their experience, interviews and written messages
from editors collected at Wikimania, and an invitation to all our readers
to become editors. The purpose of this campaign is mainly to raise
awareness among readers about how Wikimedia projects work and who is behind
them. The purpose is also to take time to explicitly thank donors for
helping us reach our goal so quickly this year. Thanking is a very
important part of fundraising -- but we've always been so eager to take
down banners that we've never taken enough time to thank donors in the
past. This year we feel its ok since almost no one saw banners for more
than 8 days.

I know there will probably be a lot of detailed questions about income from
different countries, comparisons to last year, to the chapters, etc... We
can't answer those now because we're too busy trying to wrap up the
campaign -- and also because a lot of transactions take time to settle.
Checks flow in slowly. And accurate comparisons take time to prepare. We
don't have truly accurate numbers until later in January.

But the basic result we have is that all 5 countries we ran banners saw a
huge increase in "donations per banner impression" from last year. I can
also say that in our Nov 15th 24-hour "dress rehearsal" in which we ran an
earlier version of the new banners (that didn't perform near as well as the
one we eventually discovered) we also saw a very big increase in almost
every country in the world. The exceptions were countries where we made
some error (such as not turning banners on for half the day). Even in a
country like Greece, where the economy is about as bad as it can get, we
saw something like a 300% increase c

Re: [Wikimedia-l] Fundraising updates?

2012-12-15 Thread Itzik Edri
Zack, Thanks for the detailed email. It was very interesting email and
I appreciate that you took the time to write it. Good work!

I still hope to see information from Chapters about the amount collected
till now, to help us see the full picture of the fundraising this year.


Itzik

On Fri, Dec 14, 2012 at 5:36 PM, Zack Exley  wrote:

> Hi Itzik -
>
> I can give a short update -- and there will be more details in the
> fundraising report after the campaign.
>
> The banners from last year with the faces of editors, staff or Jimmy and
> "Please read a personal appeal from..." stopped working between last year
> and this year. We tried very hard to figure out why, but I still can't say
> exactly why. It could be a mix of underlying issues that are pulling down
> performance of any kind of banner and the fact that everyone on the
> internet knows exactly what's in that "personal appeal" now and are no
> longer curious enough to click.
>
> What saved us was taking text from the personal appeals and putting it into
> the banner itself. These banners did very well. These new message-driven
> banners are what made us split the campaign in two -- because we knew we
> were going to develop a lot of new messages and not have time to translate
> them well. The campaign that started on the 27th of November ran only in
> five countries: Australia, Britain, Canada, New Zealand and the United
> States.
>
> At first, we had a short version of the new banner:
>
> http://en.wikipedia.org/wiki/Main_Page?banner=B12_1123_Smallinfo_fix
>
> We could have run that for 46 days (the length of last year's campaign) and
> probably made our goal. But it performed better the more information we put
> into it. Through a series of tests we became confident that, while greater
> banner height improved performance, that wasn't as big a factor as the
> additional information we put into the banner. We tested many new versions
> of messages in the banners and found many improvements. It looked like we
> might be able to have a 25 day-long fundraiser.
>
> We launched on the 27th. A few days into the campaign, we were still
> paranoid about the goal. We were afraid that maybe the new banners would
> burn out faster than the old ones did. Maybe they were good at getting
> donations faster, but maybe we were not increasing the overall pool of
> donors. We were constantly testing to boost performance. Out of curiosity,
> we tested making the banners stick to the top of the screen while the page
> scrolled. We knew that would be a dramatic step in a more annoying
> direction, but like I said, we were still worried about the goal and just
> wanted to know what our options were. The "sticky" banners did about 30%
> better for donations. So we decided to keep them and see if we could get
> the campaign done with in a very short time.
>
> After 8 days of having the banners up, we were able take banners down and
> only display them to people who had not seen them before (or rather,
> browsers and computers that had not seen them before). We've never been
> able to do this before and only had this feature fully developed several
> days into the campaign. Since we took banners down for everyone, we've
> mostly been displaying them only 1 or 2 times to people who've never seen
> them. Though yesterday we pushed that up to 10 because we're hoping to
> reach our US$25 million goal in the next few days.
>
> We also made the banners stop sticking after the first 8 days, and
> hopefully we'll never feel we have to use sticky banners again. In total,
> we had sticky banners up for 4 or 5 days.
>
> We hope that next week we'll be able to start a sort of "Thank you
> campaign." We will feature a thank you message, a video of Wikimedia
> editors talking about their experience, interviews and written messages
> from editors collected at Wikimania, and an invitation to all our readers
> to become editors. The purpose of this campaign is mainly to raise
> awareness among readers about how Wikimedia projects work and who is behind
> them. The purpose is also to take time to explicitly thank donors for
> helping us reach our goal so quickly this year. Thanking is a very
> important part of fundraising -- but we've always been so eager to take
> down banners that we've never taken enough time to thank donors in the
> past. This year we feel its ok since almost no one saw banners for more
> than 8 days.
>
> I know there will probably be a lot of detailed questions about income from
> different countries, comparisons to last year, to the chapters, etc... We
> can't answer those now because we're too busy trying to wrap up the
> campaign -- and also because a lot of transactions take time to settle.
> Checks flow in slowly. And accurate comparisons take time to prepare. We
> don't have truly accurate numbers until later in January.
>
> But the basic result we have is that all 5 countries we ran banners saw a
> huge increase in "donations per banner impression" from last y

Re: [Wikimedia-l] Fundraising updates?

2012-12-15 Thread James Salsman
Hi Zack,

Thanks very much for your updates:

> What saved us was taking text from the personal appeals and putting it into
> the banner itself. These banners did very well. These new message-driven
> banners are what made us split the campaign in two -- because we knew we
> were going to develop a lot of new messages and not have time to translate
> them well

As you know I've been saying for years that the variance among the
volunteer-supplied messages, originally submitted in 2009 and hundreds
of which have not yet been tested (as far as I know), was large enough
to suggest that some messages would certainly outperform the
traditional banners and appeals. While it's refreshing to be
validated, as you might imagine I feel like Cassandra much of the time
for reasons that have nothing to do with the underlying mathematical
reasoning involved.

The last time I heard from you, you said that you intended to test the
untried messaging from 2009 with multivariate analysis. However,
http://meta.wikimedia.org/wiki/Fundraising_2012/We_Need_A_Breakthrough
shows only three very small-N multivariate tests, the last of which
was in October, and no recent testing.

Do you still intend to test the untried volunteer-submitted messages
with multivariate analysis? If so, when? Thank you.

Sincerely,
James Salsman

___
Wikimedia-l mailing list
Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l


Re: [Wikimedia-l] Fundraising updates?

2012-12-15 Thread Thomas Dalton
On Dec 14, 2012 3:37 PM, "Zack Exley"  wrote:
> Since we took banners down for everyone, we've
> mostly been displaying them only 1 or 2 times to people who've never seen
> them. Though yesterday we pushed that up to 10 because we're hoping to
> reach our US$25 million goal in the next few days.

Obviously it's early, but that change doesn't seem to have caused a
noticeable break from the existing trend on the fundraising graph. Have you
been gathering data on how banners perform on the first, second, third,
etc. time of viewing? Presumably the same cookies that let you stop banners
after a certain number of views would let you gather such data. If so, what
kind of an increase were you expecting to see?
___
Wikimedia-l mailing list
Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l


Re: [Wikimedia-l] Fundraising updates?

2012-12-17 Thread Zack Exley
On Sat, Dec 15, 2012 at 11:16 AM, Thomas Dalton wrote:

> On Dec 14, 2012 3:37 PM, "Zack Exley"  wrote:
> > Since we took banners down for everyone, we've
> > mostly been displaying them only 1 or 2 times to people who've never seen
> > them. Though yesterday we pushed that up to 10 because we're hoping to
> > reach our US$25 million goal in the next few days.
>
> Obviously it's early, but that change doesn't seem to have caused a
> noticeable break from the existing trend on the fundraising graph.


The graph can be misleading because it counts email revenue too. Over
several days that we were just showing 2 banners to new people, we were
seeing a steady decline in revenue each day. We don't where that will
bottom out -- whether at $50K per day or $1000 per day. The answer to that
question will have a big impact on our fundraiser for next year.


> Have you
> been gathering data on how banners perform on the first, second, third,
> etc. time of viewing?


Yes, we have some data, but it's noisy and confusing. But the overall
picture is that when we turn on banners, a huge majority of donors are
giving upon their first banner view. Everyday, the distribution spreads
out. After a week of solid banners it was more like 50% of donations came
after the first banner view -- and most of the rest coming before the 10th
banner view.



> Presumably the same cookies that let you stop banners
> after a certain number of views would let you gather such data. If so, what
> kind of an increase were you expecting to see?
> ___
> Wikimedia-l mailing list
> Wikimedia-l@lists.wikimedia.org
> Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l
>



-- 
Zack Exley
Chief Revenue Officer
Wikimedia Foundation
415 506 9225
___
Wikimedia-l mailing list
Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l


Re: [Wikimedia-l] Fundraising updates?

2012-12-17 Thread Zack Exley
On Sat, Dec 15, 2012 at 11:01 AM, James Salsman  wrote:

> Hi Zack,
>
> Thanks very much for your updates:
>
> > What saved us was taking text from the personal appeals and putting it
> into
> > the banner itself. These banners did very well. These new message-driven
> > banners are what made us split the campaign in two -- because we knew we
> > were going to develop a lot of new messages and not have time to
> translate
> > them well
>
> As you know I've been saying for years that the variance among the
> volunteer-supplied messages, originally submitted in 2009 and hundreds
> of which have not yet been tested (as far as I know), was large enough
> to suggest that some messages would certainly outperform the
> traditional banners and appeals. While it's refreshing to be
> validated, as you might imagine I feel like Cassandra much of the time
> for reasons that have nothing to do with the underlying mathematical
> reasoning involved.
>
> The last time I heard from you, you said that you intended to test the
> untried messaging from 2009 with multivariate analysis. However,
> http://meta.wikimedia.org/wiki/Fundraising_2012/We_Need_A_Breakthrough
> shows only three very small-N multivariate tests, the last of which
> was in October, and no recent testing.
>
> Do you still intend to test the untried volunteer-submitted messages
> with multivariate analysis? If so, when? Thank you.
>
>
James -

We can only do big multivariate tests for banner click rates. But banner
click rates have very little to do with donations in our present context.

For example, the new banners have about 30% the click rate of the old ones,
but they make about 3 or 4 times as much money.

To determine how well a banner message does for donations, we usually need
a sample size between 500 and 5,000 donations per banner, depending on the
difference in performance between the banners. That takes from 30 minutes
to several hours to collect -- if we're only testing two banners at a time.

Regarding the banners suggested in past years: I've explained this before,
and will repeat: We tested tons of those banners. I think that we tested
virtually every different (serious) theme that was suggested. They all had
BOTH far lower click rates and even lower donation rates -- usually by
orders of magnitude. This was also true for the new short slogans that we
came up with ourselves on the fundraising team.

Now we're pretty clear on why: A short slogan isn't enough to get people
over all their questions about why they should support Wikipedia. More text
was needed. In our marketing-slogan-obsessed culture, the idea that we'd
have to present people with a long paragraph was very counterintuitive. We
didn't think of it on the fundraising team and none of the volunteers who
submitted suggestions thought of it either. Several marketing professionals
who contacted us with advice even told us to get rid of the appeal on then
landing page altogether because "people don't read!"

As it turns out, Wikipedia users DO like to read -- and want all the facts
before they donate.

Where we're at today, just to emphasize my previous point, is that with the
new banners, changes in messages effect donations totally independently of
click rate. And we typically need an hour or two -- or five -- to detect
even a 10%-%15 percent difference in message performance. That's why we're
not running big multivariate tests with tons of difference banners.

You'll be happy to know, though, that we are running multivariate tests
when we're able. For example, if we have a tweak to the landing pages that
we think is fairly independent of the banner effect, then we sometimes run
a multivariate test. Or if we have a design tweak (like color) that we're
confident will always effect click rate in the same direction as donations,
then we can combine that with message testing.


> Sincerely,
> James Salsman
>



-- 
Zack Exley
Chief Revenue Officer
Wikimedia Foundation
415 506 9225
___
Wikimedia-l mailing list
Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l


Re: [Wikimedia-l] Fundraising updates?

2012-12-17 Thread James Salsman
Zack,

Thanks very much for helping me understand this.

> We can only do big multivariate tests for banner click rates.

The multivariate tests you ran in May and October list total
donations. What am I missing?

> For example, the new banners have about 30% the click rate of the old ones,
> but they make about 3 or 4 times as much money.

If you have any reason to believe that this is not because of the new
pull-down format, please let me know.

> To determine how well a banner message does for donations, we usually need a
> sample size between 500 and 5,000 donations per banner [requiring] from 30
> minutes to several hours to collect -- if we're only testing two banners at a 
> time.

How about using a much quicker 90% confidence interval on a larger
number of tests, and then confirming the top, say ten performers, at
the 95% confidence level you've been using in subsequent tests?

> We tested tons of those banners. I think that we tested
> virtually every different (serious) theme that was suggested.

I'm sorry, but "tons" is not a number. You didn't even get a third of
them and a quick perusal shows that you missed hundreds of sincere
serious submissions.

> They all had BOTH far lower click rates and even lower donation rates

I believe using the word "far" is a factual mistake. I plotted a
sample distribution, and I know that the variance is more than enough
that the population would exceed the top of the sample. Moreover,
there were many discovered which came very close to the "Please read
an appeal from Jimmy" top performer in 2009 and 2010. Isn't that how
you discovered the "If everyone donated $x this fundraiser would be
over in y" which was the top performer for a time?

> usually by orders of magnitude.

That is technically true, but as they fit a lognormal distribution,
it's not surprising.

> Now we're pretty clear on why: A short slogan isn't enough to get people
> over all their questions about why they should support Wikipedia. More text
> was needed. In our marketing-slogan-obsessed culture, the idea that we'd
> have to present people with a long paragraph was very counterintuitive
> As it turns out, Wikipedia users DO like to read -- and want all the facts
> before they donate.

I completely agree with this, but I note that the longer banners you
started with this month were comprised of three shorter statements,
one of which was an appeal statement.

Let me just cut to the chase and ask it this way: If you were to test
the remaining volunteer submissions in the appeal statement slot in
the existing top performing, fact-based, pull-down appeal,
(1) What would you need that you don't already have?
(2) How long would it take at the 90% confidence level?
(3) How can volunteers help make it easier and faster for you?

___
Wikimedia-l mailing list
Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l


Re: [Wikimedia-l] Fundraising updates?

2012-12-17 Thread Thomas Dalton
Have you considered doing some longer tests? Lasting a week, say. It would
enable you to do proper multivariate testing, including dependencies
between variables (which I don't think you have done any real tests of
yet). It would also let you test time dependence. Eg., does a particular
message work better in the morning than in the afternoon? (Different types
of people browse at different times, so it wouldn't surprise me) You could
also model banner fatigue properly, which could be very useful.
On Dec 17, 2012 4:28 PM, "Zack Exley"  wrote:

> On Sat, Dec 15, 2012 at 11:01 AM, James Salsman 
> wrote:
>
> > Hi Zack,
> >
> > Thanks very much for your updates:
> >
> > > What saved us was taking text from the personal appeals and putting it
> > into
> > > the banner itself. These banners did very well. These new
> message-driven
> > > banners are what made us split the campaign in two -- because we knew
> we
> > > were going to develop a lot of new messages and not have time to
> > translate
> > > them well
> >
> > As you know I've been saying for years that the variance among the
> > volunteer-supplied messages, originally submitted in 2009 and hundreds
> > of which have not yet been tested (as far as I know), was large enough
> > to suggest that some messages would certainly outperform the
> > traditional banners and appeals. While it's refreshing to be
> > validated, as you might imagine I feel like Cassandra much of the time
> > for reasons that have nothing to do with the underlying mathematical
> > reasoning involved.
> >
> > The last time I heard from you, you said that you intended to test the
> > untried messaging from 2009 with multivariate analysis. However,
> > http://meta.wikimedia.org/wiki/Fundraising_2012/We_Need_A_Breakthrough
> > shows only three very small-N multivariate tests, the last of which
> > was in October, and no recent testing.
> >
> > Do you still intend to test the untried volunteer-submitted messages
> > with multivariate analysis? If so, when? Thank you.
> >
> >
> James -
>
> We can only do big multivariate tests for banner click rates. But banner
> click rates have very little to do with donations in our present context.
>
> For example, the new banners have about 30% the click rate of the old ones,
> but they make about 3 or 4 times as much money.
>
> To determine how well a banner message does for donations, we usually need
> a sample size between 500 and 5,000 donations per banner, depending on the
> difference in performance between the banners. That takes from 30 minutes
> to several hours to collect -- if we're only testing two banners at a time.
>
> Regarding the banners suggested in past years: I've explained this before,
> and will repeat: We tested tons of those banners. I think that we tested
> virtually every different (serious) theme that was suggested. They all had
> BOTH far lower click rates and even lower donation rates -- usually by
> orders of magnitude. This was also true for the new short slogans that we
> came up with ourselves on the fundraising team.
>
> Now we're pretty clear on why: A short slogan isn't enough to get people
> over all their questions about why they should support Wikipedia. More text
> was needed. In our marketing-slogan-obsessed culture, the idea that we'd
> have to present people with a long paragraph was very counterintuitive. We
> didn't think of it on the fundraising team and none of the volunteers who
> submitted suggestions thought of it either. Several marketing professionals
> who contacted us with advice even told us to get rid of the appeal on then
> landing page altogether because "people don't read!"
>
> As it turns out, Wikipedia users DO like to read -- and want all the facts
> before they donate.
>
> Where we're at today, just to emphasize my previous point, is that with the
> new banners, changes in messages effect donations totally independently of
> click rate. And we typically need an hour or two -- or five -- to detect
> even a 10%-%15 percent difference in message performance. That's why we're
> not running big multivariate tests with tons of difference banners.
>
> You'll be happy to know, though, that we are running multivariate tests
> when we're able. For example, if we have a tweak to the landing pages that
> we think is fairly independent of the banner effect, then we sometimes run
> a multivariate test. Or if we have a design tweak (like color) that we're
> confident will always effect click rate in the same direction as donations,
> then we can combine that with message testing.
>
>
> > Sincerely,
> > James Salsman
> >
>
>
>
> --
> Zack Exley
> Chief Revenue Officer
> Wikimedia Foundation
> 415 506 9225
> ___
> Wikimedia-l mailing list
> Wikimedia-l@lists.wikimedia.org
> Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l
>
___
Wikimedia-l mailing list
Wikimedia-l@lists.wikimedia.org
Unsubscribe: 

Re: [Wikimedia-l] Fundraising updates?

2012-12-17 Thread Samuel Klein
Thomas writes:
> You could also model banner fatigue properly, which could be very useful.

Yes, a detailed model of banner fatigue would be fascinating.

It's certainly something studied by many groups in different contexts;
ideally we'd learn from published analysis, and then see deviations from
the norm in our own context.  It's quite likely that the context changes
between donation appeals and other messages; understanding this better
would also help us rotate global sitenotices more effectively.

Zack - thank you for sharing so much detail about the process.
James - thank you for your nuanced statistical comments; something we could
use more of.

SJ


On Mon, Dec 17, 2012 at 12:23 PM, Thomas Dalton wrote:

> Have you considered doing some longer tests? Lasting a week, say. It would
> enable you to do proper multivariate testing, including dependencies
> between variables (which I don't think you have done any real tests of
> yet). It would also let you test time dependence. Eg., does a particular
> message work better in the morning than in the afternoon? (Different types
> of people browse at different times, so it wouldn't surprise me) You could
> also model banner fatigue properly, which could be very useful.
> On Dec 17, 2012 4:28 PM, "Zack Exley"  wrote:
>
> > On Sat, Dec 15, 2012 at 11:01 AM, James Salsman 
> > wrote:
> >
> > > Hi Zack,
> > >
> > > Thanks very much for your updates:
> > >
> > > > What saved us was taking text from the personal appeals and putting
> it
> > > into
> > > > the banner itself. These banners did very well. These new
> > message-driven
> > > > banners are what made us split the campaign in two -- because we knew
> > we
> > > > were going to develop a lot of new messages and not have time to
> > > translate
> > > > them well
> > >
> > > As you know I've been saying for years that the variance among the
> > > volunteer-supplied messages, originally submitted in 2009 and hundreds
> > > of which have not yet been tested (as far as I know), was large enough
> > > to suggest that some messages would certainly outperform the
> > > traditional banners and appeals. While it's refreshing to be
> > > validated, as you might imagine I feel like Cassandra much of the time
> > > for reasons that have nothing to do with the underlying mathematical
> > > reasoning involved.
> > >
> > > The last time I heard from you, you said that you intended to test the
> > > untried messaging from 2009 with multivariate analysis. However,
> > > http://meta.wikimedia.org/wiki/Fundraising_2012/We_Need_A_Breakthrough
> > > shows only three very small-N multivariate tests, the last of which
> > > was in October, and no recent testing.
> > >
> > > Do you still intend to test the untried volunteer-submitted messages
> > > with multivariate analysis? If so, when? Thank you.
> > >
> > >
> > James -
> >
> > We can only do big multivariate tests for banner click rates. But banner
> > click rates have very little to do with donations in our present context.
> >
> > For example, the new banners have about 30% the click rate of the old
> ones,
> > but they make about 3 or 4 times as much money.
> >
> > To determine how well a banner message does for donations, we usually
> need
> > a sample size between 500 and 5,000 donations per banner, depending on
> the
> > difference in performance between the banners. That takes from 30 minutes
> > to several hours to collect -- if we're only testing two banners at a
> time.
> >
> > Regarding the banners suggested in past years: I've explained this
> before,
> > and will repeat: We tested tons of those banners. I think that we tested
> > virtually every different (serious) theme that was suggested. They all
> had
> > BOTH far lower click rates and even lower donation rates -- usually by
> > orders of magnitude. This was also true for the new short slogans that we
> > came up with ourselves on the fundraising team.
> >
> > Now we're pretty clear on why: A short slogan isn't enough to get people
> > over all their questions about why they should support Wikipedia. More
> text
> > was needed. In our marketing-slogan-obsessed culture, the idea that we'd
> > have to present people with a long paragraph was very counterintuitive.
> We
> > didn't think of it on the fundraising team and none of the volunteers who
> > submitted suggestions thought of it either. Several marketing
> professionals
> > who contacted us with advice even told us to get rid of the appeal on
> then
> > landing page altogether because "people don't read!"
> >
> > As it turns out, Wikipedia users DO like to read -- and want all the
> facts
> > before they donate.
> >
> > Where we're at today, just to emphasize my previous point, is that with
> the
> > new banners, changes in messages effect donations totally independently
> of
> > click rate. And we typically need an hour or two -- or five -- to detect
> > even a 10%-%15 percent difference in message performance. That's why
> we're
> > not running big m

Re: [Wikimedia-l] Fundraising updates?

2012-12-17 Thread Thomas Dalton
On 17 December 2012 17:28, Samuel Klein  wrote:
> Thomas writes:
>> You could also model banner fatigue properly, which could be very useful.
>
> Yes, a detailed model of banner fatigue would be fascinating.
>
> It's certainly something studied by many groups in different contexts;
> ideally we'd learn from published analysis, and then see deviations from
> the norm in our own context.  It's quite likely that the context changes
> between donation appeals and other messages; understanding this better
> would also help us rotate global sitenotices more effectively.

Published analyses would certainly be interesting, but it wouldn't
surprise me if they were completely non-applicable for us. There is
really nothing else like our fundraiser - nobody else uses their own
top 5 website for their fundraising, since no other non-profits have a
top 5 website!

___
Wikimedia-l mailing list
Wikimedia-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l


Re: [Wikimedia-l] Fundraising updates?

2012-12-17 Thread Megan Hernandez
Here's the link to the chapter daily reporting numbers.  Please keep in
mind that these numbers are still preliminary.  The numbers will still
change as donations are still coming in and some donations take a while to
settle (checks, bank transfer, etc).

https://docs.google.com/a/wikimedia.org/spreadsheet/ccc?key=0Alem5893XFmUdDNITmtPTlpmcVlETnphOUEwdFlCUHc#gid=0

Megan

On Mon, Dec 17, 2012 at 9:28 AM, Samuel Klein  wrote:

> Thomas writes:
> > You could also model banner fatigue properly, which could be very useful.
>
> Yes, a detailed model of banner fatigue would be fascinating.
>
> It's certainly something studied by many groups in different contexts;
> ideally we'd learn from published analysis, and then see deviations from
> the norm in our own context.  It's quite likely that the context changes
> between donation appeals and other messages; understanding this better
> would also help us rotate global sitenotices more effectively.
>
> Zack - thank you for sharing so much detail about the process.
> James - thank you for your nuanced statistical comments; something we could
> use more of.
>
> SJ
>
>
> On Mon, Dec 17, 2012 at 12:23 PM, Thomas Dalton  >wrote:
>
> > Have you considered doing some longer tests? Lasting a week, say. It
> would
> > enable you to do proper multivariate testing, including dependencies
> > between variables (which I don't think you have done any real tests of
> > yet). It would also let you test time dependence. Eg., does a particular
> > message work better in the morning than in the afternoon? (Different
> types
> > of people browse at different times, so it wouldn't surprise me) You
> could
> > also model banner fatigue properly, which could be very useful.
> > On Dec 17, 2012 4:28 PM, "Zack Exley"  wrote:
> >
> > > On Sat, Dec 15, 2012 at 11:01 AM, James Salsman 
> > > wrote:
> > >
> > > > Hi Zack,
> > > >
> > > > Thanks very much for your updates:
> > > >
> > > > > What saved us was taking text from the personal appeals and putting
> > it
> > > > into
> > > > > the banner itself. These banners did very well. These new
> > > message-driven
> > > > > banners are what made us split the campaign in two -- because we
> knew
> > > we
> > > > > were going to develop a lot of new messages and not have time to
> > > > translate
> > > > > them well
> > > >
> > > > As you know I've been saying for years that the variance among the
> > > > volunteer-supplied messages, originally submitted in 2009 and
> hundreds
> > > > of which have not yet been tested (as far as I know), was large
> enough
> > > > to suggest that some messages would certainly outperform the
> > > > traditional banners and appeals. While it's refreshing to be
> > > > validated, as you might imagine I feel like Cassandra much of the
> time
> > > > for reasons that have nothing to do with the underlying mathematical
> > > > reasoning involved.
> > > >
> > > > The last time I heard from you, you said that you intended to test
> the
> > > > untried messaging from 2009 with multivariate analysis. However,
> > > >
> http://meta.wikimedia.org/wiki/Fundraising_2012/We_Need_A_Breakthrough
> > > > shows only three very small-N multivariate tests, the last of which
> > > > was in October, and no recent testing.
> > > >
> > > > Do you still intend to test the untried volunteer-submitted messages
> > > > with multivariate analysis? If so, when? Thank you.
> > > >
> > > >
> > > James -
> > >
> > > We can only do big multivariate tests for banner click rates. But
> banner
> > > click rates have very little to do with donations in our present
> context.
> > >
> > > For example, the new banners have about 30% the click rate of the old
> > ones,
> > > but they make about 3 or 4 times as much money.
> > >
> > > To determine how well a banner message does for donations, we usually
> > need
> > > a sample size between 500 and 5,000 donations per banner, depending on
> > the
> > > difference in performance between the banners. That takes from 30
> minutes
> > > to several hours to collect -- if we're only testing two banners at a
> > time.
> > >
> > > Regarding the banners suggested in past years: I've explained this
> > before,
> > > and will repeat: We tested tons of those banners. I think that we
> tested
> > > virtually every different (serious) theme that was suggested. They all
> > had
> > > BOTH far lower click rates and even lower donation rates -- usually by
> > > orders of magnitude. This was also true for the new short slogans that
> we
> > > came up with ourselves on the fundraising team.
> > >
> > > Now we're pretty clear on why: A short slogan isn't enough to get
> people
> > > over all their questions about why they should support Wikipedia. More
> > text
> > > was needed. In our marketing-slogan-obsessed culture, the idea that
> we'd
> > > have to present people with a long paragraph was very counterintuitive.
> > We
> > > didn't think of it on the fundraising team and none of the volunteers
> who
> > > submitted su