Methodology: How the Journal Carried Out Its Analysis

John West
Nov. 15, 2019 7:22 am ET

The Wall Street Journal compiled and compared auto-complete and organic search 
results on Google, Bing and DuckDuckGo in three phases, from July 23-Aug. 8; 
Aug. 26-31; and Sept. 12-19.

We created a set of computers in the cloud, using Amazon Web Services EC2 
(Elastic Compute Cloud), which presented new IP addresses, the unique 
identifier that many webpages use to associate one browser session with 
another, for each search. The computers were, however, identifiable as working 
off a server in Virginia, and location could be a factor in our results.

We deployed code onto those computers that would mimic a human typing a phrase 
into a query box, such as “Joe Biden is.” The resulting auto-complete 
suggestions from each search engine were captured by recording the HTML, the 
code that represents the content of a webpage.

Additionally, we mimicked a human searching for a term, such as “Joe Biden,” 
and captured the HTML to record the first page of search results on each search 
engine. On Google, we also collected the news results that appeared in the news 
module that appeared on the first page of search results.

The Journal tested 17 words and phrases that covered a range of political 
issues and candidates, cultural phrases and names in the news. During each 
testing cycle, one computer would search one phrase roughly every two hours for 
the duration of the cycle, with a new IP address for each search, on each 
search engine. For example, during the 17-day cycle, the phrases were each 
tested 181 times on each search engine. Some of the terms were repeated in 
later cycles.

...

https://www.wsj.com/articles/methodology-how-the-journal-carried-out-its-analysis-11573820552




_______________________________________________
Medianews mailing list
Medianews@etskywarn.net
http://etskywarn.net/mailman/listinfo/medianews_etskywarn.net

Reply via email to