Video! The search quality meeting, uncut (annotated)

3/12/12 | 10:30:00 AM

Labels:

It took eight video cameras and 16 microphones, but we’ve done something new and special to give you another inside look at how search works. Today we’ve published, for the first time, a video with the uncut discussion of a proposed algorithm change (in this case, an upcoming change to our spell correction system). The language can be technical, so we've included annotations to provide some context for the discussion (and have a little fun!).



The footage was captured on December 1, 2011 at our weekly “Quality Launch Review” meeting. We hold the meeting on Thursdays to discuss possible algorithmic improvements and make decisions about what to launch. As usual, meeting participants gathered in Mountain View and joined on videoconference from remote offices around the globe, including our offices in Moscow, New York, Zurich, Seoul, Haifa and Tokyo. Check out the video for a flavor of the kinds of topics and data the team discusses before making many of the important changes to our system.

A few things you’ll observe:

  • Even relatively subtle changes get intense scrutiny by our search evaluation and ranking teams. The specific change discussed in this video improves spelling suggestions for searches with more than 10 words and it impacts only .1% of our traffic. Still, you can see the scrutiny and thoughtfulness that goes into approving this change.
  • Every change has a dedicated search quality analyst assigned to study the impact. This analyst is not part of the engineering team building the change, but instead offers a separate opinion on whether the change is good for users.
  • The search team relies heavily on the results of experimental data to make decisions. During the meeting, we rely on detailed analyst reports including the results of click evaluations and side-by-side experiments. These reports can sometimes be more than 25 pages long.
  • Launch reports include specific examples to illustrate broader trends in the data. Rather than manually change one example, our engineers look for algorithmic ways to improve millions of queries.
  • Search algorithm improvements often rely on and impact many different systems, so engineers with expertise in different areas all need to come together to make the best decision for the user, balancing all the tradeoffs involved (relevance, spam, latency, cost, language impact, etc.)

As I said in the video, this is an experiment, and we’re interested to hear what you think. For all the search geeks out there, we hope you enjoy it! For a video summary of our process, I can also recommend the video we posted last August.