According to Google’s Director of Research, Peter Norvig, if you look at Google Trends for trends related to “full moon” or “ice cream”, you’ll see that Google searches for those terms imitate actual physical trends in the world. With a very large number of queries performed for those terms, searches for “full moon” peak every 28 days. Searches for “ice cream” peak every summer, 365 days apart. Large amounts of data make interesting things possible.
If you’re interested in how search engines work, and how large amounts of data can help them do what they do more effectively, it’s highly recommended that you read the paper The Unreasonable Effectiveness of Data (pdf), written by Alon Halevy, Peter Norvig, and Fernando Pereira, from Google. Even more highly recommended is a presentation from Peter Norvig of the same name from a Distinguished Lecture Series at the University of British Columbia last fall, which sadly has less than a 1,000 views at YouTube presently:
In the early days of Google, when you performed a search, the results you received were just links to pages found on the Web, showing page titles, snippets, and URLs. Google started adding other types of searches to its Web search, such as:
While these launched as separate search repositories, they weren’t going to stay that way, and may never have been planned as solely being standalone data repositories. In 2007, Google introduced Universal Search. At a Google presentation called Searchology in May of 2007, Google announced Universal Search, which included video, news, books, image and local results incorporated into Web search results. According to the Official Google Blog post, the roots of Universal Search can be traced back to 2001, with a lot of effort leading to its launch:
Over several years, with the help of more than 100 people, we’ve built the infrastructure, search algorithms, and presentation mechanisms to provide what we see as just the first step in the evolution toward universal search. Today, we’re making that first step available on google.com by launching the new architecture and using it to blend content from Images, Maps, Books, Video, and News into our web results.
A few days ago, I asked the question, Is Google Aiming at Building Faster Networks and Data Transmissions? Google had acquired some interesting patent applications that have the potential to increase the speed and quality of data transmissions. An even more recent intellectual property acquisition by Google points to a growing interest in networking technology.
Google is planning on bringing ultra high speed broadband access to Kansas City, with fiber optic cable connections between homes that Google promised will deliver 1 gigabyte-per-second speeds, or a speed that’s “20,000 times faster than dial-up and more than 100 times faster than a typical broadband connection!.” That’s pretty fast. According to the Official Google Blog post, Google may be in talks with other cities to bring them this kind of high speed internet access as well.
The Google Fiber Blog hasn’t been updated too frequently, but may be something to watch. If Google is successful in Kansas City, it’s quite possible that they will be installing fiber elsewhere.
How Geographic Relevance Might be Spread Across a Site
How much might one page on a website influence the rankings of other pages? When I joined an agency in 2005, our focus was on rankings for individual pages – optimizing their content for specific terms and phrases, and making sure that they had links from other pages, both onsite and off. I found myself unable to color just within those lines. It was impossible to ignore the impact of global issues on a website when trying to optimize individual pages for terms. Every page on a site has the ability to impact how each page might be crawled and indexed and displayed by search engines.
For example, if the home page of a site was accessible at multiple URLs, there was the very real risk that PageRank for that page could be split multiple ways, such as among: