eBrandz Blog

The thought and process behind changes to Google search algorithms

Any smaller or major changes to Google algorithms are done only after thorough quality evaluation. A typical change of such nature takes off as a basic idea from one of its versatile search scientists who develop better ways for evaluating the search results quality.

According to Google, different search queries denote different freshness requirements that the new search results try to fulfill. The latest algorithmic improvement is to better understand as how to differentiate between the various kinds of searches and the specific level of freshness users need, and ensure that they get the most precise and up to the minute answers.

This is the thought behind the latest changes introduced by Google like the one related to hot topics or recent events, which start trending on the web! Users understandably wish to find the latest and best information as fast as possible. Now when one searches for current events such as Occupy Oakland protest, or for any latest news about sports or politics, more high-quality pages, even only be minutes old, will surface.

Some events happen on a regularly recurring basis, like annual conferences, sports competitions, the presidential election and so on. For them, without any need to specify with your keywords, it is implied that you expect to know the most recent developmennts. Then there are things, which recur more frequently.

So now when you seek the latest NFL scores], you will get to see the latest information. Also, if you are researching the best digital cameras, or you are in the market for a new bike and want reviews], you probably require the most up to date data something which would be now taken care of.

In spite of these improvements, a section of search experts are apprehensive about the efficacy of the Google searches. According to a recent Experian Hitwise report (it was released this August), over 81 percent of searches in Yahoo led to a visit to a site, with Bing coming in a close second at over 80 percent. By contrast, success rate of Google was much lower, hovering at around 68 percent.

The report attributed ‘not-so-accurate’ performance of the search major partly to the huge number of library books housed in its database. Google and its library partners made use of optical character recognition (OCR) programs to scan literally millions of older works. However, they are not 100 percent foolproof, more so while processing old texts in archaic fonts or with peculiar foreign-language characters.

The report noted that Google Books is comprised of a large number of word misidentifications because of OCR errors that can potentially cause Internet users follow false trails, especially when they carry ‘one-word searches’. However, Google engineers are not too sure of this conclusion drawn in the report and how it has arrived at the findings.