Google gives the world a new peek as its infamous algorithm and warns of crackdown on content scrapers
It also comes as the company is working to make tweaks to that very same algorithm in order to combat spam, with the head of the company's spam fighting group tweeting for users to help locate content "scrapers", which experts say have become a problem in recent months.
"Scrapers getting you down? Tell us about blog scrapers you see... we need datapoints for testing," Matt Cutts, the head of Google's spam group, said on Twitter.
Google released a short video last week detailing parts of how its infamous algorithm works, and described what happens when some changes are made – over 500 of them are made every year, the company said, with some happening every day.
"The Google Search algorithm is made up of several hundred signals that we try to put together to serve the best results for the user," Rajan Patel said in the video.
The algorithm is infamous among the search marketing industry. It powers how search rankings work, and dictates whether one site is ranked above another if they both have similar content. Search marketers must understand the effects of the algorithm in order to make their business appear higher in rankings for certain search strings.
The video describes that Google uses two ways to see if new sites deserve to be ranked higher after making an algorithm change. The first sees a group of employees manually compare the two sites, while another sees a group of typical users grouped together to have their internet browsing behaviours monitored.
Before any change to the algorithm is made, search engineers hold a meeting.
"While an improvement to the algorithm may start with a creative idea, it always goes through a process of rigorous scientific testing," Google fellow Amit Singhal wrote in the company's blog. "Simply put: if the data from our experiments doesn't show that we're helping users, we won't launch the change."
The timing of the video is curious, as Google rarely expands on the details of its algorithm, instead choosing to keep such information secret. However, the company is now speaking with the Department of Justice, reportedly over a number of search competition issues, and releasing more public information about its algorithm may be a way to relieve political and legal pressure.
Stewart Media chief executive Jim Stewart has another idea – that Google now relies on so many different ways for ranking sites, the algorithm may be actually losing relevance.
"I think there are so many factors for ranking sites, they are not as concerned with people trying to scam the algorithm anymore."
"With all of these other things like Google authorship and so on, there's a bigger issue here that they are going to try and reward quality content. I don't think this is very surprising, then."
Stewart says the push from Google towards authorship, and around associating content with particular authors instead of random sites, is one factor. Another is that Google is also now ensuring search rankings may be different for each users based on if their friends in social circles have shared that content.
"Google is now becoming a monolith. Things like Google+ and authorship are creating this system, whereby if you are determined to be scamming the algorithm you could just lose your entire Google profile. They've got a much bigger stick now."
However, Google is taking a stand on one issue – scraping. Cutts, said on Twitter this weekend that it wants people who have experienced scraping to fill out a document and inform the company.
Scraping is where a particular site will copy content from one page, and use it for its own. These sites are usually of low quality and according to Google don't offer users any value.
Stewart says this is becoming more of a problem lately, and with Google's newfound emphasis on quality, relevant content, it is no surprise they are cracking down.
"Even with our own site, I put up a post with the word 'infographic', and within seconds the image had been scraped and put onto an infographic website. There is a real push now for spammers to be caught, and scrapers are using other content to sell products and so on."
"That's why Google has put out a multi-pronged attack with emphasis on authorship to find more original content. If a scraper beats their bot, they want to find out how that happens."