With both John Mueller and Gary Illyes being active on Twitter, it is great to hear their insights, but the 140 character limit can lead to speculation on what is actually meant (280 characters may be too low as well). We are avoiding turning the monthly roundup into "What did John & Gary said on Twitter this month". It is important that we can verify information from Google itself, so let's get started with fact checking...
Fact Checking Schema
There’s no doubt that 2017 has been a year of questionable news and information.
With the backlash from the reporting of the 2016 US Presidential Election still in full swing, Google and other companies such as Bing, Facebook, and Twitter, are actively searching for ways to cut through dubious or “fake” news and facts.
Google has been sponsoring fact checking projects and working on ways to improve its fact checking credibility in its real time results. Websites can now add a ClaimReview structured data element to a webpage.
This is particularly important for websites that regularly present facts to users, and this element enables Google Search to provide a summarised version of a fact check when a page appears in search results for a particular claim.
In addition to general guidelines for structured data markup, fact checking holds another set of guidelines that should be noted before implementation.
If your company or website supplies facts to its audience, then
ClaimReview is something that you should consider adding to your website.
It's important to avoid publishing inaccurate facts and publishing them as your own, as this will get flagged as "false" by Google Search. Every business should strive to improve on reliable information.
Google Mobile Indexing Starts
On November 4 last year Google announced that it was to begin a mobile first initiative, where it would start to index websites from a mobile perspective first.
A year later it seems that we are beginning to see this implementation on a very narrow basis.
Upon being asked during a Google Hangouts session, John Mueller confirmed that a very small amount of websites might have already have been subject to a mobile first index initiative, but that everything was still in experimental stages.
Some webmasters have already claimed to have seen changes in mobile rankings and it should be noted that Google intends to roll out the mobile first index in batches.
According to Mueller, Gary Illyes is writing a blog post on the subject, which indicates that an official announcement is close at hand - webmasters should prepare for a more full rollout soon.
Make sure your site is ready for the mobile first index. It is very close to launching (we expect Q1 2018). Use Google's mobile friendly testing tool to test whether your website is responsive and mobile friendly.
Tabbed Content for Mobile First Index
Google has previously stated that when it comes to mobile first indexing, tabbed, accordion, and any other content that requires a click to view it, would not face demotion. This is because such a tabbing structure makes for a better user experience on mobile sites.
Earlier in the month however, John Mueller stated that tabbed content must be loaded within the page source on the initial pageview. It cannot be loaded upon later interaction with the page and any content that does will not be indexed by Google.
This is something to bear in mind, as it has not been mentioned before and some site owners might not be aware of how mobile content is being indexed by Google.
It’s important to remember that there are legitimate cases for hiding content within tabbed content. Ecommerce stores often have delivery, returns and sizing information on those pages. By moving that information to a tab and calling via AJAX, the amount of duplicate information across product pages can be reduced.
Google Search Console Index Coverage Report
This was first mentioned in our August 2017 technical SEO roundup.
Google’s Index Coverage report has long been in a private beta, but this changed back in September, when it went into public beta.
You can check whether you now have access to it in your search console. If you do, you should see a ‘Try the new Search Console’ link in the top left corner.
The new Index Coverage report provides detailed and easy to understand visuals that help webmasters grasp index coverage and site performance.
The report also offers a lot more detailed information when it comes to sitemaps, but changes might be needed to sitemap strategy to take full advantage of the new metrics and data.
Further reading You can read more about why sitemap strategies might need to be changed in this detailed write up by AJ Kohn.
Wait for the new Search Console to be fully released. If you are using data from Google Search Console, it is possible and worth noting that you will see less pages which could trigger some alerts for business units.
On October 13 AMP Project released a range of new ecommerce templates with customisable elements that allow designers to showcase full product libraries with product sorting, image galleries, and checkout flows.
It’s worth noting that although AMP templates mean fast pages, webmasters must maintain an additional and very limited functional version of a website.
The AMP specification is strict and has a maximum code size of 50,000 bytes, which means that you can't import the entire Bootstrap library, because it will be invalid.
Further reading A great blog article was written by the AMP team (E-commerce at the speed of AMP) that shows stats behind AMP Ecommerce along with how to get started.
Monitor competitors for adoption of Ecommerce AMP. Consider enabling AMP on your product pages for early adopters, then roll out to the rest of the website. AMP requires a lot of change to both views, functionality and style implementation. Consider the cost of development and maintenance vs reward of revenue.
Paywalls for Publishers
In a Google Webmaster Central office-hours hangout, John Mueller has put kneejerk cloaking fears aside and has confirmed that it is possible to show paywall content to Googlebot, with the correct schema markup.
With this in mind, publishers should verify user-agents who claim to be Googlebot. Some publishers currently allow any user-agent that is defined as "googlebot" to view the content. This means a you could pretend to be Googlebot and get free access to their content.
It is possible to verify a web crawler accessing your server by running a reverse DNS lookup on the incoming IP address. This can be done by using the host program (Unix server) and verify that the domain name is either googlebot.com or google.com.
Allow Googlebot to view the entire web page content behind paywalls. Implement a Googlebot verification system to ensure human readers cannot spoof Googlebot. Creating a realtime system for sites with high traffic requires caching responses and deferring.
If the system is under a heavy load however, those who claim to be Googlebot to access the content up to a given amount of times, before giving a hard requirement for the bot verification. It would take approximately 12.1 MB of RAM per 1,000,000 records, consisting of 1 32-bit IPv4 address, 1 bit boolean flag for whether the IP belongs to Google, and a 64 bit UNIX timestamp.
XML Sitemaps for High Hreflang Directives
John Mueller has confirmed using XML sitemap implementation for hreflang where there are a high number of directives.
When hreflang is placed in the
<head> of a webpage, it can take more time to parse and download. With some companies having many countries and locales, it can significantly bloat the
An alternative is to use an XML sitemap to indicate alternate language pages, as Mueller stated during the confirmation.
You can use unlimited hreflang entries on a site, but ensure that you use a sitemap so that you indicate the alternative languages to Google.
Removing Low Quality Pages
The removal of low quality content is viewed by many as a quick fix for improved performance.
The reality however, is that you need to check if Google is sending traffic to these pages as an indication of whether or not they consider them high quality, as traffic can give an indication of some measure of quality.
At the State of Search Conference in Dallas on October 10, Gary Illyes was asked whether there was any benefit in removing old content from a website.
He gave the following answer:
“It's not guaranteed that you will see any positive effect from that. Basically if you have lots of [bad] content pages and hopefully you are not going to rank for those pages, but if you do and you noindex those pages, then you are lowering your own traffic by noindexing those pages. “I don’t like the idea of noindexing pages, I would much rather see site owners improve the pages that show up in the search results. For those that don’t show up in the search results, those are not indexed, and if they are not indexed then typically they are not affecting your site.”
Consolidation of content therefore, might be better option than total removal for content that is receiving traffic.
However, John Mueller stated in a Google Hangout at the end of October that both pruning and improving content are valid strategies. Improvement is preferred, but if this is not possible removal is fine.
Here is a transcript from the video:
“On the other hand if you can't improve the quality of that content because it's just so much. Or maybe you auto-generated all of this content at some point and you can kind of like improve some small fraction of it but a large part is something you can't really change at all. “Then that makes sense perhaps to be consistent there as well and say well I can't do anything about this so I'm just going to get rid of this and clean up. And cleaning up can be done with noindex with a 404 kind of whatever you like to do that.”
Run a backlink analysis to see if the lower quality pages have acquired external backlinks. If so, 301 redirect them to a suitable webpage to prevent a broken user experience and recycle the authority that they have collected.
Alternatively, to remove a page entirely from the index, we recommend using a 410 header response to remove content faster. Using 404 header responses can leave pages showing in Search Console for years, whilst using a 410 removes them much faster.
Relevant Local Search Results for Users Travelling Abroad
Google announced on October 27 that it has updated the way in which it labels country services on mobile web, the Google app for iOS, and desktop Search and Maps.
This means that if a user travels abroad, instead of the choice of country service being indicated by a country code top level domain name (ccTLD), by default, they shall be served with the country service that responds to that person’s location.
If a user wishes, they can change back to their country of preference in settings.
This means that it will change the way that Google Search and Maps services are labelled in the future, although it won’t affect how products work.
Monitor keyword volumes to see if the change affects monthly stats.
Ranking Identical Products Pages
For large ecommerce websites, it’s pretty common to have products that are nearly identical to one another, with variations only perhaps occurring in colour or size.
It is often a worry for sites that choose to give these products their own individual pages, despite there being no duplicate content penalty within Google.
In another Google Webmaster Office Hours session, it was revealed that Google would index such product pages, but when it comes to serving a page, the engine will try and pick or filter one that recognises that a user is searching for something within a duplicated section.
Do not worry about duplicate content issues for sites with near-identical products, but also assess whether these products might be best fitted onto one unique page.
HTTPS Migrations & [No] Traffic Loss
Although this is something that Google has long denied, a recent Twitter debate broke out over whether moving from HTTP to HTTPS can cause traffic loss.
Although this happened in August, the issue has been a longstanding cause of worry, but again, traffic loss in this area has been strenuously denied by Google and its employees.
The fact is that if a migration is done correctly, and all at once, there should be no drop in long-term traffic when switching protocols.
Further reading on the subject is ample and a solid FAQ post can be found here.
Migrate to HTTPS as soon as possible. You should still take caution in having a well thought our migration plan, to help prevent losses.
Dell forgot to renew their cloud hosting domain, which was linked to customer backup and recovery. Read the full article on Sophos.
Are you expecting your business to last more than 5 years? Why not renew it for that entire period. Run bi-annual checks to make sure card details are correct.
Internal Anchor Context
Ensure link anchor text provides context for the page being linked to. Generic anchors, such as, "click here" do not provide any context for what is being linked to. Google Hangout response was here.