Join Merj - We're Hiring

Technical SEO Roundup August 2017

Development News

07 Sep 2017
7 minutes
By Ryan Siddle
Author Avatar

We’re bringing back our technical SEO monthly roundup.

There have been a few notices for August, but little action required for the most part.

Google Search Console

Benefit: Medium
Difficulty: Low

Google announced early August on the Webmaster Blog that two experimental features were being released to the beta team.

One beta tester was given access to the features and sent screenshots of their account to Search Engine Land. Sharing the screenshots violated the Non-disclosure Agreement that they signed with Google and the user has since had their beta privileges revoked.

Action Point

Important Note: If you're lucky enough to find yourself as a beta-user, you should never share screenshots or non-public information with third parties and possibly within your own organisation. Always check the Agreement before signing and discussing with others.

John Mueller confirmed on 2nd August in a Google Webmaster Hangouts session that the beta contains the user interface and data improvements.

As with all beta testing, not all features may become public facing. Google has not formally provided timelines for a public release. It can vary from a week to a few months, depending on the complexity of changes involved.

We assume that Google’s Webmaster API (v3) may soon be depreciated in favour of Search Console (v1 Beta). The Search Console API currently has "Mobile-Friendly Test", however, this could possibly be expanded to include Search Queries and Error Reporting, along with other new data sources. A transition to the new naming convention has already started.

Developer Documentation can be found here:

Action Point

Wait for new Search Console to become publically available (or an extended invite for beta users who are not in this particular test).

Crawl Time & Resource Allocation

Benefit: High
Difficulty: High

On the 2nd August, John Mueller confirmed in a Google Webmaster Hangouts session that Googlebot limits the amount of time it spends crawling a website if the server has a high response time to first byte. The crawler will back off crawling to prevent disruption to users on the site. This has been a long standing suspicion of ours.

Action Point

Make sure your load speeds are being monitored. Production environments can act differently to Staging/UAT environments. The simplest way is to group pages by templates, i.e. category pages, facet pages, brand pages, static pages, homepage.

Calculate your maximum number of users at any one time, then use load testing tools such as loader.io or Apache Bench to see how they perform.

Consider upgrading the web application infrastructure (dedicated servers or cloud servers), caching requests, Domain Name Service (DNS) resolution and low-level elements to improve the initial load time.

Canonical Tags & Noindexing

Benefit: Low
Difficulty: Low

Using a canonical <link> and a robots <meta> element with a noindex value will cause the canonical <link> to become void. Google assumes wanting to prevent indexation of a page prevails over canonicalising to another page.

Action Point

Avoid creating canonical <link /> elements and <meta /> robots no index rules on the same webpage.

JavaScript Hacks

Benefit: Medium
Difficulty: Low

Googlebot will only index content if it visible on Document Object Model (DOM) load and onPage load call. After that, Googlebot does not index additional content, although we have seen evidence of it following JavaScript links.

Action Point

If you have duplicate content such as Delivery & Returns or sizing charts, trigger the content to load after the initial page load has completed. Tabbed content is indexable on mobile view, therefore using an AJAX request upon clicking a tab navigation would remove the duplication.

Mobile First Indexing

Benefit: High
Difficulty: Medium

Google is still working on mobile first indexing and it will take some time for them to roll it out fully. John Mueller stated that websites would be prioritised based on those “ready”, followed by the rest. Webmasters will not be notified either.

Action Point

Keep designing for mobile experience first. Fast webpage load times, click through rates and interaction are all important parts of improving revenue.

Reverse Tabnabbing

Benefit: High
Difficulty: Low

Although not strictly technical SEO, an important security flaw known as Reverse Tabnabbing was discovered. It allows phishing attacks, persuading users to submit personal details to well known websites. We are resurfacing, because many sites are still susceptible.

It can occur when using the attribute target=”_blank” on anchor elements. It first came to the mass public spotlight when Wordpress 4.7.4 included functionality that automatically added rel=”noopener noreferrer” to all links that have a target=”_blank” attribute. However, had been mentioned in a post by Alex at JitBit one year prior “Target="_blank" - the most underestimated vulnerability ever” and first found by Aza Raskin in 2012.

Using those two additional values in the rel attribute is a security fix and will not affect SEO.

Action Point

Update all links to include rel=”noreferrer noopener”. If a webpage contains a rel=”nofollow” attribute, append the two values, separated by a space.

Search web application source code and database entries for links that are affected.

Alex (JitBit) also provided a solution for JavaScript windows.

 <pre class="robots px-3">
 <code class="m-0 py-0">
 var newWindow = window.open();
 newWindow.opener = null;
 </code>
</pre>

Avoid "Not Secure" Warning in Chrome 62

Benefit: High
Difficulty: Low

During April 2017 the Google Security team stated that all web pages that contain a submission form, must use HTTPS . This is to protect the privacy of the user who may be submitting data to the site. Some examples of forums would include search boxes, contact information or billing information. The "Not Secure" warning would only appear with the user interacts with the form elements. Chrome Incognito mode would automatically display the "Not Secure" warning by default.

Having the insecure mark is not a direct Google ranking signal. However, it could have an impact on click through rates, bounces and other aspects of user experience, which are known to have ranking signal influence.

Action Point

Ensure all form submissions are served under a HTTPS connection. Note that enabling HTTPS is just one part of ensuring user data is secure.

Robots.txt Monitoring

Benefit: Medium
Difficulty: Low
At the beginning of August, we launched a free robots.txt checker. It turns out quite a lot of websites change their robots.txt files on a regular basis. Some companies have a good handle on their requirements, while others appear to be running rogue. We will be posting a learning session later in September.

Action Point

Grab a free account and make sure you are monitoring all your robots.txt files on your subdomain(s) and Content Delivery Networks.

Get the Newsletter

Stay in the loop with our monthly Newsletter to get the complete round up of market shifts and how to stay ahead.

Let us help you solve your digital problems

We help leading organisations to optimise their digital presence the right way, by tailoring software to integrate business and digital processes, so the humans can focus on strategy, while the machines do the heavy lifting.

We're committed to your privacy. We use the information you provide to us to contact you about our relevant content, products, and services. You may unsubscribe from these communications at any time.