As you might imagine, May 2018 was a hectic month, especially with the Google I/O conference taking place in California, where many exciting announcements were made.

Priority Hints Specification

It has been announced that Priority Hints (W3C working specification) will soon be coming to Chrome beta. The addition will allow developers the control to indicate a resource’s importance to browsers.

This will enable developers to prioritise what order they should be loaded in. The attribute name is importance and valid values are “low”, “high”, or “auto”. HTML elements can be marked up as:

<link rel="stylesheet" href="https://assets.example.com/style.css" importance="high" />
<script src="https://assets.example.com/app.js" importance="low" />

Alternatively, HTTP headers could be used.

HTTP Headers

Link: <https://assets.example.com/style.css>; importance=high
Link: <https://assets.example.com/app.js>; importance=low

The JavaScript fetch function supports priority hints:

<script>
 // Critical Fetch request for article content
 fetch('https://api.example.com/article/1.json', { importance: 'high' }).then(/*...*/)

 // Request for related content now reduced in priority
 // reducing the opportunity for contention
 fetch('https://api.example.com/article/1/related.json', { importance: 'low' }).then(/*...*/)
</script>
Action Point

List all of your assets and prioritise them by how critical they are for first paint. This will vary from business to business.

For example, with carousels, the first image should be a high priority, while all others could be low priority. Use JavaScript interactive scripts as a low priority, as they often do not add value to the critical paint.

Chrome No Longer Render Blocking Stylesheets in Body

Tweeting on 29 May, Yoav Weiss noticed that in Chrome Canary, developers now have the ability to load non-critical CSS in the <body> of text, which will no longer block rendering.

The original idea was discussed by Jake Archibald back in 2016, and he was pleased to announce that the feature is enabled by default. You can read more about his original thoughts on the subject here.

Action Point

Although stylesheets in the body is no longer blocking, you should still consider packing them together in reasonable groupings. Having multiple stylesheets requires more connection downloads. It may be some time until other browsers are using non-render blocking stylesheets in the body – consider using a stylesheet lazy load for less important styling.

Chrome to Rollback “Secure” Indicator

In the Chromium Blog on 17 May, Emily Schechter stated that users should expect the internet to be safe by default, and that as a consequence, Chrome’s positive security indicators will be removed so that the default unmarked state is secure.

Image showing how Chrome are rolling back the "secure" indicator in the address bar.

The change shall be rolled out over time, and the beginning phase will remove the “Secure” wording and HTTPS scheme in September 2018, upon the release of Chrome 69.

All HTTP pages are still to be marked as definitively “not secure” upon the release of Chrome 70, and users will be shown a red warning upon entering data on any HTTP page.

Action Point

No action required.

It was revealed during I/O that Googlebot actually only analyses anchor tags with href attributes.

With this in mind, JavaScript can be used to build hyperlinks by changing the “window” location and it’s also possible to use an onClick callback function to monitor for click changes.

If the <a> does not contain a href attribute at the time of the crawl, Google will not follow it.

Example

The following gist file has two linking examples. Save the raw file to your desktop (or use the JS Bin live demo) and view the source code.

The first link uses the jQuery JavaScript library to make all HTML elements with a class value of “js-link” use a function called jsUrl. When clicked, the jsUrl function will execute, causing a new window to open with the data attribute value from data-href. This gives the appearance of a standard hyperlink. This functionality is not just restricted to <a> elements, you could use it for <span> too. Googlebot will not crawl this.

NB: Turning off JavaScript will prevent the element reacting to being clicked. This example includes the requirements for handling reverse tabnabbing by setting w.opener = null if a target attribute value “_blank” is found.

The second link is a standard hyperlink, with no magic. Googlebot will crawl this.

Action Point

Use this function when wanting to restrict web crawlers from accessing specific links.

Friendly GDPR Notice

Speaking during a Google Webmaster hangout on 29 May, John Mueller answered a question regarding GDPR pop-ups and how they might affect SEO and usability.

Although Mueller admitted that pop-ups can indeed be annoying, they are fine as long as they are not intrusive and do not inhibit GoogleBot from crawling a page, which you can test using the mobile-friendly testing tool.

He also stated that a page’s content shouldn’t be replaced just with an interstitial, or a redirect to an interstitial, which would force GoogleBot to click a button to get to the actual content.

Being that GoogleBot would not click the aforementioned button, this would actually force Google to index the interstitial content instead.

Action Point

Don’t use a paywall style of block or automatically redirect to the privacy policy.

Parameters & Canonicals

Speaking in yet another webmaster hangout, although a little earlier on May 18, Mueller stated that if webmasters use the URL parameters tool in GSC, that the crawl budget would be impacted.

However, he also said that if a site uses a rel canonical, that the URLs still need to be crawled, but that this wouldn’t drain crawl budgets too much.

He also said that both can be used together.

Action Point

Download your list of parameters listed in Search Console and tag them accordingly. It will not greatly affect your “crawl budget”, but will help keep the house in order.

Google Crawls JS Several Days Later

Live tweeting from the Google I/O conference, John Shehata confirmed that JavaScript rendering does not, in fact, occur until several days after the initial page crawl.

This is because JavaScript can take quite a lot of computation, and rendering cannot occur until GoogleBot has the necessary resources available to process the content.

Action

Following the advice from a few roundups ago:

1) Reduce DOM elements as much as possible.
2) Check the page speed using sitespeed.io to find opportunities.
3) Make sure to render server-side as much as possible.

Google Structured Data Testing Tool & Tag Manager

Towards the end of May, Joe Kelly reported on Twitter that Google Structured Data Testing Tools had stopped working with Google Tag Manager.

This meant that if a webmaster used Google Tag Manager, that the Structured Data Testing Tool wouldn’t validate some markup.

Although complaints had been flowing several days before the issue was properly picked up, there came no official explanation from Google until John Mueller confirmed that he would “check some things”.

On 29 May Simo Ahava tweeted that the issue can be fixed by “programmatically adding the JSON LD instead of adding it as a <script> block directly.”

Action Point

Always aim to use server-side rendered HTML or the native JavaScript frameworks for important information where possible.

Reporting of New Rich Data

On 27 May Google added support for both “TV and Movie” and “Event” content types.

This means that when a user searches for the name of a TV show or a movie, a Knowledge Graph Card might appear in the Search results.

The cards can also include actions that can enable users to watch media from partnered streaming services and apps.

Organised events can also be marked up so that users can discover them through Search and other products such as Google Maps.

Action Point

All types of events can be added through markup, and [you can use the Data Highlighter](https://support.google.com/webmasters/answer/2774099) if you only have a few events on your website.

It’s worth noting however, that a few months back, a warning was issued for people abusing the Event markup. Make sure you use it wisely.

New Google My Business API Features Announced

On 1 May Google released a host of new API features for Google My Business, including a brand new agency dashboard.

The new features within version 4.1 of the Google My Business API are well worth investigating and these include:

  • The ability to retrieve and view insights for user-generated photos and video for managed locations.
  • The option to list, accept, and decline account and location-related invitations via admin managed APIs.
  • The creative ability to describe your businesses in your own voice and share its unique offerings and history with users.
  • The option to share the date that your location opened for business.
  • The ability to prepare a new type of Post on Google so that users can find deals at your locations.

The new agency dashboard also has a host of interesting features and is designed for managing multiple businesses. Its features include:

  • The ability to manage more than 100 locations from a single account.
  • The option of User Groups to manage teams and control access to specific locations.
  • Improved search functionality so that users can efficiently search for locations within an account.
  • Easier workflows to send and receive invitations.
Action Point

You may integrate My Business API with your content management system along with importing user-generated images and videos.

Anonymous Reviews Ignored

Writing on his blog late in May, Mike Blumenthal confirmed that Google no longer counts anonymous reviews from “A Google User” in the total shown in Knowledge Panels.

This means that some businesses could see a drop in their review count within Knowledge Panels, especially if a company started the review process before the Google+ era.

Action Point

Encourage customers and clients to leave reviews if they have not done so already.

Lighthouse 3.0

Google Developers announced Lighthouse 3.0 on 10 May via Twitter. The new version allows for faster audits, a new report UI, fewer variants, and more features.

The chrome extension should automatically update to 3.0, and for Chrome DevTools, it shall be available in Chrome 69.

Otherwise, to update to 3.0:

  • CLI Run npm install -g lighthouse@next
  • Node Run npm install lighthouse@next

The new version also includes simulated throttling using a new internal auditing engine named Lantern. This runs audits under normal network and CPU settings before estimating how long the page would take under mobile conditions.

To determine that a page has finished loading, the period between it waiting for the network and the CPU to have no activity, has been shortened in the new version.

In collaboration with Chrome UX (Research & Design) teams, 3.0 also has a new and easy to understand UI, which you can view below:

https://developers.google.com/web/updates/images/2018/05/lighthouse3.png

Action Point

Read more about some of the important updates here.

Lighthouse Badge for Open Source

As you might be aware, Open Source projects power most of the Internet at various capacities.

In an effort to help show how well a web application performs, Lighthouse badges have been created on Google.

Action Point

Ensure your website runs as fast as possible. Benchmark your projects and dependencies and use a relatively large sample size to see a good mean average (at least 50 tests per page type).

Action Point

Watch the Web Performance Made Easy conference above. You can view all of the Google IO videos here.

Takeaways:
1) New Web performance Audits (Lighthouse).
2) Find things to remove from the page, without affecting overall performance.
3) Use Cache-Control policy to set how long a resource should last before having to redownload.
4) Regularly check the source code to remove technical debt.
5) Optimise images and videos (https://images.guide):
5a) For animated backgrounds, use videos rather than GIF files.
5b) Have an equivalent static image so that the navigator.connection.effectiveType API can decide whether to show the background animation on slower connections (2G-3G).
5c) Lazy load images to fetch them on-demand.
6) Self-host web fonts for maximum control and speed.

One of the initial takeaways was that Google does not read nor abide canonical <link /> elements that are not included on the HTML rendered page (i.e. inserted via JavaScript).

Some consultants have found that there could potentially be a bug which does read and abide by the non-HTML server rendered canonical <link /> element. We should have clarity throughout June.

Firefox Focus Offers GDPR Tracking

Writing on The Mozilla Blog, Firefox Focus announced that it is offering additional tracking protection against advertisers in light of the new GDPR laws.

Previously, Firefox Focus had blocked first-party trackers from sites that were known to follow users across the web (otherwise known as cross-site tracking).

On 23 May however, the company announced a new cookie management feature that gives users full control over the source of trackers following them.

This allows people to protect the visibility of their online activity on all sites, including from third party sites.

Users can make their decision under Settings > Privacy & Security > Cookies and Site Data.

Action Point

The GDPR implementation will most likely have a great impact on analytics data, and month-on-month and year-on-year may be down as a result of opt-in. Firefox Focus helps users with greater privacy for sites that have not yet implemented GDPR compliance websites. Now the deadline has passed, it would be a good time to clear all of your browser history for a clean session (that will mean logging in again, which isn’t a bad thing).

Chrome Autoplay Policy Moved to Release 70

On 23 May, via Twitter, Chrome Developers announced that the new autoplay policy to the Web Audio API is now postponed until Chrome 70, which is to be released in October 2018.

Writing in Updates, it was noted that the Autoplay Policy in M66 Stable for audio and video elements is currently blocking around half of unwanted media autoplays in Chrome.

The Web Audio API policy, when launched, will affect web games, some WebRTC applications, and other web pages that use audio features.

Action Point

No action required.

Yandex

Improving Local Results

Writing in Yandex Blog for Webmasters, on 25 May the Russian search engine announced that it had revised how business ratings affect rankings in Yandex Maps.

The company states that it has paid particular attention to the reliability of reviews (such as whether people have visited a business or area), as well as the total number of ratings that a person has previously given.

The search engine has also implemented a series of updates in the search engine itself so that users have more opportunity to leave feedback on companies or organisations.

Users might be asked whether they have visited a particular place, and are given the option to rate it, should they confirm Yandex’s suspicions regarding their location.

If relevant, check your rating in Yandex.Reference and make sure you actively monitor and respond to feedback and ratings as Yandex is planning more updates in the near future.

Additional Reading

Research Finds that 77% of websites have JS vulnerabilities

Google IO – AI conversation.

The AI conversation from Google IO caused excitement and fear. Will every phone call start with a Turing test?

Search Engine Roundtable Webmaser Report

Search Engine Roundtable provides a great summary of all SEO conversations that occur online. Make sure you check out the May recap.