Problem with Custom CSS Properties

Google Engineer, Eric Bidelman, tweeted on 28 February and stated that Chrome 41 does not in fact support Custom Properties for CSS.

This is due to the fact that Chrome 41 was released in March 2015, and Custom Properties was added to W3C in December 2015.

This could prevent web pages from rendering correctly, and therefore show a considerable deviation between what Google and users are able to view.

Details around Custom Properties can be found through Mozilla documentation.

The current Bootstrap 4.0.0 release and dev branch makes use of Custom Properties with Sassy CSS (SCSS) compilation to CSS.

Github advanced search Bootstrap for Custom Properties in SCSS “–“

The _root.scss file shows that Custom Properties are used for colours and breakpoints (responsive width) and the font used.

:root {
 // Custom variable values only support SassScript inside `#{}`.
 @each $color, $value in $colors {
 --#{$color}: #{$value};
 }

 @each $color, $value in $theme-colors {
 --#{$color}: #{$value};
 }

 @each $bp, $value in $grid-breakpoints {
 --breakpoint-#{$bp}: #{$value};
 }

 // Use `inspect` for lists so that quoted items keep the quotes.
 // See https://github.com/sass/sass/issues/2383#issuecomment-336349172
 --font-family-sans-serif: #;
 --font-family-monospace: #;
}

Browser support site caniuse.com reports that Chrome 41 is now used by 0.01% of all users.

Custom Properties was introduced in Chrome 48, but only became defaulted in Chrome 49, which was released in March 2016.

UPDATE: 12 MARCH 2018

Eric Bidelman has created a Puppeteer script on the GoogleChromeLabs Github repository that you can use to detect.

Action Point

Review your CSS files for css properties that start with “–“. Make sure you perform Fetch & Fender tests for Desktop and Mobile on each of your page templates to ensure those pages render as expected.

CSS Paint API and the ServerTiming API

The Chrome Team has released a Paint API in Chrome 65, allowing webmasters to programmatically create images with the intention of consuming less data than traditional CSS.

Unfortunately however, the images might not display correctly through Google’s web crawler render, due to the fact that it relies on Chrome 41, which was released in early 2015.

The new Server Timing API can now pass information through designated response headers to the browser, which is designed to offer a more complete performance picture, including the speed of both the client and the server.

But this is not a good idea for production servers, as it could expose more information than you would like.

Action Point

Wait for Google to update its Chrome crawler version to 65+ before implementing any Paint API.

We do not recommend using the Server Timing API. Instead, use a dedicated service such as Newrelic for performance monitoring.

Google Drops Support for Meta News Keywords

After six years of encouraging publishers to use the news meta keyword tag, John Mueller, in a conversation on Twitter, stated that Google had stopped using it some time ago.

The search engine introduced the meta news keyword tag in 2012 so that publishers could enjoy more creativity in both headlines and copy without having to worry about cramming keywords into articles.

The admission means that news organisations can now safely ignore placing the news meta keywords tag into articles, as Google will no longer pay attention to it – just as it does with the normal meta keywords tag.

Action Point

No action required.

(However, make sure you check out the Additional Reading below for Bing PubHub).

Google Mobile Speed Scorecard

On 26 February Google introduced the mobile Speed Scorecard and Impact Calculator, which is an easy-to-use tool that allows you to compare your mobile site speed against other companies.

Powered by the Chrome User Experience Report, the tool also allows site owners to calculate how much revenue could be lost due to slow page load times.

Such factors are particularly important for ecommerce stores, where just the delay of a single second can cost a 20% drop in conversions.

Action Point

Use the mobile speed report to supplement the other speed reports. Aim to provide the fastest website amongst competitors.

AMP Stories

Launched as an addition to the AMP Project on 13 February, Google has provided publishers with a new format for delivering news and information.

Mobile-focused and visually rich, AMP stories allow for publishers to engage with users through easily consumable, visual storytelling.

The stories are built on the technical infrastructure of AMP, and a publisher can host an AMP story HTML page and link to it from anywhere else on their site.

Simple to implement and arriving with a preset format, it means that smaller and less tech savvy publishers are able to utilise the stories while remaining creative and flexible.

For those interested, a tutorial and supporting documentation has been published.

You can view AMP stories on Google Search by searching for prominent publishers in g.co/ampstories within a mobile browser.

It is expected that Google will be expanding AMP stories throughout Google Search and across its products in the future.

Action Point

Follow the aforementioned tutorial published on the AMP Project to learn how to create stories.

Google Launches People Also Search For Box

Sometime in mid February, Google quietly altered the design of the “people also search for box”.

The search engine has been testing and redesigning the box for a number of years, including the introduction of dynamic loading in 2015.

Its new look provides the “people also search for” below an organic search result if a user clicks on the result, before coming back to the SERP.

Whether it appears or not might depend on the available information based around the original search.

Action Point

No action required.

Google Chrome 64 (Mobile) Trims Parameters

On 4 February Android Police reported that Google Chrome 64 for mobile and tablet devices uses the canonical tag when the user shares a link by clicking:

Share Icon > Copy > Paste
Android Police Chrome Shorten URL

This means that users who copy and paste from the browser address bar will be able to copy then paste URLs which are free of unnecessary parameters that often cause bloat.

People are also able to copy URLs to the clipboard and share them onto other apps, and if they were to highlight the URL bar and select the text manually, the full URL is still available to share.

To get a better understanding of how it works, we ran some tests to see if Chrome left a footprint in the server logs to query the canonical URL and it doesn’t.

Action Point

No action required.

Google Removes “View Image” Button

An ongoing dispute between Google and Getty Images (among others), has come to a head, as Google removed the “View Image” button from search results sometime in early February.

The stock provider filed a complaint with the European Commission in 2016, accusing the search engine of “creating captivating galleries of high-resolution, copyrighted content” while failing to direct users to original sources.

The button enabled people to access images via a direct link to the image URL so that they did not have to visit the webpage on which they were hosted.

That said, users can still find the direct link within the hosting web page source code.

Action

Take note of the click through rates of Image Search in Google Search Console.

Avid users of the View Image button have already created browser extensions for Chrome and Firefox. Ensure that all images are respectfully used with their copyright licenses.

If your business sells imagery, ensure you clearly state copyright requirements.

Google Changes Request Crawl Limit

Google has reduced the amount of request recrawl limits for a one month period and has updated its help documentation page to specify what has changed.

Whittled down, the updated documentation explains the following in regards to recrawl limits and quotas:

The number of “Crawl only this URL” has been decreased.

WasNow
500 requests every 30 days10 per day x 30 days = 300 requests

The number of “Crawl this URL and its direct links” however, has increased:

WasNow
10 requests per 30 day window2 per day x 30 days = 60 requests
Action Point

Use the recrawl functionality when adding new sections to a website. Create a temporary HTML sitemap and choose the “Crawl this URL and its direct links” option. We are not aware of any “direct links” limit.

Yandex Webmasters – Turbo Pages Update

It’s been three months since Yandex made adding Turbo pages (the equivalent of Accelerated Mobile Pages) available to any site, and on 20 February, the corporation gave an update on how the pages have been performing since release.

You can read the original announcement here, but non-Russian readers will have to use Google Chrome to automatically translate the page. Firefox users can download a Google Translation extension.

It seems that much like the AMP Project, Turbo pages have been well received by webmasters, despite a minor complaint, wherein people noticed that the content within a Turbo Page might vary from the original version.

The company said that this is due to the fact that the content of ordinary and light versions fully correspond with each other but that not all material is transferred to the RSS channel.

It has since added a new type of warning, which will state that, “The content of the Turbo page does not match the original version” if this is the case.

Since 5 March, poor-quality Turbo pages have not been appearing in Search, Zen, or News searches. When this happens, original site pages are served to users in replacement.

Prospective Turbo Pages are submitted via an RSS feed with content embedded, and you can View the requirements and examples here.

Yandex Turbo pages  in Yandex Webmaster Tools
Action Point

Yandex has a great webmaster section with a variety of tools. Verify your website and discover what Yandex already knows about your site. You can add Turbo Pages to your site here.
If your business has Russian customers, then you may wish to add more information and analytics tracking.

Yandex Webmasters – Bypass Speed

On 15 February Yandex announced a new tool named “Bypass speed“, which has been added to indexing in Yandex.Webmaster.

This allows webmasters to state what the crawl rates of a website should be, as by default, Yandex selects the maximum speed for a site, given the allowable loads of its server.

The announcement was made after Yandex admitted that there was an incorrectly configured Crawl-delay directive in robots.txt, which caused the slow downloading of documents.

The error meant that site owners were inputting misleading figures into the interface, which reduced the speed of the site bypass. This was also responsible for how often a robot can access a site, which could lead to a page not being indexed.

Yandex has now abandoned its crawl-delay directive in its entirety.

Yandex Crawl Rate Settings for implementing a crawl delay within Yandex Webmaster Tools

Bing Webmasters on the other hand, allows you to set both the time of day and crawl speed.

Bing Crawl Control settings from within Bing webmaster tools
Action Point

If you would like to throttle the Yandex crawler, then consider removing it from your robots.txt file and use the web interface. We do not recommend restricting crawlers in general, as it may lead to a sign that the website could not cope under load, in turn, providing a bad user experience.

Chrome Will Mark All HTTP as “Not Secure”

Google has long been pushing webmasters to migrate sites to HTTPS and has over the past four years offered slight ranking boosts for doing so.

As part of its campaign, the search engine announced last year that it would be gradually marking a minority of HTTP pages as “not secure”, but it has since upgraded this warning, as with the release of Chrome 68, the web browser will mark all HTTP sites this way.

Google is offering a helping hand however, by providing mixed content audits to developers wanting to migrate to HTTPS in the latest version of Lighthouse — an automated tool for improving webpages.

Chrome 68 will be released in July 2018.

Action Point

Google has been leading the march on a secure web and it’s time to move forward. All businesses still serving under HTTP should fix this before the 2018 Q3 release.

Chrome Ad-blocker

As partially discussed in the December 2017 Technical SEO roundup, Google has now enabled its ad blocker for Chrome to discourage intrusive advertisements that do not comply with the Coalition for Better Ads.

Some of the major offenders include prestitial ads (full-page advertisements that block viewers seeing content), flashing animated ads, and advertisements that autoplay with both sound and video, but this varies between devices.

On desktop, the tool will work on blocking:

  • Pop-up ads
  • Large sticky ads
  • Auto-play ads with sound
  • Countdown ads

On mobile, it is even more stringent and will block:

  • Pop-up ads
  • Ads displayed before content (with or without a countdown)
  • Auto-play video ads with sound
  • Large sticky ads
  • Flashing animated ads
  • Fullscreen scroll over ads
  • Dense advertisements

Aware that the majority of advertisements are controlled by site owners, the blocker is now implementing a three-step process to deal with bad advertising practices.

Once a site is evaluated, the owner shall be notified of any issues and time will be given to correct excessive advertising before a block is enforced.

Using the Ad Experience Report API, site owners can access Google’s evaluations, which are rated as “pass”, “warning”, and “failing”.

Once that infringing ads are corrected, the site owner can apply for a re-review.

Should Google discover a high number of violations and continued inaction however, Chrome will begin blocking ads after a period of 30 days. That said, users will still have the option to view blocked ads should they wish.

Google reports that 42% of sites that were initially failing the Better Ads standards have already resolved their issues.

Watch the video below for more information:

Action Point

Review your Ad Experience Report through Search Console.

Google Files Patent to Clear up Place Names Predicament

In an article about how Google understands place names, Bill Slawski (known for his insights into digesting Google patents), has explained how the search engine has evolved from a simplistic fact repository, to a deep knowledge graph.

This evolution is particularly important in regards to place names, for if you search for “York” in Wikipedia, you are presented with multiple towns and cities spread across the world, which isn’t very convenient for users.

But which city or town should a webpage represent if it features a place name?

In order to clear up the confusion, Slawski explained that Google has filed a patent which shall help solve the issue of place names by utilising a schema.org representation that uses the geographical coordinates of a town, city, or place.

You can read the full patent here.

Action Point

If your web pages are ambiguous, and use simple place names without delving into the overall area or country, consider adding scoping information so that you can show intent, i.e. by using “York, UK” or even “York, Alaska, U.S.” within copy. If possible, add schema.org/Place markup with geo data.

Further Reading

Article Images Blocked?

@GlennGabe highlighted that some ad-blockers are using regular expressions to block specific URL, element classes and element patterns. Make sure you check against this list.

Bing’s PubHub

Bing allows publishers to push their content to Cortana, Microsoft Office, Dynamics, and Bing.com (+App) using its PubHub. Make sure you read Bing’s Publisher Guidelines.

Cryptojacking Government Websites

In the January 2018 Technical SEO roundup, the additional reading section included “Lessons to be Learned Around Harvesting Personal Data”. A UK government website was recently affected after a JavaScript dependency was altered. Scott Helme has summarised the attack.

AMP for Email

AMP is becoming a lot more prominent with Gmail now introducing interactive emails. Building email templates is extremely difficult and adding extra layers of complexity is making it worse.

Bing Provides Perspective Answers

Having a one sided story can be bad. Bing provides both perspectives to allow further debate and help clear away biased results.

Automatic Podcasts

Amazon launched a WordPress plugin called Polly to turn your blog articles into audio files. This will open the doors to more information becoming available through podcasting.

Google Wants To Get Date Timestamps Accurate In Search Results

Timestamps can be problematic. Google relies on articles reporting the true time of publication and this might not always end well. Websites should be penalised for not providing realistic timestamps for the document publication and updates. Search Engine Land reports on some examples of how it can go wrong.

Search Console External Link

The new Search Console data view makes copy and pasting links difficult. Here is an Open Source Chrome extension hosted on Github you can build to inject an external link.