Flash Support Enters Final Stages
28 October saw the publication of what may be the final notification regarding Flash. It shouldn't come as a surprise that Google Search will stop supporting Flash before the end of the year.
While no firm date was provided, the search engine will begin ignoring Flash content within web pages and ceasing to index standalone SWF files. By means of a eulogy, Google Engineering Manager Dong-Hwi Lee said that Flash had been "the answer to the boring static web".
As this is a change long in the making, it is expected that most websites and users will have migrated away from use Flash assets. Flash is already disabled by default in Chrome, as well as other browsers, including Microsoft Edge and Firefox.
Should your site still offer content via Flash, provide steps for your users to re-enable Flash safely in their browser. Expect Flash assets to disappear from Google Search before the end of the year. If creating new flash-style content, consider a future-proof alternative such as HTML5.
Chrome Moves to Block Mixed Content By Default
Beginning in December 2019, the Chrome browser will gradually begin blocking mixed content on HTTPS pages, according to a Chromium Blog post published on 3 October.
Upon the introduction of Chrome 79, Google Chrome will respond to mixed content in two ways:
- The browser will automatically upgrade HTTP content to HTTPS if the resource is available.
- A toggle will be offered to users so they can override the default block and view the insecure content.
Google states that it will be rolling out the changes gradually to "avoid warnings and breakage". You can read the timeline of events leading up to full content blockage in the aforementioned Chromium post.
Ensure that wherever possible your HTTPS pages only load secure HTTPS resources. At the very least, be sure your resources can be accessed via the https:// protocol to benefit from the autoupgrade. You can check for mixed content using a variety of applications and tools, including Lighthouse's mixed content audit, Screaming Frog software, and the JitBit SSL Checker.
You can read more about preventing mixed content in this useful Web Fundamentals article.
Site Kit Released for all WordPress Sites
The plugin allows webmasters to view a variety of Google-led insights and metrics within WordPress, including traffic sources, performance, and even individual page reports.
Site Kit takes information from multiple Google tools, including Google Search Console, Google Analytics, PageSpeed Insights, and AdSense.
As this is a plugin it requires no changes to source code, which should be useful for webmasters with little development knowledge or resources.
Collating multiple reports and metrics into a single dashboard can be a great way to optimise your workflow. WordPress is a platform that many people are already familiar with, so Site Kit could be an excellent way of introducing team members or clients to the insights available via the whole range of Google tools.
Google has also set up a Site Kit forum where you can ask questions about the plugin.
New Video Reports now Available in Search Console
On 7 October Google announced that it was adding two new video reports into Search Console. These are known as the Video Enhancement Report and the Video Appearances in Performance Report.
Video Enhancement Report
In this report, webmasters can view errors and warnings regarding any video structured data they have implemented on a verified site. The report also allows webmasters to validate fixes bu requesting a re-crawl.
Video Appearances in Performance Report
The Video Appearances in Performance Report allows webmasters to see the performance of their videos, both within the primary Search tab and within Discover, using the new "videos" tab.
Your content will appear in this report if the web page has deployed VideoObject structured data or if Google manages to detect a video (or videos) on the page via other signals.
If your website produces video content, the new reports will be of use for understanding their performance across both Search and Discover. If you use structured data for your videos, ensure to use the Video Enhancement report to identify previously undiscovered issues. You can learn more about avoiding issues in this video best practice guide.
GoogleBot Updates Announced for December
Writing in the Webmaster Central Blog on 2 October, Google announced that from December, the search engine will update Googlebot so that it reflects the latest version of the Chrome browser.
Instead of seeing user agent strings such as "W.X.Y.Z", you will now see something similar to "76.0.3809.100" to reflect the Chrome version currently in use.
Google states that most sites will not be affected by the change and that those using feature detection and progressive enhancement should continue to work.
Sites looking for specific user agents, however, might be affected.
Google states that: "You should use feature detection instead of user agent sniffing. If you cannot use feature detection and need to detect Googlebot via the user agent, then look for 'Googlebot' within the user agent."
If you are unsure how this might affect your site, load one of its web pages using the new Googlebot user agent. Google recommends following these instructions to override a user agent in Chrome.
There is also an official list of user agents used by Google, which you can use to identify crawlers.
Google Introduces BERT
On 25 October Google announced that it was introducing the BERT algorithm, which stands for Bidirectional Encoder Representations from Transformers.
It is Google's neural network technique for natural language processing (NLP) pre-training. Its function is to better understand the language used by users.
Impacting one in ten searches, the algorithm rolled out throughout the final week of October for English search queries. It will also affect Featured Snippet results.
Open Sourced last year, the algorithm was written about in extensive detail within the Google AI blog.
Understanding the subtle nuances in language will help Google provide more accurate search results to users, and the search engine offers an example of how BERT could change a search result.
For instance, if a user had previously search for "do estheticians stand a lot at work" the search engine would have matched to term "stand-alone" with "stand". However, with the new algorithm in place, the search engine can differentiate between the multiple functions of the word "stand".
Google states that BERT represents "the biggest leap forward in the past five years, and one of the biggest leaps forward in the history of Search."
Danny Sullivan confirmed that webmasters cannot optimise for BERT — just as they can't for RankBrain. Check your site traffic through the last days of October and in early November to see if it was affected.
If you find that traffic or overall visibility has been impacted (whether positively or negatively) testing where and how your content adheres to Google's quality guidelines would be a good first step. While this update's impact could be significant, it does not represent a change in Google's philosopy towards the kind of content it rewards with high positions.
Customisable Search Snippets Rolled out
Tweeting on 16 October, the official Google Webmasters account announced the roll out of a new markup that allows webmasters to customise how their featured snippets are displayed to users.
Previously discussed on the Webmaster Central Blog in September, the new markup allows webmasters to customise set attributes within snippets, including snippet length, video preview durations, and thumbnail sizes.
Webmasters that began implementing the markup back in September should now see their efforts reflected within snippet results.
Optimising the way your previews appear to users should be useful way to gain a competitive edge within featured snippet results. Check out our analysis of the September announcement in last month's Technical SEO Roundup.
Their statement reads that: "Bingbot will now render all web pages using the same underlying web platform technology already used today by Googlebot, Google Chrome, and other Chromium-based browsers."
The search engine states that the adoption will create less fragmentation of the internet, make life easier for SEOs and "ease" rendering engine concerns for web developers. Microsoft is committed to keeping the BingBot evergreen, which will mean updating its renderer to the newest (stable) version of Microsoft Edge.
This transition will be made "under the hood" over an unspecified length of time.
Bing Tests Content Submission API
Speaking at Pubcon on 9 October, Fabric Canel announced that Bing is piloting a content submission API. In theory, this means that the search engine would not be required to first discover a page before it can be considered for indexing, so long as a webmaster has already submitted it.
In the pilot, selected webmasters will be allowed to submit their URLs to the API, which will then deliver the payload of content, HTML, images, and other assets.
During a Webmaster Hangout on 29 October, John Mueller said that Google is neither working with Bing or creating its own Content Submission API. He also said that Google has no plans to make one in the future.
At present there is no way to request access to this pilot project. Should that change we will be sure to report it in a future round-up. Should the pilot be a success we would expect the feature to be rolled out to a larger set of users.
New Patent By Trystan Upstill Granted
Writing for the Search Engine Journal, Bill Slawski noted that Google's Head of Core Web Ranking and Mobile Content Search had a newly granted patent that discusses adjusted search features that may change search engine scores.
Gary Illyes Confirms There is no E-A-T Score
Speaking at Pubcon on 10 October, Gary Illyes confirmed that there is no singular E-A-T score that Google uses to rate sites, as there are multiple algorithms related to the expertise, authoritativeness and trustworthiness concept. Check out the write-up by Search Engine Land.