Development News

Google Adds Movie Structured Data

On 5 September Google added movie structured data documentation to its Google Search reference guides. The new properties include:

  • aggregateRating: An annotation for the average review assigned to the movie.
  • dateCreated: The date that the movie was released.
  • director: The director of the movie.
  • review: A nested review of the movie.

On the same day, writing in The Keyword, Google announced a new movie search interface for mobile devices that recommends movies to users based on their viewing preferences.


If your content includes movie reviews or features, consider adding the markup to your pages to take advantage of the new mobile search function. You can also find a list of additional movie related data that Google might support in the future at

Google News

Google Announces September Core Update Rollout

Taking to its Search Liaison Twitter account, Google announced the rollout of a new core update on 24 September:

The September 2019 Update marks the second time that Google has given webmasters notice of a coming core update.


Should a core update negatively impact your site’s traffic or rankings, it’s important to look at the impact through the prism of Google’s quality guidelines, as well as any information released prior to or after the update. We’d recommend you assess your site holistically, rather than looking for a single “silver bullet” resolution.

A good place to start if your performance issues have has begun recently would be Google’s core update guide, released in August 2019.

Google Search Quality Rater Guidelines Updated

SEM consultant, Marie Haynes, took to Twitter on 5 September to announce that Google had updated its Search Quality Rater Guidelines.

Previously updated on 16 May, the new version prioritises original news reporting and also puts greater priority on vetting YMYL content.


Search Engine Land has compiled a comprehensive list of changes within this article, which we’d recommend reviewing even if your site hasn’t seen any downturn in performance after recent updates.

Auto DNS Verification Added to Google Search Console

Writing on the Google Webmaster Blog on 3 September, Google announced that it was adding a new method to verify websites in Google Search Console. Auto-DNS verification works in collaboration with selected domain name service providers to automate “part of the verification flow.”

Google states that: “The flow will guide you through the necessary steps needed to update your registrar configuration so that your DNS record includes the verification token we provide. This will make the verification process a lot easier.”


To verify your domain using the new flow, click “add property” from the property selector in the Search Console sidebar and choose the “Domain” option. The system will then guide you through the necessary steps. This will include a visit to the registrar site.

You can also read up on other methods to verify your website in this Search Console Help guide.

Search Console now Reporting Fresh Data

In response to feedback, on 23 September, Google announced that Search Console would now report fresh data that is less than a day old.

This now means that webmasters can monitor and track performance during essential days and events, such as crucial shopping days or national holidays.

If an important technical issue has been fixed, it also means that webmasters can quickly monitor and check site traffic.

Google Search Console Performance Report Screenshot

Google notes that neither the Search Analytics API or the Discover performance report support fresh data, although this is to be addressed in the future.


The ability to monitor and analyse fresh data is beneficial during busy periods and times when a webmaster has implemented technical changes to a website. Familiarise yourself with the fresh data and bear the new functionality in mind when analysing busy or important periods.

Writing on 10 September, Google announced that it was adding two new link attributes to provide webmasters with ways to identify the nature of a link. These are:

  • rel=”sponsored”: This is to be used for when a link is created as part of a sponsorship, advertisement, or a compensation agreement.
  • rel=”ugc”: This is the User Generated Content (UGC) attribute, which is to be used for links within content generated by users.

More than one attribute can be used at one time. For example, if a link is found within UGC content and is also sponsored, rel="ugc sponsored" is a valid combination of attributes.

In the case of accidentally using the wrong attribute, this is only a concern for sponsored links. For example, if a webmaster flags a non-ad link as rel="sponsored", it means that the search engine might not credit the authority of the link to the recipient page.

Google states that webmasters do not need to update their current nofollow attributes.


Highlighting the function of a link is incredibly useful for sites that often display different types of links on their pages. Google has provided a small FAQ section on the bottom of its post, so ensure to read through them to get a full understanding of the new attributes.
Adapt your internal linking strategy to be mindful of the new attributes moving forward.

Google Hands Over Snippet Control to Webmasters

On 24 September Google announced that it was providing more control to webmasters so that they can dictate how their sites are previewed in listings.

Going live in the second half of October, webmasters can apply the settings through a list of robots meta tags or an HTML attribute.

These can be added to an HTML page or specified via the x-robots-tag HTTP header:

  • Nosnippet: This option is unchanged.
  • Max-snippet:[number]: This meta tag lets webmasters specify a maximum text-length in characters for a snippet.
  • Max-video-preview:[number]: This meta tag allows you to specify a maximum duration of seconds for an animated video preview.
  • Max-image-preview:[setting]: This meta tag lets you specify a maximum size of an image preview using the terms “none”, “standard”, or “large”.

There is also the possibility of combining the tags for if a webmaster wanted to control the snippet and image preview together. For example:

<meta name="robots" content="max-snippet:50, max-image-preview:large">

Aside from the above meta tags, webmasters can also use an HTML attribute to prevent part of an HTML page from being shown within the textual snippet. For example:

<p><span data-nosnippet>Coca Cola</span> is probably the most popular carbonated drink in the world.</p>

Google states that the above instructions are firm directives, rather than hints that it may choose to ignore.


Once live, Google will announce the roll-out on its Google Webmasters Twitter account. Although the new tags won’t directly impact your rankings, they might affect how certain listings are shown in rich results or featured snippets.

The new settings give webmasters more flexibility over how their websites are shown. Consider implementing the new features on appropriate pages ahead of the roll-out.

Writing in The Keyword on 12 September, Richard Gingras, Google’s VP of News, stated that over the past few months, Google has modified its algorithm to give preferential treatment to original reporting.

Gringras wrote that: “Recently, we’ve made ranking updates and published changes to our search rater guidelines to help us better recognize original reporting, surface it more prominently in Search and ensure it stays there longer.” This means that original reporting should not only take up a higher ranking positions, but it will also stay there for a longer period.

Earlier in September, Google updated its Search Quality Rater Guidelines and instructed raters to give the highest ratings to “very high quality” original news reporting.

The updated guidelines also ask raters to consider a publisher’s overall reputation for original reporting, and the changes are not confined to Google News, as the changes also impact Google Search and Google Discover.


If you are a news publisher, it might be harder for stories to perform if they are rewritten from other sources.

Google has not stated how it determines original reporting, but as the algorithm change has taken place over the past few months, if you have seen fluctuations in your rankings or visits, this could be the cause.

Content Accuracy is not a Ranking Factor

Discussing the subject of content accuracy on Twitter, Danny Sullivan conceded that Google currently has no way of verifying the accuracy of content, which means that it cannot be a ranking factor.

Machines can’t tell the “accuracy” of content. Our systems rely instead on signals we find align with relevancy of topic and authority. See: and— Danny Sullivan (@dannysullivan) September 9, 2019

As Sullivan states, Google relies on a system of signals derived from the relevancy of topic and the authority of the site. In November 2017 Google teamed up with The Trust Project to bring higher amounts of transparency to news content.

The move allowed publishers to add up to eight trust indicators, which were:

  • Best practices
  • Author expertise
  • Type of work
  • Citations and references
  • Methods
  • Locally sourced
  • Diverse voices
  • Actionable feedback

You can learn more about The Trust Project back in our November 2017 Technical SEO Roundup.


The debate on whether content veracity could act as a ranking factor for websites has raged on for some time. This statement reveals that, although important from a user perspective, topic relevancy and site authority are the signals that Google uses to rank content.

Bing News

You can now Import Your Sites from Google Search Console to Bing Webmaster Tools

On 13 September Bing announced that webmasters can now import their websites across to Bing Webmaster Tools directly from Google Search Console.

Bing states that the imported sites will be auto-verified; eliminating the need for manual verification.

Importing a website takes only four steps:

  • Sign into Bing Webmaster Tools.
  • Navigate to My Sites and click “import”.
  • Sign in with your Google Search Console account and click “Allow”.
  • After authentication, Bing Webmaster Tools will display the list of verified sites present in Google Search Console. Select the sites you want to add and click “import”.

Using the above method, webmasters can add up to 100 websites at a time (although the limit of 1,000 still applies).

Note that Bing will periodically sync with your Google Search Console to validate your ownership of the imported sites. This means that if you lose access to your Google Search Console account, you will have to import the sites again or verify them using alternative methods.


The change comes as Bing admits that verification has been a “pain-point” for webmasters in the past. If you are yet to verify your websites with Bing Webmaster Tools, the new method offers a quick and simple way of doing so.

Additional Reading

Google Waves Goodbye to the old Search Console

In a Webmaster Central Blog post on 9 September Google announced that webmasters trying to get to the old homepage or dashboard would be redirected to the relevant Search Console pages.

On 19 September Google announced through its Google Webmasters Twitter account that breadcrumb structured data reports are now available in Google Search Console.

Writing in The Keyword on 17 September, Google announced that it had launched “key moments”, wherein content creators can now define key moments within a video so that the tagged segment is shown to users in Google Search.

Businesses Given the Ability to opt-out of Google’s Food Ordering System

An “opt-out” form is now available for companies that do not wish to be a part of Google’s online food delivery service.