Development News

Google Announces New SEO Myth Busting Video Series

On 6 June Google announced its new SEO Mythbusting video series on the Webmasters Central Blog.

With an ample number of videos already published on YouTube, episodes are uploaded every week and discuss SEO topics from the perspective of a developer, as well as the evolution of technical SEO itself.

So far, the series has covered:

  • Googlebot.
  • Microformats and structured data.
  • JavaScript and its relationship with SEO.
  • Web frameworks.
  • Web performance.
  • The future of the web.
Action Point

Presented by Martin Splitt, the videos serve as digestible lessons around topics that often generate a lot of misleading or inaccurate information. A playlist of the mythbusting sessions has been created so that you can navigate through them with ease.

Yoast and Google Developers Propose XML Sitemaps for WordPress Core

A group of Yoast and Google team members have proposed the inclusion of XML sitemaps as a WordPress Core feature. The proposal also introduces the idea that an XML sitemaps API would extend functionality for both developers and webmasters.

Enabled by default, the sitemaps would allow indexing for a variety of content types, including:

  • Core Post Types (Pages and Posts).
  • Custom Post Types.
  • Core Taxonomies.
  • Custom Taxonomies.
  • Users (Authors).

The proposition will include the majority of WordPress content types and will meet the minimum requirements of search engines. However, it will not cover image, video, or news sitemaps.

Additionally, the feature will not cover the caching mechanisms of XML sitemaps or UI controls that exclude individual pages or posts from the sitemap.

The API will let developers manipulate XML sitemaps in several ways, and will be able to:

  • Provide a custom XML Stylesheet.
  • Add extra sitemaps and sitemap entries.
  • Add extra attributes to sitemap entries.
  • Exclude a specific post type from the sitemap.
  • Exclude a specific post from the sitemap.
  • Exclude a specific taxonomy from the sitemap.
  • Exclude a specific term from the sitemap.
  • Exclude a specific author from the sitemap.
  • Exclude specific authors with specific roles from the sitemap.
Action Point

Poorly optimised plugins can affect how sites run, and the proposal could mean that webmasters and developers can install less third-party plugins.

The proposal, therefore, has the potential to provide webmasters with more flexibility over the plugins that they install.

WordPress has appealed for suggestions and ideas to help the development of the proposal.

Wayback Machine Adds Changes Feature

On 26 June Cyrus Shepard tweeted that The Wayback Machine had added a beta feature designed to help users identify content changes on a URL over a given period. Using a colour coded system, the feature indicates the degree of relative change from one date to another.

Wayback machine adds changes feature  in beta
Action Point

The feature is an excellent addition for webmasters and SEOs that want to investigate changes to a page or domain over a period of time. If you’re already using the wayback machine to help with diagnosing site issues, this feature should offer a much more frictionless UI for comparing the evolution of a given page.

Guide to Mining SERPs for SEO, Content, and Customer Insights Published

On June 17 Rory Truesdale published a guide on the Search Engine Journal on how to get the most out of SERPs when conducting audience research.

Truesdale notes that out of the 100 results that a SERP will provide, there will be, on average, around 3,000 words, which is a sizeable amount of content and information. The ability to analyse the information provided is, therefore, highly valuable to webmasters and SEOs conducting research to understand the needs and desires of customers.

In the article, Truesdale covers a variety of techniques for doing this, including:

  • Using Python.
  • Analysing data through DeepCrawl or Screaming Frog.
  • Conducting SERP data and linguistic analysis.
  • Cleaning text for analysis.
  • NGram analysis and co-occurrence.
  • Part of speech (PoS) tagging and analysis.
  • Topic modelling from SERP data.
Action Point

The article provides a place to start if you’re looking to improve your ability to conduct research directly from a SERP. Try implementing one or more of these strategies next time you’re looking for insight into the types of content that will resonate with your customers. Additionally, this kind of research can provide you with an additional layer of data for improving on-page factors such as page titles, meta descriptions and rich media elements.

Back in February of this year it was reported that an upcoming feature addition to Chrome would allow for linking directly to a specific word or phrase on a page. Similar to the way that Youtube allows users to link to a particular timestamp.

Using Chrome’s Canary version, you can now test out this feature and link to any given word, sentence, or paragraph on a webpage using Scroll to Text.

Google has often been seen (both openly and behind the scenes) to push the benefits of long form content and this feature allows for users to more effectively point others in the direction of the most pertinent piece of information when dealing with pages that can boast a large word count.

Action Point

Read the full article to find out how to link to specific content on a page. The author, Disnan Francis, also wrote and published an earlier article that goes into more detail about how the feature works. If you have the Text Fragment Anchor set, you’ll be able to see it in action here.

New Features Described in Chrome 76 Beta

On 13 June the Chromium blog published a post detailing the new features within the Chrome 76 Beta channel release for Android, Chrome OS, Linux, macOS, and Windows.

The article highlights a detailed list of changes, including the new dark mode feature, improvements on the payment API, improvements for Progressive Web Apps, and the updating of WebAPKs.

The new release has also removed certain features, including the lazyload feature policy, outputs from MediaStreamAudioDestinationNode in the web audio node, and more.

Action Point

The article provides detailed descriptions of a range of new features and updates within the latest Chrome Beta channel, often providing more comprehensive resources on some of the more significant changes. You can also find a full list of changes on Chromestatus.com.

Google News

June 2019 Core Update

On 2 June Google announced the rollout of a broad core algorithm update. Named the “June 2019 Core Update”, while reportedly not as significant as some previous updates, saw sites such as the Daily Mail and CBS reporting large traffic drops in the immediate after period.

Searchmetrics also reported a surge in the number of video carousels appearing in Google’s U.S. results, and stated that a number of trusted aggregator sites have also received boosts in visibility.

Tweeting on 17 June, Danny Sullivan stated that the update is not like the Panda Update, although he said that: “The old questions about improving after Panda are useful because at their core, they’re about improving content — not that they are Panda-specific.”

Action Point

Knowing when substantial updates occur is extremely useful for tracking traffic increases or decreases, so webmasters understand when specific changes in algorithms have impacted them.

If you have seen dramatic changes in your traffic from the update, or any other known update, ensure to find out as much information about it as you can so you can determine what improvements you might need to make.

John Mueller Explains Why Google Can’t Offer Specific Advice Around Algorithm Updates

When it comes to core algorithm updates, Google and its employees have often refused to give specific advice about how to recover or “fix” sites once they have been impacted.

During a Google Webmaster Central office-hours hangout on 14 June, John Mueller said that the primary reason is that there are simply so many fundamental issues that a site would need to resolve.

That said, he did say to consider three factors that you should analyse if your site was impacted:

  • Does the site look outdated?
  • Do people recognise the authors of the site?
  • Are the author photos stock images rather than real images?
Action Point

Barry Schwartz has provided a full transcript. Mueller also referred to an article on the Webmasters Blog, which contains 23 questions that site owners should consider for their own websites.

If the June 2019 Core Update has impacted you, read through the questions and see if any apply to your site so that you can make the appropriate changes.

Google Rolls Out More Features for Small Businesses

Writing in The Keyword, Google announced the rollout of several new features for Google My Business on 20 June.

To attract more customers or clients, businesses now have the ability to:

  • Add welcome offers.
  • Claim short names and URLs.
  • Upload cover photos.
  • Enjoy more prominent logo features and photo features.
  • Create offline materials.

Google also states that it is adding both physical and digital badges of honour for businesses later in the summer.

Action Point

Adding as much description and relatable content for users is critical to help them find and understand your business, so ensure to make use of the new features to stay ahead of competitors.

Google launches new site for small businesses

On 27 June Google announced the launch of a new website for small businesses to introduce them to relevant Google products.

Named Google for Small Businesses, the site was launched on International Small Business Day, and includes personalised plans to guide small businesses through the myriad Google products that can help them in contexts both on and offline.

Action Point

If you run a small business or the aspiration to run one, we recommend taking a look through Google for Small Business so that you can learn about how to get started with Google’s products for businesses, which includes branding tools, local services ads, and Smart Campaigns.

Genius Accuses Google of Stealing its Content

As reported by The Wall Street Journal, media website Genius has accused Google of stealing its content and serving lyrics to users within information panels.

Speaking to The Verge, Ben Gross, Chief Strategy Officer at Genius, said that the company has: “Shown Google irrefutable evidence again and again that they are displaying lyrics copied from Genius in their Lyrics OneBox. This is a serious issue, and Google needs to address it.”

The site used a specific pattern of straight and curved apostrophes within songs to form a type of watermark. When converted to Morse Code, the message spells “Caught Red Handed.”

In response to the accusation, Google posted an article the next day, stating: “We do not crawl or scrape websites to source these lyrics.”

It continued: “The lyrics that you see in information boxes on Search come directly from lyrics content providers, and they are updated automatically as we receive new lyrics and corrections on a regular basis.”

Google has stated that it has asked its lyrics partner to investigate the issue to ensure that Google follows best practice procedures and that it will “soon include attribution to the third party providing the digital lyrics text.”

The story is particularly interesting to SEOs as Google has been accused of taking content from sites for rich results on a number of occasions. This is problematic for many reasons, especially when the search engine doesn’t attribute its sources. AJ Kohn, owner of the SEO site Blind Five Year Old, has also shown that click-through rates for search listings drop dramatically whenever a Lyrics OneBox appears in search results.

Action Point

Google remains hawkish about its potential for retaining visitors on the search engine’s own virtual real estate. The broadening scope and higher number of rich results and information boxes is undoubtedly a concern for publishers who rely on the pull of their content. It represents a topic worthy of continued interest from the SEO industry, and one that is unlikely to go away any time soon.

Google Penalising Unruly Favicons

In May 2019 Google announced that it would start showing site favicons in mobile search snippets.

As reported by SEO Roundtable, however, some sites have begun using favicons that do not meet Google’s guidelines, and are already being penalised for doing so. These guidelines are as follows:

  • Both the favicon file and the home page must be crawlable.
  • A favicon should be a visual representation of your website’s brand.
  • A favicon should be a multiple of 48px square.
  • A favicon URL should be stable
  • Google will not show any favicon that it deems inappropriate, including pornography or hate symbols.

So far, sites that have been seen to abuse these guidelines have had their favicons removed from mobile search.

Action Point

Favicons make up a part of a web entity’s branding and user experience. Ensure to stick to Google’s favicon guidelines so that your site does not lose the opportunity to display a favicon.

How Well Can Google Visually Crawl Videos?

Google crawls content and media so that it can understand their purpose, what information they provide to users, and their context within a website as a whole.

For video producers, the ability to let search engines know what your content is about is critical for exposure within search.

The question of how well Google’s crawler can visually analyse content within a video file, therefore, was asked by Justin Briggs on Twitter.

He also supplied information on how well Google can crawl videos depending on the kind of equipment used when using machine learning via the Video Intelligence API.

Action Point

If you’re a photographer, filmmaker, advertising company or other entity that relies heavily on video content, knowing how well Google can visually crawl your videos should offer advantages. Read through Briggs’ tweets to see what kind of equipment is better suited for videos in the online space and read through Google’s video intelligence documentation.

Preferred Domain Setting Removal in Google Search Console

In an article entitled “Bye Bye Preferred Domain setting”, Google announced on 18 June that it was sunsetting the Preferred Domain Setting within Google Search Console. From now on, when a website appears to be offering the same content on multiple URLs or protocols (e.g. http vs https, www vs non-www, rogue subdomains), the search engine will pick one URL as the “canonical” for Search.

Webmasters can indicate to Google what the preferential URL is by:

  • Using a rel=”canonical” link tag on HTML pages.
  • Using a rel=”canonical” HTTP header.
  • Providing a sitemap.
  • Using 301 redirects for retired URLs.

If a preferential URL is not implied via one of these methods, Google will “choose the best option” – which is to say, the option it believes to be best.

Action Point

Many sites suffer from this kind of duplicated content and, if that describes a site you own or work on, you should already be using the above methods to identify preferential URLs. However, as Google will begin ignoring any signals from your existing Preferred Domain configuration, it’s important to double check and deploy a strategy for dealing with any potential issues.

Leaving canonical URL selection up to Google is a risk not worth taking. We’d recommend providing as many positives signals to your preferred URLs as you can.

Social Knowledge Panel Markup Support Undergoing Depreciation

Tweeting from the Google Webmasters account on 25 June, Google announced that it was depreciating support for social profile markup.

This means that the search engine will no longer pay heed to markup added to pages specifically for social media knowledge panels.

Google says that it will now automatically discover profiles, and if webmasters have already claimed a knowledge panel, they can suggest which profiles should not be included in this discovery.

Action Point

The search engine still recommends that you claim your knowledge profile through Google Search, and you can find out how to make changes with this guide.

Mobile-First Indexing Features Added to Google Search Console

Taking to Twitter on 26 June, Google announced that it was adding mobile-first indexing features to Google Search Console.

Under property settings, you can now see which Googlebot is indexing your site.

Furthermore, if it happens to be Googlebot smartphone, you will be able to see the exact date that crawling switched from desktop to mobile.

Action Point

Understanding when a site moved from desktop to mobile-first indexing is important for SEOs, webmasters, and developers alike. If you know the exact date the switch happened, this might help you understand and fix crawling or indexing issues – especially if you have experienced ranking fluctuations afterwards.

Google Search Console Performance Report Showing 90 Days’ Worth of Data

The performance report within Google Search Console now shows up to 90 days worth of data. A significant upgrade from the previous 28 day limit.

Action Point

This small update is incredibly useful for webmasters and SEOs as it allows us to see how sites are performing in Google Search and Google Discover over the same time period. Google added the report earlier in the year. Read more about it in our April Technical SEO Roundup.

Email Alerts Sent for Top Query Changes

Google Search Console is now sending out alerts for changes in top queries.

The email shows the changes in ranking positions according to Google Search Console data and will arrive with the subject line, “change in top queries for your site.” An example of the email can be seen in the following tweet by Dawn Anderson:

Action Point

Google has sent out alerts for changes in clicks and impressions before, so this is not totally new. However, it is extremely useful information to know, so if you see the above subject line appear in your inbox, ensure to read what it says so that you make the necessary amends.

Sitemaps Indexed Counts Not Returning on Search Console API

John Mueller confirmed on Twitter that Google does not support showing the indexed count within sitemaps commands.

Writing on 11 June, John said: “As we removed that functionality from the UI a while back, we’re no longer returning sitemaps indexed counts in the API either.”

He also apologised, saying: “We should have communicated this better – sorry for the hassles.”

Action Point

This confirmation suggests that indexing figures from the Search Console’s Sitemaps report have been inaccurate for some time. If this is a metric by which you currently measure indexed pages we’d recommend it be depreciated in your reporting. The Index Coverage Status report is likely intended to provide this information instead.

Bing News

Bing Introduces Batch Mode for Adaptive URL Submission API

Writing on 12 June, Bing announced that it had launched the batch mode capability for its Adaptive URL Submission API.

This allows webmasters to submit URLs in batches to avoid excessive API calls when submissions are made on an individual basis.

An example request for the Batch URL Submission API for supported protocols can be seen below for a JSON request sample:

POST /webmaster/api.svc/json/SubmitUrlbatch?
apikey=sampleapikeyEDECC1EA4AE341CC8B6 HTTP/1.1
Content-Type: application/json; charset=utf-8
Host: ssl.bing.com

{
"siteUrl":"http://yoursite.com",
"urlList":[
"http://yoursite.com/url1",
"http://yoursite.com/url2",
"http://yoursite.com/url3"
]
}

Successful submissions will result in a HTTP 200 response, and the URLs will be checked for compliance with the Bing Webmaster Guidelines.

Once or if they pass, the URLs will be crawled and indexed within a matter of minutes.

Action Point

Read through the API key and Batch Submission API documentation for more details and bear in mind that batch submissions are maxed out at 500 URLs per request. The total limit for a 24 hour period still applies.

Yandex News

Turbo Page Information Now on Site List Page

The page list of sites is one of the most popular areas of Yandex Webmasters as it supplies a significant amount of data and information.

On 20 June Yandex announced that site owners would be able to view even more data, as information on Turbo pages is now present within the section. This includes:

  • Turbo page notifications.
  • Total number of Turbo pages present.
  • Detected Turbo page issues.
  • Autosparsing information.
Action Point

The new information allows webmasters to quickly identify and fix information within Turbo pages, so ensure to familiarise yourself with the new section and keep an eye out for the new alerts.

Additional reading

GIFs are becoming an ever-popular way to communicate quickly and effectively. As a result, Google now allows users to share GIFs directly to apps from Google Images.

Google Explains How it Fights Fake Businesses in Maps

Scammers have long plagued Google Maps with fake businesses. On 20 June Google [published an article to explain how it was combating the issue.

Google Rolls Out Q&A Auto Suggest Answers

Google rolled out a Knowledge Panel feature that automatically suggested answers to new Google Q&A questions in April. In the start of June, the search engine rolled the feature out to desktop browsers.

Yelp Offering More Paid Profile Upgrades for Business Owners

On 25 June it was reported that Yelp is offering a range of new profile upgrades for paying customers, only a week after Google announced its new promotional tools in Google My Business.

Image Search is in a state of constant evolution. If you’re looking for a comprehensive guide about best practices, this guide by ContentKing is currently one of the best.