Chromium Project Drops Support for FTP
It’s a topic that’s been under discussion and debate since 2014, but the axe has more or less swung for FTP support.
Although an app and was originally considered as a replacement, the Chrome team has decided that “not secure” warning labels will soon be added to Chrome 63, which has a December 2017 release date.
The team admits that this was not in the original project plan, but state that FTP security properties are slightly worse than HTTP if delivered in plaintext without an HSTS-like upgrade.
This might affect open source projects that are currently using FTP to download packages.
Action: Public FTP servers can be moved to an HTTPS connection, while private FTPs should not be indexed. But use an FTP client.
Google AMP URLs With usqp=mq331AQCCAE= Is A Test
It has been noticed that Google has been serving two variants of AMP URLs to users.
One is the regular URL and the other is the same URL, with
After being questioned, Malte Ubl at Google stated on Twitter that the second URL relates to a Google experiment for more advanced transformations on the AMP cache.
Google has already sliced loading times with bandwidth usage from images being cut by 50 per cent without any noticeable visual difference.
There are already a range of search engines, social sites, and bookmarking sites supporting the project, which now has 2 billion pages from 900,000 domains across the web.
Action: With many improvements and changes still to come, it’s likely that webmasters will notice these little experiments in the undercurrents.
ReactJS 16 Moves to MIT license
Facebook has announced that it will relicense several of its open source projects including React, Jest, Flow, and Immutable.js under the MIT license.
Admitting several weeks of “disappointment and uncertainty”, the social giant stated that React is, in fact, the “foundation of a broad ecosystem of open source software for the web”, and that it did not want to hinder such progress.
After failing to convince its tech community that its BSD + Patents License provides at least some benefits, the company stated that it was going to leave the door open for lost developers, despite many other popular projects keeping their BSD + Patents license — for now.
Action: With the license having been updated during the last week in September, it is expected that businesses will now be more willing to adopt to ReactJS.
Apple Switched From Bing to Google for Siri Web Search & Spotlight on Mac
With 60 per cent of searches now taking place on mobile devices, and with an increasing amount of people using their voice to conduct these searches, it is perhaps time marketers begin thinking more about optimising for voice search.
In a significant move, Apple has proven that it too is thinking about voice search by switching from Bing to Google for its Siri Web Search and Spotlight services.
Why? One of the core principles of Apple has always been consistency, and Safari on Mac and iOS already use Google search results as the default provider, so the switch means that there will be consistency across Apple devices and apps.
Despite this, web image search results from Siri will still come from Bing for the time being due to the fact that Bing provides superior results in this case, according to TechCrunch.
Action: For those that are already optimising and catering for voice search, it seems ever clearer that Google is still the search engine that you want to be prioritising.
Further reading: http://searchengineland.com/optimize-voice-search-273849
Google Kills off Fetch and Render for Apps
Using Google+ as its platform for announcement, Google has stated that it is to remove some app indexing features within the Google Search Console.
With Fetch being the most noticeable, the company said that it wanted to avoid unnecessary duplication within the Firebase help documentation.
Surviving a mere two and a little years, it could be expected that all features for deploying app indexing will be removed, with Google pouring its attention into Accelerated Mobile Pages, Progressive Web Apps and other technologies.
Action: Google advises smartphone app creators to check out Firebase App Indexing for more information.
Google Chrome Will Stop Trusting Old Symantec SSL Certificates
After discovering in 2015 that some Symantec certificates didn’t adhere to industry standards, and that the company had outsourced jobs to other businesses, Google has announced that Chrome will soon stop trusting Symantec certificates issued before June 2016.
Starting with version 66, webmasters need to switch to a Certificate Authority trusted by Google before the April release date (although it is worth noting that 66 will be released to Chrome Beta users in March).
When Chrome 70 arrives in October, Google expects a full shut-down of trust in Symantec’s pre-June ‘16 certificates, meaning that there is now a 12 month deadline to disavow any old SSL certificates.
Google notes that site operators will need to obtain certificates from Symantec’s existing root and intermediate certificates may do this from old infrastructure until December 1.
These however, will need to be replaced again before the release of Chrome 70.
Mozilla has also agreed to match the dates proposed by Google.
Action: If you have them, replace any old Symantec certificates with a Certificate Authority trusted by Google. It might also be worth noting that Symantec is actively helping webmasters switch to DigiCert certificates.
Bing Introduces Fact Checking Label
A year after Google introduced Fact Check labels, Bing has now announced its own Fact Check label in search.
Powered by the same ClaimReview Schema markup, this means that if a user conducts a search, the result might be accompanied with a statement that reads “fact checked by Snopes”, alongside a true or false statement indicating an article’s validity.
Bing notes that webmasters who want to add the schema markup should consider transparency over sources, with citations included, easily identifiable claim checks, and a full summary of the fact check and evaluation conducted.
Recent Baidu Updates
As the Chinese based search engine, Baidu continues to grow, it is worth noting that they made a series of updates throughout September.
The first point of interest is the fact that Mobile Instant Pages (MIP) have indexed a large quantity of AMP pages. This is significant because MIP is designed for Chinese users who use different browsers and behaviours than those in the West.
With over a billion pages now on MIP, it is advised that websites deploy MIP instead of AMP if they are only serving the Chinese mainland.
Another worthy update is the fact that there is now a desktop bot and a mobile crawling bot after Baidu released new page-rendering capabilities in March.
The desktop version will show:
Mozilla/5.0 (compatible; Baiduspider-render/2.0; +http://www.baidu.com/search/spider.html)
And the mobile version:
Mozilla/5.0 (iPhone; CPU iPhone OS 9_1 like Mac OS X) AppleWebKit/601.1.46 (KHTML, like Gecko) Version/9.0 Mobile/13B143 Safari/601.1 (compatible; Baiduspider-render/2.0; +http://www.baidu.com/search/spider.html)
Action: You can check from the IP for a real Baidu bot simply by using the
host script in Linux or
nslookup in Windows.
Further reading: http://searchengineland.com/seos-need-know-baidu-2017-281275
Solved: Google Search Console robots.txt Tester Inconsistencies
We conducted some experiments on Google Search Console robots.txt tester after Giuseppe Pastore and Liam Sharp (Screaming Frog) discovered some inconsistencies with the tool.
We found that the robots.txt rule checker was escaping percentage-escaped URL paths when using the testing tool which could lead to an infinite loop of checking.
Action: Use UTF-8 encoded URL paths when using Google’s robots.txt rule checker.
Three Equifax breaches in one month
Difficulty: N/A They say that bad luck comes in threes and that really was the case for Equifax.
1. Firstly, its servers were hacked.
Over 143 million customers had their personal information exposed to hackers thanks to a web server vulnerability in the company’s open-source software; Apache Struts.
After admitting hacks between May and July, the credit check company saw its shares tumble as it became clear that the hacks were some of the largest and most serious ever recorded.
Forcibly retired Equifax chief executive, Richard Smith, was slammed by congress earlier in the week and was told that “no law can fix stupid.”
2. The company used admin/admin credentials for securing an employee database.
After some initial guesswork, a cyber security firm was able to uncover personal employee information on an Equifax website based in Argentina.
The database, which included social security numbers, was secured by the username and password, “admin”.
Alex Holdon, the chief information security advisor, said simply that “you don’t expect anything like that.”
This is especially infuriating when you consider that in Equifax’s own security advice article, it states that having, “long, unique combinations of numbers and letters, both upper and lower case, can help prevent [breaches] from happening.”
3. The company’s own staff were fooled
Equifax tried to help its 143 million customers see if they have affected by the recent data breach by setting up a dedicated website: www.equifaxsecurity2017.com.
By anyone’s guess, it looks and acts like a very spam-like domain from the start. Equifax set up a redirect from www.equifax.com/security to www.equifaxsecurity2017.com which still seems odd. Furthermore, it did not communicate this with staff and customers needed to verify their identity and submit more personal details to the new website.
Getting the correct new security domain to rank would be a challenging task on its own and Equifax could have solved the problem by using the
/security path and reverse proxying the underlying web application.
Meanwhile, security researcher Nick Sweeting set up a phishing (fake) website that cloned the original Equifax website on to the new domain www.securityequifax2017.com (now returning a forbidden error and Google has classed it as a "dangerous site").
The Equifax social media team tweeted eight times to Nick’s fake website.
Luckily, Nick abides by security ethics and later made an announcement which led to further criticism of the flailing company. A malicious hacker could have collected even more data.
Google does a very good job of burying the majority of phishing sites, but the threat lies with fresh sites and bad actors who want to hit vulnerable people.
Nick confirmed to me on a Twitter Direct Message that zero SEO had been performed to get the URL ranking. SEMRush shows that the domain now ranks for ~1,200 searches, but this is most likely due to the publicity it received after the announcement.
Google’s algorithm may factor in a “freshness”, which even enables cloned websites to rank for a very short amount of time.
Unfortunately, we can expect malicious hackers have learnt from Nick’s exploit and may use it for other nefarious means for future hacks.
Action: If communicating important information with clients, do it through your main website rather than setting up microsites.