We are happy to share our notes from a Hangout discussion with webmaster analyst John Mueller from Google and Czech SEO specialists from the 26th of June, 2017. In this discussion, John revealed interesting new information about Google behavior.
Note: I originally wrote this article on Czech SEO blog, Bloxxter.com.
Can we expect a rapid increase of featured snippets in SERP of other European countries?
The Google team is working on spreading awareness of the featured snippets and finding a way to include them into the SERP so that they will provide value. John thinks we will see more of them, but can’t promise anything.
How do you count the crawl budget for pages rendered on your site? Does JavaScript rendering googlebot count RPCs, JS and CSS files in the crawl budget? Or is the crawl budget only for HTML pages?
JavaScripts, CSS files and images are all requests that need to be rendered. In Google, they do a lot of aggressive caching on their side, which means that if they have seen a JS file before and they assume it hasn’t changed, they are not going to fetch it again. The bot is not rendering the full page with every file every time. While crawling the page, it crawls HTML and reuses cached images, CSS or JS files. If you make a significant change in JS or CSS files, write the version of those files right in the URL, so that the engine knows it’s a new file and it has to crawl it.
Sometimes people put too much emphasis into the crawl budget. For almost all websites, it’s typically not a problem.
In case the same content is located in http and https URL (a redirection is missing for some reason), do any incoming links count for each URL separately?
As these URLs have the same content, in Google they fold them together and deal with them as one URL. All of the links are counted for the one cannonical URL. John still recommends setting up a redirect or using a cannonical. That is the best practice if you want to be able to choose the URL, rather than use the one they decide to choose. Then you know where to find your data easily under one URL in the Search Console.
Google ignores meta descriptions and creates its own description for search results. How can they be adjusted in order to be shown in the SERP? Based on Google choosing a different description for my page, did the importance of a unique meta description decrease for Google?
In Google, they try to use meta descriptions as much as they can, but they also try to adjust a snippet based on what the user is searching for. So if the user is looking for something that you don’t mention in the description, maybe they will choose something from the content. It is all about relevancy of the content and the query.
For analyzing, John uses the Search Console to check which queries have the higher number of impressions, and then he tries to Google those queries by himself. Then he can see what meta descriptions come back in Google.
In case Google picks a description from the page content, then that means that the meta description is probably not similar enough to the query. In that case, you may consider expanding the meta descriptions to cover all those keywords.
Meta descriptions are not a ranking factor. You don’t need to stuff them with keywords.
Is it OK for Google to use for pagination combination rel=next/prev and index, follow on the first page of pagination and noindex, follow from the second page?
In practice, we have no problem with that. With Google, they won’t be able to show those pages in the SERP though, so keep that in mind while setting the page up.
Are the values of links in the dropdown menu decreased?
No, we crawl and treat them as any other internal links on the website.
Are Sitewide Links still bad for Google ranking in 2017? Do they harm the link profile of a website?
Natural links are OK. Whether dealing with sitewide or non-sitewide backlinks, there is the same rule as for any other backlinks.
Do relative internal links, instead of absolute links, have any negative impact? Can relative paths be used in hreflang attributes and pagination attributes?
No, not at all. It is up to you which links you want to use. They can be used in hreflang and pagination. John doesn’t recommend using them in rel cannonical because it’s easy to get them wrong.
Why does Google sometimes choose to create featured snippets by using terrible web with terrible content?
This is one of many questions about Google that John can’t answer clearly. Web searches are not done, and are not complete. They try to correct these cases, and the engineers try to figure out why sometimes results are stupid or tricky. They continue to work on it and try to improve it.
What do you think of broken link building? Is it natural in your eyes if a broken link is replaced by another one?
When you have a deal with another website that you provide them with information, it is not at all problematic.
Rich cards should be global now. We have implemented them in recipe sites about one month ago. It is implemented correctly, according to the structured data tester. But we don’t see it in the Rich Cards section in the Search Console. What is the problem?
John is not sure which of these structured datas are available and in which locations.
Check locally to see which ones are shown and which are not shown yet. Maybe it’s just that they are not being shown in the specific country. Google takes it step by step, and can’t do everything globally all at once. It is difficult even internally to know which feature is available in which country and to guide people to implement them more.
Does Google index the content of iframe?
These pages are rendered, so yes, it can happen. With Google, they can take the content of the iframe and use it in a meta description in SERP. It also can happen that the page with iframe can be ranking higher than the original page. They try to show the original, the cleanest source, although it is not wrong to also show the iframe version.
Is there a “brand” relationship between different domains of an international company? In other words, would there be a benefit if an international company had one single domain with language and product mutations?
Different country code versions or just one version of website – this is more of a marketing question than an SEO question. If the pages exist on one website or on multiple websites, it is pretty much the same thing.
From the SEO point of view, in some situations it will follow to the single domain version that is being ranked higher, and in other cases it will follow the multiple domain version, because we can see this is something more specific to individual countries and languages. John can’t say which is better though; it depends.
What about links in hidden content, such as “read more”, “accordion” or “tabs”? Do you consider them totally the same as links in visible part of page?
In general, if content is not visible, we treat to it as such. With regards to the links, we deal with them as with others on the page.
When do you place HTML after JavaScript render into Google cache?
They only show the HTML version of the page in a cached version. Sometimes JS can run in a cached version, but it depends on the security settings on the how you set up the JS, and this doesn’t mean that this content is actually in the cached page. John doesn’t see that changing at this moment.
Sitelinks – I know Google should be able to choose the right sitelinks. But what do we do when they aren’t?
Sitelinks are handled as normal web rankings, and sometimes they’re shown in the sitelink style.
If you don’t want a page to be shown, you use noidex. That can be tricky though, because sometimes you just don’t want that page to be linked to that page. For now, it can’ be specified in any manual way. If it is very bad, you should post it on the Webmaster Help Forum.
Do you plan to report crawled but not indexed pages in the Search Console?
In Google, they are working on it, but the whole issue is tricky. Because technically, sometimes you can say that in noindexed pages there is an error, but it is nothing to be concerned with immediately.
For instance, you have a redirected URL in the sitemap, and from our point of view, you should have the final destination listed. From the practical point of view, it still works, and we can still index the page. We can’t say that it is some problem you have to fix, that it is a critical problem, because it is only a small problem. It’s difficult to find a balance while programming that.
Want to know more?
The next Google Webmaster Central office hours hangouts are announced regularly on the Google Webmasters page.
Watch the next hangout on 30th June 2017 at 9am CET here: https://www.youtube.com/watch?v=sQBPSxrbE8g
Watch previous hangouts here:
27th June 2017 – https://www.youtube.com/watch?v=sQBPSxrbE8g
16th June 2017 – https://www.youtube.com/watch?v=qMT0llp5sf4
12th June 2017 – https://www.youtube.com/watch?v=byDfpbOTSMI
2 comments