Goossips SEO: HTTP(S), JavaScript & Anchor Links

Goossips SEO: HTTP(S), JavaScript & Anchor Links

Some unofficial tidbits about Google (and sometimes Bing) and its search engine, gathered here and there over the past few days, with this week's agenda including answers to these questions: What are the consequences of a hidden HTTP page? Why should you avoid displaying "unavailable" before content loads? Why should you prioritize visible anchor text for links?


Gossip #1

A hidden HTTP page can cause a naming problem in Google.

Google's John Mueller revealed an unusual problem: an old, invisible HTTP homepage can cause malfunctions in the display of the site name and favicon in Google search results.

The context: a website was using HTTPS, but a default HTTP homepage remained accessible on the server. The catch? Chrome automatically upgrades HTTP requests to HTTPS, making this HTTP page invisible during normal browsing. However, Googlebot doesn't follow this behavior and indexes the wrong version. Google determines the site name and favicon from the homepage by reading structured data, title tags, heading elements, and other signals. If Googlebot reads a default HTTP page instead of the actual HTTPS page, it uses the wrong information.

John Mueller recommends two methods to see what Googlebot actually sees:

  • Use the command curl http://yourdomain.com in the terminal to display the raw HTTP response without Chrome's automatic upgrade.
  • Use the URL inspection tool in Search Console with a live test

If the response returns a default server page rather than your actual homepage, that's the problem.

Source : Search Engine Journal

Reliability rating: ⭐⭐⭐ I agree!

This case perfectly illustrates why technical auditing cannot always be limited to automated tools, even if it is an "all HTTPS" site.


Gossip #2

Do not serve "unavailable" using JavaScript

John Mueller strongly advises against displaying "not available" via JavaScript before the actual content loads. This practice can trick Google into thinking the page doesn't exist, preventing it from being indexed and ranked in search results. He recommends instead loading the entire content block directly via JavaScript.

If a client (like Googlebot) doesn't execute the JavaScript or only partially executes it, it will receive misleading information indicating that the content is unavailable. When Google crawls the page, it only sees the "unavailable" message and leaves, without waiting for another message to appear.

John Mueller compares this situation to Google's recommendation regarding noindex tags in JavaScript. Google advises against using JavaScript to change a meta robots tag from "noindex" to something else (in fact, there is no "index" tag, only the absence of noindex).

Source : Search Engine Roundtable

Reliability rating: ⭐⭐⭐ I agree!

From an SEO perspective, John Mueller's recommendation makes perfect sense. However, it can be frustrating to see Google still penalizing sites for JavaScript rendering issues in 2026.


Gossip #3

Use visible anchor text for your links.

John Mueller recommends always prioritizing visible anchor text for links to provide more context to search engines. In other words, don't just use the title attribute in links; make sure the links in question contain actual, visible anchor text.

Source : Search Engine Roundtable

Reliability rating: ⭐⭐⭐ I agree!

This recommendation isn't new, but it reinforces a fundamental SEO principle: clarity and visibility above all . It's best to avoid letting Google guess the intent of a link from hidden attributes (title, aria-label). It's safer to ensure the visible text explicitly states what the destination page is about.



To view or add a comment, sign in

More articles by Jaydeep K.

Others also viewed

Explore content categories