Someone asked Google’s John Mueller if blocking their use of ad blocking detection would cause Google to treat the page as a cloaking. Mueller explained what the cloaking is and why it didn’t stop Google from stopping ad blockers from detecting scripts.
What is cloaking?
Cloaking is an old black hat technique in which web pages display different content depending on whether the site visitor is a search engine robot or a normal human user.
In the past, adding keywords multiple times on a page can help the page rank better. That is the so-called “keyword spam”.
However, the page looks very bad and unreliable, and visitors tend to leave the page instead of clicking on affiliate links and earning referral fees for website owners.
Therefore, what spammers do is show a page full of keywords to search engines to help their ranking.
But for human users, the web page will display a decent and normal page, which can be more converted because it doesn’t seem to look spammy.
Cloaking definition as per Google
“Serving a page of HTML text to search engines, while showing a page of images to users
Inserting text or keywords into a page only when the user agent that’s requesting the page is a search engine, not a human visitor”https://youtu.be/BFUxzfGIuDY
Q1. Cloaking could be happen due to Ad blocker?
Those person who asked the question said they are considering adding an anti-ad blocker to their website. An ad blocker will prevent visitors from using the blocker to view content.
Its purpose is to train visitors to whitelist the website so that they can view the content and advertisements.
What will be the impact?
Visitors who don’t have ad blockers have a greater right to give them permission to read your content.
Visitors who have ad blocking enabled have fewer permissions and are unable to read the content.
John Mullers Explanation
“Probably not. I think in general that would be fine.
I would kind of see that as a way of recognizing that Googlebot doesn’t actually have an ad blocker installed.
So it’s kind of a unique setup that Googlebot has with regards to rendering pages and I think that would kind of be okay.”
Mueller believes this is different from the content displayed by humans and the Google bot. He believes this is because Google does not have an ad blocker, so he has the right to view the content.
“In regards to cloaking, the cloaking team mostly tries to watch out for situations where you’re really showing something different to users as to Googlebot.
And with regards to.. ad blocking or …other kind of things where it’s like you have to be logged in to actually see the content and that’s kind of different.”
Mueller went on to point out that he didn’t like the “anti-ad-blocking settings,” but admitted that if the site needs to do it, then this is the “appropriate method.”
Q2. Ad-blocking overlay on top of the content would create indexing problems?
“If it’s an HTML overlay on top of the existing page then I don’t see that as being problematic because we would still see the actual content in the HTML, kind of, behind that.
That’s similar to like if you have …a cookie banner or a cookie interstitial that you’re essentially showing just an HTML div on top of the page.
From our point of view if we can still index the actual content from the page then that’s fine.”
Cloaking is something very specific with a certain intention of misleading Google and website visitors to achieve higher search engine rankings. cloaking is showing unique content to search engines for ranking purposes, which is completely different.