Google says the best way to hide a website from search results is to use a password, but there are other options you can consider as well.
This topic is explored in the latest installment of the Ask Googlebot video series on YouTube.
Google’s John Mueller replied to a question about how to keep content from being indexed in search and if that’s something websites are allowed to do.
“In short, yes, you can,” says Müller.
There are three ways to hide a website from search results:
- Use a password
- Block crawling
- Block indexing
Websites can either refuse indexing entirely or allow them to be indexed and hide content from the Googlebot using a password.
Blocking content from Googlebot does not violate webmaster guidelines as long as it is blocked from users at the same time.
For example, if the website is password-protected when it is crawled by the Googlebot, it must also be password-protected for the users.
Alternatively, the website must have instructions to prevent the Googlebot from crawling or indexing the website.
Problems can arise if your website serves the Googlebot differently than the users.
This is called “cloaking” and is against Google’s guidelines.
With this distinction, you will find the right options here to hide content from search engines.
3 ways to hide content from search engines
1. Password protection
Locking a website with a password is often the best approach if you want to keep your website private.
A password ensures that neither search engines nor random web users can see your content.
This is a common practice for websites in development. Publishing the website live is an easy way to share work in progress with customers while preventing Google from accessing a website that it can’t yet view.
2. Block crawling
Another way to prevent Googlebot from accessing your website is to block crawling. This is done with the robots.txt file.
This method allows people to go to your site with a direct link, but it won’t get tracked by “good” search engines.
This is not the best option, says Müller, as search engines may still index the website’s address without accessing the content.
This rarely happens, but you should be aware of this possibility.
3. Block indexing
The third and final option is to block your website from being indexed.
To do this, add a noindex robots meta tag to your pages.
A noindex tag tells search engines not to index that page until after they crawl it.
Users do not see the meta tag and can still access the page normally.
Mueller’s final thoughts
Mueller concludes the video by recommending Google to take the password route:
“Overall, we recommend password protection for private content. It’s easy to check if it’s working and prevent anyone from accessing your content.
Blocking crawling or indexing are good options when the content is not private. Or if there are only parts of a website that you want to prevent it from appearing in search. “
Check out the full video below:
Recommended image: Screenshot from: YouTube.com/GoogleSearchCentral, November 2021.