A “subdomain” refers to the portion of site architecture that lives at any level below the top-level domain name.
So, all of the following:
- https://www.example.com
- https://blog.example.com
- https://news.example.com
- https://randomly-made-up-site.example.com
would be “subdomains” of “example.com.” Each of these features a protocol (HTTPS) and subdomain before the main domain name, “example.com.” (Yup, “www” is technically a subdomain!)
Subdomains allow webmasters to organize their content, so they can maintain an ecommerce store on one domain, for example, and a blog on another.
However, Google treats subdomains as separate sites. While “blog.example.com” and “example.com” may share a root domain, Googlebot will view them as distinct domains and crawl and index them accordingly. This has caused tons of confusion and leads us to the big question:
Does Google Index Subdomains?
The short answer is yes, Google can and will index and rank subdomains unless you explicitly take steps to ensure they’re excluded from its index.
Google’s entire business model is based on discovering content. The same goes for all search engines. If a page has unique content and can be crawled/indexed, it most likely will be since it supports the search engine’s goal.
In fact, if Google didn’t index subdomains, there would be no “www” websites in their entire index. This probably seems surprising since “www” sites are generally viewed as the default online, but “www” sites are actually the most common subdomains.
Despite the visible indexation of millions of subdomains within Google’s results, there continue to be lingering questions regarding Google’s crawling and indexing of subdomains. This is because the use of subdomains can create challenges for and impact SEO efforts.
We cover these considerations at length in our subdomain vs subdirectory blog. While you should review these factors when making decisions regarding your site architecture and how it can impact your site’s SEO performance, they don’t impact the fundamental question of whether or not subdomains can rank on Google.
When Will a Subdomain Not Be Indexed?
While the default behavior for Google is to index subdomains, there are certain situations where subdomains will not be indexed.
These include:
There Are No Links Pointing to Your Subdomain
Google discovers and prioritizes the indexation of URLs based on the links pointing to those URLs. These links can be from external domains or from other accessible subdomains on the same root domain.
If Google does not come across links to your subdomain when it’s crawling your site or another site, then it won’t be able to find your subdomain and thus won’t index it.
The only exceptions to this would be if:
- You submit the XML sitemap for your subdomain to Google via Google Search Console.
- Your XML sitemap index includes URLs for your subdomain. These would essentially function as external links to your subdomains.
- There were previous links to your subdomain that allowed it to be discovered, however, those links have since been removed. In a situation like this the subdomain will likely remain “stuck” in the index.
The Subdomain Uses Noindex Tags
You can explicitly block the indexation of specific URLs across your website by utilizing noindex tags. You can make sure a subdomain isn’t included in Google’s index by using either noindex <meta> tags or noindex HTTP response headers. A noindex tag allows Google and other search engines to crawl a page and follow links but essentially asks them to refrain from indexing it.
Keep in mind that, if you want to successfully use this method, all of the pages on your subdomain will need to have the noindex tag within their respective HTML code files. Simply applying a noindex tag to the subdomain’s homepage will only noindex your homepage.
The Subdomain Is Blocked via Robots.txt
You can also block the indexation of an entire subdomain by updating the robots.txt file for that specific subdomain. For example, within the Victorious site we used to utilize a “start” subdomain for our free consultation landing page: https://start.victoriousseo.com/seo/.
If we wanted to make sure this landing page (and all other landing pages hosted on the “start” subdomain) are never crawled or included in Google’s index, we could implement a robots.txt disallow.
This would look like:
This method only works for excluding a subdomain from Google’s index if the subdomain hasn’t already been indexed by Google.
If a subdomain has been indexed, blocking it via the robots.txt file will simply preserve the version of the subdomain URLs already included in Google’s index. To remove a subdomain that’s already indexed, institute noindex tags or use Google Search Console’s URL removal request tool once the robots.txt file has been updated.
Note: This method may not work for subdomains that have numerous internal and external links pointing to them. While the robots.txt block is a strong suggestion to Google to avoid indexing pages, if that signal is contradicted by numerous links to the URL, Google may decide to list the URL in its index regardless of the robots.txt directive. If Google indexes your subdomain despite your robots.txt directive, consider removing the block and using meta noindex tags across all subdomain pages.
How To Remove Subdomain From Google Index
If you find that Google has indexed a subdomain you don’t want featured in search results, you can:
- Use noindex metatags on the URLs you don’t want indexed. Then, use the URL removal tool in Google Search Console to request that Google stop serving up those URLs. A removal request isn’t immediate or permanent. The effects last about six months, so be sure that your tags are correct or else the pages may end up in Google’s search results again.
- Update your robots.txt file so that it includes the subdomain you don’t want indexed. Be careful when making updates to this file — you don’t want to accidentally block Googlebot from crawling your main site.
Looking for Site Architecture Help? Victorious Has Your Back
If you haven’t used a subdomain on your site because you weren’t sure whether Google would index it, rest easy. By creating an SEO strategy for each of your subdomains, you can rank well for your desired keywords and drive traffic to your subdomains.
And you don’t have to do it alone. SEO website architecture can be incredibly confusing if you don’t have a background in search engine optimization. Simplify the process with a skilled SEO agency and conquer Google SEO for subdomains. Contact us today for a free SEO consultation to have our team evaluate the best options for your website.