SEO has a lot of moving parts: meta tags, headers, keywords, and page load speed, to name a few. But while you may have a plan for getting these on-page search engine optimization elements ship-shape, you might need a web developer to implement broader technical changes.
A successful SEO campaign is a collaboration between marketing and web development.
But it can be frustrating to drive results if everyone isn’t on the same page. This SEO cheat sheet summarizes the essential on-page SEO practices that make it easier for search engines to read and index your site. Your web and SEO developer can reference this guide and integrate these practices into their cross-functional workflow to improve site performance and user experience. Our SEO cheat sheet will streamline your efforts by highlighting which search engine optimization tactics to prioritize and why, so you can implement a strategy that supports each of your teams.
Web Dev SEO Cheat Sheet
Web developers usually receive SEO and web development requests in bits and pieces as their colleagues monitor site performance and track metrics like organic search engine traffic, bounce rate, and page load time. But a web team’s scope of responsibility extends beyond technical SEO. Tasks such as checking page speed or redirecting URLS are often assigned to web devs without explaining why these tactics are needed to improve Google rankings or support company goals.
Thankfully, web development and SEO can go hand-in-hand. Focus on these seven on-page SEO strategies to increase your chances of ranking for your desired keywords.
1. Site Speed
According to Google, when a web page’s load time increases from one to three seconds, visitors are 32% more likely to bounce back to search results. Page speed is also a ranking factor, making it more than just a UX concern.
Use Google’s PageSpeed Insights to check your site’s performance on desktop and mobile. It provides feedback on which factors need improvement.
Learn more about Google PageSpeed Insights and Lighthouse and how to optimize page speed here.
2. Mobile-Friendly Web Design
Google rolled out mobile-first indexing in 2019, which prioritizes mobile-friendly websites in SERPs regardless of device. Test how your site appears on mobile devices with Google’s Mobile-Friendly Test tool and discuss needed changes with your web designers. This tool provides a snapshot of how your page displays and flags usability problems such as:
- Incompatible plugins
- Unreadable text
- Content that’s wider than the screen
- Links that are too close together
3. HTTP Status Codes
Broken links and error messages create a poor user experience and make it harder for search engines to crawl and index your website. Diagnose and fix problems quickly by monitoring HTTP status codes. Otherwise, pages returning 4xx codes (except 429) will be unindexed, and frequent 429, 500, and 503 HTTP status codes will result in reduced crawling rate.
Run an Index Coverage Report on Google Search Console as part of your technical SEO audit. Identify crawl errors and address the ones under your control. Roll this data into a site improvement plan.
HTTP Status Code | What It Means |
---|---|
200 | Page is loading normally. |
301 | Permanent redirect; page authority is passed from the redirecting page to the new page and the new page will eventually show up in SERPs instead of the old page. |
302 | Temporary redirect; only use this for short-term redirects. |
404 | Page not found; ensure page is removed from your sitemap and remove or redirect all internal links. Search engines will unindex page. |
410 | Page has been permanently removed; ensure page is removed from your sitemap and remove or redirect all internal links. Search engines will unindex page. |
429 | Too many requests. Server or application error. If error happens regularly, search engines may reduce crawling rate. |
500 | Internal server error. If error happens regularly, search engines may reduce crawling rate. |
503 | Server is unavailable. If error happens regularly, search engines may reduce crawling rate. |
4. Important HTML Elements
HTML elements play an integral role in helping search crawlers understand your content and may impact search visibility. Audit HTML elements regularly and optimize for search as needed. Pay particular attention to the following.
Page Title Tags
These appear on SERP listings directly above the page URL. They should be clear, enticing, and concise to encourage click-throughs.
Header Tags
Define the hierarchy of information on your page with header tags. H1 tags are the main title of an article and are the most important header tag for SEO. H2 through H6 tags are used to indicate topics and subtopics.
Meta Descriptions
A meta description helps increase click-through rates by providing insight into what a page is about. It will appear under the page title in search engine result pages. A good meta description includes your keyword and a call-to-action that makes a searcher want to visit your page.
Images
Use image alt tags to describe the contents of an image. In addition to being an essential part of site accessibility, alt tags that contain a relevant keyword improve on-page search engine optimization. Images should be compressed and appropriately sized for quicker loading.
Canonical Tags
Use canonical tags to solve duplicate content issues and point search engines to the master version of a page when necessary. Learn more about them below.
5. Site & Page Structure
To rank your site, Googlebot needs to understand how it’s organized and the relationship between different sections. Implement a site hierarchy that illustrates the logic of how information flows and how pages relate to each other. Focus on the elements below to help users and search bots locate the right content.
HTML Structure
Use HTML heading tags (H1 through H6) to organize each page and clearly show the information hierarchy. Further subdivide content into blocks using tags such as <div>, <section>, or <footer>.
Increase a page’s readability for both readers and search engines by including a table of contents at the beginning of an article or as a sidebar, with links to the appropriate section. This makes it easier for users to navigate the page and for search engines to understand the page’s content.
URL Structure
URLs can tell search bots what a page is about. For example, it’s clear that the link below is for a category page on the Victorious blog about on-page SEO.
As a general rule, use a consistent structure for URLs. Include the keyword you want to rank for and keep them short, simple, and descriptive. Avoid random numbers or jargon, and use a hyphen to separate words for clarity.
Redirection
Redirects help users and search engine crawlers find the content they’re looking for when pages are moved, removed, or combined. Otherwise, the missing page becomes a dead-end — a 404 error.
Use permanent and temporary redirects to indicate how to interpret the page’s new location. Permanent redirects (301) are taken as a strong signal that the new target page is canonical (more on this later). Temporary redirects (302) are more likely to keep the old URL in search results. Because of this, you should only use them for short periods.
Internal Linking
Internal links guide users between pages on a website. Your key pages should be easily accessible via your menu or a sidebar.
Internal links also help Google understand which pages are valuable for ranking purposes, as they’re linked to most frequently. This should include product category pages or service pages.
Through strategic linking, you can share authority from your highest-ranking pages with other pages on your site (known as “link equity“). Thoughtful internal linking can improve the rankings of the page in question and all internal pages to which you’re linking.
XML Sitemaps
Think of an XML sitemap as a roadmap to help search bots crawl your site efficiently. Sitemaps list important pages on your site, the relationships between them, and the date the pages were last updated. It should be updated regularly for proper indexation.
6. Efficient Crawling
Google gives every website a crawl budget that determines how much time its bots spend indexing pages. Make the most of yours by indicating pages you don’t want the bots to crawl or index. Minimize the likelihood of index bloat by ensuring certain pages aren’t indexed. You can use robot.txt files and robot meta tags to manage how Googlebot navigates your site and ensure it’s crawled as efficiently as possible.
Robots.txt file
Robots.txt is a site-wide file that tells search bots which pages to access and index. Use the command “disallow” in the file to specify URLs you don’t want crawled. These may include account pages, confirmation pages, and printer-friendly versions of your content.
Crawl bots aren’t obliged to follow these instructions and may still index pages through internal links. To avoid this, use the robots meta tag.
Robots Meta Tag
Using the robots meta tag in the <head> section of an HTML document directs search engine crawlers at a page level. The robots tag allows you to provide a directive for one page to a specific search engine bot or to all crawl bots.
For example, the following tag prevents Google from translating a page:
- <meta name=”googlebot” content=“notranslate”>
Find a list of the robots meta tags that Google recognizes in their Advanced SEO documentation.
X-Robots
Issue site-wide directives on how to handle non-HTML files with X-Robots, which allow you to provide instructions in the HTTP response header of a page regarding what shouldn’t be indexed. For example, you can block the indexation of a specific image or PDF file. To add X-Robots to your page, access your website’s .php or server configuration files.
Canonicalization
Duplicate content can decrease your chances of ranking for a particular keyword since determining which page is the original is left up to the search engine. Canonical tags manage duplicate content by specifying which page is the master copy. This helps guide search engines when they’re determining which page should rank.
Canonical tags are placed in the head of a page’s HTML code and tell Google the URL of the canonical version of the page. For example:
- <link rel=”canonical” href=”test.com/blog/original”>
Pagination
Sometimes, you need to break long content into several pages for readability. To ensure Google doesn’t see these as duplicate content or as individual unrelated pages, signal that a page is part of a multipage series.
- Create a master page with the full content and use a canonical tag to request that Google index this page.
- Use the rel=prev/next tags for navigation, which are specific pagination tags. While it’s no longer an indexing signal, these tags are useful for other search engines and web accessibility purposes.
Most Common User Agents
The User-Agent attribute in the header helps determine the browser version and device a user is using, among other things. Implementing user agents means you can deliver the correct version of a webpage based on the user agent information.
For example, if you detect that a user (or crawler) is connecting with Safari on iOS, set your webserver to automatically serve the mobile version of the site so that information renders properly.
7. Appearance of Content
How your content looks on external sites such as SERPs and social media platforms doesn’t impact search rankings, but it can improve traffic to your site. To improve click-throughs, your posts should highlight your attention-grabbing title or a large image. This can indirectly boost conversion rates and even backlinks.
Control how your content appears in social media and SERPs using the following tools.
Open Graph Tags
Open Graph tags control how a web page is displayed when viewed on social media platforms like Facebook. Many third-party platforms have their own set of open graph tags. The Open Graph Protocol website has a list of specifications used in the protocol.
Twitter Card Markup
Twitter Cards are similar to Open Graph tags and control how your title, description, and image appear when someone shares your link in a tweet. There are different types of Twitter Card meta tags you can add to your page.
Structured Data
To help search engines index content, SEO programmers can add structured data to a page using JSON-LD (JavaScript Object Notation for Linked Data). Structured data, also known as schema, is a standardized format for indicating key elements of content.
For example, use structured data to highlight ingredients, cooking time, temperature, and calories on a recipe page. These labels help search engines understand what search queries the content answers.
Rich Snippets
It’s also important to optimize how search engines display your web page in search results. Enhance your SERP listing with rich snippets such as an image, video preview, or star ratings, to make it more attractive to searchers. Use schema to generate rich snippets and give searchers more information about your website right from the search engine result page.
Recommended Reading
Search-First: Unite the Pieces of Your Search Strategy
SEO web development requires the implementation of best practices, consistent performance review, and attention to detail. A trusted partner like Victorious can help manage some or all of these elements to supplement your internal resources and reduce strain on your web developers. We have a full suite of SEO and web maintenance services to match your needs and a team of experts to answer your questions. Request a free SEO consultation to learn more.