Over the years, seoClarity has conducted hundreds of site audits, and one recurring theme we've observed is the frustration clients feel when faced with the same technical SEO issues time and again.
These common challenges can be persistent and disruptive, impacting site performance and search visibility.
In this post, we’ll highlight the most frequent technical SEO problems we’ve encountered and provide actionable solutions to effectively address them.
Here are the top SEO issues that we'll cover in this post:
When we talk about technical SEO, we’re referring to updates to a website and/or server that you have immediate control over and which have a direct (or sometimes indirect) impact on your web pages' crawlability, indexation, and ultimately, search rankings.
This includes components like page titles, title tags, HTTP header responses, XML sitemaps, 301 redirects, and metadata.
Technical SEO does not include analytics, keyword research, backlink profile development, or social media strategies.
In our Search Experience Optimization framework, technical SEO is the first step in creating a better search experience.
Other SEO projects should come after you've guaranteed your site has proper usability. But for an enterprise site, it can be hard to stay on top of potential SEO problems.
These common technical SEO issues are often overlooked, yet are straightforward to fix and crucial to boost your search visibility and SEO success.
Site security with HTTPS is more important than ever.
If your site is not secure, when you type your domain name into Google Chrome, it will display a gray background — or even worse, a red background with a “not secure” warning.
This could cause users to immediately navigate away from your site and back to the SERP.
The first step for this quick fix is to check if your site is HTTPS. To do this, simply type your domain name into Google Chrome. If you see the “secure” message (pictured below), your site is secure.
When you search for your brand name in Google, does your website show up in the search results? If the answer is no, there might be an issue with your indexation.
As far as Google is concerned, if your pages aren’t indexed, they don’t exist — and they certainly won’t be found on the search engines.
XML sitemaps help Google search bots understand more about your site pages, so they can effectively and intelligently crawl your site.
Type your domain name into Google and add “/sitemap.xml” to the end, as pictured below.
This is usually where the sitemap lives. If your website has a sitemap, you will see something like this:
A missing robots.txt file is a big red flag — but did you also know that an improperly configured robots.txt file destroys your organic site traffic?
To determine if there are issues with the robots.txt file, type your website URL into your browser with a “/robots.txt” suffix. If you get a result that reads "User-agent: * Disallow: /" then you have an issue.
When the NOINDEX tag is appropriately configured, it signifies certain pages are of lesser importance to search bots. (For example, blog categories with multiple pages.)
However, when configured incorrectly, NOINDEX can immensely damage your search visibility by removing all pages with a specific configuration from Google’s index. This is a massive SEO issue.
It’s common to NOINDEX large numbers of pages while the website is in development, but once the website goes live, it’s imperative to remove the NOINDEX tag.
Do not blindly trust that it was removed, as the results will destroy your site's search visibility.
Recommended Reading: The Best SEO Checklist to Boost Search Visibility and Rankings
If your site doesn’t load quickly (typically 3 seconds or less), your users will go elsewhere.
Page speed matters to the user experience — and to Google's algorithm. In summer of 2021, Google announced that the page experience update (which includes metrics from Core Web Vitals) rolled out and updated a new Page Experience report in Search Console.
Remember when you discovered “yourwebsite.com” and “www.yourwebsite.com” go to the same place? While this is convenient, it also means Google may be indexing multiple URL versions, which dilutes your site's visibility in search.
Even worse, multiple versions of a live page may confuse users and Google's indexing algorithm. As a result, your site might not get indexed properly.
Rel=canonical is particularly important for all sites with duplicate or very similar content (especially ecommerce sites). Dynamically rendered pages (like a category page of blog posts or products) can look duplicative to Google search bots.
The rel=canonical tag tells search engines which “original” page is of primary importance (hence: canonical) — similar to URL canonicalization.
With more and more brands using dynamically created websites, content management systems, and practicing global SEO, the problem of duplicate content plagues many websites.
It may “confuse” search engine crawlers and prevent the correct content from being served to your target audience.
Unlike content issues like too little or “thin” content where you don’t have enough content on a page (at least 300 words), duplicate content can occur for many reasons:
Ecommerce site store items appear on multiple versions of the same URL.
Printer-only web pages repeat content on the main page.
The same content appears in multiple languages on an international site.
Each of these three issues can be resolved respectively with:
Proper rel=canonical (as noted above).
Proper configuration (instructions for setup also noted above).
Correct implementation of hreflang tags.
Google’s support page offers other ideas to help limit duplicate content including using 301 redirects, top-level domains, and limiting boilerplate content.
Broken images and those missing alt tags are a missed SEO opportunity. The image alt tag attribute helps search engines index a page by telling the bot what the image is about.
It’s a simple way to boost the SEO value of your page via the image content that enhances your site's experience.
Most SEO site audits will identify images that are broken and missing alt tags. Running regular site audits to monitor your image content as part of your SEO standard operating procedures makes it easier to manage and stay current with image alt tags across your website.
Good internal and external links show both users and search crawlers that you have high quality content. Over time, content changes and once-great links break.
Broken links interrupt the searcher's journey and reflect lower quality content, a factor that can affect page ranking.
While internal links should be confirmed every time a page is removed, changed, or a redirect is implemented, the value of external links requires regular monitoring. The best and most scalable way to address broken links is to run regular site audits.
An internal link analysis will help digital marketers and SEOs find the pages where these links exists so they can fix them by replacing the broken link with the correct/new page.
Use our backlinks feature to find all external links that are broken. From there, you can reach out to the sites with broken links and provide them with the correct link or new page.
Google defines structured data as:
… a standardized format for providing information about a page and classifying the page content …
Structured data is a simple way to help Google search crawlers understand the content and data on a page. For example, if your page contains a recipe, an ingredient list would be an ideal type of content to feature in a structured data format.
Address information, like this example from Google, is another type of data perfect for a structured data format:
<script type="application/ld+json">
{
"@context": "https://schema.org",
"@type": "Organization",
"url": "http://www.example.com",
"name": "Unlimited Ball Bearings Corp.",
"contactPoint": {
"@type": "ContactPoint",
"telephone": "+1-401-555-1212",
"contactType": "Customer service"
}
}
</script>
These structured data can then present themselves on the SERPs in the form of a rich snippet, which gives your SERP listing a visual appeal.
As you roll-out new content, identify opportunities to include structured data in the page and coordinate the process between content creators and your SEO team. Better use of structured data may help improve CTR and possibly improve rank position in the SERP.
Once you implement structured data, review your GSC report regularly to make sure that Google is not reporting any issue with your structured data markup.
Hot Tip: Use Schema Builder to build, test, and deploy structured data with a simple point-and-click interface.
In December 2018, Google announced mobile-first indexing represented more than half of the websites appearing in search results. Google would have sent you an email when (or if) your site was transitioned.
If you’re not sure if your site has undergone the transition, you can also use Google URL Inspection Tool.
Whether Google has transitioned you to mobile-first indexing yet or not, you need to guarantee your site is mobile friendly to ensure an exceptional mobile user experience. Anyone using responsive website design is probably in good shape.
If you run a “.m” mobile site, you need to make sure you have the right implementation on your m-dot site so you don’t lose your search visibility in a mobile-first world.
As your mobile site will be the one indexed, you’ll need to do the following for all “.m” web pages:
Guarantee the appropriate and correct hreflang code and links.
Update all meta data on your mobile site. Meta descriptions should be equivalent on both mobile and desktop sites.
Add structured data to your mobile pages and make sure the URLs are update to mobile URLs.
Meta descriptions are those short, up to 160-character, content blurbs that describe what the web page is about. These little snippets help the search engines index your page and a well-written meta description can stimulate audience interest in the page.
It’s a simple SEO feature, but a lot of pages miss this important content. You might not see this content on your page, but it's an important feature that helps the user know if they want to click on your result or not after they make their query.
Like your page content, meta descriptions should be optimized to match what the user will read on the page, so try to include relevant keywords in the copy.
There are a couple of ways to address this issue:
For pages missing meta descriptions: run an SEO site audit to find all pages that are missing meta descriptions. Determine the value of the page and prioritize accordingly.
For pages with meta descriptions: evaluate pages based on performance and value to the organization. An audit can identify any pages with meta description errors. High-value pages that are almost ranking where you want should be optimized first. Any page that undergoes an edit, update, or change should also have the meta description updated at the time of the change. It's important to make sure that meta descriptions are unique to a page.
In 2011, Google introduced the hreflang tag for brands engaged in global SEO to improve user experience. Hreflang tags signal to Google the correct web page to serve to a user based on language or location of search. It’s also called rel="alternate" hreflang="x".
The code looks like this:
<link rel="alternate" href="http://example.com" hreflang="en-us" />
Hreflang is one of several international SEO best practices including hosting sites on local IPs and connecting with local search engines. The benefits of serving locally-customized content to users in their native language, however, really cannot be understated.
Using hreflang tags requires a fair amount of detail work to ensure all pages have the appropriate code and links with errors not being uncommon.
Google provides a free International Targeting Tool, and there are a variety of third-party tools that digital marketers can use as well. For instance, with our site audits, you can run an in-depth hreflang audit and verify your implementation with cross-checks of referenced URLs.
seoClarity's hreflang audit capability displays the count of hreflang tags found and lists the errors with specific tags.
Fixing hreflang errors effectively involves two steps:
Guaranteeing the code is correct. A tool like Aleyda Solis’ hreflang Tags Generator Tool can simplify the effort.
When updating a page or creating a redirect, update the code on all pages that refer/link to it.
Investigating the top technical issues – and their respective solutions – in this blog post is the best way to quickly improve your SERP visibility, and it can have a extremely positive impact on the searcher's overall experience of your site.
To quickly uncover and prioritize the most impactful SEO fixes and optimizations for your site with a single view of ALL your SEO data (no matter the source), schedule a demo of Clarity 360 today!
Editor's Note: This post was originally published in October 2017 and has been updated for accuracy and comprehensiveness.