What is an SEO audit?

In a simple sense, an SEO site audit is a process of evaluating search engine friendliness of a website. Like how well a site is performing in a search engine, factors affecting the site health and performance and what areas can be improved that enhance the efficiency and visibility of a website.

A simple analogy could be like going to a hospital for a whole body checkup. After the arrival of reports, you followed to an assigned doctor. Doctors evaluate your overall report and in case there is a sign of any deficiency, the doctor will tell the best possible way to fight against it. 

Why perform an SEO site audit?

The purpose and importance of an SEO audit is to identify as many issues affecting organic search performance. It helps to analyze the overall web presence that grades the site for its ability to appear in search engine results pages (SERPs). Thus, helps to fix existing issues that hinders your site SEO performance.

How often should you audit your site for SEO?

For a small business website, once in every six months will be perfect. If you are a large scale company with a content-heavy website like e-commerce conducting an SEO audit every 3-4 months makes more sense. Since large websites need to update their content on a regular basis, it’s worth reviewing an SEO effort in a timely manner.

Pro Tips: If you are an online marketing agency offering SEO services, make sure to audit a client site at the beginning. SEO audit is the first process whereas optimization and strategy will take place later. 

What are some of the best SEO audit tools?

While there are many tools you can use to SEO audit a website, the best way is to either perform the audit by following a guide (you are reading now) or hire a digital marketing agency that offers SEO Site Audit service. SEO audit is quite technical and it is always recommended to hire an expert or an online marketing agency. 

Here are some of the best tools that you need for an effective SEO audit process:

Free SEO Audit Tools

In an earlier version of google search console formerly called as a google webmaster, on-site errors like HTML improvements could be addressed. But now this feature is not available. Technical issues like index coverage error, mobile usability,  site page speed e.t.c can be determined after complete crawl.

Top Premium SEO Audit Tools

I have listed 3 best SEO tools and will guide you through a solid process of auditing your website. So let’s get started. 

Why use premium SEO tools?

So, it’s better to use premium SEO tools where an overall site audit can be performed. Most of the SEO tools use their own bots to crawl a website, hence faster and more efficient to evaluate overall site performance. Even more, premium audit tools can detect much more issues and guide you to fix it quickly.

Pro Tips: For newbie bloggers or small websites, evaluating a site using Google Search Console and Google Analytics is highly recommended. Since everyone cannot afford premium SEO tools, auditing using free tools is quite enough.

How to start an SEO Audit?

Well, after finalizing SEO audit tools here are things you should be aware of:

Step 1: Make sure only a one canonical version of URL is browseable

In a moment, I’ll show you how to begin crawling your site for SEO issues using various tools.

But first, you need to check that only one canonical version of your site is browseable. Technically having multiple URLs for the same pages leads to duplicate content issues and negatively impact your SEO performance.

Consider all the ways someone will type your website address into a browser. For example:

http://yourdomain.com
http://www.yourdomain.com
https://yourdomain.com
https://www.yourdomain.com

Only ONE of these should be accessible in a browser. For instance, https://yourdomain.com is a preferred version.

Even more, for a WordPress user, you will find a configuration setting right over (Setting > General >Make WordPress Address URL and Site Address URL to a preferred version)

Pro Tips: Make sure to use 301 redirection to the canonical version. You can see that if you type the HTTP version into the address bar, it automatically redirects to the HTTPS version.

Step 2. Start a website crawl

I am going to show you using the 3 most popular SEO tools. I will be wrapping up every step. Let’s get started 

Using SEMrush

SEMrush is one of the world’s most popular SEO tools offering 40+ modules for enhancing online visibility and marketing analytics.

# Go to Site Audit Tools

# Step 2: Go to the project section and fill with your domain and project name. Make sure to exclude hypertext protocols like HTTP and https.

#After successfully creating a project then go to a project dashboard and click setup.

# After successful audit you will show a report as given below

Image: Crawled Report Demo

Using Ahrefs

One of the top tools for auditing your website, exploring competitors, researching  keywords & backlink analysis.

# Go to a Site Audit

(Site Audit > New Project > Enter domain in the ‘Scope & seeds’ section > Untick 301 redirects)

scope and seeds ahrefs site explorer
Gif Source: Ahref

# Hit “Next.”

For the most part, you can leave the rest of the settings as they are.

But I do recommend toggling the Check HTTP status of external links and Execute JavaScript options (under Crawl settings) to “on.”

crawl settings changes

This will ensure that Site Audit also checks all external links on your website and any JS links.

When you’re done, hit “Create project.” Ahrefs’ Site Audit will start working away in the background while we continue with the audit.

Using Screaming Frog

The Screaming Frog SEO Spider is a website crawler that allows you to crawl websites’ URLs to analyse, audit technical and on-site SEO. 

Pro Tips: Though it is premium SEO tools allows you to crawl up to 500 URLs. Either you are a newbie blogger or small business owner it makes sense to leverage this powerful tool. 

To make use it first downloads a screaming frog in a desktop version

#After installing on your desktop, you will see an interface as shown below. 

#Fill in the URL( preferred version of URL) section and hit start 

#You will see a interface after completed your entire site audit as shown below

Alright, you learned to run full site audit using various tools.

Make sure to use only one premium SEO tool for site audit. I have been using SEMrush for years. I am pretty comfortable with that. Use what makes you comfortable. 

Now have a quick overview of site audit report. In this phase, it’s time for the optimization and implementation. After gathering audit reports prioritize a quick fix and do it accordingly.

So you might be wondering the best possible way to re-optimize your site for SEO. First, make sure to prioritize optimization after analyzing site audit reports. Many premium SEO tools usually categorize site audit reports as errors, warnings, notices or something else. 

How to fix website SEO Issues?

Interpreting Issues 

It’s time to interpret site audit results. Basically there are three levels of issues:

Errors(Max Priority)

The most important problems on the website that need attention and quick fix.

Warnings 

Issues that are slightly less important than errors, but still will make an impact when fixed properly

Notices

Issues that are not extremely important but could lead to issues down the line and are good to be aware of.

Pro Tips: So, the best way to start making changes on your site that will improve its health and visibility would be to work on all of the Errors. 

What are the key elements of a site audit?

Basically, in this article, I divided an SEO audit into four parts namely:

  • On-Page SEO Audit 
  • Content Audit
  • Technical SEO Audit
  • Backlink/Off-Page Audit

Performing On-Page SEO Audit 

I already conducted a full site audit and its time to examine existing issues and fix them one by one. So let’s get started and how to resolve it quickly.

Check Missing Title Tags

A <title> tag is a key on-page SEO element. It appears in browsers and search results and helps both search engines and users understand what your page is about.

Warning: If a page is missing a title, or a <title> tag is empty, Google may consider it low quality. In case you promote this page in search results, you will miss chances to rank high and gain a higher click-through rate.

How to fix it?

Ensure that every page on your website has a unique and concise title containing your most important keywords. For information on how to create effective titles, please see this Google article.

Check Duplicate Title Tags

Duplicate or exact match <title> tags make it difficult for search engines to determine which of a website’s pages is relevant for a specific search query, and which one should be prioritized in search results. Pages with duplicate titles have a lower chance of ranking well. Moreover, identical <title> tags confuse users as to which webpage they should follow.

How to fix it?

Provide a unique and concise title for each of your pages that contain your most important keywords.

Check Duplicate Content

Moz Duplicate Content Example
Source: Moz

Webpages are considered duplicates if their content is 85% identical. Having duplicate content may significantly affect your SEO performance.

First of all, Google will typically show only one duplicate page, filtering other instances out of its index and search results, and this page may not be the one you want to rank.

Warning: In some cases, search engines may consider duplicate pages as an attempt to manipulate search engine rankings and, as a result, your website may be downgraded or even banned from search results. Moreover, duplicate pages may dilute your link profile.

How to fix it?

Here are a few ways to fix duplicate content issues:

  • Add a rel=”canonical” link to one of your duplicate pages to inform search engines which page to show in search results.
  • Content consolidation with original one: Use a 301 redirect from a duplicate page to the original one.
  • Use a rel=”next” and a rel=”prev” link attribute to fix pagination duplicates
  • Instruct GoogleBot to handle URL parameters differently using Google Search Console.
  • Provide some unique content on the webpage.
  • You can try a content duplicate checker such as Copyscape to ensure the quality of the content.

For more information, please read these articles: “Duplicate content” and “Consolidate duplicate URLs“. 

Check Broken Internal Link

Broken internal links lead users from one website to another and bring them to non-existent webpages. Multiple broken links negatively affect user experience and may worsen your search engine rankings because crawlers may think that your website is poorly maintained or coded.

How to fix it?

Please follow all links reported as broken. If a target web page returns an error, remove the link leading to the error page or replace it with another resource.

Check Broken Internal Images

An internal broken image is an image that can’t be displayed because it no longer exists, its URL is misspelt, or because the file path is not valid. Broken images may negatively affect your search rankings because they provide a poor user experience and signal to search engines that your page is low quality.

How to fix it?

To fix a broken internal image, perform one of the following:

  • If an image is no longer located in the same location, change its URL
  • If an image was deleted or damaged, replace it with a new one
  • If an image is no longer needed, simply remove it from your page’s code

Check Duplicate Meta Description

Duplicate meta description example
Image Source: Moz

A <meta description> tag is a short summary of a webpage’s content that helps search engines understand what the page is about and can be shown to users in search results.

Duplicate or exact  meta descriptions on different pages mean a lost opportunity to use more relevant keywords. 

Also, duplicate meta descriptions make it difficult for search engines and users to differentiate between different web pages. It is better to have no meta description at all than to have a duplicate one

How to fix it?

Provide a unique, relevant meta description for each of your webpages. For information on how to create effective meta descriptions, please see this Google article.

You can also view tips on writing a unique and compelling meta description.

Check Short Title Element

Generally, using short titles on web pages is a recommended practice. However, keep in mind that titles containing 10 characters or less do not provide enough information about what your webpage is about and limit your page’s potential to show up in search results for different keywords.

How to fix it?

Add more descriptive text inside your page’s <title> tag.

Check Long title element

Most search engines truncate titles containing more than 70 characters. Incomplete titles look unappealing to users and won’t entice them to click on your page.

How to fix it?

Try to rewrite your page titles to be 70 characters or less.

Check Missing H1

While less important than <title> tags, h1 headings still help define your page’s topic for search engines and users. If a <h1> tag is empty or missing, search engines may place your page lower than they would otherwise. Besides, a lack of a <h1> tag breaks your page’s heading hierarchy, which is not SEO friendly.

How to fix it?

Provide a concise, relevant h1 heading for each of your pages.

Check Missing Meta Description

Though meta descriptions don’t have a direct influence on rankings, they are used by search engines to display your page’s description in search results. 

A good description helps users know what your page is about and encourages them to click on it. If your page’s meta description tag is missing, search engines will usually display its first sentence, which may be irrelevant and unappealing to users.

How to fix it?

In order to gain a higher click-through rate, you should ensure that all of your web pages have meta descriptions that contain relevant keywords.

Check To many on-page links

To many on page links seems unusual and spammy. Placing more links on a webpage can make your page look low-quality to search engines, which may cause your page to drop in rankings or not to show up in search results at all. Having too many on-page links is also bad for user experience.

How to fix it?

Review all pages that contain unusual links and delete unnecessary links.

Check Missing Image ALT attributes

Alt attributes within <img> tags are used by search engines to understand the contents of your images. If you neglect alt attributes, you may miss the chance to get a better placement in search results because alt attributes allow you to rank in image search results.

Warning: Not using alt attributes also negatively affects the experience of visually impaired users and those who have disabled images in their browsers.

For more information, please see these articles: Using ALT attributes smartly and Google Image Publishing Guidelines.

How to fix it?

Use descriptive image alt tags. 

Pro Tips: Use SEO friendly image filename for better optimization. 

Check Too many URLs parameters

Using too many URL parameters is not an SEO-friendly approach. Multiple parameters make URLs less enticing for users to click and may cause search engines to fail to index some of your most important pages.

How to fix it?

  • Try to use no more than four parameters in your URLs
  • Avoid using underscore(_) in URLs
  • Using hyphens (-) is a best practice

Low word count

Source: Backlinko Case Study

Low word count means significantly lower ranking. This issue is triggered if the number of words on your webpage is less than 200. The amount of text placed on your webpage is a quality signal to search engines.

Search engines prefer to provide as much information to users as possible, so pages with longer content tend to be placed higher in search results, as opposed to those with lower word counts.

For more information, please view this video.

How to fix it?

Improve your on-page content and be sure to include more than 200 meaningful words.

Check Nofollow attribute in outgoing internal links

The rel=”nofollow” attribute is an element in an <a> tag that tells crawlers not to follow the link (e.g., “<a href=”http://example.com/link” rel=”nofollow”>Nofollow link example</a>”). Nofollow links don’t pass link equity to referred webpages. 

That’s why it is not recommended that you use nofollow attributes in internal links. You should let link juice flow freely throughout your website. Moreover, unintentional use of nofollow attributes may result in your webpage being ignored by search engine crawlers even if it contains valuable content.

How to fix it?

Make sure not to use nofollow attributes by mistake. Remove them from <a> tags, if necessary.

Check Sitemap.xml not specified in robots.txt 

If you have both a sitemap.xml and a robots.txt file on your website, it is a good practice to place a link to your sitemap.xml in your robots.txt, which will allow search engines to better understand what content they should crawl.

How to fix it?

Specify the location of your sitemap.xml in your robots.txt. To check if Googlebot can index your sitemap.xml file, use the Sitemaps report in Google Search Console

Check Multiple H1 Tags

Although multiple <h1> tags are allowed in HTML5, it’s not recommended to use more than one <h1> tag per page. Including multiple <h1> tags may confuse users.

How to fix it?

Use multiple <h2>-<h6> tags instead of an <h1>.

Check orphaned pages

Orphaned pages example.
Source: Search3W

A webpage that is not linked to internally is called an orphaned page. It is very important to check your website for such pages. If a page has valuable content but is not linked to by another page on your website, it can miss out on the opportunity to receive enough link juice. 

Warning: Orphaned pages that no longer serve their purpose confuse your users and, as a result, negatively affect their experience.

How to fix it?

Review all orphaned pages on your website and do either of the following:

  • If a page is no longer needed, remove it
  • If a page has valuable content and brings traffic to your website, link to it from another page on your website
  • If a page serves a specific need and requires no internal linking, leave it as is

Check page crawl depth

A page’s crawl depth is the number of clicks required for users and search engine crawlers to reach it via its corresponding homepage. From an SEO perspective, an excessive crawl depth may pose a great threat to your optimization efforts, as both crawlers and users are less likely to reach deep pages.

Pro Tip: For this reason, pages that contain important content should be no more than 3 clicks away from your homepage. You can use free tools like screaming frog to determine crawl depth easily.

How to fix it?

Make sure that pages with important content can be reached within a few clicks. If any of them are buried too deep in your site, consider changing your internal link site architecture.

Performing a Technical SEO Audit

We already conducted a full site audit and its time to examine existing issues and how you could fix them. So let’s get started with issues and how to resolve it quick:

Check 5XX Errors

5xx errors refer to problems with a server being unable to perform the request from a user or a crawler. They prevent users and search engine robots from accessing your webpages, and can negatively affect user experience and search engines’ crawlability. This will in turn lead to a drop in traffic driven to your website.

How to fix it

Investigate the causes of these errors and fix them.

Check 4XX Errors

A 4xx error means that a webpage cannot be accessed. This is usually the result of broken links. These errors prevent users and search engine robots from accessing your webpages, and can negatively affect both user experience and search engine crawlability.

This will in turn lead to a drop in traffic driven to your website.

How to fix it?

If a web page returns an error, remove all links leading to the error page or replace it with another resource.

To identify all pages on your website that contain links to a 4xx page, click “View broken links” next to the error page if you are using SEMrush. 

You can also examine it using google search console from index coverage issue tab. 

Check for Page Not Crawled

This issue indicates that our crawler couldn’t access the webpage. There are two possible reasons:

  • Your site’s server response time is more than 5 seconds
  • Your server refused access to your webpages

How to fix it?

Please contact your web hosting technical support team and ask them to fix the issue.

Check for Invalid robots.txt file

If your robots.txt file is poorly configured, it can cause you a lot of problems.

Web Pages that you want to be promoted in search results may not be indexed by search engines, while some of your private content may be exposed to users.

So, one configuration mistake can damage your search rankings, ruining all your search engine optimization efforts.

How to fix it?

Review your robots.txt file and fix all errors, if there are any. You can check your file using Google’s robots.txt Tester

For information on how to configure your robots.txt, please see this article.

Check for Invalid Sitemap.XML format

If your sitemap.xml file has any errors, search engines will not be able to process the data it contains, and they will ignore it.

How to fix it?

Review your sitemap.xml file and fix all errors. You can check your file using the Sitemaps report in Google Search Console

For information on how to configure your sitemap.xml, please see this article.

Check www resolve issues

Normally, a webpage can be accessed with or without adding www to its domain name. If you haven’t specified which version should be prioritized, search engines will crawl both versions, and the link juice will be split between them. Therefore, none of your page versions will get high positions in search results.

How to fix it?

Set your preferred version in Google Search Console. For details, please see this article.

Large HTML page size

A webpage’s HTML size is the size of all HTML code contained on it. A page size that is too large (i.e., exceeding 2 MB) leads to a slower page load time, resulting in a poor user experience and a lower search engine ranking.

How to fix it?

Review your page’s HTML code and consider optimizing its structure and/or removing inline scripts and styles.

Check Issue with mix content

If your website contains any elements that are not secured with HTTPS, this may lead to security issues. Moreover, browsers will warn users about loading insecure content, and this may negatively affect user experience and reduce their confidence in your website.

How to fix it?

Only embed HTTPS content on HTTPS pages. Replace all HTTP links with the new HTTPS versions. If there are any external links leading to a page that has no HTTPS version, remove those links.

Check For Neither canonical URL nor 301 redirect from HTTP homepage

If you’re running both HTTP and HTTPS versions of your homepage, it is very important to make sure that their coexistence doesn’t obstruct your SEO. Search engines are not able to figure out which page to index and which one to prioritize in search results. 

As a result, you may experience a lot of problems, including pages competing with each other, traffic loss and poor placement in search results. To avoid these issues, you must instruct search engines to only index the HTTPS version.

How to fix it?

Do either of the following:

Redirect your HTTP page to the HTTPS version via a 301 redirect. Mark up your HTTPS version as the preferred one by adding a rel=”canonical” to your HTTP pages

Check For Redirect Chain Loops

Redirecting one URL to another is appropriate in many situations. However, if redirects are done incorrectly, it can lead to disastrous results. Two common examples of improper redirect usage are redirect chains and loops.

Long redirect chains and infinite loops lead to a number of problems that can damage your SEO efforts. They make it difficult for search engines to crawl your site, which affects your crawl budget usage and how well your web pages are indexed, slows down your site’s load speed, and, as a result, may have a negative impact on your rankings and user experience.

How to fix it?

The best way to avoid any issues is to follow one general rule: do not use more than three redirects in a chain.

If you are already experiencing issues with long redirect chains or loops, we recommend that you redirect each URL in the chain to your final destination page.

We do not recommend that you simply remove redirects for intermediate pages as there can be other links pointing to your removed URLs, and, as a result, you may end up with 404 errors.

Check Broken Canonical URLs

By setting a rel=”canonical” element on your page, you can inform search engines of which version of a page you want to show up in search results. When using canonical tags, it is important to make sure that the URL you include in your rel=”canonical” element leads to a page that actually exists.

Canonical links that lead to non-existent webpages complicate the process of crawling and indexing your content and, as a result, decrease crawling efficiency and lead to unnecessary crawl budget waste.

How to fix it?

Review all broken canonical links. If a canonical URL applies to a non-existent webpage, remove it or replace it with another resource.

Check Multiple Canonical URLs

Multiple rel=”canonical” tags with different URLs specified for the same page confuse search engines and make it almost impossible for them to identify which URL is the actual canonical page. As a result, search engines will likely ignore all the canonical elements or pick the wrong one. That’s why it is recommended that you specify no more than one rel=”canonical” for a page.

How to fix it?

Remove all canonical URLs except the one that you’d like to serve as the actual canonical page.

Check For Too large sitemap.xml 

This issue is triggered if the size of your sitemap.xml file (uncompressed) exceeds 50 MB or it contains more than 50,000 URLs. Sitemap files that are too large will put your site at risk of being ineffectively crawled or even ignored by search engines.

How to fix it?

Break up your sitemap into smaller files. You will also need to create a sitemap index file to list all your sitemaps and submit it to Google.

Don’t forget to specify the location of your new sitemap.xml files in your robots.txt.

For more details, see this Google article

Check Slow page (HTML) load speed

Page (HTML) load speed is one of the most important ranking factors. The quicker your page loads, the higher the rankings it can receive. Moreover, fast-loading pages positively affect user experience and may increase your conversion rates.

Please note that “page load speed” usually refers to the amount of time it takes for a webpage to be fully rendered by a browser. However, crawler only measures the time it takes to load a webpage’s HTML code – load times for images, JavaScript and CSS are not factored in.

How to fix it?

The main factors that negatively affect your HTML page generation time are your server’s performance and the density of your webpage’s HTML code.

So, try to clean up your webpage’s HTML code. If the problem is with your web server, you should think about moving to a better hosting service with more resources.

Broken External Links

Broken external links lead users from one website to another and bring them to non-existent webpages. Multiple broken links negatively affect user experience and may worsen your search engine rankings because crawlers may think that your website is poorly maintained or coded.

How to fix it?

Please follow all links reported as broken. If a target web page returns an error, remove the link leading to the error page or replace it with another resource.

Performing a Content Audit 

In simple sense, content audit is the process of evaluating overall content that you have on your website. The main purpose of content audit are:

  • How well every piece of content is performing in a search engine?
  • What are the factors affecting its performance?
  • What areas can be improved in order to stand out among other contents?

How to perform a solid content audit?

First you should be familiar with different tools whether you are using premium tools or free tools.

Tools required for content audit

There are a lot of content audit tools but it makes more sense to analyze manually. Premium tools like SEMrush, Ahrefs, Screaming frog are cool. But in this section I prefer you to analyze using google tools:

  • Google Search Console
  • Google Analytics

First list out all posts/pages and categorize accordingly. You can extract those pages using google search console in an excel format. 

Great Content

These are top-performing articles and they might just need a couple of edits. If those are the pillar posts, it can be used as a content clustering and also for a better internal linking. It makes more sense to optimize great articles. Optimizing underperforming articles can kill your huge amount of resources. 

How to optimize more?
  • If possible, you can make it more compelling adding such as infographics, statistics, complete guide and so on
  • If it is topically relevant, internally link from those posts to top-performing posts
  • Target using different keyword-rich anchor text(intended keyword) to enhance ranking.

Salvageable(Editable)

These are mediocre articles, probably good articles, but it can be updated into something better (and parts of the existing content strategy).

How to optimize more?
  • If possible, you can update or rework for further optimization
  • If it is topically relevant, internally link from top performing posts
  • At the time of internal linking, target using different keyword rich anchor text(intended keyword) to enhance ranking

Unsalvageable(Non-editable)

These are underperforming or bad contents and might have a poor visibility in SERP. Sometimes it can be referred to as a Zombie Pages or thin contents.

How to optimize more?
  • If it has a thin content either rewrote them or deleted those pages. Make a 301 redirection from those deleted pages to better pieces of contents.
  • If a site has a number of thin content with similar kinds of posts, content consolidation works the best. After consolidation make a 301 redirection to a better content. Helpful for avoiding keyword cannibalization and for crawl budget optimization.
  • It is very important to check your website for underperforming pages. If a page has valuable content but is not linked to by another page on your website, it can miss out on the opportunity to receive enough link juice and ranking. Make sure to internally link from  top performing pages or topically relevant pages.

Performing Backlink Audit 

Backlinks (also known as “inbound links”, “incoming links” or “one way links”) are links from one website to a page on another website.

Backlinks are basically votes from other websites. Each of these votes tells search engines: “This content is valuable, credible and useful”. That’s why backlinks are regarded as one of the google top ranking factors.

Why to do a backlink audit?

The Goal of Backlink Audit is:

  •  To remove toxic backlinks from your site to improve the site’s SEO health.
  •  To find new backlink opportunity
  •  To fix broken backlinks

Suggested Tools :

In this backlink audit section, I will explain it surfacely using Google Search Console. Let’s get started:

# Open your google search console and select your site property

#Go to links tab

# Then export link as a CSV

# Analyze all backlinks either using Moz Explorers or SEMrush.

Analyze your links:

  • Live links
  • New Links
  • Lost Links

Link Type / Attribute:

  • follow (do follow) – Valuable Backlink
  • nofollow – Mainly Social Media Links
  • sponsored – Sponsored Content
  • UGC – User Generated Content

What to do if you are getting a link from a spammy website?

In order to get rid of toxic domains, you can do it simply uploading a toxic domain using google disavow tools. Disavowing links is the process of requesting Google to ignore your toxic backlinks.

How to use google disavow tools?

Step 1: List out SPAMMY/TOXIC backlinks and prepare a disavow txt. You can determine a toxic score if you are using tools like SEMrush, Moz Pro and so on

Step 2: Go to google disavow tools. You will see as shown below.

Step 3: After selecting your site, it will take you to an upload section where you will be uploading a .txt file containing those toxic domains. A single site can have many backlinks and sometimes a site can have two versions of urls like Http and Https. In this case you can simply rewrite as:

Final Thought

By following these steps, you should have learned a simple yet effective way of performing complete SEO audit and fixing those issues. 

In case you are seeking for complete audit service, Orka Socials is with you. If you have any questions about the audit process, then please leave a comment below. 🙂

Leave a Reply

Get 25% more traffic using “low hanging keyword method” on next 30 days

Close Menu
×
×

Cart