your google ranking dropped significantly, what to do

Your Google ranking dropped dramatically, what to do.

An in-depth article on 10 reasons for losing your Google rankings significantly (and how to recover them!)

TL;DR:

Has your Google ranking dropped dramatically? Here's what to do. - Find out why it happened using tools like Google Search Console or SEMrush. - Verify your website’s content quality, backlinks, and technical issues. - Research the impact of the drop - Fix the common reasons for the issues - Ensure your website is mobile-friendly, has HTTPS, and is aligned with UX and search intent. - If nothing else worked: hire a professional SEO team to restore rankings.

Let’s be honest, experiencing a dramatic drop in your Google ranking sucks.

You’ve worked hard to get your website to rank. You probably wrote articles, optimized the technical elements where needed, and even did some keyword research. Yet, Google decided that you no longer need to rank for certain keywords. Ouch.

But it’s not the end of the world, it’s solvable! In this article, we’re sharing all the issues that could cause your problem and how to solve them.

First things first: Don’t panic!

Drops in Google’s organic rankings happen to almost every website. It’s a natural flow of changing positions. Sometimes you drop a few positions to see your rankings going back up next. So the first rule is: don’t stress!

Did you lose rankings for multiple weeks in a row or did you completely disappear from the SERPs? Then here’s the second rule: Trust that you will figure it out and that you can solve it.

Besides that: don’t start fixing the issue without being certain what the problem is.

If you're not sure what you’re doing, we recommend you find an SEO expert to assist you.

But if you want to do it yourself, go through the following process to find out what is bothering your website's visibility in the Google search engine.

How to be sure you really dropped in the Google rankings?

Did your ranking really drop, or is it just a natural change in positions?

Let’s see what is going on!

You can verify this using your Google Search Console. If you see a steep drop in website traffic, the problem is real and needs an immediate fix to prevent any further traffic loss.

Here’s a screenshot from Google Search Console showing a drastic drop.

significant drop in clicks and views visible in Google Search Console

You can also check this out on SEMrush.

dramatic drop in google ranking visible in semrush

When you see either of these graphs, you can be sure you dropped significantly in the search results due to something different than natural position changes.

So your next step is to figure out which pages are affected and why.

Research the impact of the traffic drop with GA4

Numbers talk!

Dive into your data and plan accordingly to get those rankings back.

Follow the next steps in Google Analytics 4:

1. Go to Reports > Engagements > Pages and screens: Page path and screen class

screenshot of pages & screens GA4

2. Add filters to choose the timeframe, regions, or other patterns you found when scoping the impact of the rankings drop.

add filters to your GA4 pages and screens

3. Compare the period in which your rankings dropped to the period in which they were still good.

compare dates in GA4

4. Review the overview for suspicious changes

review the overview in GA4

Now that you have this overview, it should be easy to tell which pages have dropped in organic traffic.

If ALL your pages have dropped in the number of views, it’s very likely that you’ve been hit by a manual or algorithmic penalty (more on that later in this article). If the drop only concerns a few pages, then it’s more likely that your competitors are creating better content or that the algorithm has changed.

Still not sure? Book a meeting with one of our SEO consultant to point you in the right direction:

10 most common reasons why your Google rankings dropped:

Understanding why your website’s rankings took a dip is essential. We've compiled a list of our 10 most heard reasons behind significant ranking drops, offering insights into each issue and providing solutions to get your website back on track.

1. Manual Penalty

When your website does not follow Google Webmaster’s guidelines, Google reviewers can hit you with a manual penalty. This means that someone at Google looked at your website and concluded that it conflicts with their guidelines. Though this is not a usual cause for a drop in Google rankings, it is one of the easiest diagnoses to start with.

You can verify the Google Manual Penalty with your Google Search Console account by following the below steps.

Step 1

Go to Google Search Console > Expand the “Security and Manual Actions” > Click “Manual Actions”.

where to find if you've been penalized manually

Step 2

If you see the message that says, “No issues detected”, you are not penalized by Google Manual Penalty.

no manual penalty applied

If you see something similar to the below message that says, “Issue detected”, you are penalized.

you have been penalized manually by google

Google Manual Penalty is a serious problem, and you should solve it completely as it will remove your web pages or entire website from the Google Index. The main reason for your penalty will be mentioned in the description, like the image above.

There are a couple of different types of manual penalties:

An "Unnatural Links" penalty, also known as a Google Penguin penalty, is an action taken by Google against websites that it believes have engaged in manipulative link-building practices. These practices are for example having an excessive number of links that are not relevant to the content, or paid links that pass PageRank. Both examples violate Google's Webmaster Guidelines.

The main goal of this penalty is to discourage black-hat SEO tactics—such as buying links or obtaining them through networks designed primarily to boost Google rankings—and promote a more natural link ecosystem.

When your site is penalized for unnatural links, you may experience a significant drop in search engine rankings for your pages or even removal from search results entirely.

Recovering from an unnatural links penalty:

  1. Identify and remove as many of the spammy or low-quality links as possible.
  2. Disavowing links that cannot be removed manually.
  3. Submitting a reconsideration request to Google, detailing the cleanup effort.

Hidden Text or Keyword Stuffing

A "Hidden Text or Keyword Stuffing Penalty" is applied when a website is found to be using deceptive SEO practices intended to manipulate search rankings.

These practices include:

  1. Hidden Text: This involves placing text on a webpage that is invisible to visitors but readable by search engine crawlers. For example: using text the same text color as the background, positioning text off-screen, setting font size to zero, or hiding it behind images. The hidden text is usually stuffed with keywords in an attempt to boost the page's relevance for those terms.
  2. Keyword Stuffing: This is the practice of overloading a webpage’s content or meta tags with excessive, irrelevant, or repetitive keywords. Your content would be horrible to read for a real visitor, but robots or crawling spiders would eat it right up.

Both hidden text and keyword stuffing are considered violations of Google's Webmaster Guidelines because they create a poor user experience and result in unfair search advantages. Imagine if everyone was keyword stuffing their online content: no one would want to use Google anymore. So Google is not only protecting their users, they also protect their product.

So when Google detects such practices, it may issue a penalty that can lead to a significant drop in the site's search rankings or complete removal from search results.

Recovering from a penalty like this requires removing the content, making the site compliant with Google's guidelines, and then submitting a reconsideration request to Google.

User Generated Spam

The “User Generated Spam” penalty is a penalty against websites that allow spammy content to be posted by users. For example spam comments in forums, blog comment sections, user profiles, or any other area of a site where external users can freely post content.

We have seen a lot of small business owners dealing with this specific penalty. If you’re hit with this penalty, and you are a small business website, it’s very likely that you were hit by spam bots that left a bunch of spammy comments on your blog articles. If this is the case, we recommend you turn off the comment section and search for a way to protect your blog before switching it back on!

This penalty is issued when you fail to monitor, moderate, or control the content being posted by your users, leading to lots of spammy or low-quality content that can affect the overall quality and credibility of your website. Think about comments with links to low-quality sites, posts made purely for promotional purposes without meaningful contribution, or the use of excessive keywords.

When Google detects significant levels of user-generated spam, it may penalize your site by reducing your visibility in search results, or in severe cases, removing the affected pages from its index altogether. To recover from this penalty, site owners need to:

  1. Thoroughly review and clean up the spammy content.
  2. Implement stricter moderation processes to prevent future spam.
  3. Possibly enhance the site’s registration and commenting systems to deter automated spam posts. Use Captcha for example.

After addressing these issues, you can submit a reconsideration request to Google, explaining the measures taken to resolve the problem and prevent its recurrence.

Note: you do not get penalized for a couple of spammy comments. As long as you keep your website clean from these comments or posts, you will be fine. This penalty only hits when you don’t monetize the user-generated content and the amount of it is getting out of hand. So keep your website clean and you should be fine.

Structured Data Markup Spam

The Structured Data Markup Spam penalty is imposed when you misuse structured data markup (also known as schema markup) to deceive search engines, particularly in how content is displayed to users in search results.

By using structured data markup you help search engines understand what information is on your website. It can lead to gaining rich snippet positions, like star ratings for reviews, prices for products, or event information.

This penalty can occur in various scenarios, including:

  1. Falsifying Data: You’re using structured data to present false or misleading information in search results. For example, adding review markup to a page that contains no genuine user reviews or fabricating review scores.
  2. Irrelevant or Misleading Markup: You’re applying markup that is irrelevant to the content of the page, such as marking up non-event content as an event or non-product content as a product.
  3. Markup Overuse: You’re excessively using markup on a site in ways that are not supported by Google’s guidelines.

When Google detects that you are misusing structured data to manipulate search engine results, it may issue a manual action against your site, which can result in the loss of rich snippets, a lower ranking in search results, or deletion from the search results.

How to recover from this penalty:

  1. Correcting or removing any misleading, incorrect, or excessive structured data markup.
  2. Ensuring that all markup is in strict compliance with Google’s structured data guidelines.
  3. Submitting a reconsideration request to Google once the issues are resolved, detailing the changes made to comply with Google's guidelines.

So, no matter what manual penalty you got: read the Google Webmaster Guidelines.

2. Google Algorithm Update

A Google Algorithm Update happens almost every day, you just don’t know about it. And you don’t have to either, most of the time these are very minor changes.

But every now and then Google comes with a Core Algorithm Update.

This is a big update to their search engine to improve the relevance, quality, and accuracy of the search results.

Some well-known Google algorithm updates:

  1. Panda (2011) - Targeted sites with low-quality content, such as thin content, content farms, or sites with high ad-to-content ratios.
  2. Penguin (2012) - Aimed to decrease the ranking of sites that violated Google’s Webmaster Guidelines by using black-hat SEO techniques, such as keyword stuffing and link schemes.
  3. Hummingbird (2013) - Focused on improving the understanding of the intent behind users' queries, making the search engine more capable of handling conversational search queries.
  4. Mobilegeddon (2015) - Prioritized mobile-friendly websites in Google’s mobile search results.
  5. RankBrain (2015) - Introduced machine learning into the algorithm to better understand the context of queries and deliver more relevant search results.
  6. BERT (2019) - Enhanced Google's understanding of natural language to better comprehend the nuances and context of words in search queries.
  7. December 2020 Core Update - This update continued the trend of refining the algorithms to prioritize high-quality content and better user experience. Websites with thin content or those that were overly optimized without providing real value saw fluctuations in their rankings.
  8. June and July 2021 Core Updates - Google rolled out two major updates consecutively in the summer of 2021. These updates were initially planned as a single update but were split due to the size and scope of the changes. They impacted a wide range of factors including page experience, site speed, and content quality.
  9. November 2021 Core Update - This update further emphasized the importance of well-researched, in-depth content, and overall site usability. Many sites experienced significant ranking changes, which prompted webmasters to enhance content quality and user engagement metrics.
  10. May 2022 Core Update - This update had a broad impact across many types of online content, with Google again emphasizing the importance of delivering a strong user experience and high-quality content. It also continued to refine how it evaluates and understands page content.
  11. September 2022 Core Update - This was another significant update that affected many websites across different sectors. Google provided guidance on focusing on excellent content creation practices and enhancing overall site performance.
  12. March 2024 Core Update - This update aimed at improving the quality of search results by reducing low-quality, unoriginal content by 40%. It also introduced new spam policies targeting various forms of manipulative practices, such as scaled content abuse, site reputation abuse, and expired domain abuse. This update had a considerable impact, leading to the deindexing of hundreds of websites that relied heavily on AI-generated content or other low-quality content strategies.

So, for example, if your website dropped significantly in ranking in March 2024, or maybe completely disappeared from the SERPs, it’s very likely that you were sharing ai-generated content. Not a little, but heavily.

How to stay up to date for Googles Algorithm changes?

Google keeps updating its algorithm without sharing much information about them. They often share that their algorithm has changed, but that goes without disclosing the specific changes.

You could follow them on X:

Google Search Liaison on Twitter X

You can keep an eye on all the Google Updates from the following:

Moz's Google Algorithm Update History page provides a comprehensive overview of all Google Algorithm Updates for your reference.

MOZ google update history

Google doesn't pre-announce all updates, but official channels like the Google Webmaster Central Blog or Google SearchLiaison Twitter account keep you informed about significant changes affecting website rankings and performance.

web master central blog screenshot

3. Competitors have created better content and are outranking you.

Not every dramatic drop in Google rankings has to do with something you did. Sometimes, competitors just produce higher-quality content than yours.

Google values quality content and its algorithm is built to keep websites with the highest quality content at the top of the search results. It’s all about the readers in the end of course.

When your content aligns with what users are searching for and keeps them engaged, search engines like Google take notice. This engagement tells Google that your content is valuable and relevant, which can lead to higher rankings in the search results.

So, your high-quality content not only attracts the right target audience but also earns high search engine rankings, resulting in better traffic.

How to find out if your competitors are beating you in the SERPs:

Go to SEMRush and enter your website’s domain in Organic Research.

semrush to check organic traffic drop

In the overview of Position Changes, you can see which keywords you’re ranking for, the average position in the search engine result pages (SERPs), and if that position has changed in the last period.

check your domain in semrush

Source: Semrush

Scroll down to the keyword overview and filter the following:

  • Declined
  • Top 10 positions
  • Order on either Traffic or Difference
order your results on traffic or difference

For the keywords that you previously had a top 10 position, but are now ranking between 11 and 30, your competition is beating you to it.

If they created higher quality content or offered more recent and updated content, they might outrank you and take your position in the SERPs.

To get those dropped keyword rankings back, you need to optimize the quality of your content. Look at what your competitor is doing and optimize your own pages accordingly.

You can make your content better by doing the following:

While optimizing your content, your goal should always be to update content for your target audience and their search intent, NOT robots and algorithms.

4. Poor content (outdated, duplicate, AI-generated, plagiarism?)

This goes to the extent of our previous point. A website with outdated, duplicate, or AI-generated content often faces a drop in its Google ranking. Especially after Googles last core update (March 2024).

You can use Ahrefs and Semrush’s Content Audit Tool.

The first thing to check is if your content is crawlable, as Google can not index pages that aren’t crawled. You can verify this using your Google Search Console by following these steps:

  1. Click on “URL Inspection”
  2. Enter your URL
  3. Click on Page Indexing
google search console url inspection

The next step is to check the basic SEO practices for high-quality content. Though it does not guarantee top positions, not meeting them can result in lower Google rankings.

Ask these questions for this process:

1. Is my Title Tag optimized?

Your title tags are an opportunity to tell Google bots what your website or webpage is about. It should provide a description or a summary of the content.

You can optimize it using plugins like All in One SEO or Rankmath for your WordPress website or customize it within the heading section of your website’s HTML.

2. Does my homepage have optimized H Tags?

H Tags define the purpose of the content. Your H1 tag lets Google bots differentiate your content from the headings. Be aware that you should only use the H1 tag once per page and that you use the H2’s, H3’s, and H4’s correctly.

3. Meta Description

While meta descriptions do not directly affect your ranking, they do influence the click-through rate (CTR), which directly impacts your Google rankings.

4. Body Content

It's essential to keep your content fresh and relevant.

Here's the correct approach to keep your content up-to-date.

Start with the four main types of posts that need to be updated:

  1. Posts ranking of page two or lower
  2. Posts causing keyword cannibalization
  3. Posts not aligned with current content strategy
  4. Posts not matching with user search intent

Follow the following five steps to update your content:

  1. Analyze top-ranking articles for target keywords.
  2. Add unique and missing elements like quotes, resources, and FAQs.
  3. Remove outdated or irrelevant content.
  4. Consider content format and structural changes, visuals, and SEO keywords.
  5. For keyword cannibalization, consolidate and redirect URL

5. Does my website have any duplicate content on it? Check with the tools like Copyscape.

Duplicate content refers to content that appears either identical or very similar on the same website or across different websites. This includes content that has been rewritten, rephrased, or directly copied.

Having duplicate content can have a negative effect on your website's Google rankings.

When search engines crawl your site to find the best results for a query, having multiple pages with similar content can confuse them. As a result, they might choose not to rank any of those pages.

Since Google doesn't like ranking duplicate content, it can lead to a decrease in organic traffic over time.

You can check duplicate content on your website using tools like Copyscape.

copyscape screenshot

6. Does my website have AI-generated content?

Though various AI content detector tools are available on the market, like Copyleaks, Sapling, and more, they often generate different results. However, you can use a tool like Undetectable.ai, which provides results from all these tools.

ai-generated text detector screenshot

Now, using ai-generated content is not a big no-no. You can use it, heck, we even recommend using AI to help you write. BUT… do not let AI write your whole page for you!

See AI as an assistant that checks your grammar and spelling, gives you ideas about topics, or rephrase a sentence you wrote.

I repeat: DO NOT LET AI WRITE A WHOLE PAGE FOR YOU.

It will kick your *ss later on.

Backlinks are one of the more important factors for your website's rankings.

When you have a high-authority website backlink profile, Google views your website as a more reliable source than other linked websites.

What does that mean?

It means that there are many other websites out there that link to you, aka backlinks. You can consider a backlink to be a recommendation from someone else. And Google’s algorithm likes those a lot.

Now you would think, the more backlinks the better, right? Not true. Gaining a link from other (preferably) high-quality websites is great, especially if they’re also in your niche, but receiving a bunch of backlinks from spammy websites can hurt your backlink profile.

If you do not have powerful backlinks or have only a few links from non-authority websites, not a big deal. It will be harder to rank, but it’s ok.

If you have a lot of backlinks from spammy websites, then your Google rankings might drop.

How to check if you have spammy backlinks?

Whenever your website rankings drop, you should audit your backlink profile using tools like Semrush. A tool like that will provide all the web pages linking to your website, the pages they’ve linked to, and the anchor text used for the backlinks.

You should check the following to detect spammy and unnatural backlinks:

  1. Are there any incoming links that appear unnatural and come from the same IP address?
  2. Toxicity score in SEMRush

If spam links are impacting your Google rankings, use Google Search Console’s disavow tool to fix the issue.

Disavowing links should be your last resort and is an advanced SEO tool. Be sure you know what you’re doing or ask an SEO Expert to assist you.

6. Technical changes on your website (robots, CMS, sitemap, hosting, SSL, videos that slow your website down?)

If the Google bot can’t access your website, it will never rank you or will drastically drop your website’s traffic. If it has issues accessing your website, you most likely have technical SEO issues to solve. Check if Google can properly crawl and index your website.

You can use Semush’s Site Audit Tool to check crawling and indexing.

check which pages are crawled in semrush

Additionally, you can access your website’s health using the same tool.

website health check in semrush

After this, remove all the issues impacting your website’s rankings.

Robots.txt

Any changes in your Robots.txt file can impact your website ranking, as an incorrect file can interrupt the indexing of your web pages. Check your robots.txt file using Google’s free Robots Testing tool.

Alternatively, you can also use tools like Ahref’s Site Audit to track your robots.txt file.

robots.txt check in ahrefs

CMS or Website Migration and Redirects

If you migrated your website or changed your Content Management System (CMS), it is possible that you have some incomplete migrated files on your website.

When you change your website's URL, Google sees it as a new page and resets its ranking. If you're changing URLs, it's best to set up 301 redirects from old pages to new ones to maintain your rankings. This ensures your web presence remains consistent.

Or you can hire an expert to do this part for you.

For WordPress websites, you can use the Rankmath plugin.

Domain/hosting issues

Make sure you have an active domain name and hosting plan, otherwise, your webpages will disappear from Google rankings.

If you have active plans and are still facing the issue, contact your website hosting company to check for any downtime on their end of a distributed denial-of-service (DDoS) attack.

4XX Pages

4xx pages result in a traffic drop. These are the error pages such as the “404 page not found” error. Make sure your website is free from 4XX pages at all times.

404 error that needs to be fixed to get yor google rankings back

Sitemaps

When migrating your website, overlooking the addition or generation of site maps can have adverse effects. This oversight can significantly impact your site's visibility and accessibility to search engines, potentially leading to a decline in rankings and organic traffic.

So, it's crucial to ensure that site maps are properly implemented during the migration process to maintain your website's performance and user experience.

You can generate sitemaps using various tools, and then add these in your Google Search Console Account.

7. Mobile-first indexing

Google Search started focusing more and more on mobile devices starting in 2015, with the mobile-friendly update. Then, in 2016, they started mobile-first crawling and indexing. Nowadays, your website can not survive the SERPs if it is not mobile-friendly.

You can check if your website is mobile-friendly with Google’s Mobile-Friendly Test.

Another option is EXPERTE.com's browser-based Mobile-Friendly Checker, which automatically crawls any website for up to 60 seconds or 500 pages. This free tool then provides a snapshot of each page, letting you know how mobile-friendly it is, along with suggestions for how to improve its score.

8. User behavior

Google has specifically mentioned a few factors that could harm your website’s ranking such as page load speeds, mobile-friendliness, and intrusive interstitials.

If you have updated your website, it may take more time to load. You can check your website’s page loading speed using Google’s free PageSpeed Insights tool.

google page speed is ranking factor

9. PPC is stealing your traffic

Websites often start losing organic traffic when competing with branded keywords used in paid competitors’ campaigns.

You can use Semrush’s Keyword Gap Analysis tool to check if keywords are used in any paid campaigns.

To do this, click on “Keyword Gap” under the SEO Dashboard.

did your competitor steal your ranking in google?

Check for the paid keywords.

paid keywords overview in semrush

10. Security - HTTPS vs HTTP

Google values HTTPS websites.

HTTPS adds encryption through SSL/TLS and secure connections between browsers and servers, whereas HTTP transmits data on the web without encryption.

htpps vs http - get your google ranking back after a drop

You can verify your website’s HTTPS status using Semrush’s Site Audit tool.

semrush site audit to find lost traffic due to dramatic drop in ranking google

Click on the ‘HTTPS view details’ button to review the report and identify any potential issues with your HTTPS.

https check in semrush

In case of any errors, you can click on these blocks to get more detailed information.

is this the reason you dropped dramatically in the google ranking?

Conclusion & CTA

Stay positive!

Every setback is a setup for a greater comeback.

It’s a tough task to find out the reasons why your website’s ranking is dropping dramatically. It requires in-depth technical analysis and expertise to fix these issues.

Although a drop in Google's ranking is a serious issue, it can be fixed. You can trust an SEO Agency like Operation Nation to do the hard work for you and provide the best solution to all the issues. Let us audit your website and find the vulnerabilities for you (and fix them if you let us!).

You can book a consultation call here!


Josien Nation profile picture

Josien Nation

|

Co-Founder & Head of Marketing

Josien Nation is a co-founder and partner at Operation Nation. She is the leader of all things SEO at Operation Nation. She has 6+ years experience helping businesses grow their audience and get found on search engines.

Other Articles You May like:

SEO Case Study for Organic Growth

SEO

Achieving 566% Organic Growth: A Case Study

This case study explores a successful journey of achieving a staggering 566% growth in organic traffic and a significant increase in conversions for a client by employing a comprehensive SEO and content strategy.

Josien Nation profile picture

Josien Nation

08 March 2024