Recent drop in search rank - Possible negative SEO?

Domkey Kong

Newbie
Joined
Oct 6, 2016
Messages
12
Reaction score
1
Hi. New to the forums but have been in SEO/Marketing for around 5 years. I manage an SME website which until recently has ranked very high in search (1-3). Our primary "target" keyword has a keyword difficulty of 34 (Moz). Over the past few months however we have noticed a drop in our rank and after researching this further with Webmaster tools, Moz, Ahrefs etc I can identify a gradual drop in search rank since late August. I've been racking my brain to understand why. I can't see any significant lost links, no major changes to page content, no manual actions, no dodgy link building on our part. The ONLY thing I can find which might have something to do with it is the emergence of a load of directory links to our site over the same period of time. All the directories have the same owner so on face value it does seem like it might have something to do with it. We definitely didn't create the directory links, so I have to wonder whether it is a negative SEO attack. The links were only identified in Ahrefs. I've already contacted the owner (no response) and so uploaded a disavow to Google.

SEO is just one part of my job, i'm not used to this kind of problem and certainly have no experience dealing with negative SEO so I have a few questions:

1. Can you think of any other reasons other than the dodgy links for the drop in rank? Happy to provide some more details if required?
2. Who else might be doing the negative SEO? I might be wrong but I think it would be too petty for a competitor to bother with.
3. Has anyone had any similar experiences?

Thanks in advance
 
Hi, I work on a high DA website and we had similar thing... it felt like a power loss... How often do you update the website? have you implemented any new tech on the site? have you checked 404s on SC?
 
Hi, I work on a high DA website and we had similar thing... it felt like a power loss... How often do you update the website? have you implemented any new tech on the site? have you checked 404s on SC?

It's updated very regularly. It has a Twitter ticker, and we have fresh news/blog posts weekly. The only significant new tech we've implemented is CAPTCHA to some contact forms as we were heavily spammed a few weeks ago (which may or may not be related to the negative seo theory?!) As for 404's we had a few, but nothing substantial. I redirected them all anyway as I'm trying everything I can.

One other thing... Our non-www. domain was not being 301'd to our www. domain (which I understand is bad housekeeping). We've fixed that now too.
 
I don't think captcha would have any effect. sounds like you are on the right track.

You may want to go for a link detox... with some sites it creates wonders but I would consider that as last option.

I think there was an august update mentioned on one of the blogs; seojournal or similar... bear in mind this could be (possibly) something external... in the event of a similar change we always check our site to see if anything changed however it could be something you haven't done on the site which your competitors carried out.... I don't know lets say AMP...
 
I don't think captcha would have any effect. sounds like you are on the right track.

You may want to go for a link detox... with some sites it creates wonders but I would consider that as last option.

I think there was an august update mentioned on one of the blogs; seojournal or similar... bear in mind this could be (possibly) something external... in the event of a similar change we always check our site to see if anything changed however it could be something you haven't done on the site which your competitors carried out.... I don't know lets say AMP...

Thanks. I saw the August update but I would expect to see a more sudden dip than the long decline we're experiencing. Can I ask what you mean by "AMP"?

Other than that, thanks for the reassurance, hopefully we start seeing some positive effects of all the efforts soon.
 
I agree but knowing that G has a learning algo I would assume they will stay away from big hits to prevent seos understanding what works and stay natural...

What I meant by amp was for example if you are a news website and implemented AMP; Accelerated Mobile Pages, then it is possible that you can see a dip before a rise etc... a new technology that would change how your site looks or behaves in the eyes of G basically...

I think updating the site everyday with new content and measuring the results are our best bet to see what works with big sites, otherwise everything else is a myth and would get hit someday...
 
  1. I would start by using the panguin tool to overlay known updates with GA traffic https://barracuda.digital/panguin-tool/
  2. Then check the Google Index - Index Status in GSC and make sure the "total indexed" equals the number of actual pages you have. If the number is greater then you navigation and duplicate content problems.
  3. Next go to Search Traffic - Links to Your Site in GSC and find out who links to you, how they link to you, and what they are linking too.
  4. Next go to Crawl - Crawl Errors in GSC and look for 500 and 404 errors.
so what it might mean:

1 would identify known changes at Google that might effect rankings.
2 checks if you have indexation problems.
3 checks if someone is linking to you in a way you would not link to yourself.
4 500 errors can signify web application problems OR denial of service attacks (both can effect rankings). 404 errors can signify orphaned backlinks or toxic domains being redirected to your website.

The other possibility is that nothing has changed with your website and with the new softer penguin your competitors have simply come back into play which has the effect of falling rankings. Definitely take factor measurements on the top pages for your keywords and make sure you achieve competitive parity with those factor measurements.
 
How long ago did you fix the 301 ?
Are we talking days or weeks ?
 
Not everything is associated to negative seo... sometimes, it is just competitors' websites surpassing you or adjustments to algorithm.

Cheers,
Commoner
 
Thanks validseo. Some observations:

1. Can pretty much rule this one out, as there's no correlation (according to panguin anyway)
2. "total indexed = 1,198" but I used Seo Spider to generate a new sitemap of 251 pages. So some disparity there. However I remember seeing this a long time ago and thinking nothing of it... So it's nothing new, it's always been way over our actual page count. Is there a simple answer as to how to fix this? I guess a start would be understanding what the 1,198 are - are you able to get that info?
3. Quite a few. Most are on the disavow list but i'll review this one as it's different to the ahrefs list.
4. I've fixed all of these. I do need to mark them as fixed though. There's about 90 to check/do.

How long ago did you fix the 301 ?
Are we talking days or weeks ?

Days. Earlier this week.

Not everything is associated to negative seo... sometimes, it is just competitors' websites surpassing you or adjustments to algorithm.

Yes, that's true. But I think the amount of competitors moving above us and the quality of their sites/seo is suspicious.
 
2. "total indexed = 1,198" but I used Seo Spider to generate a new sitemap of 251 pages. So some disparity there. However I remember seeing this a long time ago and thinking nothing of it... So it's nothing new, it's always been way over our actual page count. Is there a simple answer as to how to fix this? I guess a start would be understanding what the 1,198 are - are you able to get that info?

This is a big issue. You are saying that Google is indexing each page about 4 or 5 times on average with different URLs... Odds are you are duplicating a few pages a whole bunch of times... pagination, keyword search, click tracking, sorts and filters are common culprits

Check HTML Improvements - Search Appearance in GSC and look at the duplicate titles and meta descriptions for clues on which pages are being duplicated and why.

Definitely make sure ALL pages have a canonical URL and that it is set properly. Pagination should use rel=next/prev

The reason this is such a big deal is that when Google sees the same content over and over again with different URLs it thinks either it is in an infinite loop and quits OR it thinks it has found all it is going to find and Quits... you might only be getting 100 of your 251 pages indexed even though Google says it indexed over 1000. It is a really big deal.

You may want to check your sitemaps in GSC and see what percentage are indexed.

Keep in mind that if Google isn't completely crawling your whole site then you have disrupted the flow of PageRank to and from the parts of the website that aren't being reached.

Also: make sure your sitemap URLs, canonical URLs, and URLs in your navigation are all in agreement and use the same strict formatting. Also check your RSS feed URLs too. HTTP/S, www, trailing slash, params all must agree too.
 
Last edited:
  1. I would start by using the panguin tool to overlay known updates with GA traffic https://barracuda.digital/panguin-tool/
  2. Then check the Google Index - Index Status in GSC and make sure the "total indexed" equals the number of actual pages you have. If the number is greater then you navigation and duplicate content problems.
  3. Next go to Search Traffic - Links to Your Site in GSC and find out who links to you, how they link to you, and what they are linking too.
  4. Next go to Crawl - Crawl Errors in GSC and look for 500 and 404 errors.
so what it might mean:

1 would identify known changes at Google that might effect rankings.
2 checks if you have indexation problems.
3 checks if someone is linking to you in a way you would not link to yourself.
4 500 errors can signify web application problems OR denial of service attacks (both can effect rankings). 404 errors can signify orphaned backlinks or toxic domains being redirected to your website.

The other possibility is that nothing has changed with your website and with the new softer penguin your competitors have simply come back into play which has the effect of falling rankings. Definitely take factor measurements on the top pages for your keywords and make sure you achieve competitive parity with those factor measurements.

^^^ What he said

A lot of your decline could to do with new competitors in your space.

Make sure your technical SEO is in line. Disavow those shitty links. You're already doing fresh content, so that is good. Try getting some fresh backlinks as well. Try to increase your sitespeed. Get some rich snippets in the SERP for increasing CTR.

How mobile friendly is your site? There was a mobilegeddon 2 update; so, as mentioned above, use panguin to identify any declines that happened near any algo updates.

Also, compare your landing pages in Analytics: YoY, QoQ, and MoM... try to identify exactly where you've fallen behind.
 
This is a big issue. You are saying that Google is indexing each page about 4 or 5 times on average with different URLs... Odds are you are duplicating a few pages a whole bunch of times... pagination, keyword search, click tracking, sorts and filters are common culprits

Check HTML Improvements - Search Appearance in GSC and look at the duplicate titles and meta descriptions for clues on which pages are being duplicated and why.

Definitely make sure ALL pages have a canonical URL and that it is set properly. Pagination should use rel=next/prev

The reason this is such a big deal is that when Google sees the same content over and over again with different URLs it thinks either it is in an infinite loop and quits OR it thinks it has found all it is going to find and Quits... you might only be getting 100 of your 251 pages indexed even though Google says it indexed over 1000. It is a really big deal.

You may want to check your sitemaps in GSC and see what percentage are indexed.

Keep in mind that if Google isn't completely crawling your whole site then you have disrupted the flow of PageRank to and from the parts of the website that aren't being reached.

Also: make sure your sitemap URLs, canonical URLs, and URLs in your navigation are all in agreement and use the same strict formatting. Also check your RSS feed URLs too. HTTP/S, www, trailing slash, params all must agree too.

All noted. I can see the problem and it involves news pages. When they are created they produce 2 or 3 urls depending on what news categories they are listed under. I have a developer looking into this now so should be able to fix in a few days. Once fixed, what is the best way of letting Google know? Just re-submit the sitemap I guess?
 
You have done the right thing. If all else fails, just noindex the duplicated pages or simply change the theme
 
I have major drop on some kw after last g update last few weeks. I was ranking on 3 kw on 2page with 500k m/s and now im nowhere so....its just google algo i guess.
 
Is the site secure? https could help
My test showed that once you switch to https your ranking drop, yes redirects pass juice but anyway from my exp. I will not recommend that if you already sold link building
 
^^^ What he said

A lot of your decline could to do with new competitors in your space.

Make sure your technical SEO is in line. Disavow those shitty links. You're already doing fresh content, so that is good. Try getting some fresh backlinks as well. Try to increase your sitespeed. Get some rich snippets in the SERP for increasing CTR.

How mobile friendly is your site? There was a mobilegeddon 2 update; so, as mentioned above, use panguin to identify any declines that happened near any algo updates.

Also, compare your landing pages in Analytics: YoY, QoQ, and MoM... try to identify exactly where you've fallen behind.

Thanks for all the advice. The site is mobile friendly (95+ on Google page test) so I don't think that's the problem. The landing pages don't really hold any insight either.
 
Back
Top