Google Algorithm Updates - 2024 Ongoing Discussion

I can share my progress with disavow so far.
I have 3 sites and all with a few fiverr links only and I got all links sent to me by excel so it was easy to submit it to the disavow tool.

We are around 30 days now when I did the disavow and no site have recovered yet, all different ages, niches, articles etc.
 
IMG-4881.gif


You should be careful with your words there.
Why?

I can share my progress with disavow so far.
I have 3 sites and all with a few fiverr links only and I got all links sent to me by excel so it was easy to submit it to the disavow tool.

We are around 30 days now when I did the disavow and no site have recovered yet, all different ages, niches, articles etc.
Same. Knowing how Google's ranking systems use hundreds of data points, I really doubt we could have tricked it by doing "one simple trick."

The only winner in this whole spiel was the guy who made LRT. $2000 for a link aggregator. Come on. He doesn't even run the spiders or database. lol.
 
Backlinks are the first tier of fixing things. Just because your process/the process you bought/you disavowed everything a tool told you to didn't work doesn't mean it won't work in other situations.

I spent hours working through lists and the single biggest thing I caught was hacked sites.

Sites that when you use a Googlebot User agent the page looks like spun shit content that was an old school manipulation tactic. When using a normal browser User Agent the page either 404s or redirects to a phishing sites. If you visit the homepage, it is a legitimate business with real traffic and ranking for real keywords. The site has been hacked and a new subfolder has been added which is spinning up a crazy amount of content connecting to more garbage pages and other hacked sites within the network.

So, no I don't believe Google can catch everything because they crawl from specific user agents which has been pretty pubic knowledge for a while. If they can account for all the smokescreen bullshit that I've found then good for them. I don't fully trust that they can tell the difference from an algorithmic standpoint, so I'm fine with helping them by submitting a disavow file.

If your site isn't big enough to get caught up in this, cool. That likely means that a disavow file may not help. So start looking at the next tier and fix your content and internal linking. This wasn't a silver bullet to fix all problems, this was a solution to fix one sub-problem in one tier or issues to check.

If the disavow file upload didn't change things after a few weeks/a month then move on to the next thing and stop waiting and hoping for things to change.

I'll gladly share what I'm finding within anyone who's looking for ways to fix their site but this thread is starting to become stupid with the bitching about "the disavow doesn't work." That's 1/100th of the equation. Figure out the next part.
 
Spend a few days combing through Ahrefs/SEMrush/LRT and Search Console.
Disavowed 6k domains
This amounted to around 20,000 URL's that i submitted to indexing platforms and sent blog spam to

9 days and still no movement. I still have hope though
 
TIL people still flock to the internet and "guru's" on Reddit/Twitter for their answers and advice when they get in a SEO bind, while claiming that they themselves are SEO's.

This is why the doers of the world end up with all the gold nuggets, and everyone else is left empty handed with no course of action.. but waiting on the poisoned crumbs the social media guru's of the world hand out.
 
It's all about patterns as I see it, of which a shitty link profile is one, but not necessarily the only thing.

The HCU and similar updates are binary, you are either good or you are bad, but the factors that go into it are most likely not binary, because Google has access to user data (pogo-sticking), backlink data, contextual language analysis and technical on-site data. Each of these can have flags and a machine learning model would be set out to create some kind of linear regression model in which each are weighed according to current "bad site" metrics.

The core of the training of this model is the manual spam review raters guidelines. Tens of thousands sites get marked spam or not-spam manually and the machine learning model is set onto it. The mistake some people make is to think that it's only what the reviewers see that is fed into it. No, of course they also look at the backlink profile of those marked spam. They were manually identified as spam without looking at backlink or user metrixs, but the machine learning algo likely has access to both backlinks and user metrics.

This is why you likely won't see any results from any single thing and you might be in a situation where if you do nothing, you might bounce back in a few years, when current "spammy" methods are no longer used and thus, a new massive manual spam review dataset would have other things for the machine algo to train on.
 
Disavow only works if you have good links and got nuked by spam farms....
If you don't have any good links........
If you havn't hit the threshold for getting dinged by the spam farms......

Also, there's some much nastier neg seo than link spam in large scale use so ugh......
 
Disavow only works if you have good links and got nuked by spam farms....
If you don't have any good links........
If you havn't hit the threshold for getting dinged by the spam farms......

Also, there's some much nastier neg seo than link spam in large scale use so ugh......

I have to agree, but my angle is slightly different.

The largest lesson I learned with SEO was to be a pig, instead of a hog.

Why?

Hogs get slaughtered, pigs get fat.

I have a ton of little sites. Too many for me to keep track of, too many for me to remember the login's to.

They have ranked top of Google, without a single hit from an algo since 2006. And no, they aren't all old ancient sites and domains. Some I made within the last 3 years.

Checks still roll in every month from affiliate networks, Nozzle still reports about 95% of their terms in the top 1-4 placements.

I think it's largely because I didn't try.

I didn't try to grow them, I didn't try to be number 1 ( even though I am, in the serps for a lot of them ). I didn't try to grow the link base. etc. I write an article for them and put them on the sites, like once or twice a year.

I don't even interlink the pages on the sites.

I didn't want to be in a "hog" niche. I didn't want to be a "hog" in even a small niche. I do have some that could be consider hoggish like the stuff I have in the financial space, but the majority isn't.

I literally pick shit no one else would, that just happens to have an affiliate program.. and I build that and leave it alone. I might be the only active affiliate some of these programs have.

And a ton of them, have Ai content on them now and that shit ranks too.

Don't try so hard. Treat it like a hobby.
 
the Ole' "Hogging trick" lol, eliquid is over here like "oink oink motherfuckers!" lol, funny shit.
 
I have to agree, but my angle is slightly different.

The largest lesson I learned with SEO was to be a pig, instead of a hog.

Why?

Hogs get slaughtered, pigs get fat.

I have a ton of little sites. Too many for me to keep track of, too many for me to remember the login's to.

They have ranked top of Google, without a single hit from an algo since 2006. And no, they aren't all old ancient sites and domains. Some I made within the last 3 years.

Checks still roll in every month from affiliate networks, Nozzle still reports about 95% of their terms in the top 1-4 placements.

I think it's largely because I didn't try.

I didn't try to grow them, I didn't try to be number 1 ( even though I am, in the serps for a lot of them ). I didn't try to grow the link base. etc. I write an article for them and put them on the sites, like once or twice a year.

I don't even interlink the pages on the sites.

I didn't want to be in a "hog" niche. I didn't want to be a "hog" in even a small niche. I do have some that could be consider hoggish like the stuff I have in the financial space, but the majority isn't.

I literally pick shit no one else would, that just happens to have an affiliate program.. and I build that and leave it alone. I might be the only active affiliate some of these programs have.

And a ton of them, have Ai content on them now and that shit ranks too.

Don't try so hard. Treat it like a hobby.
This is so fucking true. If you try to double your income at a job, they want you to work 4x more. It's just easier to please your boss, do the bare minimum to leave a good impression, and get another remote job. I'm at 2 and might go to 3.

I know PPC agencies who charge $600/month for PPC. It's for restaurants and shit. They just start the campaign, turn performance max on, and send reports every month. They haven't touched the campaigns since they started but, the customer got a cheap PPC channel taken care of! The agency puts 30 clients to an employee. Sounds a lot but the employee mostly produces reports and not actually PPC. Again, you're so right.

Much respect, sir.
 
Maybe some light at the end of the tunnel for those playing the "waiting" game: Google Hints At Improving Site Rankings In Next Update

In a series of tweets, Mueller acknowledged the concerns, stating:

I imagine for most sites strongly affected, the effects will be site-wide for the time being, and it will take until the next update to see similar strong effects (assuming the new state of the site is significantly better than before).”​

He added:

I can’t make any promises, but the team working on this is explicitly evaluating how sites can / will improve in Search for the next update. It would be great to show more users the content that folks have worked hard on, and where sites have taken helpfulness to heart.”​

..

some things take much longer to be reassessed (sometimes months, at the moment), and some bigger effects require another update cycle.”​

"months" - good luck with that bro.

RXBB9CS.gif
 
Yep, everything and anything coming out of Google mouthpieces is complete bullshit. All you need to do is look at the SERPs.
 
I can’t make any promises, but the team working on this is explicitly evaluating how sites can / will improve in Search for the next update. It would be great to show more users the content that folks have worked hard on, and where sites have taken helpfulness to heart.

Of course, there was always going to be some level of "reversal". That HCU was too brutal and unrealistic to be sustained for long. What's uncertain is just how much reversal we're going to see....If things will ever fully go back to normal.

When the time comes, the reversal is going to be because Google is done with whatever internal goal they had going on (think the reddit scraping thing and IPO happening at same time, SGE/AI answers and other stuffs we have no idea about) AND/OR because they've been able to finally understand and fix their broken algo.

Its not going to be because Google suddenly wants to reward folks that have taken "helpfulness to heart". You must be HIGHLY delusional to think HCU was really about "rewarding helpful content" in the first place.
 
Last edited:
What about the INP? (New metric instead o FID)

EU legal requirements have forced site owners to use an extensive system under the so-called Consent Mode v2. This is usually a pop-up window, which gives the possibility to choose the consent to track various behaviors of users.

Recently I noticed on one client's site that the implementation of this rudeness, coincides almost perfectly with a decrease in traffic on the site.

Looking at the Pagespeed Insights report, the site loads twice as fast without this, that's one thing. the other thing is how INP works:

“Interaction to Next Paint (INP) is a web performance metric that measures user interface responsiveness - how quickly a website responds to user interactions like clicks or key presses.”

Do I understand correctly that a pop-up window with Consent Mode, effectively delays user interaction with the site and thus the INP score is poor, and thus the site gets a worse score and thus loses positions and traffic?

Does it seem realistic that such a temporary blockage of user interaction with the site can have such a significant impact on the positions? And ultimately traffic?

What is noteworthy is that this Consent Mode pop-up, must be enforced immediately, it cannot be delayed, due to legal issues.

One of the most popular solutions for serving consents in the context of Consent Mode is Cookiebot. They know that this can create probems.

The Cookieboot thing is just outlining the situation. Maybe, however, someone of you met with a similar problem?

Of course, GSC reports a problem with INP, at the same time the site is experiencing drops.

Any ideas?

P.S If this topic does not belong here then I apologize and ask the mods to move this one where it belongs.
 
What made you decide to cancel the disavows?
I wasn't confident in my disavow list and I don't have the money to pay for Ahrefs or SEMrush right now to dig deeper into it. Plus, I don't really know what I'm doing and thus feel that my time is better spent focusing on other things.

I saw continued decline and 0 improvement since implementing it 2 months ago, so I figured canceling it couldn't hurt me any more than anything else.
 
G search quality has become equally as bad.

Pinterest image search is 1000000000000x better.

G has just become so awful.
 
Anyone here loves a conspiracy theory? :D:evil::evil:

More proof that HCU has absolutely nothing to do with "Rewarding Helpful Content"?

Check this out:
Then.....

March 2024, BOOM! HCU IS NO MORE!:
https://www.seroundtable.com/google-helpful-content-update-gone-37196.html

Do you see the pattern here?

HCU was a thing for about 18 months, then it was suddenly baked into the core algo after brutally smashing tens of thousands of small businesses..

So, what was the thought process behind this? Why was HCU baked into the core algo when it's clear that something isn't "right" with the last HCU? And why were there no recorded recoveries despite it being rolled into the march core update?

My hypothesis on this:
  • This is exactly how Google wants it to be (at least for now). HCU wasn't a mistake or broken algo. "HCU" is likely a code name for some internal google project/goals.
  • The first 2 HCU (August & December 2022) were likely a pilot test to see the impact on Google's revenue.
  • The HCU wasn't part of the march core update. That's just some bullshit PR stuff they told you. Historically, there's always a core update in Q1 or first half of the year, which serves as the perfect time/excuse to "retire"/"end" the "Helpful content update". It explains why we've seen very few recoveries even with the core update. If HCU was truly about content quality/helpful content then lots of people should have seen varying degrees of recovery with the march core update.

NWzu8pS.png


Finally, there are some recent comments from Google folks on twitter that seem confident about HCU sites recovery in the next core update...and i agree that this will likely happen... WHY?:

  • Historically speaking, the next core update should happen in August/ September 2024. This will mark the 1-yr anniversary of September HCU.
  • The Google x Reddit deal likely happened around the September 2023 HCU, but was only announced Feb this year....It was probably a requirement for Reddit's IPO listing (IPO listing was announced shortly after the API deal announcement: https://searchengineland.com/reddit-google-ai-content-licensing-deal-437782)
  • If the above is true, then the deal will expire in September...Which means Google will demote reddit and "small blogs" will automatically get higher visibility???:D. It kinda aligns with the "next core update".

Technically, its a win-win for Google. They get 1-yr worth of reddit data and by demoting reddit, they also get fresh insights from "small blogs" that significantly improved their content.
 
Last edited:
Back