New SEO Insight Into Why Some New Sites Seem Blessed While Others Are Doomed

Ryuzaki

お前はもう死んでいる
Moderator
BuSo Pro
Digital Strategist
Joined
Sep 3, 2014
Messages
6,221
Likes
13,095
Degree
9
Okay... now we're getting somewhere!

Backstory & Advice
I've complained at length about something I've seen in the SERPs that drives me nuts. The basic explanation is that some sites seem to be "blessed" and absolutely rock the space that they exist in. They'll rank for everything and anything as long as they produce content for it. The worst part is, these sites blow. They're low grade, low usefulness, crappy content, low effort trash.

The same happens with good sites and we don't complain because it's justified. The sites are nice.

Because I've never been able to figure out why or how they're blessed, my advice (which I don't myself take due to the workload and infrastructure required) would be to create 5 versions of any authority site you want to build and see which one Google randomly "favors" as being better, even though they're all pretty much the same. Then roll with that one.

Webmaster Hangouts
In the most recent Google Webmaster Hangout, someone asked John Mueller:

“Theoretically, could a website that’s only one to two weeks old rank for top positions on Google for ultra-competitive head keywords say for example for shoes with significantly better content only, considering it’s the most important part of the core algorithm? If not then clearly time is a factor when it comes to ranking pages for highly competitive areas no matter how good they are unless new pages are created on already established websites.”

It's a nice question because it's phrased in such a way to place Mueller (the new Matt Cutts) in a corner live on air. To paraphrase it's basically, "You guys insist content is the most important ranking factor, but we only see new content ranking for big terms if they're published on old and powerful domains. That means time is a ranking factor. Change my mind." They don't want to admit too much about time being how they fight spammers, because it means they can't actually handle it and it'll also incentivize spammers to just scale harder and wait.

John buys some time at the start and throws in some plausible deniability by saying "this is all theoretical and a lot of things could be at play." Then he breaks character and says "But in practice things aren't so theoretical." Then he's right back to the game, "So I don't really know if there's a good answer that would be useful to give for something like this."

I think Mueller and Cutts and any other of the psyop spokespeople for Google are good people and want to be helpful. Mueller basically proves this by dropping some new hot fire information on us.

How Google Ranks New Websites
Mueller needs to explain away time as a ranking factor. He ends up doing so by being generous to those of us paying attention. Explaining away time means explaining that brand new sites can rank for competitive terms nearly immediately. This circles us back around to my huge complaint about some sites being blessed. He ends up saying:

“We use lots of different factors when it comes to crawling, indexing and ranking. And sometimes that means that completely new websites show up very visibly in search. Sometimes it also means that it can take a bit of time for things to settle down.”

If you read between the lines there and do some deductive reasoning, if a site is brand new, time is definitely not a factor in the initial considerations of 1) crawling, 2) indexing, and 3) ranking. That means it has to boil down to two things: 1) on-page SEO, and 2) technical SEO (so speed, code efficiency, bandwidth needs, information architecture, etc).

So basically there's something going on with brand new sites where something matches up in terms of on-page SEO and tech SEO that makes Google think the site will be a good performer in the future, and they then go ahead and rank that site accordingly. If it doesn't perform, the site loses rankings. If it does perform, it sticks pretty highly.

Back in the day we called this the honeymoon period. We knew what it was. You index high, they get some data, and then you slip back (hopefully not very far). But now we're getting to exactly which pages get chosen for a honeymoon period. It doesn't seem like as many as it used to be, because if it was, with the exponentially growing size of the web, then the SERPs would be trash. You can't test every page out.

Mueller says exactly that. He says they make estimates and test pages and sites out based on those estimates, giving them more visibility. They can then win or lose. Losing can mean being doomed, winning can mean being blessed. This lines up perfectly with my observation. Some sites get doomed or blessed, but not all sites, just some. This seems to be why.

What it ultimately means is you can become blessed if you can figure out what on-page matters and what technical SEO matters compared to data from similar sites in your niche and how they went on to perform, and then choose the right search terms to optimize for so you perform well once given extra visibility.

Mueller also goes on to say:

"And it can also be that maybe you’re shown less visibly in the beginning and as we understand your website and how it fits in with the rest of the web then we can kind of adjust that."
That sounds like an example of how the majority of new sites are treated. It's not that they understand how the website fits in with the others. It's simply that they don't test them out and don't have any signals on them yet, so you have to do the slow crawl up the rankings (because time is, in fact, a ranking factor). We've always called that the sandbox.

The question then becomes, for the majority of sites, how can you give Google some signals to work from if you can't rank? I'd guess that having Google Analytics is one way, using Google Fonts might be a way, having visitors that use Chrome is one way. Asking people to search specifically for your site through Google is one way. Doing marketing and traffic leaks and PPC definitely always kicks starts things.

They can't only be using data they get through their search engine or nobody new would ever get exposure and they'd rarely ever get a link to get exposure, while those with existing exposure would keep getting all the natural links.

Anyways, back to the honeymoon period. We know why and the results of how they do it, but how do they actually, mechanically do it? There's a patent called "Ranking Search Results" that explains they have a score modification engine (which I believe is how Panda and Penguin work, as side-algorithms or layers on top of the core algo). They generate an initial score and then tweak the scores respective of the query and based on a "plurality of resources."

H8akJkH.png

This is food for thought, made more difficult by my rambling. But if anyone was so inclined, they could, given the time and resources, probably determine which sites in their niche are the baselines by rolling out sites based on all of the successful ones and seeing which get blessed or doomed, then repeat to confirm. And then go to town setting up winners.
 
Brand search volume from clean real users.

Is the current hole in the google boost network system and speaks volumes to what the related factors are.


Prove me wrong.
 
That's interesting.

I've seen different cases.

One of my first sites in the health niche, ranked high almost right away, top 3 for a one word keyword, and stayed first page for 6 months. This with basically a new site up against very old and very established huge health sites.

What was that about? Well, I've always considered this "honeymoon period" to be about testing with live audience. If you fit some kind of criteria, you get some real intent visitors to check how they respond to your site. Could there be a situation where your feedback is just so much better than the competition, that Google lets you stay? Say you have some groundbreaking material. You break some news from Hollywood. Everyone googles for it to find the original source, not being satisfied with the many copycats ranking on authority alone. I think that's possible. If people open a lot of results for a query, but keep coming back to yours. Or in my case, the established sites simply didn't respond to "URGENCY" of the intent. I fulfilled the urgency, stopped them from opening more tabs.

On the other hand, some niches and some sites seem to have zero honeymoon. My fitness equipment site is like this. Just won't budge. I contribute that to a "satisfied serp". Intent gets fulfilled and then some.

Like secretagentdad wrote, if people type in your brand name and a keyword, maybe that's one of those secret sauce ingredients. Like "TMZ celeb scandals" for keyword "celeb scandals".
 
That's interesting.

I've seen different cases.

One of my first sites in the health niche, ranked high almost right away, top 3 for a one word keyword, and stayed first page for 6 months. This with basically a new site up against very old and very established huge health sites.

What was that about? Well, I've always considered this "honeymoon period" to be about testing with live audience. If you fit some kind of criteria, you get some real intent visitors to check how they respond to your site. Could there be a situation where your feedback is just so much better than the competition, that Google lets you stay? Say you have some groundbreaking material. You break some news from Hollywood. Everyone googles for it to find the original source, not being satisfied with the many copycats ranking on authority alone. I think that's possible. If people open a lot of results for a query, but keep coming back to yours. Or in my case, the established sites simply didn't respond to "URGENCY" of the intent. I fulfilled the urgency, stopped them from opening more tabs.

On the other hand, some niches and some sites seem to have zero honeymoon. My fitness equipment site is like this. Just won't budge. I contribute that to a "satisfied serp". Intent gets fulfilled and then some.

Like secretagentdad wrote, if people type in your brand name and a keyword, maybe that's one of those secret sauce ingredients. Like "TMZ celeb scandals" for keyword "celeb scandals".
Welp. I’m pretty much going all in on theories resembling this one. I think it works. Publickish case study I guess. Lol
I think user experience and brand power Trump all.
 
Last edited:
Me and @CCarter have pushed this for a long time, that time is a ranking factor. Years actually we have pushed it.

Everyone thought we were stupid and crazy.

Data doesn't lie though.

We are probably one of the few that even such data sets AND THE WANT to dive into them to help isolate and show it was a factor. Kinda the same as with NOINDEX links that everyone thought we were stupid about.

You can attach sub items to TIME such as brand exposure and link build up, but without TIME you won't achieve such things. Many experienced webmasters and devs have the resources to push thousands of dollars into links 1st and 2nd month of a new site out of the gate, but without TIME those sites don't rank in the first 60 days....
 
When you are in an industry or anything for so long you start to instinctively know what to look for. When I see websites I'm exampling within the first couple of minutes I can tell whether the site performs well in the search engines or not.

It's a combination of experience and knowing small subtleties to look for.

For example if I can't find an eCommerce website's address or contact info, they are most likely not going to rank locally for local SEO terms. Sounds logical right? But a lot of people miss these small things.

Another example are blogs that remove the dates - well you are essentially eliminating yourself from freshness factors altogether. I understand the logic of removing dates from comments, but removing dates from blog posts or post themselves reduces your chances of showing up in the organic results.

I can tell you this because when I search for stuff, and sometimes obscure coding stuff, 95%+ of the results have dates. Some state "2019" others go back to "2013". All those pieces of data are important for the end users to know whether this article they are reading is relevant or not.

Example if I have an Ubuntu and MYSQL issue, reading an article from 2013 doesn't help when the versions of software they were using is way behind. So I have to find a fresher solution. All the results ALWAYS have dates in the search results or the page - yet SEOs are doing the exact opposite and removing dates. Then they can't figure out why they aren't ranking or their site drops in the next algo - it's not an individual factor it might be that your site simply is not "keeping up" with the standards.

Like @secretagentdad said, the biggest factor that I see that can overcome these time limits are brand mentions and Google getting "signals" that this brand is more than just a weak website. It's why I push for social media mentions, it's why I push to get YouTube videos going, cause there are literally customized organic results for the YouTube Carousel, and Twitter Feeds, and other aspects showing up in Google. ALL of those are based off of freshness factors meaning - wait for it... Dates!

A lot of this isn't rocket surgery, you can't just "publish a post" and wait for Google, or send some links and again wait for Google. You have to promote the post through out the internet and Google will notice that and connect it to your brand. And then when Google knows you as being "someone" - you'll be able to rank your future content easier.

And guess what - the only way to get "Brand search volume from clean real users." - by... "MARKETING" and "PROMOTING" your brand outside the search results.

[insert obligator CCarter gif]
 
ughhh ^^^^^^
I hate when you're right about something I've fought to avoid doing.
I'm gonna start using dates this year.


Also get in here plebs. The best way to get good at seo is to OP. OPS are OP get it.....
Post some theories and get some holes shot in them. Or shit on my shitty approaches.
Criticism and text wall theories work better when we beat up each others approaches.
Actually doing the work is expensive so there is real value in trying to actually participate in the discussion.
We got a working framework with a few holes you can drive a truck through being outed.
Make some fucking money and stop being so poor.
Who knows, maybe you'll get invited to the good meet ups and stuff if you post a bit.


More importantly,
Smaller search engines don't have the manual intervention component google does and they're starting to gain traction.

The signals they have to work with are what people type into their box, and the internet backlink graphs.
You can run types of content that are currently difficult to rank in google fine on a lot of them since their index clean up and templating procedures are more simplistic out of necessity.
 
Last edited:
the sandbox.

LOL. I feel like this OP was an extension of our conversation the other day.

But, yes, I 100% agree with what @Ryuzaki is saying here.

I forget where I read it, but there was a data-based case study that was put out sometime over the past year that showed that the average domain age for ALL serps is 3.3 years. I have found this to be true.

Using my main site that I've been working on for 2 years now, here's what I've experienced:
  • There are new sites that appear to be outliers and get blessed almost immediately. Some of those sites have stuck, and some have dwindled. I'm assuming the sites that have stuck have proven that they deserve to stick. Some of these blessed sites are less than DR10, have 3x fewer links than us, and continue to rank hard for short tails in the industry.
  • It appears that all other new sites that didn't trip the "blessed" filter hit the sandbox.
  • The sandbox appears to be a spectrum, with the main determinant being time.
    • Example: When the site first launched, new posts would index in the 50's, pull back even further, and then slowly creep up over time. After a year, new posts would index in the 20's, pull back to the 50's, and then creep over time. After two years, new posts would index in spots 11-15, pull back to the 30's, and creep up over time.
    • Example: We've experienced several "big" algorithm updates that have caused rankings/traffic to fluctuate 5%-10%. At worst, we had a 30% swing after one update. With that being said, we did not change our strategy in any way during the existence of the site, and did not change our strategy as a result of any single update. We've come back from every update stronger than before without changing anything we were doing SEO wise. The only difference is that we are now an older site.
    • Example: More links does not compensate for less time. We have significantly more links (at a higher quality) than some older sites in our niche, and continue to rank behind them.
  • Slamming a page with quality links to "force" it to rank faster only seems to work if the SERP is lacking quality pages (aka low keyword difficulty). Slamming a page that targets a high difficulty keyword seems to cause rankings to DECREASE for a period (6+ months). Only after a period of time do those high-difficulty keyword targets break through their previous ceiling.
  • Site freshness and individual page freshness continue to pack the biggest punch in ranking improvements. The more often we post on the site, the bigger ranking and traffic gains we see. And, the more often we update existing content, the faster we see that content rank well.
  • Being first to the punch in regards to ranking for new/trending keywords is a gold mine. We run Google Alerts to surface new companies in our industry that we can review. When we are successful in posting the FIRST review article for a company, it seems like it's almost impossible to be beat. Using this strategy, we ranked one review article that represents 30% of our existing revenue, and another one that represents about 5% of our existing revenue. And, to date, the keywords for these new review opportunities continue to show less than 100 for search volume on industry-standard tools.
it's not an individual factor it might be that your site simply is not "keeping up" with the standards.

This sounds like what all these new correlational tools like Surfer and POP are trying to resolve.

Taking an 80/20 approach here, I feel like these tools can get you 80% of the way there in regards to keeping up with the standards. The rest of the way will need to be covered via manual analysis... like you did in finding the dates issue.
 
Opening Post Summary
I feel like I need to summarize the opening post for the benefit of the TL;DR crowd.

I'm not talking about certain posts doing well on an existing site, or domains doing well or bad after time, or even how to accelerate out of the sandbox.

I'm talking specifically about brand new domains that get "blessed or doomed." Blessed ones not only bypass the sandbox but bypass everyone in the SERPs and slam right to the top and stick. Some sites get doomed and never have any chance of ranking, even when they've done nothing wrong.

The point was that Mueller has admitted that Google (through seemingly random sampling) estimates what brand new sites user metrics might be by comparing them to other successful sites. The only thing they have to go off of is on-page SEO and technical SEO, and I'm leaning more towards tech SEO being the main factor in this.

Figuring out exactly what it is they like for a niche means easy and fast domination, saving time waiting and money on links. You can dominate through pure content development.

Some blessed sites don't last because they don't satisfy the users and their user metrics cause them to tank. Doomed sites never even get a chance to prove their worth.

It's a two-pronged approach. If you can figure out how to get blessed, you win. If you can figure out that your site is doomed without waiting 12 months, you win. Actually testing these things isn't really pragmatic or worth the time.

I think the best approach would be to develop 5 or even 10 versions of a site. Different themes, same keywords being targeted, fairly similar content. One or two are going to pop, most will do the normal slow-grow, and one might get the doom stamp. From there you can focus on the winners and use the others as a PBN or simply get rid of them.

___________


Other Stuff
Brand search volume from clean real users.

I knew a guy that used to build flash game portals on 4 letter domains. He didn't care what they were as long as they were 4 letter dot coms. Then what he'd do is head over to Microworkers and pay a zillion people a nickel each to go type his 4 letter brand name (XGRZ.com for example). He was influencing the auto-suggest, but he felt that cascaded out into SEO benefits.

It would seem to be key to have it sustained, and the only way to pull that off is to be a real brand through marketing and superior everything. You have to get in front of millions of eyeballs and get people actually searching your brand and get an actual search volume on the books. (As well as doing other obvious brand footprint things across the web).

Well, I've always considered this "honeymoon period" to be about testing with live audience.

Yeah that's exactly what I'm getting at, which Mueller has admitted to now. But back in the day everyone got the honeymoon period and it was on a page-by-page basis. It seems now, since the internet is exponentially larger, they have to dole it out randomly and not to pages but to domains. Which explains the "blessed or doomed" observations I've been crying about the past year here. It's nice to get confirmation again that I'm not imagining things.

Two other notable pieces of confirmation was when I swore they inverted the index one time. I thought it was some crafty ploy to get a baseline of user metrics for bad sites. Matt Cutts said some doofus just accidentally replaced a > with a < once. The other was Mueller, at a private dinner, confirming my sitewide Panda Quality Score theory, that finally leaked out years later.

Which all supports this idea...

Post some theories and get some holes shot in them

There's tidbits I see people drop on here all the time that are worth millions of bucks to the person that will apply them. One in particular I always remember is how @CCarter discovered that like 90% of Sears Canada's emails were going to spam, during the start of the mad holiday rush. If they read it, it would have probably been a billion dollar revenue difference at least, a problem any number of us could solve for them.

Stuff like our page speed talks can add an unbelievable amount of money to an e-commerce store's revenue overnight if they're already hitting big numbers. The on-page discussion, the freshness factor thread, the index bloat talk, lots of this stuff can change businesses over night. Let alone the Crash Course for newbies. Go look at the list of threads there and look how many reads the early days get and how it peters out towards the end. Free money sitting there for the taking and people are too lazy to read, let alone type and ask questions.
 
Ok, I’m going to be honest, I haven’t been deeply involved in SEO for a while now. If you know me, you know that I am working for a fast growing startup. One of the things that really sets us apart from competitors is that we have a CEO that is very active on Twitter. While the company only started in the middle of 2017, that has caused a massive amount of social mentions, brand searches, etc.

In 2018 I did a small test of transcribing a couple of our training videos and slapping them up on our website to see what kind of traction we could get from an organic perspective. Keep in mind these posts are not optimized, and were something that took me about 5 minutes of cleaning up the transcript to get posted.

I went and checked today and for some queries we’re pushing out sites like Stack Overflow, which is a behemoth in the space. We’re also young compared to a lot of these companies/competitors. One of our main competitors (that was acquired) was started in 2012, but we’re ranking decently for queries where they should be crushing us, and we haven’t even looked at SEO. (Which is going to change very soon now that I’ve looked at this data.)

Takeaway is that brand signals seem to be assisting in ranking. Hard to say 100% because the Twitter activity has resulted in a number of posts being written about us on various large blogs/websites. So we’ve gotten the brand stuff, that has pushed some nice links.
 
Cora says they test against 2040 ranking factors, and recently put out a list of the top ones:

qfUC7AI.jpg

Magically there is data to provide that I'm right about dates. Yet all these blogs have gone out of their way to remove publishing dates from their content. Or better yet, people do the lazy thing of just updating the date and not the actually content even though Google's threshold for duplicate content is 40% unique.

cBz0HEY.gif

I think I should come out with an SEO blog or something.
 
people do the lazy thing of just updating the date and not the actually content even though Google's threshold for duplicate content is 40% unique.

Are you suggesting that you need to add/edit/remove 40% of an article for Google to consider it to be fresh?

I use my CMS’s updated date for inserting the date. I don’t use published date so that there aren’t competing dates. So the updated date changes if I fix a typo, add a clarifying graphic, or rework some text. Given the evergreen nature of the site, there isn’t need to change much, but I don’t want my 2012 article next to someone’s 2020 article so I make small edits here and there.

I don’t think I’ve ever been penalized for small changes with an updated date. Maybe google knows the difference between “reference” topics and “news” topics.
 
Are you suggesting that you need to add/edit/remove 40% of an article for Google to consider it to be fresh?

To clarify, if you have a page, then duplicate the page and then update at least 40% of the content Google will market this page as unique.

An example in the wild: Think of all those recipe sites that have similar recipes, they figured it out and started putting their life stories above the recipe cause otherwise it would be considered duplicate content.
 
Here's my current voodoo recipe.

Search on google, click Tools - Anytime, Set Date range.
Looking at your distribution of rankings under various different timelines can be really enlightening.
I try to do it every week in my bigger niches.
Once you get that blessed status you pop when you publish.
You can identify when its time to go ham on the content cuz you pop at page one when ever you write something within your short tail keyword tree.
When you get a good description and document combination you work your way up with time instead of down. Double down on the winners and just delete the stuff that's sinking.

If you watch the time series, you can quickly spot which competitors content is being taken seriously for testing. Then you can blog about their brands with their BRAND or product NAME VS your brand or product name and try to get into their branded auto suggest.

When you delete content regularly you seem to get some kinda crawl priority boost.
This appears to be less true now than a year back but I think it still works.
I'm speculating that it might not be a real time perk any more.
A lot of stuff appears to have moved behind some kinda QA index these days.
 
Last edited:
To clarify, if you have a page, then duplicate the page and then update at least 40% of the content Google will market this page as unique.

An example in the wild: Think of all those recipe sites that have similar recipes, they figured it out and started putting their life stories above the recipe cause otherwise it would be considered duplicate content.
I fucking hate that. I want to cook some sauce, and they tell me their whole life story and how their grandmother used to make it when she came as an immigrant from ... Just give me the recipe.

On the topic of dates though: a lot of the dev articles have old publish dates, but keep getting updated, and also provide an 'updated date'.
 
I think the best approach would be to develop 5 or even 10 versions of a site. Different themes, same keywords being targeted, fairly similar content.
A practical question... in my country you need an imprint on your site (your bzw name, address, tax number) and I wonder if it would harm my sites if multiple sites in the same niche with similiar content would pop up at the same time. I guess it would be very obv for google to see that I try to hit an algo winner. The question is if this will have negative effects or not, hm.

It would seem to be key to have it sustained, and the only way to pull that off is to be a real brand through marketing and superior everything. You have to get in front of millions of eyeballs and get people actually searching your brand and get an actual search volume on the books. (As well as doing other obvious brand footprint things across the web).
If you are in a smaller niche without any strong competing brands wouldn't it be enough to use the brand searches mainly at the beginning to increase the odds of getting a blessed site?
 
I wonder if it would harm my sites if multiple sites in the same niche with similiar content would pop up at the same time.

I doubt they're running any checks like that. Most giant publishing brands have tons of sites in the same niches. Not just verticals, but niches. Scroll down through these brands: https://www.futureplc.com/brands/ They all have the same business details at the bottom like you're describing.

I'd probably host them on separate cheap servers until I know which one is the best, then move the winner over to my main server, though. And drop the others into the aether. This process could take a long time. You'll still have to contend with the sandbox if one doesn't magically pop through. One will still be more favored than the others, and waiting to find out which it is could be a year in duration even.

Unless you're the kind of guy already pumping out boatloads of sites, I'm not sure how practical any of this is. We have to find a way to stack the deck in our favor or we're just pissing in the wind and hoping, and potentially hoping for a long time in order to get nothing out of it. That's the danger of SEO altogether.

Maybe one thing to do would be to identify a blessed site in the niche you want to go into, and then find out what theme they're using for their CMS and use the same one. You can change colors and whatever to make it different, but the point would be that you mimc their HTML layout for the most part. You can even identify what plugins they're using and how they're setting them up, what areas of the site are indexable, what's in the robots.txt, etc. You can look at their sitemap, and so forth, and mimic them as much as possible on the technical side, while masking what you're doing on the design side superficially.
 
Some of you will remember that I've been advocating for a form of this for a very long time. The buzzwords may change, but SEP is still the undefeated, undisputed Champion of the SERPs.

It's been many years, but I've finally been blessed to experience it again on a new project, but I don't want to take it for granted so I'm making daily offerings of content.

People who get it, people who believe... they'll win again and again. Those who reject the light? Well, they're still spinning their wheels in circles.
 
Last edited:
Apologies if this is obvious, but what is SEP?

Just like after USA is USB, After SEO is SEP.

SEP = Search Engine Pray.

It's where you throw up content and pray to the Search Engines that your content magically outranks pages that have more backlinks, more social engagement, and faster load time than your page.

Instead of actually doing technical SEO to make sure the underlying foundation of your website is faster, optimized, and doing outreach and creating quality content - content that's interesting and entertaining enough that it will get talked about and naturally backlinks, one employs SEP by publishing $2-5 articles and praying this search engine algorithms somehow think it's 2007 when it comes to your particular content piece versus the rest of the top 100 competitors.

ayfx8NV.gif

I think I'm done here. My time is up, we've come full circle.
 
SEP, oh joy. @CCarter, remember when I mislead Rand Fishkin on the live Moz stream with some long winded question about the effectiveness and efficiency, blah blah blah, of Search Engine Prayer? :D I also got him to answer my question about whether or not it requires a chisel to wash his hair. We were so young and care free back then. Now that we've ascended the throne, it's all bloodshed, bones of our enemies, coronavirus, etc. What I wouldn't give to go back.
 
Sounds like a hot new strategy I should be testing.

Will report back on results.

Are there any particular saints or deities that are known to work well other than Bill Lambert?
 
I've started experimenting and theory crafting with this a lot.
My latest isolation tests have focused on outbound link "virtue signalling" in the context of time series.

Specifically I've been focused on finding explosive new trends / brands and trying to be the "first industry influencer" to link to sites or news items that are naturally acquiring links. Even if they compete blatantly with me.
This sorta goes with how people in other threads have been talking about how you can get your site "classed" in different serps.

It appears that being first on the dogpile when there's real news or growing brands in your space does some cool things. I would almost be ready to say that I'm sure order of acknowledgement is being used in some major ways as a db classing variable.

At the very least it appears to be a powerful factor in what ever scoring indexes are used to acquire what we're calling "blessed" status. Still got a lot of testing to do.


To get a bit more out into theory-crafting land this makes intuitive sense that Google would want to reward quality signal sending.
I'm testing some other rabbit holes that are useful from the perspective of how can I help other search engines make the best possible results in my vertical.
If anyone's got any other line items that fit with this line of time reasoning, I'm looking for more to test.
 
Last edited:
Okay... now we're getting somewhere!

Backstory & Advice
I've complained at length about something I've seen in the SERPs that drives me nuts. The basic explanation is that some sites seem to be "blessed" and absolutely rock the space that they exist in. They'll rank for everything and anything as long as they produce content for it. The worst part is, these sites blow. They're low grade, low usefulness, crappy content, low effort trash.
....

“We use lots of different factors when it comes to crawling, indexing and ranking. And sometimes that means that completely new websites show up very visibly in search. Sometimes it also means that it can take a bit of time for things to settle down.”

If you read between the lines there and do some deductive reasoning, if a site is brand new, time is definitely not a factor in the initial considerations of 1) crawling, 2) indexing, and 3) ranking. That means it has to boil down to two things: 1) on-page SEO, and 2) technical SEO (so speed, code efficiency, bandwidth needs, information architecture, etc).

So basically there's something going on with brand new sites where something matches up in terms of on-page SEO and tech SEO that makes Google think the site will be a good performer in the future, and they then go ahead and rank that site accordingly. If it doesn't perform, the site loses rankings. If it does perform, it sticks pretty highly.

Back in the day we called this the honeymoon period. We knew what it was. You index high, they get some data, and then you slip back (hopefully not very far). But now we're getting to exactly which pages get chosen for a honeymoon period. It doesn't seem like as many as it used to be, because if it was, with the exponentially growing size of the web, then the SERPs would be trash. You can't test every page out.
...

Mueller also goes on to say:

"And it can also be that maybe you’re shown less visibly in the beginning and as we understand your website and how it fits in with the rest of the web then we can kind of adjust that."
That sounds like an example of how the majority of new sites are treated. It's not that they understand how the website fits in with the others. It's simply that they don't test them out and don't have any signals on them yet, so you have to do the slow crawl up the rankings (because time is, in fact, a ranking factor). We've always called that the sandbox.

The question then becomes, for the majority of sites, how can you give Google some signals to work from if you can't rank? I'd guess that having Google Analytics is one way, using Google Fonts might be a way, having visitors that use Chrome is one way. Asking people to search specifically for your site through Google is one way. Doing marketing and traffic leaks and PPC definitely always kicks starts things.

They can't only be using data they get through their search engine or nobody new would ever get exposure and they'd rarely ever get a link to get exposure, while those with existing exposure would keep getting all the natural links.

Anyways, back to the honeymoon period. We know why and the results of how they do it, but how do they actually, mechanically do it? There's a patent called "Ranking Search Results" that explains they have a score modification engine (which I believe is how Panda and Penguin work, as side-algorithms or layers on top of the core algo). They generate an initial score and then tweak the scores respective of the query and based on a "plurality of resources."

H8akJkH.png

This is food for thought, made more difficult by my rambling. But if anyone was so inclined, they could, given the time and resources, probably determine which sites in their niche are the baselines by rolling out sites based on all of the successful ones and seeing which get blessed or doomed, then repeat to confirm. And then go to town setting up winners.

I can confirm that new sites can rank highly for competitive keywords, even if they're 12 pages. We're a CF66 domain with about 100 different products. We have a competitor who makes 12 page websites, ranks them for head terms, with enough backlinks to get C20 or CF30. It's about 50 RD total, if I recall correctly.

So, can one rank quickly for head terms if they're site is optimized for on-page SEO? Yes. However, I've been in this industry for 4 years now and know that their sites fall off the SERPs after a few months. Therefore, their business involves having a team create new sites to replace the old ones constantly. We actually interviewed one of their ex-employees. She confirmed this. They mostly do on-page SEO and technical SEO. They don't do link building on the scale we do.

We wanted to ask her more, but she signed a non-disclosure agreement with the last company. They have a team of 12 working on those sites.

So, yes, it is possible but the business plan is totally different than a business plan for a big brand company.

I can tell you this because when I search for stuff, and sometimes obscure coding stuff, 95%+ of the results have dates. Some state "2019" others go back to "2013". All those pieces of data are important for the end users to know whether this article they are reading is relevant or not.

Example if I have an Ubuntu and MYSQL issue, reading an article from 2013 doesn't help when the versions of software they were using is way behind. So I have to find a fresher solution. All the results ALWAYS have dates in the search results or the page - yet SEOs are doing the exact opposite and removing dates. Then they can't figure out why they aren't ranking or their site drops in the next algo - it's not an individual factor it might be that your site simply is not "keeping up" with the standards.

IMO the post date would only matter when the query deserves freshness. Other than that, not really. Google even has an acronym for it in their Search Rater Guidelines, QDF.

Also get in here plebs. The best way to get good at seo is to OP. OPS are OP get it.....
Post some theories and get some holes shot in them. Or shit on my shitty approaches.
Criticism and text wall theories work better when we beat up each others approaches.
Actually doing the work is expensive so there is real value in trying to actually participate in the discussion.
We got a working framework with a few holes you can drive a truck through being outed.
Make some fucking money and stop being so poor.
Who knows, maybe you'll get invited to the good meet ups and stuff if you post a bit.


More importantly,
Smaller search engines don't have the manual intervention component google does and they're starting to gain traction.

I totally agree that people need to post their theories here to get them harassed. That's the only way they can receive feedback quickly and in an efficient manner.

As for the smaller search engines, which ones are you talking about? I've been trying to rank on Yanoo.jp, Naver, Yandex, and Bing for ages. IMO, those sites have much slower crawl rates, which is a reason why my site isn't ranking. If there was a way for me to get their spiders to crawl my backlinks, I'm set.

I think the best approach would be to develop 5 or even 10 versions of a site. Different themes, same keywords being targeted, fairly similar content. One or two are going to pop, most will do the normal slow-grow, and one might get the doom stamp. From there you can focus on the winners and use the others as a PBN or simply get rid of them.
...
I knew a guy that used to build flash game portals on 4 letter domains. He didn't care what they were as long as they were 4 letter dot coms. Then what he'd do is head over to Microworkers and pay a zillion people a nickel each to go type his 4 letter brand name (XGRZ.com for example). He was influencing the auto-suggest, but he felt that cascaded out into SEO benefits.

It would seem to be key to have it sustained, and the only way to pull that off is to be a real brand through marketing and superior everything. You have to get in front of millions of eyeballs and get people actually searching your brand and get an actual search volume on the books. (As well as doing other obvious brand footprint things across the web).



Yeah that's exactly what I'm getting at, which Mueller has admitted to now. But back in the day everyone got the honeymoon period and it was on a page-by-page basis. It seems now, since the internet is exponentially larger, they have to dole it out randomly and not to pages by to domains. Which explains the "blessed or doomed" observations I've been crying about the past year here. It's nice to get confirmation again that I'm not imagining things.

Two other notable pieces of confirmation was when I swore they inverted the index one time. I thought it was some crafty ploy to get a baseline of user metrics for bad sites. Matt Cutts said some doofus just accidentally replaced a > with a < once. The other was Mueller, at a private dinner, confirming my sitewide Panda Quality Score theory, that finally leaked out years later.

Which all supports this idea...



There's tidbits I see people drop on here all the time that are worth millions of bucks to the person that will apply them. One in particular I always remember is how @CCarter discovered that like 90% of Sears Canada's emails were going to spam, during the start of the mad holiday rush. If they read it, it would have probably been a billion dollar revenue difference at least, a problem any number of us could solve for them.

Stuff like our page speed talks can add an unbelievable amount of money to an e-commerce store's revenue overnight if they're already hitting big numbers. The on-page discussion, the freshness factor thread, the index bloat talk, lots of this stuff can change businesses over night. Let alone the Crash Course for newbies. Go look at the list of threads there and look how many reads the early days get and how it peters out towards the end. Free money sitting there for the taking and people are too lazy to read, let alone type and ask questions.

If you want to do "rank and bank" go ahead but, IMO, that's a flimsy business strategy. I'd rather have a CF66 domain that'll stick around for years than many domains that only last for a few months.

Here's why: you can sell the CF66 domain. You can't with a rank and bank. Also, if you have employees and they have kids and stuff, it'll be pretty unethical for you to use blackhat strategies when their livelihood and their children's livelihoods are in your hands. Just my opinion.

As far as getting branded searches, my "brand" gets 14,800 searches a month! How did we do that? With great customer service! Who else do you think is googling your name? Customers and potential customers :smile:

To clarify, if you have a page, then duplicate the page and then update at least 40% of the content Google will market this page as unique.

An example in the wild: Think of all those recipe sites that have similar recipes, they figured it out and started putting their life stories above the recipe cause otherwise it would be considered duplicate content.

They're not doing that to optimize for unique content. I doubt they even know about that. They're doing that to create personal relationships with their audience. They're food influencers. They need the personal relationship with their audience in order to have them continue reading their blog.

It's not about the information, the recipe itself, but about the relationship. They're women too. It's a different world view than a man's.
 
As for the smaller search engines, which ones are you talking about? I've been trying to rank on Yanoo.jp, Naver, Yandex, and Bing for ages. IMO, those sites have much slower crawl rates, which is a reason why my site isn't ranking. If there was a way for me to get their spiders to crawl my backlinks, I'm set.
They tend to not bother with extremely huge sites unless you are abusing engine specific indexing tricks.
Bing and co use a much smaller list of sites and factors for their base white list.



Biggish new observation: I think stuff behind login walls is getting used as part of the blessed calculations.
Been getting credit on my home pages with some tools sites for stuff that required sign up.
Google just goes ahead and signs up if you make tools and don't block them.

Looks to me like they're probably using the data in the live index now.

Been adding some unique long tails in my tools and they've either been picking up the sites about them from some other blogger talking about it or from the actual crawling itself.

Added some more easily fingerprintable items to my higher crawl rate tools to test.
 
Back