Is this true for Google SEO? Linkjuicer Quote

raygun3001

Power Member
Joined
Mar 23, 2009
Messages
529
Reaction score
102
theLinkjuicer.com FAQ Quote about articles -

"How does Google know it is quality? By the amount of time readers spend on the page. If it is machine generated garbage the average visit time will be one second, and you won't get any juice out of the link."

Is this correct?


p.s. i have nothing to do with linkjuicer and i have not tried it.
 
It is one of the factors google considers. Just because the surfer stays on page for long time it doesn't mean that the quality is good. So google check for a number of factors including back links, page content etc along with the surfer behavior.
 
It is one of the factors google considers. Just because the surfer stays on page for long time it doesn't mean that the quality is good. So google check for a number of factors including back links, page content etc along with the surfer behavior.

really interesting, I was of the understanding that google didn't take traffic into consideration at all.
 
Well there's that... and about a gagillion other factors in the algorithm.
 
Google says they dont, Im not sure whether this is true or not. It's important to remember that for Google to do this the site would need to use Google Analytics.

I wouldn't factor this as a very big reason for assigning quality to a page.
 
I notice that CTR to your site from organic listing in Google plays a role in google , CMIIW
 
Well there's that... and about a gagillion other factors in the algorithm.

Yep I agree here with Forevernever.
If you get the chance, load up google analytics on a site and then look at how much they look at. Its a crazy amount of info they pass on to us let alone what they keep to themselves.
 
I think google looks at the bounce rates... not traffic. The traffic model would put the same 100 websites at the top of all searches. Google doesn't want that. Bounce rates as a percentage is a pretty good apples to apples indicator for the value of the content in relation to the search terms.

I know it feels like traffic is a factor but it really can't be without destroying the usefulness of the content search. The bounce time on site is believed to be around 4 seconds. The google analytics bounce time is less because google doesn't want people to worry about it and thus... tune websites for it artificially.

Google sees clickstream patterns by session like this

google home
submit search
exit link 1
exit link 4
page 2
exit link 13

In this example exit link 13 gets the big bonus because the person didn't come back to google before their session expired. If they return too quick you get a bounce which is bad. If they return after the bounce interval it is neutral (no penalty). But only the last link gets boosted for the search term.

They measure the time from an exit link to the next action as the time on site. They figure if you kept searching results then you didn't find what you hoped to find with the search terms.

They aren't looking at referer URL so schemes to redirect to a different page wont help. The most popular scheme for adjusting bounce rates is to use a nice fast progress bar for 5 seconds that says the page is loading. do this in javascript then unhide your main content div.

The nicer looking your loading scene and the faster the action is in the animation is the more likely people wont mind it. Only display it if the referer is from google or other search engines... use a cipher like ascii character value - 1 on your strings so googlebot and others don't see their names in the javascript source.
 
Last edited:
So let's say Google does consider the bounce rate AND/OR the time people spend on your site. Does this mean that sending 1000's of redirected traffic to our sites (say via old domain redirects) will adversely effect your SERPS?

And yes I know there are many other factors:) BUT in my humble opinion, just a little advantage can mean a significant jump.

EDIT:
IPOPBB - JUST RE-READ YOUR ANSWER AND I SEE WHAT YOU MEAN NOW:

They aren't looking at referer URL so schemes to redirect to a different page wont help. The most popular scheme for adjusting bounce rates is to use a nice fast progress bar for 5 seconds that says the page is loading. do this in javascript then unhide your main content div.

The nicer looking your loading scene and the faster the action is in the animation is the more likely people wont mind it. Only display it if the referer is from google or other search engines... use a cipher like ascii character value - 1 on your strings so googlebot and others don't see their names in the javascript source.
 
Last edited:
It is POSSIBLE that Google consider bounce rate as one of the many factors to determine your ranking in the SERP. They are all for user experience. So it is highly possible that they design their algo to drop sites from the SERP that have high bounce rate. (Which is telling them that the sites are garbage and majority of the users that search for this particular keyword is not finding the contents that they are looking for in your site).

They could easily compare the bounce rate across the top 10 listings after X amount of visitors to those sites and adjust their rankings on the fly--pushing the sites with low bounce rate higher in the SERP and likewise, penalizing those with high BR. Hence the Google Dance along with many other unknown factors? God knows.

Their #1 concern has always been user experience. They would do anything to give a better user experience. The way i look at this, there is no reason why they wouldn't implement this in their SE, considering that they have plenty of datas on our websites.

Is is possible for them to know the bounce rate if we didn't install GA? Maybe..

However, I have plenty of thin websites that are MFA that have very high bounce rate (80%+) but my rankings are still staying strong at the top spots for months. Maybe the algo have not caught up with these websites since they're not competitive terms and might require more time to gather statistical significant datas? Who knows.

I'm pushing my luck with these low quality MFA sites. But I can easily make one and rank for another term that can last me another good 6 months. If you're concern with ranking your site well and maintaining it's top spots for a long haul, i'd install GA on the site and make amendments according to the statistic as much as Google would. It's either you make the changes to your sites for the users, or google will.

Of course these are all speculations. There is no right or wrong answers to the question. Just go along with Google and provide a better user experience and you will be fine.
 
Put up some decent content in a clean layout and it's not an issue.

Google does check to see if you click on a search result link, then come right back to the search results. How much this determines rank is unknown but there are over 200 "signals" Google uses to determine rank. This is also a factor in AdSense quality score so if people only spend a second on your page and use the browser's back button and click on another one, Google will take that into consideration.
 
really interesting, I was of the understanding that google didn't take traffic into consideration at all.

I was not mentioning about the traffic. My guess is that the traffic also got a minor effect on ranking. (That's why you will see sites with good alexa rankings get rated higher)

What I was mentioning is the surfer behavior. This is not a guess. Google already mentioned this in their patent.
Page Selection Rate (CTR ), Time spent on page and the amount of user Bookmarking your page (on their browser, not on bookmarking sites) are considered while deciding rankings.

Do a search for google patent and read it.
 
All the major search engines have to use universally spiderable values to determine search engine position - bounce rate is not one of those because it can't be remotely spidered for.

I'm one of those people now gradually removing my sites from Webmaster and Analytics specifically because I don't want Google to know too much. There are other ways of getting the same info that remain entirely under my control.
 
All the major search engines have to use universally spiderable values to determine search engine position - bounce rate is not one of those because it can't be remotely spidered for.

I'm one of those people now gradually removing my sites from Webmaster and Analytics specifically because I don't want Google to know too much. There are other ways of getting the same info that remain entirely under my control.

The engines can't monitor bounce rates on your site but they do know if someone clicks on a search result link, then comes right back to the search results using the back button and this can be easily tracked.
 
So let's say Google does consider the bounce rate AND/OR the time people spend on your site. Does this mean that sending 1000's of redirected traffic to our sites (say via old domain redirects) will adversely effect your SERPS?

EDIT: I was going to say that it did but Blackhat soda is right... they wouldnt have a way to monitor it unless you loaded their monitoring software on your server or pointed google analytics to your site.
 
Interesting.. I actually got my SERP higher after buying redirected (low quality) traffic to my site that obviously increased a bounce percentage. I am using Analytics tracking. Of course at the same time I was doing directory submissions, social bookmarking etc.. so I was getting nice amount of backlinks as well.

What I am trying to say is that even if this would be a factor, bulding up relevant backlinks is still much more effective than the counter-effectiveness of low quality traffic.

Of course... this was based around the bounce rate factor. I concider my content to be good quality.
 
I think google looks at the bounce rates... not traffic. The traffic model would put the same 100 websites at the top of all searches. Google doesn't want that. Bounce rates as a percentage is a pretty good apples to apples indicator for the value of the content in relation to the search terms.

I know it feels like traffic is a factor but it really can't be without destroying the usefulness of the content search. The bounce time on site is believed to be around 4 seconds. The google analytics bounce time is less because google doesn't want people to worry about it and thus... tune websites for it artificially.

Google sees clickstream patterns by session like this

google home
submit search
exit link 1
exit link 4
page 2
exit link 13

In this example exit link 13 gets the big bonus because the person didn't come back to google before their session expired. If they return too quick you get a bounce which is bad. If they return after the bounce interval it is neutral (no penalty). But only the last link gets boosted for the search term.

They measure the time from an exit link to the next action as the time on site. They figure if you kept searching results then you didn't find what you hoped to find with the search terms.

They aren't looking at referer URL so schemes to redirect to a different page wont help. The most popular scheme for adjusting bounce rates is to use a nice fast progress bar for 5 seconds that says the page is loading. do this in javascript then unhide your main content div.

The nicer looking your loading scene and the faster the action is in the animation is the more likely people wont mind it. Only display it if the referer is from google or other search engines... use a cipher like ascii character value - 1 on your strings so googlebot and others don't see their names in the javascript source.

I don't know what are you talking about. Google, Google, Google, bla, bla, bla. Do you have guys all your sites registered by Google Analytics???? With black hat??? Very inteligent.
 
Back
Top