Discussion in 'AI - Artificial Intelligence in Digital Marketing' started by yitaoz, Feb 7, 2017.
You can't A/B test a URL because Google will only index one version of the content for a URL which means you would have to run an "A then B" test and compare the results over different time intervals and potentially different visitor groups. Additionally, Googles ability to recrawl, detect changes, and process them to the point of altering SERP can take up to 22 days (the longest I've ever measured). Even though it is usually faster than that there is no service level agreement to how much time google can take to process your A vs B changes and that even if you publish them at the same time there is no guarantee they will be processed at the same time.
I like the idea of automated testing but I think A/B is exactly the wrong tool for SEO. So I would classify this as bold claims that are poorly thought out.
Distilled have launched a tool for this.
You don't split test the same URL, you take 1000 URL's and make 1 SEO change to 500 of them and compare the results to the other 500.
If the change makes a good effect, apply it to all 1000 URL's and try something else.
Not only that, but Tom Anthony at Search Love showed us a bot he built that predicts which URL would rank higher based on 12 simple on page metrics. It was correct 75% of the time.
It used machine learning to get to that point.
He was then feeding it real pages, vs hypothetical changes to the same page to see if the bot would rank the new page higher or lower than the current page.
He was then publishing these hypothetical pages when the bot said the new one would rank higher, and he was ranking pages higher in SERPs on the advice of his algorithm
The future of on-page is 100% algo/robot based. Maybe even Tech SEO also...
You will probably see humans move more into content marketing/out reach than doing on site changes.
This was super informative. Thanks!
It does seem like A/B testing for on-page would only work on sites with a lot of pages (otherwise you won't get statistical significance between your test buckets). I'm guessing there is no real way to A/B test SEO if you only have a landing page. At most you could A/B test conversion with optimizely / vwo
You're correct, if you only have a few pages then you're going to have issues... but if you launch a service like the one you have linked that will have machine learning algorithms with data from 1000's of websites (customers) it won't be long before it's able to spit out SEO recommendations for single page sites based on the data it's got from elsewhere.
I think the split testing product they have launched will eventually turn into this, I would be surprised if they are not using the data for this.
This is an interesting article on Machine Learning:
A computer was better at diagnosing cancer than doctors after indexing data from a tiny data set.
If a computer can do something as complicated as cancer, SEO is a piece of cake.
Interesting, looks like a ton of NLP (which I suppose Watson is amazing at after winning Jeopardy) coupled with lots of regression.
SEO would be a piece of cake, and probably catching stuff like PBNs too. Once NLP is mature enough Google could just write an algo that examines all the RDs for a site, and check for subject matter coherence throughout the referring sites. P(ublic)BNs posting all kinds of articles depending on who's paying would probably be really obvious.
And if on-page goes to the machines, then perhaps the only real way to compete would be to build some really quality PBNs haha
*cough* Rank Brain *cough*
Can you please teach me how to do SEO ?
i thought RankBrain was more for interpreting the "meaning" behind ambiguous search queries through better language processing.
But for sure they could turn in on ferreting out PBNs :/
All Google needs to do is blacklist sites using their DNS. That would effectively put this company out of business.
RankScience is good for large enterprise or commerce sites where little A/B changes applied to tens of thousands of individual pages adds up to be a huge impact. For affiliates looking to generate XX,XXX per month 1. you can A/B test things more or less yourself 2. There's probably many more effective SEO tasks you should be focused on that simply running split tests.
Very interesting. SEO is all about data, logic and analytics. All three can be automated. But a bit of a bold claim that they can boost performance by 37%! I dare them to try that with a professional ecommerce organization, bet the results, if any, wont be anywhere near that percentage.
Our upcoming SaaS is running on machine learning and BILLIONS of data points across over 15,000 sites. We document and measure in real time changes made to over millions of web pages and how those changes correlate to changes in ranking. This data is then used to drive our "on-page" optimization plugin.
Effectively allowing us to make changes in real time based on what is actually working in the SERPS and make adjustments on the fly for any algo updates (if necessary). The cool thing is we can do this no matter how many pages a client has. So we can update 1000s of pages that otherwise would take the site owner 100s of hours to do manually.
P.S. We'll likely have a special launch for BHW users and a free/lite version as well
Separate names with a comma.