Facebook Cloaking Methodology

innovativecr

Newbie
May 19, 2014
15
4
Hi all,

I am fairly familiar with device, ip/proxy, user-agent and browser detection techniques and am curious as how to which people maintain a list of Facebook footprints?

Is it churning out a bunch of ads and monitoring the first 1-2 visitors to the landing environment, then storing these footprints as non pass-through?

Thanks in advance,

inno
 
What you describe will help you collect FB bots, though getting FB bots is easy, they advertise their ranges in their website here:
https://developers.facebook.com/docs/opengraph/howtos/maximizing-distribution-media-content

The IPs u need to really collect are the 3rd party services they work with (security companies, etc) and the manual reviewers IP. The only way to collect them is by gathering a huge data set of numerous active campaigns across different periods from all over the world, and cross data between them. So, creating an ad and waiting for a manual reviewer to visit, or some 3rd party crawler, aint gonna work really.

Besides, gathering IPs is the least you really need to worry abt, as IPs get changed, and they use dynamic IPs too which makes it hard to track/block (as legit users use such IPs too, so blocking them means losing traffic). There are other ways to differentiate good traffic from bad one, and it's not by the IP (or UA string, host, referrer, etc etc all the headers you collect) level.
 
Last edited:
One company, when reviewing Facebook's traffic quality on their paid ads, simply looked at what percentage of the traffic they were receiving had javascript enabled or not. Typically with this company's traffic, they would see below 5% or so of overall traffic had javascript disabled, but they later noticed that somewhere upwards of 70%+ of their paid Facebook traffic had it disabled. Considering how javascript-heavy Facebook is, it's safe to say these likely weren't real users, but rather bots.
 
One company, when reviewing Facebook's traffic quality on their paid ads, simply looked at what percentage of the traffic they were receiving had javascript enabled or not. Typically with this company's traffic, they would see below 5% or so of overall traffic had javascript disabled, but they later noticed that somewhere upwards of 70%+ of their paid Facebook traffic had it disabled. Considering how javascript-heavy Facebook is, it's safe to say these likely weren't real users, but rather bots.

Again, this method will only differentiate bots from users which is easy (though some bots do have JS enabled, like Google etc). However, manual reviewers obviously have JS enabled, so it aint gonna solve manual reviewers detection
 
Its very difficult to predict what facebook will do. When you submit an ad fb of course hits the link to see what your submitting and to give a preview. But then they do check the link later on with simplistic ip ranges and or specific useragents, browsers or internet providers. If you cloak to many internet providers then youll cloak to many people. Its difficult to remain cloaked because they use proxies and or dynamic ips as mentioned above.
 
Back
Top
AdBlock Detected

We get it, advertisements are annoying!

Sure, ad-blocking software does a great job at blocking ads, but it also blocks useful features and essential functions on BlackHatWorld and other forums. These functions are unrelated to ads, such as internal links and images. For the best site experience please disable your AdBlocker.

I've Disabled AdBlock