1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

[Guide] The Bottom Line of Analysis of Links Data

Discussion in 'Black Hat SEO' started by Lilacor, May 22, 2013.

  1. Lilacor

    Lilacor Regular Member

    Joined:
    Feb 22, 2013
    Messages:
    206
    Likes Received:
    91
    Occupation:
    SEO expert at SeoFen.com
    Home Page:
    Hi all,

    I just wrote an article on link analysis and I decided to share it with you. In the post Penguin 3 and pre-Penguin 2.0 world and all unnatural link warnings on the way it's more and more important to know how to analyze your backlink site profile by ourselves. The article below is my attempt to put all the info together and give everyone here some useful (hopefully) tips on how to analyze link data. I hope you will find this read interesting and will build good backlinks and remove the bad ones much more easier. Let's hope that Penguin 2.0 will fix everything that's to fix out there :)

    ---

    Link data reign supreme in the online space and inform people on backlinks which are precious for online findability. A list of backlinks generated for a given website paints a clear picture of what the website online profile is, so that webmasters can figure out what more has to be done. Webmaster tools are indispensable for drawing up charts and making comparisons with the competition. For those that have yet to plunge into the link data pool, the present article would be useful, dealing with tips on how to obtain such data, how to perform the analysis, and how to identify the existing problems.
    The Manner in Which Data Can Be Obtained

    Here is a list of tools which can be used to obtain data, in a simple and straightforward manner:

    The first two tools help to obtain an idea of what the picture of a website?s backlinks is in the eyes of major search engines. For analysis purposes, more robust tools should be used, to help derive all the information needed to analyze link profiles.
    The Manner in Which Data Should Be Analyzed

    When choosing a links analysis tool, it should be certain that the tool selected can provide the areas of information listed below:

    • The URL linking to the website.
    • The page that the URL links to on the website.
    • The anchor text that is used.
    • The type of link, follow or nofollow type.
    It can also be helpful if the tool can yield information on one more link aspect, if the link is an image link, or a text link; but that is not a problem if the tool does not provide such information.
    Below are listed the steps to pursue in the examination of the backlink profile.
    1. Checking the Anchor Percentages, Such as Brand/URL, Compound, Money, etc.

    The data to search for comprise:

    • The brand or URL anchors. They include the terms using brand names or URLs as anchor text (example: ?Search Engine Watch? and ?searchenginewatch.com?)
    • The money anchors. They include the terms that a webmaster or his client targets to rank for (example: ?blue widgets?).
    • The compound anchors. They represent anchor combinations (example: ?blue widgets from Search Engine Watch?; here the combination of money anchor and brand anchor is seen).
    • Other anchors.
    To perform the search, webmasters can use handy tools, such as the Link Research Tools which enables the identification of each anchor by category and also shows the percentages. When webmasters decide to perform the search on their own, they may come face to face with a whopping amount of work to be done, with large profiles.
    Anchors need to be matched to categories, so that the percentage for each anchor can be obtained. Majestic is a tool which displays the results in pie chart form, with the anchors and percentages shown. Before selecting a tool for the checking, webmasters should find out if the data that the specific tool yields is enough for the purposes they target. In the event that there are not more than 100 backlinks, with most of them being similar ones, then it may not be needed to assign each anchor to a category.
    The information obtained from the search can help to identify if there are any suspicious places that should be inspected in deeper detail. Here the usefulness of competitive analysis is clear. Let us discuss an example to illustrate that. A website that has e.g. 65 percent money anchors, then 25 percent of brand/URL anchors, and finally 10 percent of other anchors, will appear as a site that heavily relies on money anchors. In competitive analysis, it may be found out that competitors have even larger money anchor percentages, so the website being inspected is not in a bad position compared to the competition. And vice versa, if the website has as little as 10 percent of money anchors, and the entire competition has a much larger percentage, then the site should strive to increase its money anchor percentage accordingly.
    It should be noted, however, that there is no hard and fast rule or standard with respect to the percentages discussed above, and the specific industry can play an important part. Brands can also influence the needed percentage. The percentages discussed here should only be used as estimations.
    2. Checking the Ratio between Home Page Links and Deep Page Links

    It was already discussed above that competitive analysis is helpful for seeing how the industry standard is set. For example, a website?s home page may have a scant number of links, but the subpages can be supplied with enormous numbers of quality links. The competitors can display the same pattern. There can also be a website home page gathering almost the entire number of links, with hardly any links left for the subpages. It depends on the picture in the online branch of industry if a specific linking pattern is natural or not natural. The ratios for the website should be compared to the other websites in the niche, to see if they conform to similar patterns.
    The checking of such patterns is easy, with a list of all the linked to pages of the website compared to the links list to the website home page. A tool such as Link Research Tools can yield the deep links ratio, but the ratio can also be calculated by webmasters.
    [​IMG]
    After the results have been obtained, it is time for thinking what should be done for improvement. If a website has fewer quality deep links than the other websites in the niche, that is an area where hard work should start. And vice versa, if the analysis shows that the website shows the similar pattern to that of competitors, it means the route pursued is the right one.
    3. Performance of Geolocation Analysis

    Geolocation analysis is also essential in the identification of potential hassles. It may happen that a website ships products to customers in its country only, and the predominant share of its links may come from websites outside of the country, websites that are in other languages. The websites linked to can have a rewarding reputation, but nevertheless they are spammy links. So geolocation can help to see where spammy links are formed to achieve artificial rankings. The tools used can help to obtain information on the top level domains, the hosting location, the country?s popularity, etc. Webmasters can check the capabilities of Majestic which offers a great map.
    [​IMG]
    The information yielded by the tool can show up any suspicious areas. If a website hosted in one country has the majority of its links coming from sites in other countries, that is worth delving in deeper. It is OK to get links from another country different from the website hosting country, if the website targets its customers too.
    To sum up, any indications of unnatural developments or situations should raise a red flag. Especially for websites targeting users from specific countries the geolocation information is precious.
    4. Checking the Sitewide Links

    The sitewide links should be checked, although they can be OK, such as links to relevant blogrolls. But there may be spammy sitewide links too. Common examples can be footers that are heaving with not really relevant links, and there are blogrolls that have large numbers of links to payday loans and to other sources unrelated to the specific niche.
    What can pose difficulties is the fact that the relevance of sitewide links is difficult to establish with a tool or with some algorithm. Then people may be happy they have a whopping ling profile with zillions of links, whereas in fact these links may not be relevant quality links, as they come from hundreds of different domains. Here is a pie chart for illustration.
    [​IMG][​IMG]
    No matter what tool webmasters use, they can calculate the numbers on their own, taking the number of referring domains and dividing it by the number of backlinks a site has. After the percentage of sitewide links has been calculated, and it turns out it is too high, it is time to take action and start aiming for non sitewide links, to establish some balance.
    There are no hard and fast standards regarding the sitewide links percentage either. But the common rule of thumb can be that if sitewides account for the largest share of links for a website, there may be some problem waiting to happen in the near future. There are a vast number of sites that have suffered penalties owing to the spammy amounts of their sitewide links achieved through different blogs.
    No machine can ascertain if links are good or bad, but websites that have most of their links as sitewides, and get punished, they are going to struggle with problems similar to the problems faced by site owners that got their links through blog networks.
    Webmasters that use two or more tools to carry out their analysis should keep in mind that tools can use their database as the source of their information, but they can also obtain information from different sources, so it is not to be surprising that the numbers yielded by the different tools do not tally exactly. In all cases it would be best to choose one tool and keep using it to have the picture of the trending from it.
    Pinpointing Problems

    When setting out to identify problems, it should be pointed out that tools such as Link Detox or Link Risk tool can efficiently display a list of potentially hazardous links, but because data yielded are not a hundred percent dependable, it is worth examining those links before they are removed. In Link Risk, for example, webmasters can upload a list of links that they wish to examine. They can get a csv file from another tool, from Majestic for example. As for Link Detox, all that is needed in it is to enter the URL.
    [​IMG]
    The potential hazards to be on the lookout for belong to the following groups:

    • Excessively high anchor percentage with respect to money related phrases.
    • Excessively high percentage of sitewide links.
    • Excessive number of links that come from a geographic area that is too far from the website country range, or links from websites in foreign languages.
    • Excessive number of links coming from sites that are not indexed by Google.
    • Excessive number of links coming from from networked sites
    According to some webmasters, the information supplied by research tools is not wholly reliable. It would greatly help if webmasters have an exact idea of what they are planning to do, and if they think over the information they get. Hasty removal of links without scrutinizing is not a wise policy.
    Both the tools mentioned above are truly efficient in narrowing the scope of possible problem areas. But afterwards it is recommended to manually review all the suspicious links. Sometimes the tools used can point to a perfectly quality link as a suspicious one, whereas in reality it does a really great job in attracting traffic. So tools cannot be entirely dependable in judging how beneficial a link is. Tools rely on the metrics available and base their conclusions on them. These conclusions are generally very accurate, but sometimes they may not be reliable enough.
    So when a research tool identifies links as hazardous ones, and after the manual checking a webmaster finds that these links are of no benefit indeed, then it is worth going on to remove them. It is true that cleaning up is a boring task, but even in the event that a site has few links, and most of them are suspicious, worthless ones, it would be truly better to remove them.
    Nowadays Link Analysis is Easy to Perform

    Although the amount of data on website links is considerable, the loads of data can be handled more easily with the tools available. These tools are increasingly improved, and although the amount of reporting they yield is often overwhelming, being calm helps to handle all link analysis tasks.

    Source: http://seofen.com/the-bottom-line-of-analysis-of-links-data/
     
    • Thanks Thanks x 3
  2. Natxo

    Natxo Newbie

    Joined:
    Mar 24, 2011
    Messages:
    14
    Likes Received:
    7
    There's definitely some critical info that we should all be aware of these days of never ending G's animal updates. Thanks!!
     
  3. Lilacor

    Lilacor Regular Member

    Joined:
    Feb 22, 2013
    Messages:
    206
    Likes Received:
    91
    Occupation:
    SEO expert at SeoFen.com
    Home Page:
    Yeah, lots of things to consider, hope you'll like the suggestions given :)
     
  4. judif414

    judif414 Regular Member

    Joined:
    Feb 25, 2013
    Messages:
    488
    Likes Received:
    438
    Damn that must've taken a while to type! This is great information thanks!
     
  5. Lilacor

    Lilacor Regular Member

    Joined:
    Feb 22, 2013
    Messages:
    206
    Likes Received:
    91
    Occupation:
    SEO expert at SeoFen.com
    Home Page:
    Thanks judi :)

    It's even more important with the Penguin 2.0 on the way!

    Cheers ;)