1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Several SEO questions from a newbie

Discussion in 'White Hat SEO' started by RoyCooper, Mar 2, 2017.

  1. RoyCooper

    RoyCooper Registered Member

    Joined:
    Mar 2, 2017
    Messages:
    76
    Likes Received:
    4
    Hi everyone!

    I have several questions and I was hoping you could help me out with them.
    Right now I’m focusing in website building in WordPress, keyword searching and on page optimization.

    1) Regarding domains names which is better for SEO ? Www or a naked website?

    2) Are there any good toolbars for firefox? Ahrefs seems to be outdated and not working, Moz has been removed. SEOquake I’ve read isn’t accurate.

    3) Is link-assistant software any good?

    4) How can you start a website immediately with HTTPS and not having to go through a redirect HTTP to HTTPS? This redirect takes a second or so in page loading time. Can we even bypass HTTP altogether and just go for HTTPS?

    5) Do you always have to tell Google search console that you have HTTP, HTTPS, WWW and non WWW and then specify which you prefer it to use to avoid duplicate content?

    6) I use Keywords everywhere and Ubersuggest for keyword search volumes. Any other good free options? Please don’t include Google Keyword Planner.

    7) For keyword difficulty I’m using the free sides of KWfinder, Moz Keyword Explorer, Keysearch dot co, keyword revealer. They all seem to have the same type of keyword difficulty so I’m guessing they use similar algorithms? I use SEMRush also, but the results are very different and always more difficult. Any other suggestions? Or maybe these tools aren’t the way at all like I have already seem written and I need to do it manually? Whatever that mean exactly...

    8) In Yoast SEO for wordpress - can you add more than one keyword to check density or it’s always manual one by one?

    Thanks!
     
  2. validseo

    validseo Jr. VIP Jr. VIP Premium Member

    Joined:
    Jul 17, 2013
    Messages:
    906
    Likes Received:
    521
    Occupation:
    Professional SEO
    Location:
    Seattle, Wa
    No correlation with rank position. Shorter is better because visitors can type less and read less with respect to your URLs.

    upload_2017-3-2_7-58-15.png

    I never needed them to rank so I very rarely used them.

    It is decent if you only have a small number of sites. I manage over 30 sites and many of them overlap on the same keywords and you have to use multiple projects and query the same keywords multiple times. Their rank tracker is limited to 10 competitors so within one project you can only track 11 sites worth of rankings for the same keyword list... that is a very weak limitation.

    Their proxy support and captcha handling is very good. So if your needs are small enough it is a very nice tool.

    I also discovered years ago that if you track SEO revenue monthly in Google analytics then you don't need to track rankings hardly ever. If revenue is holding strong or growing then there isn't much need to deep dive. I only have to investigate downturns which are very infrequent... maybe once or twice a year. This frees up a lot of me time to focus on doing SEO instead of spending so much time measuring SEO. Revenue is a much better KPI and has better bragging rights than rankings anyway.

    SEO revenue is a better gauge for the overall health of your SEO too... rankings alone will fail to alert many serious SEO issues.

    You can do both. You only have to redirect if you want to force HTTP users to use HTTPS. The way you start is you verify HTTPS version in GSC and set your HTTPS preferences. Additionally canonical to your HTTPS version and force your navigation to use HTTPS. then update your Sitemaps, RSS feeds, and product feeds to use HTTPS and resubmit them.

    Even with all that someone can still type in the non-secure URL so you will still see it sometimes.

    You should if you want Google to present it properly. but it isn't required. It also has nothing to do with duplicate content. Setting your canonical URL should solve the duplicate content issue for both www and HTTPS.

    Google Trends is the most practical for me... it doesn't suggest but the information is very valuable.
    Ask the public is cool too.

    Meh... Using metrics that Google doesn't use... meh... weird way to make decisions in my opinion. I've never needed them to rank. I find them interesting but they are by no means required. You can often assess difficulty by simply looking at the competition.

    I think the best way to approach tools is to manually rank a few pages and create your "method" then pick or build tools that help your method.

    I don't know about yoast's ability to do it but to calculate density properly you need to find all the words Google considers relevant matches for a search. Google makes them bold in the search results. A search for DUI lawyer LA is a good example:
    upload_2017-3-2_8-36-1.png

    The reason is the TF/IDF is the algorithm used for search relevance and the term frequency part is totally dependent on what Google considers a match for a search term.

    "Variations of the tf–idf weighting scheme are often used by search engines as a central tool in scoring and ranking a document's relevance given a user query"

    https://en.wikipedia.org/wiki/Tf–idf
     
    • Thanks Thanks x 1
    Last edited: Mar 2, 2017
  3. RoyCooper

    RoyCooper Registered Member

    Joined:
    Mar 2, 2017
    Messages:
    76
    Likes Received:
    4
    Thanks for the useful info.
    I felt a bit lost in what you mentioned in point 7 though...Could you please tell me what "manually rank pages" means and what methods are there to do it? Or recommend where to read about it if it's too time consuming for you.
    Thanks!