Hi, I'm relatively new to GSA SER and SB. What I'm doing now is: 1. Extract all the footprints of all the engines of SER (2012 footprints to be exact). 2. Scrape niche related keywords 4 levels deep with ScrapeBox (end up with a big list of ~200k KWs from no tail to long tail KWs). 3. Split the keywords into chunks of 500 keywords per text file since ScrapeBox can't handle amounts over 1 million keywords very well. 4. Load the first chunk of 500 KWs and merge it with the footprints (results in 1 million keywords (500 x 2012). 5. Start harvesting 1000 results per Keyword from Google, Yahoo and Bing. Repeating 4 and 5 until I'm out of keywords, process (filtering) all the harvested URLs and load them in SER for it to identify the Engines of all the URLs. I know that I might not use all of the engines in SER when link building, but wouldn't it be good to have them for in the future? Let's say I want to do churn and burn, all the engines I wouldn't use in a normal link building project would come in handy so I wouldn't have to harvest again. Is what I'm doing good? Anybody got some tips for me?