Skip to content

Trim to root with SB good idea?

edited September 2012 in Need Help
So I have some 1,000,000+ urls in various .txt files, I intend to import them to scrapebox - trim to root domain - remove duplicates - check if they are indexed (don't care about PR much) then use the indexed de-duped root domains to import into search engine ranker

Sounds like a good idea?, or would trimming someones.com/2010/some-old-post to someones.com be a bad idea? - could SER still find where to post a comment to?
Tagged:

Comments

  • s4nt0ss4nt0s Houston, Texas
    You might want to run them through the identify and sort platforms. Running a list that big through it would probably take a lot of time so you don't have to, but I know if GSA SER can detect the platform, then technically it can post to it if the platform is open to receive links.

    To see how to do it check out the video tutorials. It's in the global site list and advanced options.

    You can always run a test of 1,000 with the root domain trimmed and see how it detects them and then run a test with the same links with the long URL and see which one it detects better.


  • From my testing it seems that trimming to last folder has the best rate ;)
Sign In or Register to comment.