Skip to content

i bought a scrapebox list... what is the best way to use them for GSA SER?

UsernameUsername
edited November 2012 in Need Help
Do I:

a) Sort the lists first and let GSA try to identify what platforms they are?

-OR-

b) Do I just go ahead and import them as target URLs for my project?

Comments

  • It depends on the list, but assuming its an AA list (for comments), you can just supply it to any project you have with blog commenting and let it run.
  • UsernameUsername
    It's a bunch of different lists - AA blogs, article directories, wiki, etc.
  • dmtaylor247dmtaylor247
    edited November 2012
    Don't bother...

    Here are a few tips if you want 30k+ comments per day.

    Go to these gigs on fiverr:

    http://fiverr.com/seo_dude/give-50000-instant-live-seo-backlinks-to-your-web-sites-and-web-properties
    http://fiverr.com/moonwolf/create-50000-instant-verified-absolute-quality-backlinks-from-10000-unique-domains-to-your-site

    There are a few more but some don't give reports some others do:

    Buy it and send to a dummy url, collect the report, most of them are general blogs, around 20% dofollow. import them all into scrapebox and remove the duplicates, then paste into SER, (beware there will be around 150k urls). Sometimes if you open a file (check imported url's) this size when your VPS is at full speed it will overload. When you run these comments your VPS will be at full speed...

    They use the software Solid Botworks ($39), buy if if you want to blast hard and increase server performance and save your VPS for more important tasks. The best thing about it is it collects urls that sometimes only have around 10/200 OBL, some are really good quality links.
Sign In or Register to comment.