Skip to content

GSA Best Practice (Part 1 - GSA & Other Services Setup)

AlexRAlexR Cape Town
edited March 2013 in Other / Mixed

I have read every post in the new forum and almost all the posts in the old BHW forum. There are so many tips that have been mentioned that I thought I would try and consolidate them into one place. There are so many ways to use this tool, but I think that what I have outlined below is the a best practice based on the various threads. I would love to get feedback and update any aspect that you feel is incorrect.

There are so many users who have helped me and I hope this adds back to the GSA community. Thanks to all for their advice.

  • VPS – 2GB Ram, unlimited data. Option is or, although Berman is more affordable.
  • Proxies – Private Proxies are a must. At least 10 private proxies but 20 are preferable. The number of proxies you require depends on the number of search engines you use. For most uses 20 will do. If you do not understand GSA SER (hereafter known as SER).
  • SEO Indexer (hereafter known as SI). Once of fee of around $20. (To get all your links indexed so the SE's recognise and see them!)
  • Captcha Breaker. Thereafter known as CB). Once off fee.
  • ShaniBPO capcha service. (only be used for important links) 
SI Settings
GSA SER Options
SE Interval = 3s. (If you have more than 20 search engines enabled, otherwise set it to 10s)
(Proxy Settings)

Set primary capcha retries to 3. When you test the service is should show any balance. This means it is working. 
Set website size to download to 3mb. Not point wasting resources and time downloading large websites. (It also serves to prevent you posting to sites with 1000’s of spam comments that have made the site very big!)

GSA Project Structure

You will require a number of projects and sub projects using different platforms. 
Step 1 is to create a project called “Sample Data”.  We will use this repeatedly to create sub projects later.
  • In all fields ensure you have replaced the default spintax. Too many people use it and it is generating some badly worded comments, about me, etc. Using the default ones provided will LOWER you successful links rate…so ensure you use your own ones!
  • Make sure you complete ALL the fields. For any non-generic ones, like Article Title, email, YouTube ID, place a “TEST” as the entry (we will insert the correct one later), complete all the other ones that are generic (e.g. blogs, about, forum, etc.)
Project "SampleData" Platforms
Select all platforms. 

Find any foreign fields (e.g. Polish), and right click and select “Disable Engines that use this”.
Project "SampleData" Data Settings

Under Categories ensure you also have the general categories. Ensure you have them in format *Category*.
Project "SampleData" Option Settings

"Sample" Project SE's to Use
Create a file with SE's that you like to use and import it here. 

Blacklist Settings
  • Clear the current blacklist.
  • Enable 
    “Skip sites with following URL in domain”  & “Skip sites where following words appear” and import a bad words list.
SampleData Email Settings

Use hotmail as it has the best success rate.

Once you have the “Sample Data” project set, right click and add it to the group. “Test.”

Now to setup the projects!

Right click on project “Sample Data” --> Modify --> Duplicate.
Always use options of “Sample Project” unless specified to change! 

Create the following projects (by right clicking and duplicating the project “SiteXXX-SampleData”):

Group 1 - Project 1 (Website Data Sample)

Give it the name “SiteXXX-SampleData”. Right click on “where to submit” and select “Check All”. Insert your email (and insert and test it under “Email Verification” tab). Insert at least 1000 keywords! Insert website URL. Insert anchor text in format {keyword1|keyword2|etc}

The SiteXXX-SampleData, has all the data we now need for the various other site projects so we will just duplicate this in future. Make sure the SampleData is set to inactive! 

Group 1 - Project 2 (Directory Submissions)
  • Set PR to 3. (We do not want to submit to poor quality directories hence the PR3+, and the reason it’s a separate project)
  • Set to 25 submitted links per day. (Do not use verified as it can take weeks for these links to get verified. SO if you set to verified the program will keep submitting and submitting as it obviously won’t get the 25 VERIFIED links in a day). 
  • Enable use global site lists. (Since directories contain many categories, and there is no point wasting resources searching for every directory project.)
Go to Data:
Disable “Use generic anchor texts” as we don’t want a “click here” under website in directory submission!

Group 1 - Project 3 (Directory Submissions Tier 2)
Give it the name “SiteXXX-DirectorySubmissions-Tier2”. The reason we do tier 2 for directory submissions is often you get an unlisted/new page, and with a few backlinks directed at your listing you can get that page to 50% of the domain PR. So if directory is a PR6, with a few links you can get your listing page to a PR3!



Part 2 to Follow!



  • AlexRAlexR Cape Town
    Part 2 will look at the various projects and settings you can use including the different tiers and platforms to use for each. 
  • SvenSven
    Thanks for this :)
  • AlexRAlexR Cape Town
    edited September 2012
    @Ozz, @Erbsensuppe, @m3ownz, @bytefaker, @S4nt0s, @Hunar, @Cherub - is there anything else I should add to this list? (Or change!)
  • AlexRAlexR Cape Town
    Here is the start of Part 2 - just need some feedback from the super senior users. :-)

  • AlexRAlexR Cape Town
    Just seen this thread updated (I've been travelling).

    So is it decided that 6 is the optimum CS retries? 

    Does anyone know which of the platforms only offer 3 retries?

  • edited September 2012
    Thanks for putting this together :) I hope that people understand that the best settings also depend a lot on the individual goals and projects.

    I have in mind that someone said Recaptcha only allows 3 trys. If that would be true, it should be only 1 CS retry in the settings. If we have a recaptcha captcha it will work like CS->CS->ShaniBPO/DBC. Can somebody confirm? I'm not sure on that, just read it somewhere.
  • AlexRAlexR Cape Town
    edited September 2012
    @erbsensuppe - "I hope that people understand that the best settings also depend a lot on the individual goals and projects." - like getting to number 1 in Google :-)
  • I mean more like getting there very fast, but don't stay for a long time ;)
    Also differences between MNS or authority site, funsite or business site etc.
  • This is GREAT, quite a few things I can make use of - thanks for taking the time to put it together so nicely.
  • AlexRAlexR Cape Town
    @bradcma - my pleasure. Working on the next phase...

    If we can get a best practice guide, then we can solve many of the similar questions on the forum as if everyone has the correct settings, it won't cause any issues. 
  • Thanks for this Global,  This should be sticked as well.  So that the redundant questions that are asked over and over and over can be answered by simply viewing this thread.  If there was a rep button or anything of the sort I'd surely give it to ya. :)
  • OzzOzz
    edited September 2012
    First of all, great work! I think we should create a sticky "Important Threads & Posts"-thread to keep the very good threads together.

    I hope that people who read this realise that some of the settings like "thread-count" or  "use proxy.. everywhere" or "custom search time" are dependend on what machine and what/how many proxies you got and not just copying the settings. But that's not your fault :)

    I personally don't have "Colllect keywords..." and "Use keywords..." checked in my project. For tier2-3 projects it could be fine, but the keywords I've collected weren't niche-related often times, so I skip that for my tier1-projects and scrape some keywords with scrapebox for example.
    I've suggested to Sven to collect the keywords from the SEs a while ago (..."Searches related to %keyword%"). If he added that some day to SER I will give it another try.

    Regarding CS retries: If someone only use CS then 5-6 retries should be fine. If my math is correct (!!?) and we assume that our worst solving rate for a particular captcha is 25% with 5 retries (= 6 solving attempts) then we have a propability of ~82%. With 6 retries we got ~87% that one attempt was sucessfull.

    I can't tell which particular platforms gonna ban you after 3 attempts and to me it was like common sense as I've read this at BHW a while ago. In the mean time I think different as there should not be too many sites that ban you for retrying.
    I've read today at BHW that the default settings for phpBB are 3 retries max, but I don't know of any other platform for sure. I can't confirm that regarding "ReCaptcha", because I've scripted many engines for reCaptcha-sites by now and never got banned while testing.

    However, we should ask Sven if it is possible to not send reCaptchas to CS and directly to 2nd service or skip them.

    PS: mediafire will block your link because of the "anominyser". People can copy/paste "" in their browser though or you just add them to your post. "*Other*" is another category I've added to my category list.
  • @Global: Just wanted to say thanks you!

    @Ozz: Great idea! I think if the platform isn't supported anyway it doesn't make sense to use resources or to waste time with trying to solve them (-> send it directly to the backup captcha solving service).
  • I dunno,

    Ever since I read Lee's Post on BHW about saving all the unknowns  and than using the destruction kit to figure them out for GSA.   and how his submitted/verifieds are blowing up  I decided to give it a go.  I literally spent all day/night just doing unknown captcha's.    What I found for the most part  even without doing anything at all  CS still got pretty darn close to solving the captcha on it's own so the only thing I had to do was test/activate it and that was it. 

    So it would be nice if there was a option in CS to make it so not everything will come back as unknown and maybe it could just try to solve it.

    Since i've been doing this  i've seen my success/verified go up quite a bit.   Please let CS give this a try before you decide to automatically make unknowns not even sent to GSA cause i can guarantee you,  You'll see your verified's increase a lot just from this alone.
  • @Hunar: recaptchas use always the same form field on every site ("recaptcha_response_field" or something like that). I think it is pretty safe to redirect them to 2nd service or skip, if thats possible somehow.
  • Thanks for this... i always wonders whats will be the best options to choose for the campaigns... 

    BTW None of the below links works... please update and if you have the files please upload them. Thanks

    Import this list of SE's. SE to Import (Thanks to OZZ for this research!)

    Blacklist Settings
    • Clear the current blacklist.
    • Enable “Skip sites where following words appear” import Bad Words List.
    • Enable “Skip sites with following URL in domain” import Bad Words List.
    • Disable “Skip Sites from following countries.”
  • Good job, thank you for sharing. Ive been avoiding the hard work of setting this thing up properly. I need to bite the bullet and get everything configured correctly.
  • AlexRAlexR Cape Town
    1) Will add "*Other*" to the general category list. 
    2) Didn't follow you why mediafire is not working. Is it a setting I need to edit? 
    3) proxies & timeout & threads, are based on an average VPS that can utilise the 2gb ram that GSA can max out on (as it's a 32bit program) with CS running and indexer. Too many people expect amazing results with a laptop and slow connection and wonder why it's not happening. But you're right, I should add a section for a different machine setup there..(just maxed out the post chars, so will have to remove another section to get that in!) Any setting suggestions for 10 private proxies, and a laptop? :-) 

    @Everyone else - what is your consensus about the "collect keywords" setting?

  • AlexRAlexR Cape Town
    @Ozz - will edit the Capcha settings to 6. 
  • AlexRAlexR Cape Town
    CS settings updated. 
  • AlexRAlexR Cape Town
    @Everyone - the other issue that we need to decide on is it worth building tier 2 links to no follow pages? I would say so, as we only build up tier 1 links that are going to stay and it's more natural to have a good mix. 
  • Excellent Work! Thanks  :D
  • OzzOzz
    edited September 2012
    @GlobalGoogler: mediafire-links don't work because they block links with "hxxp://". Because of that you should post mediafire links like this: 
 (without "www.")

    People can copy and paste that line into their browser.

    regarding 3) it was in no way meant to criticize your work here! But you know people, they are lazy and just copy whatever they see in images. Thats not your fault, its theirs :)
  • question regarding the preferred search engines - I uploaded them and their is only 37... why is this preferred over the potential 156 english language SE's?
  • OzzOzz
    edited September 2012
    @jamesruhle: thats because many of them won't give you any or good results. just read this thread to know more about this:

    its no problem to check all other english "google, msn, yahoo"-SEs though. 
  • Thanks for this. IT would really help people like me. There's one part I don't understand: "Enable use global site lists. (Since directories contain many
    categories, and there is no point wasting resources searching for every
    directory project.)"

    Don't we need to have a global site list first? For a new account, it wouldn't have any right?
  • AlexRAlexR Cape Town
    @jonathanjon - if you don't have a site list it will use the SE's to find some. Then over time, it will divert this time/resource to posting to existing directories, rather than searching. 
  • AlexRAlexR Cape Town
    @Ozz - Still not sure about mediafire! I have placed " <a href="">SE to Import</a>" in the code and it's still giving error. 

    @S4nt0s, @Bytefaker, @Hunar, @m3ownz - what's your consensus on "collect keyword setting"?
  • OzzOzz
    edited September 2012
    You can't use a complete Mediafire-Link on this forum because it will be blocked by MF.
    Just post "" without "hxxp://www.". People have to copy/paste the url directly into the browser.
Sign In or Register to comment.