Skip to content

My results with GSA, need your inputs

I ran one campaign for last 3 days to build contextual links for my site. This are Tier1 links for me.

My config:

1) GSA + CB + 20 private proxies
2) One campaign with Article, Wiki, Directory, Social Network, Social Bookmark, Document Sharing, Microblog Web 2.0 selected
3) I have 27 keywords related to my site
4) Manual spun text for everything using SpinFolderAndDelete macro (whereever supported). Usernames/Pwd randomized and default.
5) Indexed with GSA SEO Indexer in fast mode
6) Search engines selected by Language - english
7) Skip sites with PR below 1 and OBL filter at 50
8) Always use keywords to find target sites
9) Stop after 60 +- 10 submissions per day

Results (after 3 days and showing as of now in Project stats):
Submitted 261, Verified 42, Indexed 5

I also built a list of 22K Article directory urls by using GSA footprints + my keyword using Scrapebox. Ran this and got another 9 verified URLs with 50 odd submissions. None indexed.

Questions:
1) Will remaining submissions convert at a later date or should I assume they are lost?
2) Any suggestions to improve # of links built.

Comments

  • More questions/comments:

    3) Couple of links where built in french article sites, even though I selected only english search engines.
    4) Threads were at 200 and HTML timeout at 120 but I never saw it reach that number. Mostly between 10-30 threads were running and sometimes even lower
    5) When I scraped I had a list of 12K URLs with PR between 1 & 10 - I got only 9 verified links out of this, is this normal?
    6) Even though submission limit was 60 +/- 10 per day, it really took all day to make those submission. So I am sure the limit itself was not a bottleneck.
    7) My server is very powerful with 100Mbps network. So I don't think there were any infra issues.
    8) I used only one email account (outlook.com)


  • - why 50 OBL for the platforms you've choosen? skip the OBL filter, its only needed for blog commenting type of backlinks.
    - 27 keywords is not enough for scraping in the long run.

    please be aware of the fact that the more you filter, the less links you get for already limited platforms like article directories. also keep in mind that CB can't solve every captcha (recaptcha for example).

    did you read through the "compiled list of tips..." thread yet? do that first.


  • @Ozz
    My site is a wordpress site with a lot of categories (think product type). I wanted to run one campaign per week for a category and build 50 odd links per day. I thought 27 keywords should be enough for the first run. Especially given that I had 12k unique scraped list from scrapebox. Next run I will add more keywords.

    50 OBL filter was to avoid sites with lots of links. Frankly I was just checking how many links I can get with the filter. I didn't see anything in the logs regarding skipping sites because of OBL (I don't know if this is logged)

    I have already read through a lot of threads including the one you mentioned. This is my first campaign and obviously there is much scope for improvement.

    What do you recommend for Captcha? I was thinking since there are just a few types of article engines, captcha should not be a big problem.


  • Update:

    I made the following changes and now have higher submits (297) and verified sites (21) in the last one day.

    1) Pause after - changed to verifications instead of submits
    2) Select all engines
    3) Enabled analyse and post to competitor links
    4) Changed OBL filter to 100
    5) Enabled "collect keywords" and "use collected keywords to find target sites"
    6) Threads 30

    With higher threads (100), my proxies got banned quickly and project stopped (since I have enabled stop after no proxies found). With 30 threads it is working well. 

    For some reason proxies are shown as bad but if I use SER proxy checker immediately it says they are fine. I finally did what another user mentioned - selected automatically find and test proxies and deleted all proxy sourcing sites. Set testing time to 10 minutes. Now my proxies are disabled but get reenabled after 10 minutes. I am using Newipnow proxies. @Sven any suggestions?

    There is still room for improvement. I see a lot of download failed and verification failure is pretty high. Any suggestions?
  • I had best luck with buyproxies. You might wanna try them.
  • SvenSven www.GSA-Online.de
    There is nothing we can do to improve verification not successful but proxy disabling maybe. Will try to improve that somehow.
  • I posted this in response to another question in the forum - re-posting here since it is my current status

    --------------
    Had the same problem. I removed OBL filter and PR filter then it shot up to 5 LPM. OBL of 70 should be ok since I see most sites that my links were posted to had less than 70 OBL. PR can be a problem. But in your case I don't see that set.

    Next thing I would recommend is adding more keywords. GSA will either use global lists or scrape for links. Either you import links into GSA or give it a lot of keywords and search engines to scrape with. I did both and hit 20-25 LPM next day.

    I used scrapebox to scrape article footprints with 100K keyword file (google and you will find it). I now have 500K non-duplicate URLs in identified list. 

    For GSA scraping, I gave it a lot of keywords (about 1K), select find and use keywords from target sites and selected all search engines.
    --------------

    The above status was with only my T1 campaign.

    I have now added a T2 campaign which includes Blog comments, Image comments, Trackback, URL shorterner and Guest book. With the new engines added I am not doing 60 LPM. I also fed GSA the full 100K list of keywords for scraping.
Sign In or Register to comment.