Skip to content

Identify low performance factor

edited January 2013 in Need Help
Hi. First I would like to introduce myself to the forums. I've bought GSA SER a couple of days back and so far I love it <3

I'm not new to IM tools, and I have experience in other tools like Xrumer, Scrapebox, etc. I've done my research and I have read the FQ/tutorial threads in the forum, and I learnt a lot:

I've started 3 test projects but I'm getting not so great results, or at least not as great as I was expecting.
These are the specs of my server

Type = Dedicated server
CPUs = 4 HT (8 virtual) (Xeon)
Memory = 32 GB
HD = 2 TB RAID 1
Net = 100 Mbps
Threads = 230+
40 Semi-private Proxies

And these are my projects:
image

  • 1st project: running for 2 hours (1 URL)
  • 2nd project: running for 10 hours (13 URLs)
  • 3rd project: running for 23 hours (1 URL)
I've taken all the suggestions from the previous threads.
  • Set threads to 230 with HTML timeout of 150.
  • Using Captcha Sniper with the latest definition files.
  • Using a huge list of keywors (250,00+)
  • Using only needed search engines (15 - 20 dpending of the project)
  • I'm submitting links to all platforms except doc sharing sites.
If it helps, this is how I'm setting up my projects

image

image

image

image

In the category list I'm using a list of 500 categories that I use on AMR.

What could be the reason for my poor performance? I'm in for the volume as these are low tiers. I've read that some members like @leeg are getting thousands of submissions per day. Am I right to assume that the lists are a key factor here?. I'm scraping my own lists with scrapebox and I'll try again later today.

I'd be thankful if you could help me out with this.
I don't have too much to share yet, but I'm willing to help here so if I can do anything for the community, let me know :)

Thank you very much.

Comments

  • SvenSven www.GSA-Online.de

    Welcome to the forum.

    The option "always use keywords to find targets" is probably be cause of low results. Didn't you get a warning from the program when using that? 

  • OzzOzz
    edited January 2013
    things i've noticed first:

    - "always use keywords to find target sites" --> this option is only usefull for niche related platforms like 'Blog Comments'. i recommend to either uncheck this or seperate niche related platforms in another project because you won't find many niche related directories for example

    - "put URL in places where its clearly seen as Spam" --> this scares the shit out of everyone, hehe, but if you want to post articles to wikis this needs to be checked

    - "collect keywords from target sites" --> i would uncheck that as you will get many unrelated keywords or loooooong tail keywords that won't help to find new targets. especially since you are using tons of keywords already

    - "use keywords as anchor text" --> i would uncheck that and define anchor text according to your projects. spin your anchor text if you want to have tons of variations.

    - "use secondary anchor text" --> define your secondary anchor text as well in the field next behind that option

    - regarding search engine selection --> if you are targeting only US links than test the search engines for results and compare the results with each other. you can use 'options' --> 'advanced' --> 'tools' --> ´'search online for URLs' to do this test (safe results to file) and compare it with scrapebox.
    if you just want to mass backlinking than it might be best to just stick to independent from each other SEs like Google, Bing, Baidu (cn), Yandex (ru), because most other SEs are clones from Google and Bing and give you many duplicates more often than not. i'm testing this also with japanese and korean SEs plus the other russian and chinese SEs atm and getting way less duplicates (thanks to LeeG btw who brought this up).

    welcome on board!
  • AlexRAlexR Cape Town
    Welcome! 

    This should be stickied as a model post for all posts "I need help...."

    There some good suggestions from Sven and Ozz. Also with 40 proxies, you could reduce your custom time between SE's to around 5 s, rather than 10. (Found in global settings). 
  • @sven Thank you, I changed that setting and it worked. By the way, I'm loving GSA SER and I also have Website submitter. Cheers :)

    @Ozz Amazing reply, thanks a lot for your thorough explanation. I really improved the performance of the app thanks to your tips, and thank you for the welcome.

    @GlobalGoogler Thanks for your reply :) I was looking how to change the delay time between queries but I can't change it as I only use proxies for posting. I'm going to test scraping/posting with proxies to see if I can improve my success rate.

    Thanks to these legendary posters who replied. I love this community, seriously. Thanks again dudes.
  • Where can I get a keyword list?
  • - "always use keywords to find target sites" --> this option is only
    usefull for niche related platforms like 'Blog Comments'. i recommend to
    either uncheck this or seperate niche related platforms in another
    project because you won't find many niche related directories for
    example

    Real newb question coming up; if this above is unchecked how does GSA SER find target URLs? I'm worried I'll run out of target sites.
  • is uses the footprints of the *.ini files to search for specific platforms.

    just double click an engine in your project options and the ini file will open. there you'll see a line with a "search term=" command with the footprints used for searching.
  • Trevor_BanduraTrevor_Bandura 267,647 NEW GSA SER Verified List
    When GSA searches for new target sites using the footprints in the .ini files, it does also include our keywords in the search right?

    Because I have noticed that using those footprints in the .ini file, and combining them with different keywords, I do get different results.

    I'm just a little unsure about editing the .ini files because I don't want to mess something up then it don't work at all.
Sign In or Register to comment.