Skip to content

Need help with GSA (Screenshots attached)

Hello GSA heads, I hope you can help me solving my problem...

I am having really low outcome of GSA lately (using it for the last 6 months or so). Until about a month ago I used to have around 60-70lpm and 20-30vpm, but since last month i am having 8-10lpm 2-4vpm. I tired changing proxies (buyproxies dedicated), vps ip, lists... nothing helped...

Ron suggested that I'll submit some screenshots and of my settings so here they are:

First the specs:

vps - https://www.solidseovps.com/ssdvps.php - i'm using the double time plan

proxies - 50 dedicated by buyproxies (i read that their IPs might be burned out, so i suspect this to cause the problem, any opinions about that?)

GSA SER, CB + captchatronix

screenshots links:

8 - http://i.imgur.com/i3PBSUs.png - this is the log, i am getting a lot of those "sockerror" messages

Hope you can help

Thanks






Comments

  • I am using loopline's box service, that's why i grab the links from the submitted folder
  • your sites list is bad, you have a lot of dead sites

    eg from your screenshot
    wiki.andygo.org
  • ronron SERLists.com
    edited October 2014
    I would definitely reduce GSA captcha retries to 2-3, not 5. I think you need more screenshots of the Project Options - need to see the full page - that is the critical part.

    It would be better if you displayed the pictures. Use the image insert icon just to the left of the Link icon, 
  • Hi Ron

    Here is 2 print screens of a typical project options section, parted to 2 parts:

    image
    image
  • ronron SERLists.com
    I've looked at all your settings and can't see anything wrong. The only thing I can think of is the list. 
  • You say you are using CB + Captchatronix but I see in one of the screenshots.. it's set to "Ask the 1st Service/User to Fill Captchas"

    Thus bypassing Captchatronix.

    Did you mention how you were getting targets?  Are you using a list, making your own list, buying a list?  Using SER to scrape?

     

  • Hi there

    So how can i force it to use Captchatronix?

    now i mainly use looplines bluebox service which grabs freshly scraped links on a daily basis from a cloud server
  • edited October 2014
    Things to try for your performance problem:

    1. Maximum size of a website to download 20 --> 1 Mb (20 is overkill).
    2. Open list folder, sort txts by size, untick all engines lower than 200 (for starters) urls total. (all that < 20 Kb in size)
    3. (Optional) HTML Timeout 180 --> 120.

    Can't comment on Captchatronix, sorry.
  • Hi Nikodim

    Didn't understand what to do on #2
  • edited October 2014
    Optimize your engines by disabling the ones that have few target URLs. It can be done as I outlined above by looking at your engine files' sizes which are located in your C:\Users\<Username>\AppData\Roaming\GSA Search Engine Ranker\site_list-success folder (or wherever your list is)
  • ok got you, thanks for the tips
Sign In or Register to comment.