Skip to content

GSA Captcha Breaker

Hi guys,

I seem to be getting real low LPM and looking at GSA Captcha Breaker, it really isn't going to well. I notice that there is sometimes many minutes between successful Captchas and wondering if my settings are wrong.

The list I am using is great and working well on my other computer, so i know it isn't the list. The settings all seem okay but I'm unsure which are the best settings to tick within Captcha Breaker. 

Here is my current set up: http://prnt.sc/d1fbkh

All help appreciated :)

Comments

  • SvenSven www.GSA-Online.de
    settings are the default once and should be ok.
  • shaunshaun https://www.youtube.com/ShaunMarrs
    Its probably something on the SER side, put a few screenshots of your settings and project options up for it.
  • shaunshaun https://www.youtube.com/ShaunMarrs
    You will need to post the options tab of a project too.

    A few things though....

    1. If you are using a list then there no need to use proxies for your search engines or have any of the options enabled for it.
    2. No need to waste bandwidth on your proxies for PR checking or Email checking.
    3. You have 2 requests set up for CB, this is probably why your % is so low on it, captchas it cant solver are just being sent right back to it slowing everything down. I have it at 0, give it a try.
    4. I have never tried blazing so cant comment on if it is worth your time.
    5. Nothing to do with speed but are you tracking your index rate with indexification? You are probably wasting money.
    6. Not sure of your goals but using Yandex TIC as PR would slow things down.
    7. Again, not sure of your goals but personally I dont care about the PR of verified links.
    8. SEREngines will slow everything down if you have active projects with it as it is required to do so much more than the normal CMS submissions SER does so that could be a reason for speed.
    9. You are pulling directly from the list folders so nofollow links will be being used and in my oppinion this wastes resources.
    10. Make sure your projects are pulling from the identified and submitted folders as thats where your list is being held.
    11. You are using a recaptcha list, this will slow everything down slightly and bring your CB sucess rate down even more, especially with ReCaptchas being sent to CB three times in total and then three times to your third party.

  • Hi Shaun,

    Real good to read the advice you set here, many thanks. 

    I tried Point 3 as you suggested but still no change. Blazing OCR solves quite a lot of captchas and so far they are doing a good job.

    Can you tell me how to track the index rate with Indexification? Is there a specific tool as I have an upgraded package for 100k links per day and like you say could save a good amount of money if it isnt doing the job properly.

    here is some screenshots of a project:


    Thanks for your help!
  • shaunshaun https://www.youtube.com/ShaunMarrs
    I use scrapebox to track the index rate of links.
    1. Having the web 2 platform ticked slows SER down a fair bit.
    2. How many money site URLs are in that project?
    3. Ask all services/users to complete captcha is probably bad, I would change it to just ask all services and not have the user bit on unless you are babysitting SER while it runs.
    4. GSA Indexer is a waste of time, rapes resources and does nothing so I recomend you untick it.
    5. Right click on your search engine field and select none, you have 4 selected so SER will try go out and process targets, totally pointless when you have a list and it slows you right down.
    6. Untick the failed option on your folders as you only need to pull from Identified and Submitted as thats where your list is.
    7. I would untick the vast majority of the "Filter URLs" option, this is most likley where your problem is. You are paying xx for a list and then cutting most of the targets out because of these settings. In addition when the search engine scraping currently set up finds a target it is rejected.



  • Great I will give that  a go and get back to you with the results. Whilst I'm here, do you know an easy way to check on indexification indexing rate?

    Thanks again.
  • Oh and on averge there could be anywhere from 100 urls to 250.
  • shaunshaun https://www.youtube.com/ShaunMarrs
    Do you have Scrapebox and semi dedicated or dedicated proxies?

    There is a way to do it in SER but it does it differently to scrapebox and your proxies end up banned crazy fast.
  • Yes I have Scrapebox and using semi-dedicated proxies.
  • shaunshaun https://www.youtube.com/ShaunMarrs
    1. Open Scrapebox.
    2. Click the Settings button at the top.
    3. Click Connections, Timeout and other services.
    4. Set the Index Check thread count to 1 and click ok.
    5. Load your proxies into scrapebox.
    6. Tick the use proxies tickbox just above the proxy field.
    7. Paste the URLs you want to index check into the URL's Harvested field.
    8. Click the "Check Indexed" button to the right.
    9. Select Google Indexed.
    10. As they are semi dedicated workout this calculation 60/number of proxies, round that number up to the nearest whole number and put it in the Delay in seconds after each request box.
    11. Click Start/Redo.
    It takes a while because of how quickly Google softban proxies these days but that time out should be enough to let you bulk index check indefinitely.

    Once the results are in anychance you could post the total number of URLs you checked along with how many of them were actually indexed? I am going to take a wild guess and say around the 30% mark will be indexed.

    BTW, not sure if you are aware but you should only really send article, social network, social bookmark, video and wikis to indexers as everything else should either already be indexed, penalised or is useless so only index check those platform types.
  • Okay great, I will get this done and let you know the results. So if I use 50 proxies, should I divide this by 60 to get the delay time?

    Thanks.
  • By the way, my LPM has gone up to 65 since 1 hour when I made the changes you suggested.

    Brill, nice one!!!
  • shaunshaun https://www.youtube.com/ShaunMarrs
    @monitorr yea mate so your time delay should be 2 seconds with 50 proxies and you should be able to index check without interruption.

    Good to hear your LPM has increased :).
  • Hi Shaun,

    Yes it seems that Indexification deletes the links after 24 hours so I have only been able to test the links sent to them since the last 24 hours and pretty much the indexed amount is around 4%.

    What are your views on this and also getting links indexed? 

    I have 4 different VPS running separate projects and so the amount of links coming through is over 100k per day. 

    My current LPM is in excess of 100 now so hats off to you for the great advice mate.

  • shaunshaun https://www.youtube.com/ShaunMarrs
    edited November 2016
    Man thats much worse than I thought! I would ditch it and save yourself the cash, I was tracking 50-60% with Elite Link Indexer but its expensive for what you get for burner links but they are the best I tested by a good 30-40%, their owner/the guy who replies to emails is pretty sound too. I have heard excellent things about Authority Indexer but its useless for me as I need more than 1000 links per day.

    In my opinion the other indexing service providers are sticking their head in the sand and knowingly ripping people off hoping their users don't check their indexing rates manually and realise they are being scammed. A Russian guy inboxed me on BHW to test his method and it was getting 80% indexed after 24-48 hours and it held but Google patched it out just before he released it :(.

    I have tried a crap ton of stuff the past few month since this years indexing patch and the best/most cost efficient method I have came up with is just increase your link count on T1/T2 and blast them with do follow none contextuals.

    Good to hear its upto treble figures :).

  • Okay thats cool, thanks I will check out elite indexer. I also use backlinksindexer.com and checked their conversion rate and notice around 40% there as well.

    When you say increase the link count to T1 and T2, what do I need to do exactly?
  • shaunshaun https://www.youtube.com/ShaunMarrs
    Although I change it per project the last batch that I was building before all this was around 25 submissions per day for contextuals, now its 100 minimum.
  • Okay cool. I have it at T1 30 T2 40 and T3 50.

    So you say T1 should be 100 per day?
  • shaunshaun https://www.youtube.com/ShaunMarrs
    As a minimum that is what I am using. I have two projects on higher comp with 300 per day but not sure how they will pan out.
  • Okay I have started as this:

    T1: 100
    T2: 100
    T3: 50

    LPM is booming on all VPS now mate, thanks again for that.

    Let me know any updates on your T1's mate.

    Cheers...
  • ....

    P.S I just signed up to the Elite Indexer to give them a try. When submitting links, what is your view on Drip feeding?
  • shaunshaun https://www.youtube.com/ShaunMarrs
    I just push them through as an when they are ready, their daily count is the dripfeed.
  • okay great, thanks for the advice.
Sign In or Register to comment.