@backlinkaddict another big thank you to you for providing detailed suggestions and summerising your experience with Xevil. Much appreciated mate. My PC is from 2017-18 era i believe. Its a desktop PC so i was able to extend the RAM easily. It currently manages GSA WC easily but i plan to install SER on it (which I own since 2015) and havent used much. This is where I may get in touch with you. But will keep it seperate as I feel that we have hijacked this thread. Lets keep testing Capv2v3 and hopefully it will turn out to be a great product in GSA's suite of amazing products
dp001 No problem at all really, that's one of the reasons I said feel free to PM me. I was thinking I don't know of any older i-3 laptops that support 32gb ram so I was a bit confused. Even one of my i-5 9th gens thats 3 years old maxes out at 16GB.
I'm with you there GSA in a great suite of tools especially SER in the right hands. I feel like if Linux had a fork/version full of preinstalled tools for digital marketing like Kali does for cybersecurity/hacking it would be full of GSA products
Anyways, yes let's hope this project becomes fruitful and possibly incorperated into the GSA suite
As I promised, 1 hour results of CAPV2V3 (Setup v40) with free plan of 10 ROTATING PROXIES - https://www.webshare.io/ :
>>> ONE SUCCESS EVERY 2 MINUTES (5 THREADS SIMULTANEOUS) >>> Please, notice that there is sometimes a problem with URL sent by SER and that the problem is not coming capv2v3 !
As I promised, 1 hour results of CAPV2V3 (Setup v40) with free plan of 10 ROTATING PROXIES - https://www.webshare.io/ :
>>> ONE SUCCESS EVERY 2 MINUTES (5 THREADS SIMULTANEOUS) >>> Please, notice that there is sometimes a problem with URL sent by SER and that the problem is not coming capv2v3 !
Thats promising. is that V3 or V2 only? I'm yet to see success on V3
I found even ordering 20 paid proxies from webshare and adding the api into SER it usually will only test 4 ok against google, ocassionally I see 8 as it rechecks them frequently.
I wonder if changing the proxies from free to another tyoe or even using another service could increase success rate?
I do see that in Xevil it can take 100 seconds on some solves so that 2 minute solve is not necessarilly a bad thing.
YOU CAN HAVE 3 PROBLEMS AND ONLY 3 : (SETUP v51)
1) PROXY PROBLEM : 'NEW FAILED PROXY RECAPTCHA' ==> MUST BE 'NEW WORKED PROXY RECAPTCHA'
2) NOT CONNECTED TO INTERNET or NOT AUTHORIZED TO CONNECT TO INTERNET (FIREWALLL?)
3) NOT RECEIVING CAPTCHA TO SOLVE : SEE PORT CONFIGURATION 8181 (BY DEFAULT)
INFO: 127.0.0.1:58329 - "POST
/res.php?action=abort&id=18603b98-f26c-4a53-8a43-1c012cb5cdb4
HTTP/1.1" 405 Method Not Allowed << thats a bug in the software (not Im going to fix for next update)
I was wondering, are you using this to solve RC on your websites that use for Registration/Posting only?
Is there a way that is would be able to solve on a search request that throws RC 2-3 times or no?
It can be that maybe do to VPN, to many requests, or a specific search "string' will throw a captcha and search is not allowed.
Im wondering how maybe this could possibily help there if it gets finished and added to GSA for "searching" workflow which could be very helpful. Use capv2v3 for "searching" ...
I guess "Google verfied proxy" today just means they are not banned on and off if tested against "G" ? Not that they are gauranteed to be used by on "G" since you wouldnt know until the "string" is added and clicked to search, that or simply just show nothing.
Must wonder why some trending searches are "FP" "FP" "FP" "FP" "KW" .... Just saying
There are times simply RC is thrown do to a search string "known to be used by bots", of course no one here is doing anything like this but how would you go about this other then using another search engine which can have simlilar behavior or changing all footprints and updating scripts for optimization?
yeah that's pretty normal .. what i would suggest is to add the folder and python.exe to windows defender's or antivirus's exception list. this way its not getting scanned constantly and the windows antimalware service will use less CPU
Ok so quick update: as soon as I changed all sleep() parameters in the main python file to 15, my success rates went WAY up! In fact the only errors I get are on the recipient Sites' ends.
Thanks @sven as well as @CL67F for making this possible! It's at the very least Twice if not Thrice as accurate as XEvil! And it's at least 4 times faster
Ok so just a quick thing: it does try to repeat tasks that are unable to be solved.
Example: one of the websites I'm trying to post to keeps giving me an SSL_CERT error, which, by the way, I have tried telling Python to stop trying if that's the case:
if e == 'SSL_ERROR_BAD_CERT_DOMAIN' or 'SEC_ERROR_UNKNOWN_ISSUER' or 'SEC_ERROR_UNKNOWN' or 'SEC_ERROR_EXPIRED_CERTIFICATE' or 'Timeout 30000ms exceeded.':
try_connection = 0
force_new_proxy = False
However, it's still trying new proxies when in fact it's just going to fail every time. What should I do in order to tell the code to just skip the site entirely if those errors occur? I can press ctrl+C but that's just not practical if I'm running this overnight....
On Windows 10, please pay attention to Windows Defender CPU ! I didn"t manage to find a solution, I changed Antivirus !!! Ahhh, these fucking Windows Defender Antimalware & SmartScreen !!!! Got a headache !!!!
backlinkaddict Forgive me as I sure wish I had statistical proof! I just notice how XEvil takes at least 60-90 seconds per query to solve, and it really is only 50/50 solve success-fail.
With this solver, on the other hand, it's a solve every 10-12 seconds! I can see it in my web contact dashboard as "sent" as well (if it was an error like an 'unsolvable' XEvil and web contact would tell me).
Speaking of which, quick update and Good News simultaneously: it turns out I just needed to modify my aforementioned code as such:
if e == 'SSL_ERROR_BAD_CERT_DOMAIN' or 'SEC_ERROR_UNKNOWN_ISSUER' or 'SEC_ERROR_UNKNOWN' or 'SEC_ERROR_EXPIRED_CERTIFICATE' or 'Timeout 30000ms exceeded.':
By the way @CL67F and @backlinkaddict my script is still hanging after some time (sometimes after 5 minutes running or 10 minutes running the script just stops in perpetuity) - have you had the same problem? If so, how did y'all fix it?
daviddig1 Though, I am very excited and thrilled about capv2v3, I have not had enough time invested into into it to give any good advice about, currently.
I kinda feel this thread has become a Webshare promotion, though.
My experience with Webshare does not match with whats being said above either.
This is confusing.
Ive spoke with the helpful support, tried few different smaller packages, residential, had them refreshed, and even tried the pay by bandwidth (7 bucks to run a test less then and hour) which is just straight silly/expensive for anyone using GSA software.
Maybe I refresh and test again.
When I try to pick "Premium working proxies" they didn't seem to have enough, like under (10) so maybe for "Premium" USA/UK, they are not great option, I guess?
So, you have 500 Webshare proxies your paying for? If you put those into a tester like GSA how many actually test "good" no doubles or redirects just working proxies and if using for RC tested against Google not Bing as that's what you'd need. . ?
Are you getting just the same few working? The API is going to keep testing the 500, but how many are actually "working" and available for RC solving of the 500 pool?
For me, I have never seen Xevil below 66 (with ok proxies) and usually 80 as far as GSA is concerned. Ive ran through GSA captcha breaker to double check too. Since it logs better.
The solve rate for both these RC solvers should highly depend on threads and proxies.
Regular Xevil gets barely any Captchas I try to solve wrong but the RC is not 50/50 either, so I confused where you seeing this?
@CL67F recently said he was getting a solve every 2 minutes and showed something like 15 percent success rate.
Your getting a solve every 10 seconds? Or your seeing "requests".
This is getting very confusing to follow as far as "services" used "results gotten" and "success rate" ?
@daviddig1 what's your PC config for xevil ? I've recently started using it and ranging 15-20s per solve with 200 threads. using reproxy and their proxies are working great. I've tried this script initially but gave up pretty soon cos of constant errors and issues. I really hope this script becomes a part of GSA CB as recaptcha solving is much needed these days.
Ok @rastarr or @CL67F I'm afraid I'm still having to press Ctrl+c or outright restart the bat file - any ideas as to how to make sure the script doesn't stop all of a sudden?
Ok @rastarr or @CL67F I'm afraid I'm still having to press Ctrl+c or outright restart the bat file - any ideas as to how to make sure the script doesn't stop all of a sudden?
When I was using my original version, this was a daily process, unfortunately. The framework doesn't always clean up unused previous version of chrome/firefox, resulting in ever increasing memory usage. That was one of the reasons I eventually stopped using it. Frameworks like Selenium, Playwright and the like are for low volume web scraping duties. They were never meant to be used in such high-volume activities like captcha solving within GSA like products. They only have rudimentary memory management, at best.
Ok @rastarr or @CL67F I'm afraid I'm still having to press Ctrl+c or outright restart the bat file - any ideas as to how to make sure the script doesn't stop all of a sudden?
When I was using my original version, this was a daily process, unfortunately. The framework doesn't always clean up unused previous version of chrome/firefox, resulting in ever increasing memory usage. That was one of the reasons I eventually stopped using it. Frameworks like Selenium, Playwright and the like are for low volume web scraping duties. They were never meant to be used in such high-volume activities like captcha solving within GSA like products. They only have rudimentary memory management, at best.
Oh ok no problem thanks for letting me know - would you recommend I turn down the number of threads in the.py file?
Comments
I'm with you there GSA in a great suite of tools especially SER in the right hands. I feel like if Linux had a fork/version full of preinstalled tools for digital marketing like Kali does for cybersecurity/hacking it would be full of GSA products
Anyways, yes let's hope this project becomes fruitful and possibly incorperated into the GSA suite
>>> ONE SUCCESS EVERY 2 MINUTES (5 THREADS SIMULTANEOUS)
>>> Please, notice that there is sometimes a problem with URL sent by SER and that the problem is not coming capv2v3 !
I found even ordering 20 paid proxies from webshare and adding the api into SER it usually will only test 4 ok against google, ocassionally I see 8 as it rechecks them frequently.
I wonder if changing the proxies from free to another tyoe or even using another service could increase success rate?
I do see that in Xevil it can take 100 seconds on some solves so that 2 minute solve is not necessarilly a bad thing.
Keep up the good work mate!
Not working for me.
https://www.webshare.io/ (Free Plan of 10 ROTATING PROXY / Plan of 100 ROTATING PROXY for 3€)
1) SINGLE LINE (YOUR IP MUST BE DECLARED ON WEBSITE)
p.webshare.io:9999
2) SINGLE LINE
p.webshare.io:80:bhcpnasz-rotate:p3l4ckr25xxx
3) LIST OF PROXY
64.137.48.208:6415:bhcpnasz:p3l4ckr25xxx
64.137.48.185:6392:bhcpnasz:p3l4ckr25xxx
This happens with all proxies tested
Now though, I am, however, still getting this error on most of my submissions:
I was wondering, are you using this to solve RC on your websites that use for Registration/Posting only?
Is there a way that is would be able to solve on a search request that throws RC 2-3 times or no?
It can be that maybe do to VPN, to many requests, or a specific search "string' will throw a captcha and search is not allowed.
Im wondering how maybe this could possibily help there if it gets finished and added to GSA for "searching" workflow which could be very helpful. Use capv2v3 for "searching" ...
I guess "Google verfied proxy" today just means they are not banned on and off if tested against "G" ? Not that they are gauranteed to be used by on "G" since you wouldnt know until the "string" is added and clicked to search, that or simply just show nothing.
Must wonder why some trending searches are "FP" "FP" "FP" "FP" "KW" .... Just saying
There are times simply RC is thrown do to a search string "known to be used by bots", of course no one here is doing anything like this but how would you go about this other then using another search engine which can have simlilar behavior or changing all footprints and updating scripts for optimization?
https://prnt.sc/WMhXfbgHcz3i
In fact the only errors I get are on the recipient Sites' ends.
Thanks @sven as well as @CL67F for making this possible! It's at the very least Twice if not Thrice as accurate as XEvil! And it's at least 4 times faster
Example: one of the websites I'm trying to post to keeps giving me an SSL_CERT error, which, by the way, I have tried telling Python to stop trying if that's the case:
if e == 'SSL_ERROR_BAD_CERT_DOMAIN' or 'SEC_ERROR_UNKNOWN_ISSUER' or 'SEC_ERROR_UNKNOWN' or 'SEC_ERROR_EXPIRED_CERTIFICATE' or 'Timeout 30000ms exceeded.':
However, it's still trying new proxies when in fact it's just going to fail every time.
What should I do in order to tell the code to just skip the site entirely if those errors occur? I can press ctrl+C but that's just not practical if I'm running this overnight....
I didn"t manage to find a solution, I changed Antivirus !!!
Ahhh, these fucking Windows Defender Antimalware & SmartScreen !!!!
Got a headache !!!!
All my GSA Softwares are working great !
Using API !
With this solver, on the other hand, it's a solve every 10-12 seconds! I can see it in my web contact dashboard as "sent" as well (if it was an error like an 'unsolvable' XEvil and web contact would tell me).
Speaking of which, quick update and Good News simultaneously: it turns out I just needed to modify my aforementioned code as such:
if e == 'SSL_ERROR_BAD_CERT_DOMAIN' or 'SEC_ERROR_UNKNOWN_ISSUER' or 'SEC_ERROR_UNKNOWN' or 'SEC_ERROR_EXPIRED_CERTIFICATE' or 'Timeout 30000ms exceeded.':
resetTempo(page_url)
By the way indeed I'm using 500 proxies from WebShare
I've tried this script initially but gave up pretty soon cos of constant errors and issues. I really hope this script becomes a part of GSA CB as recaptcha solving is much needed these days.
The framework doesn't always clean up unused previous version of chrome/firefox, resulting in ever increasing memory usage. That was one of the reasons I eventually stopped using it.
Frameworks like Selenium, Playwright and the like are for low volume web scraping duties. They were never meant to be used in such high-volume activities like captcha solving within GSA like products. They only have rudimentary memory management, at best.