[HELP Needed] General Blog Comments Captcha Error.
iamzahidali
United States
in Need Help
I have a list of general blogs comments the list is AUTO Approved and total links are around 300k but the issue is most of the time Captchas passed from GSA SER to XEvil shows "Incorrect Parameter v2" but i tried to open some blogs and manually tried solving captcha and it worked fine. I added my site list as CUSTOM in Global sitelist. and after some hours GSA SER shows "No Target Left to Post" even after submitting to only 20k Links What might be issue either with GSA SER or XEvil ? Just Look at the screenshot below you will get clear picture.
My Setup:
GSA SER + XEvil 6 Beta + ISP Proxies (AT&T) + 300k List + 100 WebShare Proxies in XEvil
My Setup:
GSA SER + XEvil 6 Beta + ISP Proxies (AT&T) + 300k List + 100 WebShare Proxies in XEvil
Comments
I do not have an answer to your problem, but I can help you make it less of a problem.
If you look at the list, you will notice that it does not have all unique domains; it shows the same domains several times and the same link, which gives the error. Below are the domains in your list causing the error:
academy.patricia.bgannecy-ophtalmo.fristanbulwheelchairrental.comquick.fuji-pt.comseoclck.kgsupergirosnortesantander.com.co
I suggest you go to the Options tab in your project and uncheck the option: retry to submit previously failed submitted sites.
This will reduce the load on resources load and also speed up things for both Xevil and GSA SER.
Another more advanced option would be removing domains that fail in Xevil from your site list.
To find the URLs that the list of URLs that keeps failing in Xevil, head to your installation - on my machine, I installed it in C:// root ---> C:\Xevil\Modules\ReCaptcha\Reports In there, you will see 2 files: good.txt and bad.txt
The file that we need is the one called Bad.txt, but the problem is there will be all duplicates in that file, so we need to remove the duplicate domains (yes, dup domains, because if there is an issue with the domain's recaptcha key, then all URLs and subfolders will also have a problem )
IN GSA settings, go to Advance \ MISC \ Tools \ Remove Duplicates from file:
Once you have removed the duplicate domains from the bad.txt file, you want to remove all those domains from your site list so GSA doesn't continue to try to build links on a site for which we cannot solve the captchas...
There are 2 ways we can go about doing this the one is to remove it manually, and the other is to use the blacklist filter:
OPTION 1- USE BLACKLIST FILTER
Go to GSA Settings and then to the FILTER TabOn the right side, select ADD, and then in the pop-up, paste the path to that Bad.txt file, as in the image below.
This option preserves your site list as GSA will now check every time if the URL you want to submit to is in the Filter list; if it is, it will skip it. This option is a bit more resource-heavy than the 2nd option below.
If you use this option, you should dedupe that bad.txt file at least ones a day or very often.
OPTION 2- MANUALLY REMOVE IT WITH GSA SER
To remove the domains from your site list, do this:
Select the:\Xevil\Modules\ReCaptcha\Reports\Bad.txt file that you cleaned before. It will then ask you to remove domains, sub-domains, or URLs and select Domains.
That's it. You will not have a much higher success rate, but your LPM and VPM will also go up, and you will use fewer resources.
=========
In conclusion, this is not a permanent fix, but it will at least prevent you from submitting to the same sites over and over and over which does not work.
Hope it helps