GSA 9.50 - Performance & Feedback
@Sven
Could you please explain the following:
[10:47:47] 5922-XXXXX - Tier 1: [ ] E-Mail Verification finished.
[10:47:47] 5922-XXXXX - Tier 1: [ ] Verifying finished (2879 checks done)
.........
login failed (invalid username)
login failed (you have entered an invalid username or password.)
login failed (identifiant et/ou mot de passe incorrect)
1. How is GSA Ser parsing the urls during email verification?
2. Exactly when is an URL - AFTER email verification, actually verified?
3. During email verification, several submitted links are being verified as Not Successfull - How come?
4. Are you actually 100% sure, that GSA Ser are using the correct USERNAME And Password during login - After email verification?
I ask - as I see a huge bunch of failures all the time....
Comments
1. It does that ba parsing the email messages. Sees if the email domain matchs one of the submitted sites, submitted url-domain is part of the email, login/password used on the submission is part of the email and so on.
2. There is no URL verified. it just "clicks" the url in emails, extracts data required for next step (e.g. password) and later when verification is done, (status active) it will start submitting
3. Can have several causes here. Usually the expected email never arrived, later submission failed or login failed and it is now removed
4. yes
Just because a site successfully breaks the captcha and registers doesn't mean its ACTUALLY postable with SER. I've noticed a lot of sites that remove or rename required forms SER needs like the "url" form and/or adds extra ones on the login page. Coupled with that is SER wrongly detecting target sites when trying to register and/or post after registration.
Man if I had even half the verified links vs registered accounts I'd be killing it. I'm not saying that there aren't things wrong with SER but I your sample size isn't even close to make any assumptions IMO. I'm currently using v. 9.46 myself with the exception of adding new updates to the engine *.ini Sven has done since then to my own modifications.
I run scraped lists also and see a helluva lot more wrongly identified engines and "required variable etc etc not used in form" once I login and try to place a link. A bunch of scraped sites are going to have the URL form removed which SER cannot possibly know until after registration and logging into the account's settings. And of course other custom/renamed forms added to thwart spammers.
Using DBC on scraped lists is just a waste of money IMO because of these reasons. If you're going to be spending more on services like that then just get a 2nd recaptcha or other ocr type solver where you can really push the amount of captchas you can send per month. Get more proxies for better ip reputation with recaptcha and ability to use more than 1 retry. Up your retries with CB. Get CS as a final backup solver. Etc. Etc.