Skip to content

Verifications Extremely Low - Help to Resolve

edited January 2013 in Need Help
I realize that verification issues can be due to a whole myriad of different items, however last time I had reported these issues (a few months back), there was actually a legit issue with the software and the bug was fixed. I am also experienced enough to look for and test out the basic culprits (proxies, emails, etc). With that being said, here is my problem.....

About 2 weeks ago, I started using more SEs (English and non English speaking) with SER (92 total) and also "finally" switched over to catchall emails (yes, they are working and receiving all verification emails). I also started a completely new project (new keywords, new spun articles, etc). On the initial run, I noticed that verifications were very low compared to what I was use to receiving (20-30%). I stopped, made a few adjustments, and still received low verifications (5-10%). I wasn't able to test much, because I had to leave for a 2 week holiday vacation. Now that I'm back.....here are the tests I have conducted.

- With private proxies (80 threads with 30 sec timeout) - 437/29 verified. 6.6% verification percentage.

- Without proxies (100 threds with 30 sec timeout) - 362/29 verified. 8% verification. Percentage would have been lower if I let it run longer.

- New Hotmail acct (switched over from catchall. Left on overnight) - 834/28 verified. 3.3% verification.

Sources I am using (pretty broad, but I got great verifications with them in the past) - Articles, Forum, Microblog, Guestbook, Images, Blog comment, SB, SN, Web 2.0.

Now I have kept all settings consistent over a long period of time, other than my 2 latest adjustments mentioned above (catchall/more SEs). At this point, there are only 2 conclusions I can come up with, at this time, for the extremely low verifications.

1. Addition of more SEs, including other countries that are not native English speaking.
2. Bug(s) within SER that need to be smoothed out.

NOTE: If anyone else has been experiencing this as well. Please post here as this may help eliminate 1 of the 2 factors mentioned.

Comments

  • LeeGLeeG Eating your first bourne

    Your html timeout is set to low, max it, then work down to your sweet spot.

    Depending on speed of private proxies, its normally about the 130 > 140 area

  • edited January 2013
    @Lee -  The setting is actually high from what other members have it set at (many at 10 sec or less). I've read a lot of feedback about 5-10 second timeouts for private proxies on many occasions working for most users. Not saying I would do it, but that is the general consensus.

    I know my settings are more than adequate for US SEs, HOWEVER, this is a great point for those who are using SEs (like myself) outside the US. Will put it to the test. Thanks!
  • AlexRAlexR Cape Town
    @grafx77- are you referring to HTML timeout or custom time to wait? LeeG is referring to HTML timeout. It's just my understanding is that HTML timeout should be slightly higher than your number of threads...so not sure who is setting it at 10s or less. 

    Also - using a ratio of submitted to verified is quite inaccurate. I requested a feature that shows this ratio for the last 24 hours/week (as an extra column) so we could see which projects needs attention and Sven mentioned that it wasn't a good measure. Using some sort of success ratio as an extra column in the GUI would allow us to all quickly see which projects are giving issues. 

    Also - are there any change to your filters? Best way to test a correct submission rate would be for a big pool of results for different platforms AND with different filter settings. 
  • @Global - Thanks for the feedback. I'm glad you brought up this point. As I was thinking about custom time interval instead of HTML timeout. I will test this out now, even though I have left the HTML timeout setting the same since my initial purchase and received great verification rates, however, since I am utilizing more international SEs, I believe this could legitimately effect the results.

    What is so inaccurate about the submitted/verification results? The calculation is accurate, however I do realize that I could split up each "source" and test out the rate of each. I'm not ready to go divide everything up like that yet (time consuming), but if needed, I might. You have to realize I am testing against a "constant subject" (combined sources). I have used the above mentioned sources (forum, SB, SN, etc) with other projects and received admirable verification rates. I don't want to start splitting things up and then re-testing as that really won't point me to the true issue. You have to always test against a stable constant in order to start deducing the problem.

    I totally agree with you on the verification ratio GUI. That would be a VERY nice feature. I'm not sure why Sven wouldn't implement this, as it's not doing anything but simple math for us. I would think this would be fairly simple to implement.

    Filters? I don't think you mean the "filters" option under the MAIN options button. I assume you are speaking of the "Project Criteria" options. No I haven't changed these as well. I am fairly sure this is not the issue. If it was, it wouldn't effect the verification rate, only the amount of submissions that SER would be able to submit.
  • AlexRAlexR Cape Town
    @grafx77 - Project Criteria Filters - will affect rate. Higher filters = higher quality. e.g. pages with 1000 OBL are MORE likely to be accepting links than a page with OBL10. 

    I think you need to break your "constant" down to a per platform basis. It's too variable to have multiple selected, as some of the platforms seems to get much more links in GSA. This would skew the results.

    It seems that the main issue is the HTML and custom search time variable. Let me know how it goes after tweaking this..
  • @Global - The Project Criteria will not affect the verification rate as it is filtering out websites and not submitting. Please realize this distinct difference. As stated above, this will only affect the number of websites submitted not the number verified.

    Based on your example: SER will find more pages with an OBL of 1000 compared to an OBL of 10 which equals more submissions not verifications. Now you could make the argument that sites with OBL of 1000 will have more auto approves, thus leading to more verifications, however this is not the case. My OBL is 125 and has been tested around the 100-150 marker, all leading to the same verifications. Again, this is a "constant" marker as well in my tests.


    I could break the constant down to a per platform basis, but that defeats the purpose at this particular moment. ;-) Understand that the only reason why I have a constant is because I have used this setting for a long time and I know what to expect from it on daily use with verifications. I already realize that verifications have dropped with my constant (settings being the same throughout), now I have to find out why. The only variables changed are the SEs used and the use of catchall emails. I've tested out all other variables, and at the moment it seems as if selecting more SE's from other non english speaking countries will pull in results with lower verifications.

    Additionally, I have many projects to run and don't intend on splitting up each project into 3-6 separate "micro projects".......maybe 2 at most. I can also view which sources have been producing more verifications by simple looking at the Show Diagram Chart figures. Seems as if Forums are the winner by a landslide followed by guestbook and social networks. Also note.....I have been running GSA at least 8+ hours a day for 3-4 days (this includes stoppages to change settings). At only 208 verifications, that's pretty pathetic.
    image


    TEST RESULTS

    As for test results....here is what I came up with over the last few days.

    - 60 threads/120 custom timeout - 740/77 - 10.4% verifications (better)

    - 80 threads/140 custom timeout - 196/25 - 12.7% verifications (better, but  too small sample rate to jump to any conclusions)

    80 threads/140 custom timeout WITH 120 HTML Timeout 457/17 - 3.7% verifications (this was run for 10+ hours. Not only did I receive much less submitted, which was to be expected, but the verifications are horrible.) Conclusion - going back to default 30-40 sec HTML timeouts.

    I'd still like to hear from others to see what their percentages have been over the last few weeks. I am merely trying to accomplish 20% or higher verifications with the sources I have. I remember a day, a few months ago, when I was receiving 40%+ verification rates on average, according to my settings and standards of use.
    The last variable to be tested will be the # of SEs used, which I will save for last. It was my hope that I could broaden the scope of SEs and receive more results, but it may be working against me instead of for me!! :-(
  • AlexRAlexR Cape Town
    @grafx77

    1) "Now you could make the argument that sites with OBL of 1000 will have more auto approves, thus leading to more verifications, however this is not the case. " Exactly what I was trying to say, just more succinct. Point noted though.

    2) I have also noted that when a project starts it goes quickly. Could it be that your niche is limited? I.e. there are limited target sites, and these get used up very quickly? 

    3) Do your keywords overlap? I.e. you take 1 keyword, it gets lots of results and all your others are variations on that keyword resulting in that you have already parsed these sites? (Also, SE type will also cause this, I'm starting some testing on this aspect)

    I know you're going to tell me to focus on the verification aspect, but just want to be certain that the above is sorted. 

    Finally, maybe you have the issue of the catchall domain getting blacklisted? I haven't had this/tested for it yet, but could be in your case?
  • 2. No same niche. Plenty of keywords. Plus I also mentioned that I am using more SE's now, so even if the niche was limited, the new SEs would bring in more results. This also wouldn't offer relevance on verifications.

    3. Who knows if they overlap. That is something you will never be able to view or test.

    How would a catchall get blacklisted if your using variations of emails on every single submission? The only part that may be blacklisted is the domain, which I'm not sure how to check. CheckForumSpam doesn't allow you to check domains, just email addresses. I highly doubt this is the case since I just started using this domain.

    You could really help me out Global by giving me the latest verification rates you are receiving with SER on your particular campaigns. If your verifications have been adequate, let me know what settings you are using (like # of threads, proxies, custom timeout, and types of SEs selected).

    So far I'm running SER with a low thread count and high custom timeout and it seems to be doing well so far.......however it just started. 34/17 is the current stat.


  • AlexRAlexR Cape Town
    @grafx77 - over last few weeks I have been restructuring everything, so don't have any reliable data to share. Only been targetting blog comments to be honest, while I restructure.  

    Basically, I want to focus on efficiency. :
    1) Improved Verification to Submission Ratio.
    2) Less Resource Usage
    3) Higher Quality Links
    4) Less Spam

    Please check out:

    I'm currently working out a test to see if there is a test I can run to measure keyword overlap, as well as SE overlap. (See discussion/logic above)

    For your catchall domain being blacklisted, take a look at http://mxtoolbox.com/blacklists.aspx and see what it shows up. Would be interested to know. 
  • edited January 2013
    Ahhh yes, I've used mxtoolbox before. I even have it in my favorites, among a ton of other resources I constantly forget about......lol. I just checked the domain and it's clear. No problems.

    I'll check out the other links you gave me tomorrow. Thanks for your input Global!

    BTW, I'm going to request a "submissions" tracking feature for the Show/Diagram Chart. I always enjoyed looking over my verification stats on that chart, but it would be soooo much better if it had the submission stats as well.
    This way we can view the verification rates of each source, according to a set criteria of dates. We can then determine which sources are offering better verifications without splitting up the projects into so many separate "micro projects" for each source.

    Here is the link: https://forum.gsa-online.de/discussion/1450/show-diagramchart-submissions ....I'd appreciate a +1 on the post, if you agree with the feature. ;-)
  • My latest test results from last night seem to have increased the verification rates substantially!! :-) Hopefully SER will continue at this rate with the new settings I have put in place.

    Proxies - 40 threads w 20 private proxies/ 120 custom timeout - 200/70 = 35% verification rate :-) I think I hit that sweetspot.
  • I have just started my first campaign, following as much as possible from you guys.

    My trouble is, I have had no benchmark, so I may have burned my first site.

    Out of 1521, I have only 40 verified.

    It could be my proxies, I am using proxy rack, similar to errsy, they were fine when I bought them, but have not been testing well. The trouble with these types of proxy, is the difficulty to test, GAS can only manually test 1 at a time, and I don't want to do an online test, that may well rip them all.

    Grafx77, by custom timeout, do you mean, custom time between search engines?

    I had mine set to 6 :-L

    I am now 50 threads w custom proxies 120 custom timeout, 60 html timeout - test running now

    Thanks for the thread

  • edited January 2013
    I need to know how many private proxies your using.

    Yeah....custom timeout = your proxy timeout under proxy configure > options. I would set this to 30-40 seconds (some recommend 5-10 seconds). I would use HTML timeout of 120. I can't say whether your thread count is right because again, I don't know how many proxies your using. Also, uncheck the "custom wait time between search engines".

    1421/40 is a very low at 2.8% verification rate. You should be aiming for around 20%+
  • edited January 2013
    These are the results from yesterday. I let GSA running for like 12 hours and I got:
    • 7399 submissions / 1922 verified links ~38.49%
    • using 40 private proxies (140 HTML timeout).
  • ronron SERLists.com
    edited January 2013

    @hyde - That's an abnormally high verification rate. Most tend to be at 15% - 20% - but that's for just about all platforms together. Are you focusing on just a few platforms? Or are you importing lists? Because both of those can skew the verification rate upwards.

  • @ron
    I guess it depends of the lists.
    I'm using both GSA's scraper and my own lists (different projects). I guess that's the reason.
  • OzzOzz
    edited January 2013
    @grafx77: "your proxy timeout under proxy configure > options. I would set this to 30-40 seconds (some recommend 5-10 seconds)."

    I wonder why you set your proxy timeout so high? It doesn't make sense to me because if a proxy doesn't react in a time range of a few seconds its basically dead to me and will just slow down your submission speed.

    Set the timeout to 1 second with private proxies so you keep the fastest and get rid of the slow ones. 

    BTW, how do you manage that option with private proxies anyway? Do you have it set like this and parse your private proxies every time from a txt-file?
    image

    The reason I ask this is because I think that the 'Threads' and 'Timeout' settings have absolutely no effect if you don't search for proxies every XX minutes. The proxies won't get tested when there is nothing to search for as you are using a fixed list of private proxies?! I might be wrong with that though.
  • Thanks guys, it was my newbie shining through, lol

    I ran again, with some tweaks, but practically the same results.

    This got me to 1963 submitted, and 44 verified.

    Then I started to look at CS.

    I updated with the installer, and altered settings to those recommended.

    My first couple of hours today is very good,

    2081 submitted, with 79 verified

    Making this run 118 submissions, with 35 verififications =29.66% great.

    The program has now paused in line with my settings, now to ramp up the campaigns.

    Grafx, I am using 25,000 proxies, in rotation with other users.
  • I think verification rate depends on what platforms u r using..I mainly use contextual links [like blogs,wikis] and get 50-60% verification..while article directories get around 20%...when i select all platforms like articles,forums,guestbook,image,social bookmark...i get around 30-40%
  • @Justme - Ok, well there's your problem. Your using 20,000 public proxies! You should be using private proxies. Your rate should substantially rise when using private proxies. This has been stated time and time again on this forum.

    @Ozz - That's not entirely high, even though I have read about users utilizing 5-10 seconds. I was under the impression that this is the timeout for page load with the proxies and not the time it takes for a proxy to be recognized. The least amount I would set this to is 5-10 secs if the timeout is truly corresponding to proxy recognition.

    I don't understand >>"how
    do you manage that option with private proxies anyway? Do you have it
    set like this and parse your private proxies every time from a txt-file?"

    I've been told by specifically by Sven (posed the exact question in another thread) that the threads posted under proxies are the # of threads used by the proxies. The main thread under General Options is for # of threads without proxies. They are controlled separately. The timeout under proxies I am still unsure about, as mentioned above. Either timeout for page loads or proxy recognition.

  • OzzOzz
    edited January 2013
    I think you've misunderstood or confused something. Or Sven was confused by your question. Or whatever :)

    The #threads in proxy option is the number of threads that SER uses when its testing the proxies. Those threads will open additionally to the threads that are defined in general options.

    The #threads in general option are used for everything but testing the proxies. 

    @Sven should confirm this though, but I'm pretty sure that '#threads in proxy option' have no use at all if the proxies won't get tested from time to time.
  • isn't 'option' under 'proxy list' for just scraping proxies? If you are scraping proxies..then that thread option will determine how many threads are used for proxy scraping..and 'timeout' is time proxy take to recognise the string..lower the timeout..faster proxies will be saved. So basically use that timeout option to get faster proxies.
  • yes, thats my understanding of that, too (99% sure).
  • Hmmmmm....this is quite the quandry we have here. I suppose Sven will need to clear this up for 100% closure on this case. If the proxy options are truly only used for gathering/testing public proxies ONLY, then I have been in the dark this whole time.

    I'm going to dig for the "exact" question I sent Sven about this and his response now.......
  • Ahaaaa!! Found it! I asked this exact question a few months ago and here is what Sven stated: https://forum.gsa-online.de/discussion/862/which-threadcount-is-being-used-for-proxies#Item_3

    Note: "proxies have there own threads handling. It's independent."
  • SvenSven www.GSA-Online.de
    @Ozz is absolutely right (as almost always) ;) Everything you see in proxy options is just for that...proxy testing/finding and has nothing to do with the program itself.
  • edited January 2013
    Well I'm glad Ozz could straighten this out then, cause I have been going off your answer in the thread listed above!
    :-?
  • grafx77 sorry about the confusion, the 25000 proxies are private with username and password.

    They are on a relayer setup

  • AlexRAlexR Cape Town
    @justme - 25 000 private proxies? Can you shoot me a PM with a little more info on how you've setup so many.
Sign In or Register to comment.