Skip to content

Thread/Timeout For Index Checking.

"11.19 - new: ability to specify the type of proxies to be used for indexed-checking"

Just seen this on the change log, its a nice feature but is there anyway the user is able to set the thread count and timeout for index checks?

Currently I use Scrapebox for this as I have it set up how I want it but it would be easier to have it in SER with the same functionality.

I have dedicated proxies that I use for index checking against Google and it seems over a long duration if a proxies hits Google more than one every 30 seconds it is soft banned. I have scrapebox set up in a way that each proxies is able to hit google once every 30 seconds and I just leave it index checking 24/7 with no problems.

For example have something in SER where the user can say use one thread for index checking with a 30 second time out and tell it to use one proxy so it never gets banned or if the user has 10 proxies they would be able to set it up with one thread with 3 second time out.

I'm really tiered right now so not sure if I have done the best job of explaining what I mean so if it doesnt make sense just say.



  • SvenSven
    You can setup that 30 seconds in options for waiting time between search queries. 30 seconds is way too low in my eyes so I would still use 180 there. But thats up to you.
    The number of proxies to be used is also taken from the main options.

    That time however is used for each IP...meaning that if you have more proxies, then the whole process is of course faster as other proxies can be used when one is in that waiting-loop.
  • shaunshaun
    Ah right I never new that, so just to confirm I understand it correctly, when Right Click - Verified - Verify that is treat as a search by SER and uses its search options in options?
  • SvenSven
  • shaunshaun

    Today I tried moving my index check over to SER and after a few minutes it started returning URLs as yellow rather than Red or Green. I pressed abort and in its summary at the bottom it gave the breakdown. I'm guessing Green is indexed, Red is not indexed and Yellow is banned?

    Literally a few seconds later I pasted the URLs to be index checked into scrapebox with the exact same proxies and clicked start it has indexed checked around 500 URLs with scrapebox now with no problems reported.

    Any ideas if it could be a bug or user error?
  • SvenSven
    yellow means that this ip or proxy is banned on google.
  • shaunshaun
    @Sven but they wernt banned on Google, I instantly ran the same test on Scrapebox using its index checker and the exact same proxies as I let run in SER and it is still running now hours later with no issue.
  • SvenSven
    hard to say what it is, do you know how scrapebox checks it's indexed-status?
  • shaunshaun
    No idea :(
  • @shaun, are you guys referring to: 2016-10-12 21.35.19.png?dl=0
    green=Google Indexed? That doesn't correspond with: results..

    @sven, could you add a column to the overview, like: Status, Priority, Submitted, Verified, G Indexed, (DoFollow.) ?

    As for the yellow, being a banned proxy, won't it retry? I am using a proxy gateway, only have 1 IP but can do 300 threads on it.. So my timeout is set 1sec, right?

  • shaunshaun
    @Myhq no mate if you right click a project - show URLs - Verified (the words might be a little off I am not able to get onto a server to check it right now) and then press index check at the bottom of that window.
  • @shaun, cheers, got it!

    Is there any way I can automate this for all my tier1 links to be updated like every 5 days? (just like we re-verify?)

    @Sven. How would I go about the proxies with a gateway? I have 1 IP, 300 threads, and a pool of 10.000 proxies. I am allowed to use 50 threads for concurrent google queries..
  • SvenSven
    next version will allow you to specify the waiting time between queries in a dialog.
  • myhqmyhq usa
    edited October 2016
    Hi @Sven

    Could this be implemented more visible and automated:

    1) in "Settings" add variables for global proxy, time-out and threads related to "indexed check"

    2) per Tier add "Indexed Check" & "re-check (not indexed links) every ...." similar to "verification check" (only has to works for verified links). I would only use it for Tier1 to my money site.., and disable it in others.. 
    >> An easier implementation would maybe be a tickbox to "check if indexed" so that all verified links are automatically checked after they are verified, same interval as the (re-)verification check, etc)

    3) add the number of indexed links right of the "verified" column in the dashboard.

    4) from the indexed links, I would also be interested in DoFollow number..

    I think it all comes down to the indexed links you create, so i consider it very valuable info to display more prominently. Especially since the tools and info is already available.. 
  • your question: "hard to say what it is, do you know how scrapebox checks it's indexed-status?"

    @sven: they use the "info:" operator, so:

    If a result is returned, that link is indexed.

    However, they easily block proxies, so its important to retry if proxy is blocked..
  • it also asks for a captcha sometimes, so integrating that could also significantly improve the success-rate..
  • Hi @sven,

    any comments on adding "indexed" (and "dofollow") to the overview? Right of Submitted.

    I guess "indexed" is the main metric everybody is/should be optimising for, especially in their first tier..
  • SvenSven
    i have noted it down...however this is nothing i can add quickly
  • I understand it is not a quick tweak. But it will be super helpfull.

    All info/ideas of simplest implementation are in this thread. Keep me posted about this )

    PS. since the indexed check is a google query, it is important that we can activate this per tier, and set max number of threads for this global process separately.

  • hi @Sven, I also get only yellow results. while my proxies pass google search test... What can this be?
  • SvenSven
    made some changes in 11.35 and hope it fixes things for you.
  • cheers!

    what is the best way to bring a feature request under the attention of fellow users? I think adding the "check indexed" as option to automate per tier, just like verification, and add it to the overview, is a great indicator of the success of your campaigns (especially first tier).

    The amount of indexed links, and the amount of DoFollow amongst them, are what we all do it for, right?
  • SvenSven
    well this might be a problem to do all the checks for each URL. Your proxies will not last long.
  • I would only activate this for the verified links in my first tier projects. And based on the amount of concurrent google threads a user can do it can throttle this, I can do 50 and have an average VpM of 40~50 over ALL tiers.. My VpM over first tiers will be about ~10% of that. combing down to 5 checks a minutes... I would say I could do that with 1 proxy even.. Everyone can decide and calculate for himself which & how many tiers he turns on this function for, right?

    But having it running in the background, and update on the overview that would be gold. if will take me hours to do with the current setup (manually, one by one)

    Important is only that fails (yellow) are retried. (I use a proxy gateway, so I dont want to burnt he IP, just re-try until it gets the info)
  • @Sven,

    is this moving up on the todo, or I have to think about a work around?

    I would just like the indexed stats of my Tier 1 projects verified links, automatically, and added to the overview..

    Optimizing campaigns for the highest VPM is 1 thing, but optimizing for getting them indexed is far more important, right? It would also allow to test different indexing services/methods you are affiliated with..

    Depending on my proxies, I might enable this for other tiers too, or not...

    Maybe you have some alternative ideas?
  • AxiusAxius Ukraine
    edited December 2016
    Please tell me, such a delay may be related to the type of proxy server? Earlier, I used a paid proxy server (, then moved to a free and started a small problem. thank you in advance.
Sign In or Register to comment.