Skip to content

Where do you index-check links in bulk?

I have been using SER, but I only have 30 proxies in SER and I can't get many checks out of it before they get banned.

Can anyone suggest anywhere or anything to use to bulk index check?

Thanks

Comments

  • SvenSven www.GSA-Online.de
    If you have private proxies, make sure to not use the option to disable proxies in proxy options.

    Thanked by 1Kaine
  • Recently, I have not been able to get good index check in link manager either, says index check banned mostly though the proxies are working fine in Scrapebox and and solving Google Recaptcha?

    Other then all these "redirects" that are showing in GSA and Scrapebox like failed '302' which is newer error on these software for me and I think more about the proxy "provider" play games with shit proxies.

    Probably tested every setting recently in proxy scraper in SER too and testing different sources for free scraped ones, though the ones Ive used for index checking are a mix of private from other "services".

    I have not found solution so far. I don't have the boxes checked as above either. It was not testing the newly added proxies (think I set that way) and I locked the private ones. I didn't see easy way to just test the "unchecked" proxies but it was end of long day.

    Also, scanner for proxies telling me some are bad when I know they are not for sure. Then I highlight a some that just tested bad but a smaller amount and they turn back to green right away. So the testing not always accurate. And I was only doing about 300ish at a clip. Then retesting the 17-20 you can highlight in GUI.

    I will see if I come up with solution but I have not really had stable results in that area currently. I will try again and share if I come up with better solution.

    I think people using and blocking VPN networks and Cloudflaire has become much bigger issue lately to get tasks done one the web. Even manually its a pain in the ass to get some tasks done.




  • cherubcherub SERnuke.com
    I've been using https://www.indexedapi.com/ for when I need to bulk check indexing, instead of having to keep a decent set of Google-passed proxies. Though I find that checking indexing too often is one of those 'going down the rabbit-hole' situations, where you end up wasting time that would be better spent just building tier links and trusting them to perform to a certain extent.
    Thanked by 1backlinkaddict

  • Yeah, API's are way to go for a lot more things these days.

    I also wonder when keep index checking, so your continually going out and doing a site: backlink search, it may seem a little strange to Google that all your backlinks are being rechecked from random proxies, not well known data collection bots.

    Maybe, it can look suspicious and like you say some things are better to do with given amount of time, spent on other tasks.

  • I did see the other day speedy index but had index checker now.

    They do charge after 50 free index checks, likely using Indexed API as you mentioned.

    Way easier then having to have a list of good proxies on hand for index checking thigh for sure!
  • Sven said:
    If you have private proxies, make sure to not use the option to disable proxies in proxy options.


    Yeah my proxy settings are not set to disable, thanks Sven.

    I finally got round to testing Speedy Index, and I wanted to correctly check their indexed status. I've been using scrapebox since I had those errors in SER and it seems to check ok.

    I sent them my free 100 URLs and so far 5 are indexed after almost 72h. I really mixed them up though, just grabbed random links from my moneysite links and also included 25 from SEREngines.
  • You don't want your private proxies set to disable! 

    So long as they are good private ones.

    Otherwise, if you leave the options to be checking for free proxies and testing, even when you lock them, and use other options, somehow they get disabled anyways. But if you test just the private ones again you should find they are still good.

    So you end up with good but disabled private proxies.

    Ive spent some time in here recently, def don't want them set to be disabled for private.

    I would disable the public less stable ones.




    I would also use Scrapebox for index checking.

    For Speedy Index that's horrible results, did you send indexable do-follows only to the bot?

    Like some shit links that are not going to get indexed anyways I'd filter from list.

    Ive been getting 66+ percent of sent links indexed steadily.
  • Here recent results over 80 percent, its labled serLib test but its decent mixture of links...

    And they refund half of unindexed links back to your credit.

    Ive been seeing this pretty consistantly, so 66 plus is being far. Also the indexing rate is higher then reported I have found more than once.


    They did add index check too, didnt try yet. There 50 free, I know the indexing is very cheap so I cant see the index checking differnt, and maybe its more accurate as well
  • I only use private ones from Buyproxies, as for speedy I'm not sure which ones I sent but I know I included 25 serengine sites.

    I now have this so far:



    I checked and 12/25 serengines are indexed so far. Others were contextual do-follow. I wasn't sure of a way to target only engines that are indexable only? I know we can do this in tiers, to select only those. But engine selection we can't, can we?
  • Anth20 said:
    I only use private ones from Buyproxies, as for speedy I'm not sure which ones I sent but I know I included 25 serengine sites.

    I now have this so far:



    I checked and 12/25 serengines are indexed so far. Others were contextual do-follow. I wasn't sure of a way to target only engines that are indexable only? I know we can do this in tiers, to select only those. But engine selection we can't, can we?

     I thought serengines no longer...wow is this still exist? If then no hands seo would be working as well lol...anyway how is it performing.


    How is buy proxies nowadays.. I am also thinking to change my proxy provider. Been using ip4 and ip6 proxies from reproxy.network. they are great for xevil but not much good with ipv4 for ranker x and GSA ser..

    Speedy links are great indexer... But I still prefer tired link building with my lists
  • Anth20 some of those engines wont index, and I noticed a few that seem to index only a handful if they are pointing at same domain
  • My bad guys I meant to say SERlib. I'm ill with a virus atm  :D
  • I think I just made same mistake in another thread after reading this, opps :D
  • KushphktKushphkt TMobile
    Scrapebox is way to go. Gsa is so bad when handle operators form checking proxies to scraping.  @Anth20 how about  your experience with serlib last time I check its a totally mess from ram , high CPU usage to hard to get indexed. It's better to spend few more bucks and stick on rankerx try it if you don't own it much better than serlib itself. There is a nice peer reviewed on bhw as well
    Thanked by 1Freshairforlife
  • SvenSven www.GSA-Online.de
    @Kushphkt "Gsa is so bad when handle operators form checking proxies to scraping" Please elaborate this!
  • Kushphkt  I would find that elaboration pretty helpful as well.

    If something isn't "working"  for you and you provide no example, how can @sven fix? 

    Seems like hes always eager to address issues...quickly

    Though, comment doesn't seem to make much sense, in English.

    If you are going to suggest another way to parse for 2024, Scrapebox is your suggestion?

    Not sure why your comparing a 50$ monthly tool that's been around for years to a newer SER add-on that was recently being developed?

    I can't comment on current serLib, likely do to all this negativity and people saying things that are not true with no evidence, or complaining because users don't understand whats needed for some browser based engines to be successful and have a good stick rate. 

    Likely, the developer is keeping project or good engines for himself. Can't say I blame him. Either way, I hope @Alitab is doing OK.

    These are serLib links... Getting indexed



    These are also mainly serLib links getting indexed... so I guess it matters your content and what links you are sending because with working indexing service, they seem to index above 80 percent for me.



    So 80 percent in few days on few tests while triple checking, using multiple software's and different settings is a bad indexing rate?

    I was asked to provide test after one user was running around saying serLib does not work and crying for 20$ refund... Seemed to be looping through each engine and a bit more aggressive on some blogs as I had set...



    This was same time period, this crying about refund and trying to tarnish service was documented here...

    I guess some users should check settings, have basic SER knowledge and actually understand what's what's going on first before they make false claims. 

    This is just one simple example of quick test as well....

    Thanked by 1AliTab
  • AliTabAliTab GSAserlists.com
    Kushphkt  I would find that elaboration pretty helpful as well.

    If something isn't "working"  for you and you provide no example, how can @sven fix? 

    Seems like hes always eager to address issues...quickly

    Though, comment doesn't seem to make much sense, in English.

    If you are going to suggest another way to parse for 2024, Scrapebox is your suggestion?

    Not sure why your comparing a 50$ monthly tool that's been around for years to a newer SER add-on that was recently being developed?

    I can't comment on current serLib, likely do to all this negativity and people saying things that are not true with no evidence, or complaining because users don't understand whats needed for some browser based engines to be successful and have a good stick rate. 

    Likely, the developer is keeping project or good engines for himself. Can't say I blame him. Either way, I hope @Alitab is doing OK.

    These are serLib links... Getting indexed



    These are also mainly serLib links getting indexed... so I guess it matters your content and what links you are sending because with working indexing service, they seem to index above 80 percent for me.



    So 80 percent in few days on few tests while triple checking, using multiple software's and different settings is a bad indexing rate?

    I was asked to provide test after one user was running around saying serLib does not work and crying for 20$ refund... Seemed to be looping through each engine and a bit more aggressive on some blogs as I had set...



    This was same time period, this crying about refund and trying to tarnish service was documented here...

    I guess some users should check settings, have basic SER knowledge and actually understand what's what's going on first before they make false claims. 

    This is just one simple example of quick test as well....

    Thank you for your support, @backlinkaddict.

    I've been focused on maintaining and enhancing the quality of my services, as well as developing a new service that will be launching soon. I won't go into more detail until it's officially released.


    @Kushphkt, since version 4.7 of SERLib https://gsaserlists.com/serlib-changelog/ , I have not received any reports concerning high CPU usage, and I believe this issue has been resolved within SERLib. Please note that regardless of the software you use to create and manage your Web 2.0s, they all share the same basic principles and mostly utilize Chromium, which is CPU-intensive. To run a higher number of threads effectively, a robust setup is required. The advantage of SERlib is that it centralizes management, making tiered link building much easier and fully automated.
  • edited May 14
    @AliTab I'm glad to see your OK!

    I have been busy building a few workarounds/projects for myself lately also :)

    Your absolutely correct about browser based engines being resource intensive by nature, also those orphan pages can be tricky to close down at times. This is how its always been and not just recently, I mean for ever with all these tools. Socket based scripts and browser based scripts (which you need for most high quality sites) are treated VERY differently!

    I just laughed when users were complaining about "slow scripts and cumbersome mouse movements in the engines." To be honest these qualities are needed if you want successful account creations and backlinks that will stick and provide rankings. Where are those users now anyways? I guess another batch of people saying this or that don't work but only have their actions to blame for this.

    Here is just a very small sample of some code needed to pass bot test. . .

    // sleep in milli seconds
        await sleep(2075);
       
        // random extra human simulation on page object
        await simulateHumanInteraction(page);

       // parses for element that has the attribute name set to "eMail"
        const emailInput = await page.$('[name="eMail"]');

        // click on the email input field (extra emulation)
        await emailInput.click({ clickCount: 1 })
       
        //sleep for milli second
        await sleep(2875);

        //must clear the field as doubleclick will not work here to overwrite
        await emailInput.evaluate(element => {element.value = '';});  

        //sleep for milli second
        await sleep(1000);


    This is just one very small sample and no I didnt leave the defined variables or anything important really, like getting a browser to pass CF is a script/secret on its own. Scripts that run too fast, well likely your just building links that will be deleted, not indexed, and or your "software/bot" will be redirected elsewhere on many of these authoritive platforms.

    This is also costing you captchas, proxies, emails, content, time etc. I dont know about others, but for myself I would like some nice posts that even if script takes 3 minutes or even a day to get a nice post I would prefer that on a high quality engine that will stick and provide rankings which Im really after. I'm not trying to build backlinks for the sake of seeing how many I can build.

    For me, that excitement of linkbuilding and automation is not like it was 12 years ago. Dont get me wrong its still very fun and things change but gotta keep up with them to have successful and be on the cutting edge.

    I think some users don't understand or appreciate all the work that goes into even just one successful script or even an account creation. There are many who have just turned there software into something that just posts to there own sites for this reason. Another strategy but not one of my favs for first tier.

    Anyways glad to se your back!



    Thanked by 1AliTab
  • KushphktKushphkt TMobile
    @Sven

    When I use Google passed proxies and check index on verifieds in gsa it to take ages to check and with scrapebox all proxies fine and no errors at all. But gsa getting failed after checking few. Its been like that for ages and I guess most of us earn SB and there is nothing to worry. Gsa ser always doing it's major job submitting to vast variety of engines.

    @backlinkaddict
    Have not used Serlib recently so not sure about laggy thing..opted for ranker x and never looked back. I really don't care wether tool is just devoped right now or old I am just looking for user friendly and affordable solution.

    I using gsa ser as it is. I feel it's clean and thats for me. So far I am good with what I am having.  I also using speedindexer and they are pretty good at indexing on google.but not other s.if you do this with ranker x I am pretty sure results are even better. Speedy indexer indexing ratio pretty insane and does work with google. Account suspension is way more less. But never with other engines which is a concern. So now I use ranker x and the best backlinks indexer which works nicely with other engines too. After all ranker x for me.

    Sorry I don't come here much. No I don't use scrape as it's cheaper to buy link lists for few bucks. I just said checking indexing is way more real and hassle free with SB. Try it. If you are using speedy indexer their indexing fee is bit higher and gsa is nearly impossible to check index and scrapebox with verified google proxies always do the trick. 




  • SvenSven www.GSA-Online.de
    @Kushphkt I don't own scrapebox and don't know how they check if an URL is indexed or not, but GSA does a full search on G. with that URL to see if a result is returned with the exact URL. Maybe you can get me details on how scrapebox is performing this task.
  • Kushphkt I don't need to "try" anything. I am very familiar with most SEO tools from last 15 years. In fact, I was the one who did the speedy index case studies here recently. I may not be up to date with latest updates for the week on some, but Ive used most these tools and know there capabilities. I'm not sure why you are trying to tell me about them? ? ?

    I really don't care wether tool is just devoped right now or old
    I think maybe you should understand this better if your going to trash talk them. Or at least give some examples of why you think this or that so if infact there is an error (not just repeating of things from other forums or misunderstandings) it can be addressed. That's helpful, trash talking with no examples is not. . .

    Also, if you already have Scrapebox and a good set of proxies, then say its better to buy some identified list you can find yourself in few minutes I don't get that logic at all? You can find yourself a better list much faster and use same filters or even add your own footprints and filters in Scrapebox in addition to default SER ones...

    @Sven I think he means Scrapebox index checker you can upload a list and it will check them very fast. But when run index check with same proxies after verifying them and index checking with SER it says index check banned for almost all and takes forever.

    Scrapebox you upload a list and if want index check you select google, yahoo or bing. As well as threads, and other settings to adjust such as use site: , url or url in quotes. Also a delay or threads if want etc. Usual scraping stuff.

    The result is this in a few seconds...


    These are some options on bottom. . .

    I will check and used same proxies and list in SER... 

    For some reason i cant import the same list into my list checking group they way I normally import from file so I will check another list after I reverify it and recheck proxies.

    These are live verified links with google proxies and I set delay for 60 seconds. Also set that way in proxies for search engines and private in settings.


    The end result has been "index  check banned" for all of the urls.

    This is pretty common lately, I have not even used this feature in the link manager because I know it will take forever and not bring results back.

    It's my understanding that its doing a bunch of checks and that's why its slower.
    But I'm still not sure why with same google passed proxies on a delay that its all indexed check banned when I can go to scrapebox and run same list with same proxies and get results fast.

    I'M not sure when exactly this started happening, but seem like its being blocked. Whether I tick proxies or not or try different things I get this.

    Its same with "might be banned on quiry "footprint" in the log... I see this and I know  the proxy is good and the footprint is not banned. Maybe not always but sometimes. This is for scraping list. So something is going on were SER thinks or is being blocked when manually you do same task on same proxy and its fine.

    Here is result 405 only with fresh proxies on live links and it took like 5 minutes or more even without delay and many proxies it still takes a long time and get back this.




    When I know for fact these are all live and indexed and the proxies are fine. Maybe we need seperate proxies for index list checking or something idk.

    This best I can explain quickly from my experience. and I have tested more then a few ways.. Hopefully this helps. If have specific question just ping me or pm me and Ill explain better. Limited time right now....

    The last time I saw this working i think I was using "buyproxies"...



Sign In or Register to comment.