Skip to content

Where do you index-check links in bulk?

I have been using SER, but I only have 30 proxies in SER and I can't get many checks out of it before they get banned.

Can anyone suggest anywhere or anything to use to bulk index check?

Thanks

Comments

  • SvenSven www.GSA-Online.de
    If you have private proxies, make sure to not use the option to disable proxies in proxy options.

    Thanked by 1Kaine
  • Recently, I have not been able to get good index check in link manager either, says index check banned mostly though the proxies are working fine in Scrapebox and and solving Google Recaptcha?

    Other then all these "redirects" that are showing in GSA and Scrapebox like failed '302' which is newer error on these software for me and I think more about the proxy "provider" play games with shit proxies.

    Probably tested every setting recently in proxy scraper in SER too and testing different sources for free scraped ones, though the ones Ive used for index checking are a mix of private from other "services".

    I have not found solution so far. I don't have the boxes checked as above either. It was not testing the newly added proxies (think I set that way) and I locked the private ones. I didn't see easy way to just test the "unchecked" proxies but it was end of long day.

    Also, scanner for proxies telling me some are bad when I know they are not for sure. Then I highlight a some that just tested bad but a smaller amount and they turn back to green right away. So the testing not always accurate. And I was only doing about 300ish at a clip. Then retesting the 17-20 you can highlight in GUI.

    I will see if I come up with solution but I have not really had stable results in that area currently. I will try again and share if I come up with better solution.

    I think people using and blocking VPN networks and Cloudflaire has become much bigger issue lately to get tasks done one the web. Even manually its a pain in the ass to get some tasks done.




  • I've been using https://www.indexedapi.com/ for when I need to bulk check indexing, instead of having to keep a decent set of Google-passed proxies. Though I find that checking indexing too often is one of those 'going down the rabbit-hole' situations, where you end up wasting time that would be better spent just building tier links and trusting them to perform to a certain extent.
    Thanked by 1backlinkaddict

  • Yeah, API's are way to go for a lot more things these days.

    I also wonder when keep index checking, so your continually going out and doing a site: backlink search, it may seem a little strange to Google that all your backlinks are being rechecked from random proxies, not well known data collection bots.

    Maybe, it can look suspicious and like you say some things are better to do with given amount of time, spent on other tasks.

  • I did see the other day speedy index but had index checker now.

    They do charge after 50 free index checks, likely using Indexed API as you mentioned.

    Way easier then having to have a list of good proxies on hand for index checking thigh for sure!
  • Sven said:
    If you have private proxies, make sure to not use the option to disable proxies in proxy options.


    Yeah my proxy settings are not set to disable, thanks Sven.

    I finally got round to testing Speedy Index, and I wanted to correctly check their indexed status. I've been using scrapebox since I had those errors in SER and it seems to check ok.

    I sent them my free 100 URLs and so far 5 are indexed after almost 72h. I really mixed them up though, just grabbed random links from my moneysite links and also included 25 from SEREngines.
  • You don't want your private proxies set to disable! 

    So long as they are good private ones.

    Otherwise, if you leave the options to be checking for free proxies and testing, even when you lock them, and use other options, somehow they get disabled anyways. But if you test just the private ones again you should find they are still good.

    So you end up with good but disabled private proxies.

    Ive spent some time in here recently, def don't want them set to be disabled for private.

    I would disable the public less stable ones.




    I would also use Scrapebox for index checking.

    For Speedy Index that's horrible results, did you send indexable do-follows only to the bot?

    Like some shit links that are not going to get indexed anyways I'd filter from list.

    Ive been getting 66+ percent of sent links indexed steadily.
  • Here recent results over 80 percent, its labled serLib test but its decent mixture of links...

    And they refund half of unindexed links back to your credit.

    Ive been seeing this pretty consistantly, so 66 plus is being far. Also the indexing rate is higher then reported I have found more than once.


    They did add index check too, didnt try yet. There 50 free, I know the indexing is very cheap so I cant see the index checking differnt, and maybe its more accurate as well
  • I only use private ones from Buyproxies, as for speedy I'm not sure which ones I sent but I know I included 25 serengine sites.

    I now have this so far:



    I checked and 12/25 serengines are indexed so far. Others were contextual do-follow. I wasn't sure of a way to target only engines that are indexable only? I know we can do this in tiers, to select only those. But engine selection we can't, can we?
  • Anth20 said:
    I only use private ones from Buyproxies, as for speedy I'm not sure which ones I sent but I know I included 25 serengine sites.

    I now have this so far:



    I checked and 12/25 serengines are indexed so far. Others were contextual do-follow. I wasn't sure of a way to target only engines that are indexable only? I know we can do this in tiers, to select only those. But engine selection we can't, can we?

     I thought serengines no longer...wow is this still exist? If then no hands seo would be working as well lol...anyway how is it performing.


    How is buy proxies nowadays.. I am also thinking to change my proxy provider. Been using ip4 and ip6 proxies from reproxy.network. they are great for xevil but not much good with ipv4 for ranker x and GSA ser..

    Speedy links are great indexer... But I still prefer tired link building with my lists
  • Anth20 some of those engines wont index, and I noticed a few that seem to index only a handful if they are pointing at same domain
  • My bad guys I meant to say SERlib. I'm ill with a virus atm  :D
  • I think I just made same mistake in another thread after reading this, opps :D
  • KushphktKushphkt TMobile
    Scrapebox is way to go. Gsa is so bad when handle operators form checking proxies to scraping.  @Anth20 how about  your experience with serlib last time I check its a totally mess from ram , high CPU usage to hard to get indexed. It's better to spend few more bucks and stick on rankerx try it if you don't own it much better than serlib itself. There is a nice peer reviewed on bhw as well
    Thanked by 1Freshairforlife
  • SvenSven www.GSA-Online.de
    @Kushphkt "Gsa is so bad when handle operators form checking proxies to scraping" Please elaborate this!
  • Kushphkt  I would find that elaboration pretty helpful as well.

    If something isn't "working"  for you and you provide no example, how can @sven fix? 

    Seems like hes always eager to address issues...quickly

    Though, comment doesn't seem to make much sense, in English.

    If you are going to suggest another way to parse for 2024, Scrapebox is your suggestion?

    Not sure why your comparing a 50$ monthly tool that's been around for years to a newer SER add-on that was recently being developed?

    I can't comment on current serLib, likely do to all this negativity and people saying things that are not true with no evidence, or complaining because users don't understand whats needed for some browser based engines to be successful and have a good stick rate. 

    Likely, the developer is keeping project or good engines for himself. Can't say I blame him. Either way, I hope @Alitab is doing OK.

    These are serLib links... Getting indexed



    These are also mainly serLib links getting indexed... so I guess it matters your content and what links you are sending because with working indexing service, they seem to index above 80 percent for me.



    So 80 percent in few days on few tests while triple checking, using multiple software's and different settings is a bad indexing rate?

    I was asked to provide test after one user was running around saying serLib does not work and crying for 20$ refund... Seemed to be looping through each engine and a bit more aggressive on some blogs as I had set...



    This was same time period, this crying about refund and trying to tarnish service was documented here...

    I guess some users should check settings, have basic SER knowledge and actually understand what's what's going on first before they make false claims. 

    This is just one simple example of quick test as well....

    Thanked by 1AliTab
  • AliTabAliTab GSAserlists.com
    Kushphkt  I would find that elaboration pretty helpful as well.

    If something isn't "working"  for you and you provide no example, how can @sven fix? 

    Seems like hes always eager to address issues...quickly

    Though, comment doesn't seem to make much sense, in English.

    If you are going to suggest another way to parse for 2024, Scrapebox is your suggestion?

    Not sure why your comparing a 50$ monthly tool that's been around for years to a newer SER add-on that was recently being developed?

    I can't comment on current serLib, likely do to all this negativity and people saying things that are not true with no evidence, or complaining because users don't understand whats needed for some browser based engines to be successful and have a good stick rate. 

    Likely, the developer is keeping project or good engines for himself. Can't say I blame him. Either way, I hope @Alitab is doing OK.

    These are serLib links... Getting indexed



    These are also mainly serLib links getting indexed... so I guess it matters your content and what links you are sending because with working indexing service, they seem to index above 80 percent for me.



    So 80 percent in few days on few tests while triple checking, using multiple software's and different settings is a bad indexing rate?

    I was asked to provide test after one user was running around saying serLib does not work and crying for 20$ refund... Seemed to be looping through each engine and a bit more aggressive on some blogs as I had set...



    This was same time period, this crying about refund and trying to tarnish service was documented here...

    I guess some users should check settings, have basic SER knowledge and actually understand what's what's going on first before they make false claims. 

    This is just one simple example of quick test as well....

    Thank you for your support, @backlinkaddict.

    I've been focused on maintaining and enhancing the quality of my services, as well as developing a new service that will be launching soon. I won't go into more detail until it's officially released.


    @Kushphkt, since version 4.7 of SERLib https://gsaserlists.com/serlib-changelog/ , I have not received any reports concerning high CPU usage, and I believe this issue has been resolved within SERLib. Please note that regardless of the software you use to create and manage your Web 2.0s, they all share the same basic principles and mostly utilize Chromium, which is CPU-intensive. To run a higher number of threads effectively, a robust setup is required. The advantage of SERlib is that it centralizes management, making tiered link building much easier and fully automated.
  • edited May 14
    @AliTab I'm glad to see your OK!

    I have been busy building a few workarounds/projects for myself lately also :)

    Your absolutely correct about browser based engines being resource intensive by nature, also those orphan pages can be tricky to close down at times. This is how its always been and not just recently, I mean for ever with all these tools. Socket based scripts and browser based scripts (which you need for most high quality sites) are treated VERY differently!

    I just laughed when users were complaining about "slow scripts and cumbersome mouse movements in the engines." To be honest these qualities are needed if you want successful account creations and backlinks that will stick and provide rankings. Where are those users now anyways? I guess another batch of people saying this or that don't work but only have their actions to blame for this.

    Here is just a very small sample of some code needed to pass bot test. . .

    // sleep in milli seconds
        await sleep(2075);
       
        // random extra human simulation on page object
        await simulateHumanInteraction(page);

       // parses for element that has the attribute name set to "eMail"
        const emailInput = await page.$('[name="eMail"]');

        // click on the email input field (extra emulation)
        await emailInput.click({ clickCount: 1 })
       
        //sleep for milli second
        await sleep(2875);

        //must clear the field as doubleclick will not work here to overwrite
        await emailInput.evaluate(element => {element.value = '';});  

        //sleep for milli second
        await sleep(1000);


    This is just one very small sample and no I didnt leave the defined variables or anything important really, like getting a browser to pass CF is a script/secret on its own. Scripts that run too fast, well likely your just building links that will be deleted, not indexed, and or your "software/bot" will be redirected elsewhere on many of these authoritive platforms.

    This is also costing you captchas, proxies, emails, content, time etc. I dont know about others, but for myself I would like some nice posts that even if script takes 3 minutes or even a day to get a nice post I would prefer that on a high quality engine that will stick and provide rankings which Im really after. I'm not trying to build backlinks for the sake of seeing how many I can build.

    For me, that excitement of linkbuilding and automation is not like it was 12 years ago. Dont get me wrong its still very fun and things change but gotta keep up with them to have successful and be on the cutting edge.

    I think some users don't understand or appreciate all the work that goes into even just one successful script or even an account creation. There are many who have just turned there software into something that just posts to there own sites for this reason. Another strategy but not one of my favs for first tier.

    Anyways glad to se your back!



    Thanked by 1AliTab
Sign In or Register to comment.