Skip to content

Please advise me on a list to subscribe to for verified links

Hello everyone. I'm looking for advice on the most reliable list seller service. I have looked on google for verified lists and so far, can not see any significant difference between the four offerings I have come across. I figured I would try to reach out to the more seasoned community and ask for any suggestions/recommendations and see if I could learn from your experiences.

The following information might be helpful in advising me:

My current setup is based on an AWS EC2 windows VPS instance.

My core for automated link building is as follows:
  • GSA SER
  • GSA Captcha Breaker

Other tools in my kit to support these programs:
  • 2captcha for hard captchas
  • Seo content machine for building my T2 and below contextual content.
  • Scrapebox (I don't really have much luck with this, despite using footprints list and proxies etc).
  • Spinrewriter for spinning content
  • Proxies (I have set up my own on Lightsail instances using Squidproxies. I find it cheaper and much faster than using paid services for private proxies.

For indexing:

I use my own method, which is free and took 15 minutes to set up. If sees any results, will share the feedback here for free. Fingers crossed.

Look forward to any advice thrown my way.

Comments

  • cherubcherub SERnuke.com
    I don't use any list service myself, but for the help he gives on this forum alone I would suggest @AliTab 's service: GSAserlists.com
    Thanked by 1AliTab
  • Will give the basic one month a try. Can always upgrade if it suits my needs. Thanks for your feedback.
  • Scraping is harder than it was 10 years ago. Much harder. But it’s still easy. In scrapebox you need to make sure you have your connections very low. Probably only 1 connection (it depends on amount of proxies), wait needs to be set correctly (again depends on amount of proxies and what kind of proxy), and timeout needs to be set to 120.

    If you’re not getting results with scraping, it’s because of your settings and proxies. 

    Also you should scrape more than one search engine.
  • Scraping is harder than it was 10 years ago. Much harder. But it’s still easy. In scrapebox you need to make sure you have your connections very low. Probably only 1 connection (it depends on amount of proxies), wait needs to be set correctly (again depends on amount of proxies and what kind of proxy), and timeout needs to be set to 120.

    If you’re not getting results with scraping, it’s because of your settings and proxies. 

    Also you should scrape more than one search engine.
    I'm not getting any results out of scrapebox these days, no matter if scraping Google or Bing using the custom scraper. Using 20 proxies, 1 threat, wait 90 seconds and following your post I have now changed the timeout from 10 to 120. Having tried to troubleshoot, I used the detailed harvester and found that proxies are not rotating, despite the setting being at 1. I made a couple of screenshots and sent these to Scrapebox support but no answer so far.
  • I’m using multiple instances of scrapebox daily. I get millions of results everyday. I suggest you get better proxies.
  • I’m using multiple instances of scrapebox daily. I get millions of results everyday. I suggest you get better proxies.
    The proxies are private proxies from newipnow, testing all good in SER and Scrapebox.
  • edited July 2022
    I’m using multiple instances of scrapebox daily. I get millions of results everyday. I suggest you get better proxies.
    The proxies are private proxies from newipnow, testing all good in SER and Scrapebox.
    I’m unsure of what you mean by “testing good,” since there are multiple tests. Are you saying your private data center proxies are passing when you test against google search, or bing search? Or passing an anonymous test?

    I don’t use datacenter proxies to scrape. It’s possible to do it but you need 100 at a minimum if you want to avoid having all them banned rapidly and set wait time to 10-20 seconds with 1 connection. 

    Instead of datacenter proxies I’m using port scanned public proxies. Today I have more than 1200 public proxies I’m using. That will change tomorrow since I’m scanning for them. So like I’ve mentioned, it’s not a scrapebox issue in an of itself, it’s a proxy issue.

    Also, scrape yahoo and search.com 
  • I’m using multiple instances of scrapebox daily. I get millions of results everyday. I suggest you get better proxies.
    The proxies are private proxies from newipnow, testing all good in SER and Scrapebox.
    I’m unsure of what you mean by “testing good,” since there are multiple tests. Are you saying your private data center proxies are passing when you test against google search, or bing search? Or passing an anonymous test?

    I don’t use datacenter proxies to scrape. It’s possible to do it but you need 100 at a minimum if you want to avoid having all them banned rapidly and set wait time to 10-20 seconds with 1 connection. 

    Instead of datacenter proxies I’m using port scanned public proxies. Today I have more than 1200 public proxies I’m using. That will change tomorrow since I’m scanning for them. So like I’ve mentioned, it’s not a scrapebox issue in an of itself, it’s a proxy issue.

    Also, scrape yahoo and search.com 
    You port scanned that many proxies and didn't get your ISP or host kicking down the door? Not bad. I know someone who did this and had their connection pulled in 5 hours after finding literally 2 proxies.
  • Scan with a proxy :) 
  • I think as well they were perhaps to blatant on it sticking to the usual suspect ports. 8888, 3128, and 8080.
  • edited July 2022
    Yeah, as you gain experience you learn. It’s not bad to start with the usual suspects though. Either way, there’s other scraper software out there. If anyone believes it’s scrapebox, maybe try something else 

    There is a-parser, and another I just learned about this morning. I haven’t actually looked at it yet but for the home page of the website called octoparse.

    https://www.octoparse.com/
    https://en.a-parser.com/

    A-parser is much more capable than scrapebox for MANY tasks. It can do lots of stuff out of the box scrapebox can’t. But I can assure you scrapebox can scrape SEs.

  • I’m using multiple instances of scrapebox daily. I get millions of results everyday. I suggest you get better proxies.
    The proxies are private proxies from newipnow, testing all good in SER and Scrapebox.
    I’m unsure of what you mean by “testing good,” since there are multiple tests. Are you saying your private data center proxies are passing when you test against google search, or bing search? 
    Passing tests against Google. Besides these rented proxies, I also have Squid installed on four of my VPSs.

    Yahoo scraping works but the results must be filtered as there is much crap in it.



  • Many people on BHW recommend serlinks. I have not bought any lists so far, so I have no personal experience about the quality and value for money.
    Momo said:
    Hello everyone. I'm looking for advice on the most reliable list seller service. I have looked on google for verified lists and so far, can not see any significant difference between the four offerings I have come across. I figured I would try to reach out to the more seasoned community and ask for any suggestions/recommendations and see if I could learn from your experiences.
  • AliTabAliTab https://gsaserlists.com
    cherub said:
    I don't use any list service myself, but for the help he gives on this forum alone I would suggest @AliTab 's service: GSAserlists.com
    Thanks for recommending my service. Hope he finds my service useful:)
Sign In or Register to comment.