Skip to content

250K list (Gscraper) and only 9 verified...

Hi guys,

I just scraped a list of 250K, out of this list I only had 9 verified urls.. 
I used a 450K keyword list (still scraping) , and the footprints of GSA SER. 

This list was for social bookmarks, so I only used the footprints for the social bookmark with DO-follow backlinks.
I use public proxies, but I test them often. 
I use yahoo e-mails (20+/- at the time)
I use GSA captcha breaker
I run it at a vps

I know I should have private proxies, but I go on holiday soon , dont wanna buy proxies for a month and only use it for a week. 
I will buy spamvilla for the recaptcha as well once I am home.

But still, even with public proxies and 'just' gsa captcha breaker, I should have more then just 9 verified urls, i assume.
Could anyone help me out?

thanks.

Comments

  • s4nt0ss4nt0s Houston, Texas
    Did you remove duplicates before you imported it the list?

    Your problems are most likely due to public proxies and lack of using a solver for recaptcha and harder to solve captchas.

    When I run social bookmarks I get a lot of recaptcha and solve media type captchas so you really should have a backup solver for best results. If you're using CB, watch the captcha window and see what kind of captchas are showing up. If the majority are the harder to solve captchas, that definitely plays a role in it.

    If you aren't ready for a secondary solver, you can always try scraping for article footprints. You can get a decent amount without a secondary solver.
  • 1234876675412348766754 Utrecht
    edited July 2015
    Thanks for your reply @S4nt0s.
    Once I am back from holiday I surely will buy private proxies and a secondary resolver, but for now on (just 4 days) i won't. 

    It's not just the social bookmarks, also articles are very low % verification. Could this have the same reasons? (try to create only no-follow backlinks is enabled). 
  • s4nt0ss4nt0s Houston, Texas
    Yep, it can be the same reason for the articles as well. For me personally, I wouldn't bother with the "no-follow" option for now. 

    A lot of times you can't tell if a site will be dofollow/nofollow until after the link is placed and even platforms that are dofollow by default, can be changed by the webmaster. 

    There's nothing wrong with getting a mix of do/nofollow. If you're using a higher quality tier 1, you'd probably want those properties to be dofollow if possible, but I don't worry about it too much for tier 2.
  • Trevor_BanduraTrevor_Bandura 267,647 NEW GSA SER Verified List
    90% of all social bookmarks use recaptcha. You need to use more than just GSA Captcha breaker to build links on those sites.
Sign In or Register to comment.