Skip to content

Very few new sites found (000/020)

Hi! My GSA is finding very few new sites to attempt to post to. I do not understand why. My threads are rarely maxed, and my LpM is low. I am getting 000/020 most of the time when scraping, and I believe this is the main problem - GSA simply doesn't find any new sites to work with.. For one project, I am using a generic list of 150k keywords, and for my other project, I am using ~10k niche relevant keywords from Google Keyword Tool. I'm using public proxies, most of which are Google-banned, but I can scrape from other sites.

I have attached all the screenshots I thought were relevant - if you need more, tell me!


I have thought about the issue a lot. I believe that either
a) I need Google-passed proxies, because the other search engines are crap
or
b) My keywords lists suck

What do you think?

Thanks in advance!



Comments

  • my vote is for a)

    If you ever scraped with SB using public proxies you know that from 100.000 proxies maybe 1000 pass google, and these will burn out after 1 hour. thats why you have to scrape fast ;D

    Private proxies solve most of these 000 issues.
  • LeeGLeeG Eating your first bourne

    image

     

    That might drop a hint, plus it puts the Ozz version to shame :D

     

    Looks through whats shared on here

    Looking at those screen shots, its partly keyword and also search engine choice

    You will get far better results search engine wise using the random google system Ozz shared on here

    Add to that a better keyword list. when you see 000/012 etc, it means there are no more target sites found

    And the above hint about public proxies

  • @Startruo @LeeG thanks for your input. It seems the proxies are to blame and I'll have to stop cheaping out :-)

    @LeeG I appreciate you being so helpful, I have read a lot of your posts already, and they are always great! 

    I have now downloaded Ozz's little "hack" and even with my public proxies, I'm getting quite a few scrapes, so this is a big improvement! Now I'm getting lots of "already parsed" messages, but I suppose this is because of my keyword list.. Any suggestions in that department?

    Oh, and I haven't forgotten about the proxies yet :-)
  • LeeGLeeG Eating your first bourne

    Don't forget (I put a lot of effort into making this, all my own hard work)

    image

     

    Keyword lists, any shared on the net will be used and abused silly.

    Your better off scraping your own keyword lists

    I use 100k keyword lists on all projects

Sign In or Register to comment.