Skip to content

Getting too many errors with few successful submissions.

I have been running a campaign and I seem to be getting an unusual amount of error messages with few submissions. There might be something that I am going incorrectly but I cannot figure out what it is. Here are some facts about the campaign.

1. All private proxies are activate and working.

2. I have almost 11k keywords.

3. All fields are filled out.

4. I have ticked all search engines to be submitted.

5. I have ticked all "where to submit" boxes except the adult tube one.



I have a log for reference that shows these errors. Please see this ---->  http://pastebin.com/DwtqYYuF

Comments

  • 1linklist1linklist FREE TRIAL Linklists - VPM of 150+ - http://1linklist.com
    edited September 2014
    At a guess?

    I opened 20 of your failed urls up in my browser, and not a single one loaded/connected. I also see a number of clues in there that this is not even an identified list; the presence of youtube.com, amazon.com, etc, gave me that impression anyways.

     If your scraping your own lists, thems the breaks. You can take measures to target your scrape better, but at the end of the day your always going to have a low success rate with a raw list.

    The same applies to older lists to; die-off is a very real, and very FAST thing that applies to any linklist.

    But maybe thats not the issue

    Some quick changes you can make to test this theory:

    Try setting up a simpler project (Ditch the 11k keywords for something like 20, just to test), Validate all the emails your using, and manually check your spins - make sure you dont have some kind of content error. If your using captcha breaking software, watch once you start and make sure they are getting passed correctly as well. (Skype, Wamp, anything that messes with port 80 can interfere with some Captcha breaking software).

    Beyond that, if your captchas, emails, proxies, and project settings are solid, the only thing I can think it would be is the list you were running.
  • Ok thanks, I actually checked the proxies and e-mail. They all seem to be stable. I am using GSA captcha and the secondary is Death By Captcha. I had someone import a list for me and maybe that list could be the problem. My question is, how do I delete that list and have GSA SER scrape URL's itself to post the links. In other words, how to remove the list? For testing I guess we could try this. I am not using Skype or Wamp. I am not using much on this PC.
  • 1linklist1linklist FREE TRIAL Linklists - VPM of 150+ - http://1linklist.com
    edited September 2014
    Hey,

    1. You can delete the imported list (assuming your not using a global site list), by right clicking the selected project, selecting modify project, then selecting delete target url cache.


    2. You need to go into the project settings, then options, and about half-way down you will see search engine settings to set SER to scrape lists for you.

    Note: If your using "Site list from globals" you'll see the option to disable that in this same section. Just untick the box.

    image


Sign In or Register to comment.