Skip to content

Very Low GSA SER Submission Rate

edited December 2012 in Need Help
Hi,

My GSA SER project has been running for more than 10 hours but only made 53 submits (2 verified).

It's a brand new project and I'm using both CaptchaSniper and DeathByCaptcha, as well as proxies for search. Configuration window below ( http://i50.tinypic.com/2625opj.jpg ) --

image

What could have gone wrong? Thanks!

Jerry
Tagged:

Comments

  • I'm thinking you need way more keywords to start. Are you using filters(PR, OBL's)?
  • Both filter(PR, OBL) are off.

    I've checked "use collected keywords to find new sites" --

    image

    and run it again for a couple of minutes, so far got 50 more submission (6 verify), is this rate considered normal/good?

    Jerry
  • Guess it's better than 50 submissions in 10 hours. You need more keywords. If you have Scrapebox, throw your keywords in their keyword scraper and load up what you get. Normally I'm using 500 to 2000 keywords in a project.
  • Oh my God, 500 to 2000 keywords? I thought my 5 keywords is enough...

    I do have ScrapeBox but don't know how to use it with GSA SER.

    Do you have a blog on these topics that I can read up?

    Thanks.
  • You just use Scrapebox as a quick way to gather more keywords to use in SER. The keyword scraper button in Scrapebox is right below the window on the left where you put in your keywords.

    image
  • i use 10k keywords and I'm actually updating it tonight and bump it up to 50k but not sure if GSA SER can handle that. However, I do have confidence on GSA SER that it can :)
  • ronron SERLists.com
    edited December 2012

    Your lack of keywords probably has more to do with it than anything else. I try to roll with about 1000 keywords.

    Other questions would be your proxies, public or private? How many proxies are you using? Do you have a limit on the links on the options page (2nd tab)? How big is your website file size limit in Options=>Filter...try to have that no more than 3 MB. What is the speed of your internet? Is your email on the blacklist? Do you have a captcha solver? These are all questions that pertain to efficiency.

    You don't interface SB with SER. Just do what Indy showed you. Put those words in SB's keyword scraper, scrape new keywords, remove duplicates, and run again.

    And I would uncheck those two keywords boxes. I never use those.

    Speed is relative to your projects. You don't want a ton of links going straight to your site. Even though you have youtube, I would still build a tiered project underneath those links you are currently building. If you build submitted links at 1000 per hour, you are doing really good. 

     

  • I used ScrapeBox for scraping keywords but it pulls in a lot of long tail keywords... I'm wondering it that will be a limitation.

    Using Google Keyword Tool however I get 800 keywords and very few of them are long tail.
  • You're thinking too much about this, put them all in SER. This is a numbers game, more keywords=more results.
    Go with broader keywords as well to find more targets. Often my keywords are totally unrelated to my project.
  • @indylinks so you got your 10k(50k) keywords by iteratively keyword scrape in ScrapeBox?
  • No, I have lists that I keep gathering/building. Been doing SEO for years, I have extensive word lists. SB and Google keyword tool are great to get going fast

  • Not to beat a dead horse but have a quick question.

    Lets say my keyword is SEO In San Fransico. I have maybe 10 LSI keywords along with that. In order to help "rank" for my main keyword again SEo In San Francisco, I am going to have to scrape for about 1,000 kw that scrapebox finds related to SEO in San Fransico? I know about generic anchors and such but can someone please advise how 1,000 different kw that may or may not have anything to do with SEO in San Francisco (other then maybe having the word SEO) in it will help
Sign In or Register to comment.