Skip to content

Very low success rate

edited February 2013 in Need Help
I have very low success rate especially for web20 and for other engines as well. I have a lot of keywords but when I look at them via the Tools button only 3 or 4 of them was found. What I'm doing wrong?

Comments

  • Can you post your full settings? Web2.0 are very difficult, try Articles too. I have a success ratio of 10:1 for articles:web2.0 ....
  • Unfortunately article rate also are very low 43:2. Woth web 2.0 I do not use keyword search, PR-not set, non of my keyword must be presented on the page, no post to competitors backlink.

    For articles it's keyword related, with PR>=2, at least 2 of my keywords must be on the page to post, no post to competitors backlink.

    And standard badwords filter everywhere. :-?
  • spunko2010spunko2010 Isle of Man
    edited February 2013
    Try changing the email address used, and also if you are new then over a few days/weeks you will be able to see which web2.0 you have success with, then extract only those URLs. I only really have success with Wordpress on web2.0
  • I have catch all email on my domain. Maybe this is really the reason but I found some peoples recommended catch all email instead of yahoo. Project setup takes so many time and result is so deplorable. I'm really tired with this. :-<

    Thank you very much. I appreciate your concern.
  • ronron SERLists.com

    @zuzuzuzu - Please remove the PR filter. If I had a dollar for everytime people use that filter and complain about terrible submission rates, I would be a multimillionaire and shooting spitballs at you you guys from a cabana somewhere in the Caribbean.

     

  • Ron, in another thread you said PR for high quality links was a must. I use PR checking for direct links to my money site only, is that sufficient?
  • ronron SERLists.com

    @spunko2010 - I said I was a believer in PR, but I never, ever, said to use PR filters on SER. I do not use one single PR filter on any project on SER, and I am knocking it out of the park.

    Without getting into a major rehash of old discussions, the google algorithm has changed quite a bit since panda. I have noticed first hand across all projects that I don't need PR to rank for highly competitive terms.

    Yes, it goes against everything I have learned over the years. But I am obliterating PR5 and PR6 sites with my PR 0 and PR1 sites. And I am talking about highly competitive terms with 10,000 - 50,000 local exact match.

    People get so hung up on PR. In SER, 99% of the time the PR setting has to do with the parent domain - not the page. SER is just creating PR NA pages = NO pagerank. Yes, you are on a slightly higher "authority" domain, but with zero pagerank. And it will likely stay that way unless you create an active campaign to build high PR links directly to that page. So there is no point. 

  • I own captcha sniper x3 which ive used for several months and it usually solves around 50%+ captcha breaker 1.57 latest release im only using it for NoHandsSeo and GSA Search engine ranker and ive even spent like a week going through like 100 of the captchas to get them to 100% with brute force find best solution for all images for onces ive selected...  im averaging about 25-30% success rate...GSA really needs to improve the Success Rate/SolveTime ASAP because ITS turned out to be a huge disappointment coming from the quality of GSA im used to....

  • OzzOzz
    edited February 2013
    wrong thread again as this is a thread about SER ^^

    honestly, what is wrong with you that you post your issues on every board, but refuse to get your problems solved when not posting a screenshot of your settings at least.

    like i told you at BHW i can't rule out if theres something not working in conjunction with NHSEO so i only can speak for SER with CB. but CB outnumbers CS by miles when it comes to solving speed, memory usage plus way less bugs. this is just my experience though but i assume there are a lot of people who witness the same that have purchased both products.

    furthermore i like to know how you have brute forced your definitions? did you just overwrite existing ones or added more processes? how big is your sample size for each captcha type you've improved? did you just brute forced unsolved captchas?
    i'm asking because i fear that you do more harm than good to your definitions which is not entirely your fault due to lack of documentation (which will change hopefully in the next few days).
  • Thanks guys for your suggestion. Ron I can't see the reason why not to remove PR filter. Will try it just right now. At least I don't know what to do next if I will not remove PR filter.

    Just a question Ron ... remove PR filter for only web 2.0 campaign or for all?
  • ronron SERLists.com
    I don't have PR filters anywhere in GSA. The only way to prove what I say is true is to crank up a couple of new websites and do it that way. Then you will have the proof.
  • LeeGLeeG Eating your first bourne

    I run pr4 on t1 pr2 on t2 and no pr on t3 and 4

    On a good day, I can do 300k submissions and 30k verified

  • leeG that pr4 filter for your tier 1 its domain pr right?
  • @ron Wierd that you don't use any PR filters and your sites are sticking?    Or are you doing build and burn sites?   

    I've always done PR filters on the T1 T2 and none on the T3  but you know what I'll give your theory a proper test across 10 domains ranging from 10,000 competing pages up to 7million  and I'll let it run for a month.   I can let you know my results if you'd like?
  • ronron SERLists.com
    I'll PM you.
  • I have disabled PR filter but results are pretty much the same. In one project I have "use keywords to find target sites" ON and in another it's OFF but this doesn't affect results somehow.

    I also have OBL =80 filter ON. May be this is the reason. I'm using this campaign for money site and run this 2+ weeks.  X(
  • I forgot to mention that I'm using public proxy but can the results be such low anyway?
  • SvenSven www.GSA-Online.de
    Grrrr yes it can be. Depends on your public proxies and the source where you pull them from. Public proxies are usually very slow in speed and very instable. 
  • time for my little artwork again

    ===================
     PUBLIC  PROXIES  SUCKS! 
    ===================
  • Public proxies are the demon!  [..]

    Go private and you wont regret it!
  • edited February 2013
    Great artwork Ozz =D> Like it very much!

    Quality and cheap proxies (all in one :) ) would be great. For example sslprivateproxy.com offers
    private proxies for $1.40 per proxy. I would highly appreciate any suggestion.
  • I guess the web 2.0 blogs havnt got any pr anyway so i might turn off the filters aswell Thank you...

    With the web 2.0 pr filter on i just used to get 3 of the same websites and that was all..
  • Absolutely agree with you content32. I've got only 2 web2.0 and one of them was xfire which I later put in the url filter to avoid posting there :)
  • OzzOzz
    edited February 2013
    sorry to say that, but to filter an engine is stupid. just uncheck the engine if you don't like it.
  • Yes exactly. I meant that under the url filter.

    So, anybody can recommend proxy supplier - cheep and quality ? >:D<
  • proxy hub
  • I have bought proxies. Let's see what will change.
    Thanks everybody!
Sign In or Register to comment.