Skip to content

Why GSA is not posting much web 2.0

edited September 2012 in Bugs
why GSA is not posting much in web 2.0 iam running gsa with my new project from a couple of days..but gsa has submitted only 3 web 2.0.whats wrong am i doing.here is the pic
image

Not only for this project,but also i faced the same issue on my previous project..GSA submitted only 5 web 2.0..here is the pic

image

Whats wrong am i doing..can anyone please help how to get succeed in this..please help me regarding this issue
Tagged:

Comments

  • SvenSven www.GSA-Online.de
    Do a "delete target url histrory" + pastebin us the log after restarting the project with just web2.0s enabled...this will be more helpful
  • i have just made some changes,i have placed the decaptcher as first captcha solver instead of captcha sniper,and i run it for 1 hour,what i have got 20 web 2.0 here is the log

    http://pastebin.com/embed_js.php?i=Ewrsjt2h
  • i think web20 is not really usefull at the moment... maybe it needs more time to get stable
  • I have the same problem. I've noticed that the submission goes well but after that I get mostly the error: download failed.

    What does this mean ? Is this a engine script failure or something else ?


  • SvenSven www.GSA-Online.de
    download failed means that nothing was returned on a download. Usually a proxy issue or firewall.
  • OzzOzz
    edited March 2013
    @sunny: can you "delete target url history" again, uncheck " n****.l*" in web2.0 and pastebin the file?

    your log only shows that SER is trying to search for  n****.l* sites.
  • @Sven

    I'm using 5 private proxy's and I've made a new rule in my firewall for SER allready. But I have the same results. Also I've delete target url history and cache.

    Is it possible that my proxy's are blacklisted and that I therefore do not get very much WEB2.0's verified ?
  • SvenSven www.GSA-Online.de
    hard to tell, but if you get always download failed it might be that yes.
  • Well I get about 10 to 15 verified ones and than it stops. Also I'm using a fresh hotmail account. Would it be better when I try it with public proxy's ? Or is this a bad idea ?
  • edited March 2013
    i have created a new project and uncheck
    n****.l*
    in web 2.0 and pastebin it,here is the pastebin url

    http://pastebin.com/embed_js.php?i=kC4xrc8a
  • Thanks for the log. 

    15:23:02: [+] 53/55 unknown login status - http://www.sportblog.fr/pass/login/
    15:23:02: [-] 53/55 required variable "about_yourself" was not used in form.

    Registration works fine for those "Bloggorilla/Bloggospace"-sites, maybe there is a problem because the missing form? 

    I have no clue, hopefully Sven can clarify this.


  • edited March 2013
    IIIFFFF i understood this web2.0 thing correctly, except
    n****.l*
    these all are 1 TIME POST TO sites. So your project will post only 1 time to them and try to create a blog as these are fixed sites and not platforms.

    Great to have all these web2.0 but if this is true than the value is quite diminished. I suspect that idocscript, scribd, all the article sites (the first 5-6 from top) are also single post sites which would mean at the start of your project use a few threads only and check all these and manual captcha solving and let GSA work on them for an hour. After that disable all the fixed ones to not waste search resources, then check all the other sites you like to post to that are platforms and continue.. I stand corrected on this.

    How it SHOULD be, example:
    Specify a posting interval and/or the max amount of blogs you want on that web2.0, then GSA will create multiple blogs on that web2.0 with random usernames, mabye even re-post to created blogs if site allows it.

  • SvenSven www.GSA-Online.de
    It is not wasting resources as it is a fixed url and if the program knows it submitted before, it will not do again. Also posting on these sites more than once is not really working as you need to have a new email for most of them. But yes it is true that you might want to submit to these sites with manual captcha solving maybe to make sure nothing is getting wrong.
  • Actually, with a one-post per domain project, I'm getting tons of xfire.com links, you might check into it.
  • I have given up with the web 2.0 on GSA - I have 6 totally separate projects in different niches and on all 6, I have seen no more than about 6 verified web 2.0's and several of those were just web 2.0 user profiles. I have tried everything mentioned included clearing oput target URL's, fresh email addresses every time, no criteria such as page rank or OBL and no matter how minimal I set the requirements, i get virtually nothing back.

    Seems a bit odd because there are many decent web 2.0's out there, I have a total of 250+ live in UD that all work so something seems to me not to be right with GSA and web 2.0 creation.

    If I use Ultimate Demon, I can easily get 100+ created and that's with me being picky on the site I use.
  • Well, same problem too. I have been running a few projects for the last 10 days and i only collected 13 web 2.0 and most of them were web 2.0 user profiles.

    I don't have even 1 in the submitted  backlinks list.
  • mmm, looking through posts here, seems many people are having the same issue with only getting a couple of web 2.0's and many of them just from profiles. Think I will have to leave this to another application I use
  • OzzOzz
    edited October 2012
    I have a vague suspicion that there are two main reasons that people have problems to post to web20's and both are related to proxies.

    1. blocked IP's, which is obvious and could be tested if you open the url with a proxy in your browser. Mostly happen if you are using shared proxies

    2. reCaptcha isn't loading fast enough or didn't load at all. I see this happen all the time when I'm testing new sites and create a fake account by hand. I'm using a shared proxy most of the time in my browser when doing this and have to reload the page a few times to get the reCaptcha. I suppose this has something to do with google as reCaptcha is part of google. When your IP is banned by google than the reCaptcha field isn't loading OR the proxy is to slow to load the reCaptcha field at a set time.
  • OzzOzz
    edited October 2012
    I've googled a bit and I think that I'm right with my suspicion about banned IP's by google for reCaptcha:

    Recaptcha involves a call to Google run servers (recaptcha is Googles own anti spam captcha)
    They ban IP's and entire blocks of IP's if they think they are being automated.

  • edited October 2012
    Hello,

    I have the same issue. My ips are ok and i have 20 private proxies. I use DBC and CS.
  • How do you know that your IPs are OK when google possibly ban complete blocks of IPs if they look suspicious to them (unconfirmed though)?

    If possible than test your proxies by hand and see if reCaptcha is loading properly on sites that using reCaptcha.
  • I have the same problem as well , I checked my private proxies by hand in the browsers and they are working fine.
  • I have been running the first project for about a week, and have the same problem. There is only 1 Web20 verified link to technorati. I don't see any links in show submitted links menu. (a popup with message "no urls to show").

    I get the "Attention! No more targets but also no competitor analysis wanted." for the web2 project at the moment.

    Other project works but this one.
  • JamesJames SEREngines.com
    edited October 2013
    @richrich123 Ultimate Demon puts template/generic platforms under its "Web 2.0" banner which can lead to confusion.

    Ultimate Demons classes a web 2.0 as:

    * Dolphin
    * Jcow
    * Phpfox
    * Elgg
    * PHPIzabi
    * Plus some individually coded ones

    While SER already does all of these and classes those generic platforms as Social Networks.

    Now do a real comparison, I see the OP have 408 social network posts which are basically what Ultimate Demon is classing as a Web 2.0.

    In SER Web 2.0s are individually coded, meaning one site only per Web2.0 in the list. These hold MUCH more value then the generic platforms like Dolphin, JCOW etc.

    If @Sven decided to put all those generic platforms under the Web 2.0 group (rather than Social Network which is where they are now) people would instantly be wowed by how many "Web 2.0s" it creates. Hey @Sven maybe it will be a good marketing stunt to do it this way?
  • SvenSven www.GSA-Online.de
    Naw, I don't care about UD or the others. If people don't realize this, it's there fault. Some things have to stay ;)
  • But why is the post rate  so low. I have only one posted and now can't find any more targets. 

    If low rate is due to something wrong we did on our side, any recommendation about what should we check. 
  • JamesJames SEREngines.com
    edited October 2013
    @samk As you using the internal Web 2.0s that are included with SER I can't speak for the health of them. Web2.0 sites update all the time so maintenance of the scripts is a must.

    One thing to remember though, is all of those Web 2.0s in the list only allow 1 account per project (version 6.68 and below of SER), so once SER goes through that list (usually within the first hour or so depending on how many other projects you are running) it won't retry them. So it is like this:

    1. Try to register on Web2.0
    2. Failed/success to register and post
    3. Never touch this web 2.0 again within this project

    The Web2.0s are not like the other engines within SER, they are 1 site Web2.0 in the list.. It doesn't go out and find more targets, as they are all individual sites and not generic platforms.

    The next version of SER will however change this and allow intelligent retrying and multiple accounts of a single Web 2.0 within the same project..
  • Thanks Jamese for the detailed explanation. I understand how they differ from other engines. Just wanted to be sure what is causing this almost no success rate at all. As far as I see the issue is with GSA itself, and there isn't anything we can step in.
Sign In or Register to comment.