Skip to content

Why does submitted count go down?

edited March 2013 in Need Help
My first 2 & 3 tier project (tier 3 using the urls from tier 2) appeared to have completed with the last activity in the log being to check proxies. I set up a new one and then noticed that the first project seemed to restart while the second project was running. However, the 'submitted' field in the first project's tier 3 level started to go down whilst the 'verified' field increased. I cant understand why this should happen, any help would be gratefully received!

Comments

  • from the inofficial FAQ:

    Why are there more verified than submitted links?
    If a submitted URL timed out (to many unsuccessful checks) it will be removed, so the number of submitted links can decrease and is sometimes lower than the number of verified links.
  • Ok I see that. However, why did the first project re-start when I loaded up a new project? It had been inactive for over 24 hours
  • without any knowledge of whats your setup or which settings were active its hard to know what happend on your end.
  • So how can you tell when a project has completely finished?
  • OzzOzz
    edited March 2013
    it never finishes when you are using search engines along with a good amount of keyword for scrapings. if SER doesn't find any new target sites though than it has nothing to post to.

    maybe post a log file and screenshots of your settings. tell us some more about your set up. the most common issue unexperienced users tend to have is the usage of bad proxies and because of that they get lots of "download failed" messages in the log window.
  • That's very helpful, thanks. I am getting a lot of 'download failed' messages but I'm using the built in proxy scraper. I use semi-dedicated proxies with other software but figured that GSA would burn through too quickly. If it's left to run with the proxy scraper does it just keep going by scraping new ones? If it's just working constantly in the background I don't see speed as being an issue.
  • OzzOzz
    edited March 2013
    public proxies are the devil. and SER doesn't burn the proxies as fast as scrapebox for instance. many users just use 10-20 proxies for their campaigns, even shared proxies working good for most.

    those public proxies are banned on many search engines within no time or dead already so you either didn't find any targets to post to or can't post to those targets because you can't get a connection (=download failed).

    just use some of your private proxies for a while and see how things work out. the worst that could happen is that they are banned from search engines. sooner or later they will be blacklisted too, if you are gonna create links with blog comments or forum profiles. uncheck those engines if you want to avoid this.
    that said, the blacklists or SEs bans doesn't touch the functionality of your proxies at all (i believe), if you are don't use them for blog commenting or forums with your other tools. as long as you are just want to use them for posting with other tools you should be good.
  • Brilliant, I will take your advice on the proxies. Many thanks for your help Ozz
  • AlexRAlexR Cape Town
    @Ozz - where is your graphic about public proxies... I think it needs to come out again. ;-0
  • better late than never i guess ;)

    ===================
     PUBLIC  PROXIES  SUCKS! 
    ===================
  • AlexRAlexR Cape Town
    :-)
  • Ozz if you have proper tools they are awesome
    for example i have so far harvested over 20 million unique urls from google with just public proxies :)
  • they may be good for harvesting. i use them for myself for such purposes. 

    but if i have to rely on a stable connection to get things work than private proxies are necessary imo, unless you have some really good proxy sources which aren't listed anywhere and found by yourself with some advanced techniques and tools like "nix proxysuite" for instance.
  • @Ozz i dont know whether any of my proxies are private or not
    But i am able to get over 2000 working with google proxies at my per run among 580k total proxies

    none of them have password

    I don't use any proxy so far with GSA - since i am not using any search engine at all :)
Sign In or Register to comment.