Skip to content

Days with No Submissions/Verifications on two projects

edited November 2012 in Need Help
Okay I am having a little bit of trouble with a couple of projects and I cant figure out whats wrong, maybe someone else can shed some light.

I have had GSA running for a little over 2 months now and everything was working really well, but the latest two projects just aren't making any links. The two projects both target social bookmarking platforms, each with a different URL set, set of keywords, email address etc.
One project has just 25 keywords, the other 759 keywords. Neither of the email addresses are on the blacklists. I have 50 private proxies and all are working fine...I dont know what it could be

Anything I am missing?

Comments

  • AlexRAlexR Cape Town
    Try and clear the target URL history and the Target URL cache. 

    Also - check your proxies using a different method to GSA. This solves it most of the time!
  • Okay I have deleted the Target URL cache and the Target URL history but still no difference.
    I did think it could be the proxies, but I had new proxies issued the other day and that made no change. Earlier this morning I added around 4000 working public proxies as I was hoping at least 1 submission could be done through that if it was a proxy problem, still nothing though.

    Is there a way I can check to see what problems the projects are having?
  • AlexRAlexR Cape Town
    Are the few submissions or 0 submissions?

    - can you check you can manually access your emails as well. 
  • Well it had originally been 0 submissions but they both seem to now have 1 submission (been running on their own for the last 2 hours). Both submissions are to the same site though even though site lists are now turned off.

    Tried one of the email accounts and its working fine, logged in manually and its sitting empty.

    Willing to give someone (Sven?) access if they want to try and diagnose the problem
  • please right click your log -> save to file and let it run a while. then upload the log to a filehoster (mediafire, rapidshare) and share the link with sven or post it in this thread.

    most propably your proxies causing this. they are either too slow or dead. if thats the case then you get a lot of "download failed" errors. 
  • I was thinking it could be proxies but they seem to be working fine in the tests I do, and they all pass the GSA proxy test with a timeout set at 4 secs.

    Here is the log though, I was using public proxies at the same time though. If you want me to do a log when just using the privates then let me know and I will
  • ok, i think i know what you did. you duplicated your projects, edited the URLs for it and didn't clear the history cache, right?

    what you need to do is to delete history cache first with right click project -> modify -> delete history cache. 
    then right click project -> import target URLs -> from file and select your SB site lists in case you checked that option to create site list. this will speed up the process of posting as you don't need to find the same target URLs again with search engines.
  • edited November 2012
    Yeah I did duplicate it, but I deleted the history and cache earlier when GlobalGoogler suggested it and it still didnt pick up any submissions.

    I did it again though and imported my SB site lists so left that for the last 10 minutes to run and still no more submissions.
    One thing I did notice though was that they were both showing 1 submission each earlier, but now project2 is back to showing 0 submissions.
    UPDATE: About 30 seconds after I posted that I noticed that project1 was back to showing 0 submissions
  • AlexRAlexR Cape Town
    My guess is your proxies are fried! I had a similar thing previously where it showed all proxies working but it would not get any submissions. Changed proxies....and all was back to normal. 
  • Im starting to think its not the proxies, I have had links built on other projects when they were running with the same proxies. I did also scrape a load of (4000+ tested) public proxies to use and it was still getting the same results.

    Okay I just set the thread count down to 1 and turned off the use of proxies - still no submissions after 10 minutes of letting it run
  • OzzOzz
    edited November 2012
    "download failed" due to bad proxies: 456
    "already parsed": 624
    "PR-? too low: 52

    do yourself a favour and buy some proxies, delete history cache and restart  with imported site lists again.

    you need to monitor and understand the log messages. just saying "still no submissions after 10 minutes of letting it run" don't help to determine your issues.
  • But I have bought proxies, I've got 50 at the moment so I thought that would be enough to handle 1 project at least (I also tested in scrapebox and they work there too). Ive deleted history/cache twice now but I will try it a third to see if that makes a difference

    By stating the "still no submissions..." I was trying to point out the fact it probably wasn't the proxies.

    Is there a guide as to what the log messages mean then? It cant really be assumed that everyone knows that "download failed" is due to bad proxies etc.
  • "download failed" is what it says. either download failed due to bad connection (most probably proxy is dead) or the site is just down or dead.
  • Okay I have since bought 5 dedicated private proxies for the job, after all they are brand new and being private no one else can mess with them. However just running the 1 project is still getting no results -

    At first it looked like it might have been doing it very slowly, took around half hour and was showing 6 submits. I had imported the site list as suggested so it was working its way through that. I went off and had some dinner and when I came back its gone back to 0 submits and 0 verifications.
    Now I imagine it's using the search engines to search for targets and the log shows that, lots of no engine matches and PAGE result entries.

    It feels like I am back to square one now
  • OzzOzz
    edited November 2012
    Maybe you need to increase your html timeout when your connection or proxies are really slow (Options -> Submission -> ~140s). But with only 1 project this should not really an issue.

    I wonder what happens when you create a new fake project with some test data, without any filters and import some file lists to this. 

    And you should save and posting the log when having issues. It doesn't really help when you only describing what SER is or isn't doing and the issue isn't obvious or very common.
  • I did increase the timeouts before, HTML timeouts are currently set at 120 so that should be plenty.

    As a test I set up a brand new project (blog commenting only) to see whether that would get the same problem and after letting it run for nearly 10 minutes there was 16 submissions, so a big improvement on that one at least.
  • Just tried one of the other problematic projects again just to see if anything changed, and after running that for nearly 10 minutes it was showing 5 submissions!


    I was using a mix of the 50 private proxies I had (buyproxies) and then the 5 new dedicated proxies I had (changemyip). I see towards the end that there was a fair few Download Failed errors so would that suggest the proxies going bad? I wouldn't have expected that to happen on 55 proxies after just 5 minutes
  • OzzOzz
    edited November 2012
    Ok, next you test only SB on your Testproject. Blog commenting seems to work ;)

    - delete duplicates from your site list (options -> advanced -> tools)
    - delete history and target URL cache for Testproject
    - uncheck all search engines of your Testproject (project option -> options -> right click SE -> select none)
    - remove all PR filter
    - stop all other projects
    - import a couple of SB URLs from your site lists (phpdug, pligg, ....)
    - test your proxies in -> options -> submission -> configure
    - start only this Testproject and save the log to file. all other projects should be inactive
  • AlexRAlexR Cape Town
    This could be turned into a great post. Maybe it should be stickied...

    Maybe entitled "Steps 1 to 10 to take when you are not getting submissions." I've seen it asked a few times and I battled through it myself to get it working again. 

    Maybe it should be added as a link to the FAQ? (@erbsensuppe)
  • Right - further testing seems to be fairly positive on the testproject. I done all the steps you mentioned Ozz, as well as turning on the Social Bookmarking platforms (turning off Blog Commenting) and ran it through a selection of social bookmark sites from my site list. In the end I think it made around 45 submissions in a matter of minutes.

    I did remove duplicate URLs, the log still shows duplicates though. Anyway here is the log - https://www.dropbox.com/s/4axvcoysknukdzx/activity3.log

    So its not a problem with the social bookmarking side of things then. I am going to try and set up a new project for 1 of the problem projects, copy of the information across manually and run it - see if that makes a difference.
  • I set up an identical project using all the information from my original. When I imported the site list entries it processed through them and ended up with 59 submissions in total - https://www.dropbox.com/s/zbl4d9e8yhg316o/activity4.log

    With that working, the next step was enabling search engines and letting it scrape results and submit to them. I ended up leaving it for about 5-10 minutes but it didnt complete any submissions. I have included the log for that one too - https://www.dropbox.com/s/xd46xx4tvl4nre9/activity5.log

    I have all the UK search engines enabled (13), and have the proxies set up to disable any that are faulty, but to test them every 30 minutes and re-enable any that test okay.
  • OzzOzz
    edited November 2012
    yes, with delete duplicate urls you will keep many duplicates of the same domain in your list. if you are only using the SB module in SER it may be best to delete duplicate domains so you won't get any "already parsed" messages for new created projects anymore.

    regarding your SEs you should keep an eye on which SE is usefull and give results for you. test all of the UK SEs (one by one) with a simple footprint like "Powered By Pligg" "Legal: License and Source" with the help of "Search online for URLs" (Options -> Advanced -> Tools). keep the ones that giving you results in your projects. you could also add international SEs, because i doubt that you'll get UK related results only with UK SEs. most propably you get .com, .net., .org and .info results anyway. so why you are limiting yourself with UK SEs only?

    furthermore you need to check if your keywords are "exhausted" by the SEs. you need to know that SB are limited by nature because they are not as common like wordpress blogs for example. every search query will give you up to 1000 results. of that 1000 results are many duplicates and many that are not belong to SB itself.
    if you have the opportunity than scrape your list with Scrapebox with different footprints and generic keywords. as they are social bookmarks the keywords don't need to be niche specific. you could use first and last names or most common words as an example to built up your list. important is that you get as many different target URLs as possible.
Sign In or Register to comment.