Skip to content

Proxies, HTML Timeout, Threads - Max Efficiency

14567810»

Comments

  • Alex i see your graph and notice that you got most links from blog comments. which engines (blog) give you most links?
    and are you use your own footprints or footprints are bydefault?
  • @SEOMystic Have you unchecked the secondary captcha option? Sending to a secondary service like DBC will really slow down your LPM.

    Also, I see you have the PR unchecked, so make sure you have disabled it elsewhere too. See here: https://forum.gsa-online.de/discussion/comment/30367/#Comment_30367
  • @baba It's based on your keywords and footprints really really. I filter the good footprints add mine too. You can see it in Advanced - Show stats - verified. About mine I check General Blogs, Blogspot and ShowNews. On the other hand I'm scraping for more BlogEngine with SB and others like Wikis and Social bookmarks, network with GScraper. Though I haven't fed any projects with the scraped list. Just like LeeG said it might hurt the LPM xD
  • So, in my quest to join the "Century Club", I grabbed all the SER engine footprints, and ran them through the "Google Competition Finder" plugin in of SB, which gets the number of results found in Google for each footprint.  I then turfed out any footprint with less than 1000 results.  I'm currently running SER with them now to see how that works.  Not seeing any great changes yet, but it's only been running for about 10 minutes so far. Currently seeing around 35 LPM.

    Just wondering what the other gurus around here used as a cut off point.  I ended up removing 126 of 1223 footprints.  I'm thinking maybe I need to remove more ???
  • 1k seriously? Or am I doing it wrong? I filter footprints under 100k results at least... For general blogs I even filter footpinrst under 200k xD
  • @Alex considering I'm only getting around 35LPM, I'd say your doing it right, and I'm the one doing it wrong ;)  I felt I was being a bit conservative, that's why I asked the question.
  • @dogGoogles I'm running at 500 threads and I get my LPM around 90-100. If you're running way lower threads than me then sure have to reconsider...
  • @Alex . . . I'm running 200 threads . . . . I'm going to try your 100K cutoff for footprints and see what happens  . . . I'll report back after I make the change and run it for a while
  • Been running about 15 hours now after cutting off all footprints with less than 100K search results.  LPM is between 60 & 60 . . . definitely an improvement, but still need to do some more digging.
  •  Can anybody help me to know Why so much wiki link there..its almost 65%  and my submission is getting low daily
    image
  • I now know I have over optimized SER when this shit happens (facepalm)

    image

    This is 6 hours of running SER and I now totally lost control of it, the GUI just blank Not responding, and then back with some LPM information, and then Not responding again (facepalm)

    The only way to stop it now is just go to task manager then end process (facepalm)

    I think LeeG has a better VPS than me :( I've reached my hardware limit...
  • ronron SERLists.com
    edited June 2013

    Just my opinion, but I think you need at least 4Mb Ram and ideally 4 processors for ultra smooth sailing. I have never had an issue with this type of set-up. And those are great stats, so kudos!

  • LeeGLeeG Eating your first bourne

    Ron, you missed the amount of proxies in that image, all 4076 of them

    No doubt scraped public proxies

    When will people catch on that public proxies are the spawn of satan

     

    40 private proxies and I did this in 24hrs

    image
  • ronron SERLists.com
    You are so right, I missed that. Where is @Ozz and his public proxies suck banner we need it?
  • LeeGLeeG Eating your first bourne
    Probably looking for a good optician to help you :D
  • ronron SERLists.com
    :-B
  • If I'm not mistaken, Alex is running on private proxies. Public proxies are for scraping sites with GSA.
  • Yes I only use private proxies for GSA SER. Public proxy scraping is to feed GScraper. 
  • What kind of system do you have GSA set up on Alex?
  • I don't get what you mean @Merlin22 ? Do you mean my VPS spec?
  • Just done an experiment...
    Normally I put 1 email for each project...But I was fed up checking it everyday if it's blacklisted or not..
    So this time I put 10 emails each projects... Of course almost no email was blacklisted...
    But now my verified URLs and LPM decreased by 30-50% :(
    Damn... Getting back to 1 email per project...
  • LeeGLeeG Eating your first bourne

    If your checking emails are black listed what about the proxies you use and the domains your submitting?

    This is why email black list checking is nothing more than a waste of time

    Times I show examples on here

    http://www.stopforumspam.com/ipcheck/91.239.15.153

    That's a good example above

    That guy, has to register on forums to be able to be added to that list

    I haven't changed emails for three months and Im still doing 300,000+ submissions daily

     

  • Hmm... so you think stopforumspam or email blacklisting are bullshit, we don't need to care about it? 
    Are you using 1 email each project or more @LeeG
  • LeeGLeeG Eating your first bourne

    I have been saying that for months and each time I give similar examples from my massive list of similar examples

     

    So you swap out an email address because its blacklisted

    Five minutes taken, 1 paid captcha if your using an email creator

    Now, how do you know that the very next time that email address gets used, it wont be listed on SFS?

    It could be one submission, it could be 1 million submissions

    Remember SFS also register ip's. So do you check your proxies? Get them swapped out because you or someone your sharing them with has got them listed on there?

     

    I use Email Account Creator Plus, made by theorbital and not the clone that came about after thats also sold on here

    I set mine up when you could still make forwarding addresses with Hotmail accounts

    I run one single address per project and tier with a lot of spun emails pointing at them.

    The only reason is for making more accounts. Forums for example are normally one email address per registration. In theory, 50 email accounts, 50 profiles on a forum

     

  • image

    Yay I'm half LeeG (facepalm)
    Been submitting 24hrs. Just updated both SER and CB. Now need to verify these shzt...
Sign In or Register to comment.