Skip to content

Proxies, HTML Timeout, Threads - Max Efficiency

145679

Comments

  • good idea.

    LeeG you also edit engines footprints and insert your own or leave them by default.
  • LeeGLeeG Eating your first bourne

    I edited the footprints on all the engines I use

    On gscraper pro, there is a tool built into that, which shows search results for footprints

    Only in the pro version though

    Went through and killed all footprints that have low results

    But I have noticed a lot of the engine files now that are in ser have got a lot better

  • ronron SERLists.com
    @baba - It isn't so much about adding new footprints as it is about removing the bad ones. I did it across all engines, and I would estimate roughly that I got rid of a third of the footprints.
  • Thanks ron understood.


    LeeG you are using private proxy fro scraping through gscraper or public? if private then how much and from where?

    and how much minimum result you sent for any footprints in gscraper?

    again Thanks for answering my quries.
  • LeeGLeeG Eating your first bourne

    I don't use gscraper that much.

    I getter better results letter ser run on its own

  • ok thanks
  • edited June 2013
    Wow guys! Guess what? Just wow!

    image

    As per @Ozz statement, I've already made $117.38 while sleeping! Lol! This new CB made me laugh any time. This sure will boost the sales of CB. Great Sven. 

    Too bad I forgot the freak settings about getting public proxies every 4 hours to feed gscraper. SER crashed so I had to end its process... Lost the submission and verified ones. Damn!

    Well, till yesterday I was eager to update whenever new version is realeased. But now I'm scared and tired as hell to re-update all my engine files :(
  • @Alex Please post again your S/V at the end of the day. Eager to know S and V for the day and if you're able to maintain LpM at that rate.
  • @gsarver  and @ozz
    Please you  let you me know if my configurations of CB and SER about captcha are corrects? I've applied the same conf like @gsarver, but my LPM is 0.5. I've also edited engine file. So I guess it's due to captcha parameters
    CB Config:
    image
    GSA Option(Tier 1):
    image

    Submission Settings (I have 25 privates proxies) 

    image
    Thank a lot for your help.

  • - uncheck 'run as webserver'
    - 'only solve is success rate is 55' <-- where is that coming from? lower that  to 20 or 10 (default), imo. depending on how many retries you are doing you might be surprised how the solving rate increase when doing the math.

    1 retry = 2 tries to solve a 20% captcha overall = 36% solving rate.

    if you are doing 0 retries than 55% might be ok, but it feels a bit to high in my opinion.
    at least for your T1s i suggest to use 1-2 retries. for the kitchen sink 0 retries is ok.


    however, i doubt that the settings of CB is causing such a low LPM. feels more like your proxies are a problem and/or your engine selection plus filters. 

    please read every single post in this thread and also those in the sticky 'Compiled list of tips..'-thread. thats the best advice everyone could get as its all in there.


    edit: just saw you that you 'skip' a form field if it can't be filled. change that to 'random'.
  • @SEOMystic: You should change Skip option to Choose Random as Ozz said. About CB, I personally don't filter the % solve rate. My option is unchecked.
    About editing your engine files, pull out the footprints that give low performance. Search for "use blog search=1" and change it to 0.
    Only check the engines that gives you the best results, cut out the ones gives like "hundreds" of verified... It was hard for me to decide to uncheck any engines, because well yeah, diversify our platforms/engines is good but I already have enough, which give me good results.
    Also about email settings you didn't mention, but hotmail gives good results. I'm using hotmail/live/outlook as stated in my post on previous page. 
    That might take a couple of hours but it sure will increase your LPM.

    About me, my LPM has again dropped to ~90-100LPM :( I thought it could maintain the ~190 this morning for the whole day..*sigh*... But anyway, it's 10 times faster than yesterday already xD
  • @Alex Exactly what I got too. About 130 LPM with 125 threads, but drop to around 60-70 at the end of the day.

  • About email, I use hotmail( For each email I have 10 alias)
    The change of "use blog search=1" to 0 concerns only engines which have poor results?
    I use 25 private proxies. I've checked them with SER and they seems to be good.
    OK, I'll apply changes about captcha. I don't exactly check the same plateforms like @Alex, I think, I have to uncheck again some plateforms. (it hurts!!:()

    guys thank you a lot for your help. I'll print my result shortly.
  • @audioguy: Well your LPM is wayyyy higher than me, comparing to the threads I'm runnning is 8 times higher than yours. I'm running 1000 threads with 600 private proxies. It's now stable at 90-100LPM. I swear to myself I won't touch GSA products today to see the results for the whole day. Lol. Just a sneak peek: 
    image
  • lower your threads to 250 for a while to see if its really make a difference compared to your 1000. i think someone posted here a few days ago that "lowerish" threads hadn't such an impact. 
  • LeeGLeeG Eating your first bourne

    Im version hopping at the moment after seeing a big drop in Verified over night

    Something I know Ozz used to hate. So I post comments like that from time to time just to wake him up :D

    My LpM dropped to under 140 with the present release

    But, we all get good and bad days

  • OzzOzz
    edited June 2013
    nah, i don't hate it if people like you are know what they are doing. i just hate it when people are just hopping around without any clue.
  • Omg nooooo 6.04 is out :(
  • @Alex What's wrong? Search and replace take 30 seconds with Notepad++, or have you done heavy duty edits? :)
  • I've edited engine footprints, not just the use blog search.... now really uncomfortable with updates -___- I will update when there's some more important things. In the next 9 hours I will post my screenshot for 24hr run of SER. Currently: ~93k S and 27k V. Oh by the way I saved 234$ with CB lol!
  • ronron SERLists.com
    edited June 2013

    Alex, just rename your files. You are making this way too complicated. Then obviously have those modified ones checked in projects. And then you never have to worry about regular version updates overwriting your engine files.

    Next, you want to keep track when @sven actually changes something inside an engine file. So keep a little text document or spreadsheet with the names of the engine files (that you have modified) and the 'date' of those original files. Then all you have to do is occasionally look for date changes to the original ones you modified. The reason you would do this: @sven does improve engine performance when there's a problem, so you want to make sure you update those improved engine files. Then you can turn blog engine off in that new file, and save it as your new modified file. The files with that particular code is very small.

    The actual changes to engine files are not too frequent, so this is a thing you can manage without extraordinary effort.

  • edited June 2013
    Damn it I woke up late (facepalm)
    image

    Anyway that's around ~110k submission and well ~37k verified...

    @ron Thanks, I will find a way to work around this.
  • Just realized how dumb I am (facepalm)
    Here are the closer the look.

    119k S 42k V This ratio is sick...

    image

    image
  • @Alex Agreed. That's high percentage of Vs there. So besides editing engine files, you also chose engines with high verifications, right?

    Thanks for posting the stats as well.
  • @Alex what type of footprint query that you remove from the engine? I almost had the same option setting like you (use CB settings too) but only get like 5 lpm, so pathetic.

    I use 300 thread, 40 semi-dedicated US proxy, html timeout 120s, GSA runs on a dedicated server 24GB Ram, 1 Gbps port.

    Literally scratching my head on how to improve lpm. 8-|
  • edited June 2013
    @audioguy: No basically I choose the engines that gives me the high submission. As you know verified links are quite hard to guess, so I just choose what I can submit the most, the hopefully they should get verified. Guess I'm lucky to have that 30% verified though...

    @Monty: I bet you just select all the platforms and all the engines, right? Like I already mentioned above or on previous page... Just choose the one that give you highest performance. Filter out the "hundreds", just check what gives best results. Then on that best results, filter the shit footprints - like giving 100k results, choose footprints that have a huge amount of search results. That's what making my LPM higher. There are just footprints you need to filter, I don't get the "type" of footprints you said above.

    I'm running at 500 threads only, but it still gives me around 90+ LPM. While 1000 threads was killing my CPU and just gives me stable 105LPM. But aww it's just a VPS, should I kill it by raising to 1500 threads xD
  • edited June 2013
    @Alex right on the money! i use almost all the platform inside Article, Blog Comment, Directory, Forum, Guestbook, Image Comment, Microblog, Social Bookmark, Social Network, Wiki. 

    Trying to filter out the shitty footprints engine now. Oh one more thing, do you use your own list of keywords or the keywords scraped from SB using the campaign particular niche? Let's say i ran a project on SEO, should i use the scraped keywords regarding that term or just using a keyword list that has nothing to do with that?
  • @Monty: It's all your choices... If you want to submit to niche-relevant sites, use your keywords. If not, use some popular freaking keywords like "weightloss fast", "payday loans" or so :) keyword scraping is also your choice, if your main keywords aren't enough.
  • LeeGLeeG Eating your first bourne

    What version of windows are you running, that looks like the tardis version

    Just had to check the date on my computer in case I had missed a few days here and there :)

    Those stats are for verified and not submissions 

  • I'm running it on a VPS with win 2008 r2. 

    First pic is for submission: 119k, verified: 42k. Woohoo I popped the 100k cherry ;)
Sign In or Register to comment.