Skip to content

Proxies, HTML Timeout, Threads - Max Efficiency

1456810

Comments

  • edited June 2013
    Since disabling all PR checking I'm not getting banned anymore, but previously I was getting 30-50 proxies banned pretty quickly only running 30 threads. My server host thinks "G" changed something in the last update because they have been noticing bans much more quickly than before.

    @ron You are absolutely correct. Good call. ^:)^ This project is building links to about 24 Web2.0s I built with SERengines a couple weeks ago, so I'm letting it build links to some blogs and directories also. I opened my submitted list and sorted by type. It looks like the blog comments are about 30% of the links. My verified will probably be lower because of it. Here is what I have checked on the project:
    image

    I have now unchecked the blog comments to reduce the %. I left the project running over night and here is the new screenshot this morning:

    image

    I really have to say holy crap, CB is good! The solve % on CB is pretty impressive.

    @LeeG I finally figured out why you were testing CSX as a secondary to CB. Simple math really. If you look at the screenshot above the average solve time is 0.346 for CB. 60 / 0.346 = 173.41, so it would be impossible to break the LPM cap of 173.41 with an average solve time of 0.346. By using CSX as a secondary you pick up some of the slack and CSX can grab the retries while CB moves on to new captchas.
  • @ozz >>> i wonder if Google have combined proxy bans for PR checking, searching and recaptcha for instance? in the past those were handled seperately but things might have changed. 

    I didnt even use them for PR checking. Maybe they just have an eye on advanced operators now (since 99,9% of all human internet users dont use them) and ban proxies for that. Just a vague theory of course, but if i was G i would do it ;)
  • OzzOzz
    edited June 2013
    to what i've witnessed the "inurl:" operator was causing bans faster in the past already. you could see it when working on footprints all day long and as soon as the "inurl:" operator was used frequently you got a "are you human?" message pretty soon.
    i don't know if they have tweaked something more in that direction, but i don't see any differences yet and everything works fine.

    what about recaptcha? or translations? all this things could add up when google is logging all IPs across all services now. using recaptchas (with retries) regularly or make use of the #trans macro frequently could be a cause for bans MAYBE.
  • LeeGLeeG Eating your first bourne

    In all honesty, I don't use csx any more

    More programs run, means more resources used. The same resources better used by ser and cb

     

    These results were gained with nothing more than ser and cb

     

    image
  • ronron SERLists.com
    edited June 2013

    @Lee, that is an insane number of verifieds. I know you have done more work than anybody on getting that number up high. ^:)^

    Any tips for the rest of us mortals on how to boost up verifieds?   

  • LeeGLeeG Eating your first bourne

    Two words "hard work" and another two "use brain" :D

    I have shared the methods I use to hit those numbers. Same method as getting a decent LpM

     Stats pages like this look good, but they are short lived. Soon as you kill the dead links on t2, the pretty patterns die :((

    image

     

    I average 50k verified daily and on a good day 90k

    Readable content don't get deleted when checked by humans as well

    Last two or three days I have been testing engines again, test, check stats and kill the crap

    Since people keep banging on about good LpM with kitchen sink links, thought I would try adding some to see what happens

    Try a dozen engines at a time, run for 24hrs, zero submissions get binned, try those that are left for a few days, poor verified get binned

    I did say I was working on getting verified up, once I beat the 1/4 million daily submissions

  • edited June 2013
    I've been following @LeeG posts, reading best practices and I've been pulling my hair out :(

    I have done a lot of experiments but I still don't know what I have done wrong to get such low LPM of ~30-40. Sometimes it raised to ~100LPM, but after a few hours, came back to 30 or even worse ~12LPM... My submission per day is around 18k, verified is around 3k.

    I have viewed the list of verified platforms by going to Options - Advanced - Tools - Show Stats - Verified, and chose platforms with higher numbers. I've also unchecked collect keywords, analyse and post to competitor backlinks. But still...

    Please help me out! 

    My machine: AMD 2.09Ghz 2 proc, 14GB RAM, 1Gbps upline.
    Proxies: 600 Semi Proxies from buyproxies. (I use for scraping with scrapebox, too)

    GSA SER Settings: 

    Overall settings:
    Threads: 1300 (I could even raise this to 2500, SER still running well and it's not even close to my machine threshold)
    HTML Timeout: 140
    Use Private Proxies: Everywhere except Verification. No Public proxies. Custom time: 4 sec.
    Captcha: GSA CB inside CB I use 3rd party service: Antigate. Toggle ReCaptcha always use 3rd service for saving time...
    Indexing: Only check Use Indexification
    Filter: Checked 6 first sites. Update interval: 1440 minutes. Maximum filesize: 4MB.

    Detailed chosen platforms: 
    Article: BuddyPress, vBulletin Blog, Article Script, Drupal - Blog, UCenter, Wordpress Article, XpressEngine.
    Blog Comment: Blogspot, General Blogs, KeywordLuv, ShowNews, BlogEngine.
    Directory: Aardvark, cpDynaLinks, EasyLink, phpLinkDirectory, phpLinkDirectory-Login, PHP Link Manager, eSyndicat, Freeglobes, indexU,
    Forum: Aska BBS, Burning Board, E-Blah, ExpressionEngine, FluxBB, e107, Vanilla, MyBB, phpBB, PHP-Nuke, PunBB, SMF, vBulletin, XenForo, XMB, XOOPS, YaBB
    Guestbook: AlexGuestbook, Advanced Guestbook,AkoGuestbook, Basti Guesbook, BeepWorld Guestbook, BurningBook, Easy Guesbook, Guestbook, write2me.nl, PhocaGuestbook
    Image Comment: DatsoGallery
    Micro Blog: Blogtronix, StatusNet
    Pingback & Trackback: all
    Social Bookmark: Pligg, Drigg, Hotaru CMS, PHPDug, Scuttle
    Social Network: Dolphin, Elgg, JCow, Oxwall, PHPFox, Plone, Ground CTRL, NING, SocialEngine
    Wiki: all

    Projects Settings: 

    Data: 
    Use a random URL from above
    Use anchor text variations with: 20%
    Uncheck "Automatically insert your URL at the bottom"
    Insert up to: 4 random URLs for random words
    For keywords, I take 40 keywords then paste it to scrapebox keyword scraper, import those to Keyword field (around 3-5k keywords per project).
    Others data fields are filled normally or defaults
    Options: 
    Ask all services to fill captchas - Chose Random
    Verified links must have exact URL
    When to verify: Custom time - 1440 minutes
    Send verified Link to Indexer Services (I use Indexification)
    Try to always place an URL with anchor text
    For TAGS use: Anchor text
    Search Engines Use: randomly 9 Google (ie: BG, BH, BS, BR, ES, UK etc...) no international
    Use URLs from global site: Verified
    Others are unchecked
    Email  verification: 1 email, live or hotmail or outlook. I check this daily to see if blacklisted or not. Checked: delete messages when verification link was found, Delete message if older than: 3 days.

    Please point out what I'm doing wrong... I really need some advices :( Thanks alot guys!

    Edit: Some screenshots:
    image
    image
    image
    image
    image
    Some platforms are too long to take screenshots. Please have a look at the list I've stated in the project settings above. I hope to get other new users getting good results.
  • OzzOzz
    edited June 2013
    whats your LPM without any 2nd captcha service and "skip hard to solve captchas"? i bet that using antigate with humans answering captchas will decrease your LPM a lot. just do the math if 30% off all your submissions are recaptchas and any solve for those captchas lasts >10 seconds instead of skipping them and do the rest with CB and <0,5 seconds on average.

    if you want do it like LeeG than do it without 2nd captcha services.
  • LeeGLeeG Eating your first bourne

    It took me about three months of testing and analysis to get ser to 200k plus submissions daily.

    A lot of work went into it in the days I went from 6 LpM upwards

    I remember when I popped the 100k cherry with ser and thought that was good :D

    Bit like breaking the sound barrier

    I know editing the blog engine files can give a boost. One edit on each file, change the "use blog engine" or how ever it is worded from a 1 to a 0

  • How much do 4281 private proxies cost? Are you sure these are not public?  If they are that is your bottleneck I would have thought?
  • edited June 2013
    Thanks you guys for your comments.

    @Ozz: I will try disabling captcha services and tick skip hard to resolve and see if it helps. 
    @LeeG: Okay I'm gonna do it, thanks for your advice. But I have to backup this right? Because whenever SER updates, it overwrites all these. Edit: I've just had a look at Article Beach for example, edit a project, double clicked the Article Beach, search for "use" but nothing even close to "use blog engine". Could you please do a sprint search? Or am I searching in the wrong file?
    @Brumnick: No, I only have 600 proxies from buyproxies, the rest of the 4281 showing there is public proxies which I use them for gscraper. I use private proxies for all actions.

    Note that on my projects above I don't feed them any lists. Just global sites (around 300k) and SER itself scrapes on google. At the time I compose the above post, it was around 21LPM, and now it dropped to 12LPM. Damn!
  • wow! How much do 600 private proxies cost?
  • Also, how many projects are you running?
  • LeeGLeeG Eating your first bourne

    Any engines I use, I rename the files, then update at my own leisure after Sven tweaks them

    Takes a bit of time setting up initially But it also makes mass deletion of poor performers a breeze.

    Kill one engine file and its job done

  • edited June 2013
    @Brumnick: It's not much as you think =) but let's focus on my LPM... I've stated above that total 9 projects running.
    @LeeG: You mean I should delete engines that I don't check, right? 
  • whats your average solve time in CB?
  • @Brumnick: My avg solve time in CB is 0.374 sec.
  • LeeGLeeG Eating your first bourne

    What I do is an initial test with the basic engines

    Use out the box engines so to speak.

    The ones I keep and use, I then go through the footprints and only use foot prints that return high results of searches.

    That then leaves you with another problem, each ser update, the engine files get over written

    So I edit the names of the engine files and add a simple -me on them

    Each ser update, those engine files remain untouched

    My engine files folder has a lot of engines like this in it

    image

    Then, if I find an engine that's not performing that well after I decide to use it, I can delete the "-me" version and its deleted globally in seconds

     

    The other day I spent two hours editing engines to the latest version

    Only to have a ser update done with a load more engines edited. Trust me, I was so happy :D

  • if you want to just replace one line in your script like "use blog search=1" i suggest to use the "Find in Files" tool of notepadd++.
    just do this after each update and your are good.
  • Just spent 2 hours editing footprints of the engines, changed use blog search=1 to 0.

    Check skip hard to resolve captchas, get rid of human captcha service.

    Now gonna let my projects run overnight, let's see if tomorrow morning there will be sparkles...
  • Lee,

    You should just sell your engine files! ;)

    Seriously, though... Thanks for all the advice and tips. I have been tweaking/testing based on the info that you and others have shared and I am slowly but steadily climbing up in LPM and verified links.

    It would have taken so much longer if not for the "nudges in the right direction" that has been shared, So, I appreciate it!

  • edited June 2013
    Ha! Guess what? I couldn't sleep though I sat up and check stuffs. Here is the result. Big thanks to @Ozz and @LeeG. Let's see how many verified tomorrow night. Still need more tweaks to catchup with LeeGendary lol

    image
  • LeeGLeeG Eating your first bourne
    Wait until you rename engines then see messages in the update notes like today "updated some engines" and then look at the dates the files were updated and find they have all been updated 8-} :((
  • go to bed. you already earned $24.32 while sleeping ;)
  • ronron SERLists.com
    edited June 2013

    Speeding Up LPM - Another Easy Method

    Another way to get a lot of links built is to understand the nature of your projects.

    For example, let's say you have 12 T1's, and 60 junk tiers.

    You know you have low link limits on your T1's, and their link requirement will get filled pretty damn quick. The problem is these projects are pretty much on pause the majority of the day because they got filled very quickly at the beginning of the day.

    Instead, start your day with only the small projects being active => the T1's, and any other low-link project aimed at your moneysite. Give that 2 hours (or less) to get those done first.

    Then stop SER, inactivate those T1's, and now activate all other junk tiers. And let that run for 22 hours. Now you will see some serious LPM.

    This is not to replace what @Lee said. This is something you can do in addition to what @Lee suggests.  

  • LeeGLeeG Eating your first bourne

    Tried that idea ron and it sucks :D

    I found you get a better LpM and submission rate by running on scheduler mode

    The t1's hit their max and stop anyway

    But that's just my own experience

  • ronron SERLists.com

    I'm doing it right now, and it is rocking - but I am using scheduler. So scheduler for the T1's, and scheduler for the junk. I'm just separating the runs.

    For whatever reason that I can't explain, some of my T1's have a hard time getting their orders filled when mixed with junk tiers. It's almost like SER wants to keep going fast, so it favors the junk tiers even though I have them on lower priority. When I just run the T1's together, I get all the links I'm supposed to have. And that's what started me down this path.

  • LeeGLeeG Eating your first bourne

    So highlight the t1's and run the second scheduler option then :D

    I still miss the early days of the scheduler running in order. You knew all tiers were getting a fair bash then

  • Thanks LeeG and Ozz

    I have checked and found that 90% engines already set to "use blog search=0"
    then why we modify engine files?
  • LeeGLeeG Eating your first bourne

    I personally don't use any blog engines listed in the search engines I use

    So why have engines set to use blog engines

Sign In or Register to comment.