Skip to content

My optimised engines.

24

Comments

  • yes correct.
  • ronron SERLists.com
    Yes @Villain, and then save that engine selection (right click) and give it a name. And use for similar projects.
  • ok, added a LOT more optimized engines..enjoy and please give feedback on results for everyone's benefit.

    grab here

    I'll await feedback before I do anymore.
  • Thanks Zeusy,
    Will give these a try.
  • Thanks man! Would be awesome if @sven can add this in the next update!
  • Thanks Zeusy and Ron.

    Added new ones too, thank again Zeusy.
  • Getting more submitted links faster then I did before using the new modified engines.

    Tks
    Zeusy
  • edited March 2013
    your welcome, hope there more feedback..if enough people happy I'll do my TA teirs next week.

    P.S the general blog engines tooks hours todo ..but it fricken flys now:)
  • yeah great stuff again ;) thxy
  • is this for only searching or does it also improve submissions ?
  • I only optimized the searching.
  • thanks for the answer. did you add any new footprints or only optimized current ones ?
  • edited March 2013
    many removed, many added.
  • i see. i should recompose my query list with your updated footprints :D
  • if you optmised the submission routinue, then please share :) thats why we're here to help one another. What goes around....comes around:)
  • edited March 2013
    i have self developed google harvesting software
    i don't use GSA harvester
    I use GSA for only submission
    This extremely increases LPM

    So i am using GSA footprints to harvest results with my about 20k keyword list
    I already generated 42m queries ^^


    by the way you have 174 new footprints
  • i haven't been able to crack 10 lpm for using GSA only for searching  / submitting (i.e. global lists off).

    i've experimenting with Gscraper to get lists to import into GSA to build up my global list base.  what is the speed everyone is scraping in GScraper I'm using public proxies i've scraped or their in built ones getting about 1400 links per minute scraped.

    i then delete duplicate domains and import the list in GSA.

    Also Zeusy: I'm getting about 80 Lpm for your modified engines.  cheers thanks for uploading (using global lists turn on though).
  • edited March 2013
    Ron identified I named one of the files wrong, this is correction file.  Ning was named as social bookmark and not a social network...sorry for this error.  Replacement
  • royalmiceroyalmice WEBSITE: ---> https://asiavirtualsolutions.com | SKYPE:---> asiavirtualsolutions
    edited March 2013
    Thanks for all the hard work

    Unfortunately the second last download link is not working

    http://anonym.to/?http://www.2shared.com/file/OvHC_byF/ser_zeus_files.html
    The file link that you requested is not valid. Please contact link publisher or try to make a search.
  • Yes its removed as error in it.
    grab the replacement file in above post.
  • royalmiceroyalmice WEBSITE: ---> https://asiavirtualsolutions.com | SKYPE:---> asiavirtualsolutions
    Thanks  Zeusy
  • Hi, is there any mirror site? I can't download from 2shared. Thanks.
  • whats your email?
  • Thanks for this Zeusy,  I'm seeing a big increase in LPM and verifieds.. 
  • Your most welcome m8ty:)
  • I'm confused, what did you optimize? The engines themselves or the search keywords? Either way, thanks for the help man
  • ronron SERLists.com
    Zeusy optimized the footprints so a lot more targets can be found.
  • edited April 2013
    @zeusy thanks for making the changes i've combined them with my own changes.

    can i ask everyone i know @ron can get high lpm without global lists but what LPM do people get running zeusy engines 12 projects (4 tier 1, 8 kitchen sink):

    a) with global lists
    b) without global lists

    for me

    a) i get 150lpm for a day then after 1 day i suspect when global lists run out of targets and you keep getting already parsed messages the lpm drops down to 50 lpm then on the 3rd day you get down to 20 lpm.  if i clear my target url history on my kitchen sink tiers it shoots back up to 150 lpm.

    b) i get about 5-10lpm.

    i'm running 20 private proxies, 3 second timeout between engine searches, 120 html timeout.
  • @Zeusy did you change the "search term" only or did you also change the "page must have" and "url must have" options?

    I found that there is a lot of room for improvement in the "page must have" and "url must have" options, and GSA SER is finding good urls because of the search terms, but then not finding the right engine because of wrong "page must have" and "url must have" options. For instance, I found some Pligg sites where I could post with other tools but SER was giving me a "no engine matches"

    @svenI guess this is difficult so fix because SER supports all these platforms and it's based in scripts, maybe an option to tell SER to submit to a list without searching for matches, but telling you that list is all one specific engine, something like a "import target urls for a specific engine" and it wouldn't search for a matching engine, just try to submit (will add this to the Feature Requests).


  • You can run a compare on what changes @zeusy made using http://www.quickdiff.com/
Sign In or Register to comment.