Skip to content

My optimised engines.

13

Comments

  • @sonic81 I compared most of them and only saw modifications to the search terms, my reply was something in between asking for a confirmation and asking for opinions about this, I don't know if Zeusy didn't think about optimizing this or there are reasons to don't do it (maybe not worth the time).



  • ronron SERLists.com

    @kaene, I find your comment very interesting:

    "I found that there is a lot of room for improvement in the "page must have" and "url must have" options, and GSA SER is finding good urls because of the search terms, but then not finding the right engine because of wrong "page must have" and "url must have" options. For instance, I found some Pligg sites where I could post with other tools but SER was giving me a "no engine matches" "

    I know this is important and I'm glad you mentioned it.

  • Noob question :

    Is it ok to check all double engine.ini files ? (SER default and Zeusy`s?)





    image
  • Good morning,

    kaene

    To confirm I just updated the footprints, keywords and to add keyword to the search. Yesterday, I have another look at the likes of Article Beach engine and to be honest the page must have =
    Needs also updated on many engines. Using google and a new footprint I found over 1million pages, but when I told the engine the search term and tested it didnt identify many of the article sites, which confirmed  they need updating.

    lrichard2112 you can use both, be you may get a lot of already parsed, I would go with one or the other.

  • edited April 2013
    Isnt the option "Put keyword in quotes when used in search queries" should add + "keyword" at the footprint?
  • edited April 2013
    Im thinking it might be greatly benifical if he egines have more than just 1 search string section.
    We have in Search section Syntax options for :

    Site:
    Inurl:
    link:
    Intitle:

    I would recommend another intext:

    In the engines files would be nice to have separate "search term=" sections
    Example:

    search term site:=
    search term inurl:=

    This would allow for focusing footprintsin the SE's. Without wasting resources.

    Example, if google is my search engine, it can handle the inurl:
    While other search engines cant. Which is waste.
  • Is there any way that you can add the mini image of platforms on the left of our backlinks (from Last verified url field)?
  • hey @Zeusy keep improving great job
  • edited April 2013
    what you guys are doing with gscraper, can this be done with scrapebox too?
  • LPM went from 10 to 5 :/ Guess I'm missing something, btw what VPS are you guys on that are getting 50+ LPM?
  • hmm

    i use only your edited engine.ini and i cant get any verified from 24 hour run.

    http://prntscr.com/yt433
  • He only changed the search footprints so it must be something else. Check your email maybe ?
  • @Irichard2112 Have you got your verification set to 1440minutes?

    I just switched from public to private proxies, I didn't expect this big of a difference at 30 LPM,

    20 private proxies > 500 public apparently :P
  • @Musk : yes its set 

    http://prntscr.com/yu0dz

    Dunno why this is happening
  • ronron SERLists.com

    I know @Ozz would love for me to set him up here (and he even has a special banner for this that he hand made), but:

    PUBLIC PROXIES SUCK

    Translation: Don't use them.

  • ronron SERLists.com
    @svobada - The answer is no. SB cannot help for this type of thing. 
  • @Ron so what zeusy did with the engines, its only working useing with gscraper? Or it can be solely use with the internal gsa scrapeing aswell?
  • ronron SERLists.com

    @eLeSlash - What @Zeusy did specifically helps users of SER. Every person on this forum should get Gscraper, obtain top notch footprints, and stick those in the engine footprints of SER.

    This is one example where people should not rely on one another. The people who do this will very much benefit in getting more links and higher LPM. 

  • @zeus, did you got also their proxy service or just use semi/private proxies?

  • I feel like I shouldn't be going for MORE results as much as going for footprints that yield a high V:S ratio.
  • @Ron i know that, but i still dont know the answears for my questions
  • unikbitI use private proxyies only for everything.
  • @Zeusy engines working great got alot of submissions with your fixed engines thanks
  • Again... For what you guys do you use GScraper?

    I've made an idea about what to do with @Zeusy files... but I don't get the point of using GScraper here.

    I'm a missing something?
  • ronron SERLists.com
    edited April 2013

    The bottom line is if you have better footprints, you will:

    • Find more targets
    • Submit more links
    • Get more verified links because you submitted more links
  • So GScraper is used for finding more footprints?
  • ronron SERLists.com

    Not for finding footprints, for testing footprints.

  • You lost me here... LMAO
  • Which part is confusing you.   The footprints in general or testing the footprints?
  • I know what footprints are, but I don't know how to test them? Or how to find more of them!?
Sign In or Register to comment.