Skip to content

Links Per Minute

189101113

Comments

  • ronron SERLists.com

    So everyone take note.

  • edited March 2013
    Without Gscraper pro my efforts are useless
    But I don't think I have a problem with having to search too often. I don't know why my LpM won't exceed 50
  • LeeGLeeG Eating your first bourne

    On the pro version, there is a tool that shows the results for footprints

     

    image

    You can see from that, some terms will throw up a lot of results

  • edited March 2013
    That's awesome. But then how do you know which footprint to put in each .ini file? That must take a lot of time. Theres hundreds of files in the engines folder but when you export them with santos' tool they get lumped into one big file
  • LeeGLeeG Eating your first bourne

    That's why I make ser dance the way I do.

    I spend the time doing it right.

    Edit each ini file, one at a time

    A few days spent doing the boring stuff and months sitting back wondering what I can tweak next

    There are no quick ways to getting the results I do on a daily basis. A few have been added recently like being able to copy engine selections from one project to the next, but you need to spend time with ser to make it run sweet and fast

  • edited March 2013
    Dude, that's crazy. All I can say. No way I'm going in there and editing all of those files. I just scrolled down... & there's a lot of damn .ini files.

    So I got the 5.22 copy.
    I got the 7 Google SE's
    I got the Global Site Lists
    I got only the successful Engines to post to
    -------------------
    50 LpM
  • OzzOzz
    edited March 2013
    maybe you won't get more LpM with the engines you've selected ATM. if its just about the numbers for you than add guestbooks and blog comments.
  • ronron SERLists.com
    edited March 2013

    But @king818, that's the kind of stuff you do if you want to make a ton of submissions. All you do is stick a few footprints in GScraper with the same keywords, and see what yields the most results.

    Then you look at the results and change that engine. Then keep that engine named and in a separate folder which you can use to copy over the original.

    And if you keep using 5.22, you make the change once and you're done, as long as you don't update. Or you just copy over after an update.

    This is the kind of busy work that pays dividends every day going forward.

  • @Ron I'm willing to do it I'm just not sure It's possible with Gscraper Free.  I can't figure out if you're talking about Gscraper Free or Paid.

    @Ozz I might have to add guestbook and blog comments. Do they really crank up LpM? So far I've been sticking to more quality links
  • free does it. get 200 free proxies from BHW, test them in the proxy test of SER, then load them in gscraper free. export footprints from SER or use som found ones of BHW........ let it run for a day. done
  • i just said this as you seem to be obsessed by numbers ;)

    just stick to your quality links for T1 and for the other tiers lower your standards regarding quality links
  • @Ozz Sounds like a good compromise :)
  • LeeGLeeG Eating your first bourne

    The free version does not have the footprint checker on it

    You can import footprints, but it does not have the bit in the screenshot above

    That's the joys of the paid version, you can see how many results are possible with the paid version

  • ronron SERLists.com
    edited March 2013

    Blog comments sure do. I love them on lower tiers. I seem to be able to get a ton more blog comments than other kinds.

    Paid (Pro). $38. One time fee. You get the free use of their proxies and 7 day trial when you pay. So if you are in the mood and have the spare time, you could take advantage of that offer and really help yourself out.

    I would just make sure you get organized ahead of time, grab the footprints, know which engines you want to target, yadayada, so when you pull the trigger you come out swinging.

  • If I just go in Google and type "Powered by 4images" "Agregar comentario" or whatever the footprint is, and check how many results it has is this doing the same thing Gscraper does albeit slower?
  • LeeGLeeG Eating your first bourne
    Similar idea, just slower :D
  • Ok so now I'm starting to understanding. Now using inireader, Santos's tool I get 1240 lines of footprint.
    When I go in say, the general blogs.ini we get crap like:


    [setup]
    enabled=1
    default checked=1

    engine type=Blog Comment
    description=Submits a comment to the blog and leaves a link to your website with a defined anchor text.
    dofollow=2
    anchor text=1
    creates own page=0
    uses pages=0

    How do I edit these .ini files to remove the footprints I don't want?
  • scpscp
    edited March 2013
    @king818 - I would familiarize yourself with the Search box in the upper right hand corner :) Most of your questions have been asked and answered numerous times.
  • edited March 2013
    I have yet to find a thread on this forum which elaborates upon editing the .ini files. That's pretty advanced SER stuff. Additionally searches for ini and footprint yield no applicable results . 

    I have the footprints I like and don't like. I just don't know how to edit these files appropriately
    Thanks for the help guys :)
  • LeeGLeeG Eating your first bourne
    If you click the help button on ser there is a nice script manual which will help you
  • When all else fails, check the thread your posting in. I'll give you a hint, it's page 6 ;)
  • edited March 2013
    Thanks guys, I think I'm good. This stuff may be too complicated for me anyhow that's why I'm trippin. I think I'll stick with my 80% SER optimization :) You guys are experts on another level 
  • LeeGLeeG Eating your first bourne

    The private internet marketers forum Im a moderator on, the guys have called me the Dr Frankenstein of ser with the tricks I pull out the bag

    Each time Im getting closer to the half million submissions in a day, without list feeding

    Some tricks can be easy, others just take time to implement and test

    This ones easy with the right tool for the job, just time consuming

  • ronron SERLists.com

    @king818 - Just to give you an example. Open up the blogtronix.ini file in Notepad++. About 10 or 12 lines down you see this (notice it is in a modified spin syntax using "|"):

    search term="Sign up with your email address. There are already * registered members."|"Powered by Blogtronix"|"Attached Image: " "Powered by Blogtronix"|"Powered by Sharetronix"|"External Profiles" "Last online" "About Me"|"users can communicate using quick status updates of 160 characters or less." "This free flowing dialogue lets you send messages, pictures and video to anyone"|"It`s also easy to find and connect with other people for private threads and to keep track of their updates."

    Each one of those phrases between the pipes is being used to find targets. Now say you run some scrapes in gscraper or SB.

    You setup each scrape with only one of those phrases. And get all of them running.

    Now let's say you find that the 3rd phrase gets you 90% of the total results. Now don't you think it would be a tremendous improvement in efficiency to remove all the other phrases except the good one?

    It's that easy. Time consuming? Yes. Worthwhile? A big Yes.

    Now let's say that those footprint lists I mentioned earlier had 10 other footprints for this engine that are completely different and unique.

    Test those as well. It's just a scrape to find out the answer. Which ones are worthwhile, and which ones are crap.

    Please don't turn it into rocket science. You are short selling yourself. These are the kind of things you do if you want to win at this game.

    It is mindless crap that will get you more links and make you more money.

  • Tks for this post ron

    Tommy
  • AlexRAlexR Cape Town
    @Ron & @LeeG - right on! It looks like we have come full circle again to my earliest concern on SER, the huge overlap in SE results and the continual parsing of them again and again and again with a slightly modified keyword.

    Do you guys know of a tool that does the following:
    1) Takes first keyword as master keyword/baseline.
    2) Takes a bunch of keywords and checks in an SE what percentage of unique results compared to master they generate. 
    3) Has option to delete anything below X %?

    I have been looking for something like this for 6 months! 

  • Awesome advice Ron and Lee G. Fiddling with my settings as I type!
  • why do you make it so hard for them........ just let them use gscraper and import the results in some lower tier and build the verified lists.........
  • LeeGLeeG Eating your first bourne

    Again your missing the point thisisalex

    Why run a secondary link scraper and use more resources?

    The whole of idea of ser is to be able to scrape results and post links with no need for extra methods of adding link targets

    A simple set and forget method

    Setting up scrapebox or gscraper to scrape links is missing the point of what ser does best

    All the time your using tools like that to import lists, your not spreading the links very widely

    The more keywords you use, the more results you pull

    The better the footprints, the more places you pull to place links

    You might be lacking the will to do things properly and gain good results, but don't think everyone else has your same lazy attitude 

  • edited March 2013
    Just to mention your own words: I wonder about contradicting your own stament now:

    https://forum.gsa-online.de/discussion/2325/can-i-speed-up-the-importindentify-function/p1

    "

    Ozz, dont give them any more methods. They will be desperately
    confused and trying to over complicate things by adding g force calculus
    and pi x infinity type calculations to work out how many times a
    sparrow flaps its wings and then adding that to the final method of
    getting the best footprints to use

    Not saying ron does that, but I might be implying it

    A little trick with gscraper free to get your 100k results maxed is to remove duplicate domains at scraping

    Im testing the free version at the moment to see how good it is

    Its more basic than scrapebox. Less confusing for us old wise ones

    Im also running a proxy scraper 24/7 now to feed it fresh proxies on each run.

    If any asks, Proxy Multiply. 7 day free trial

    "

    "
    LeeG

    February 21
    Flag



    How do some of you lot posses the ability to breath at times

    Come on, scrape a list with gscraper free version, use the free tool by Santos to extract the footprints

    Leave that running over night with its 100,000 url limit and bang, following day add them to a t4

    Easily sorted into the two main site list areas.

    At worst you have a load of junk links on a lower tier

    No doubt given time you will find a method to complicate a simple task by adding your quantum physics girly tool methods

    "





    and still. I am helpful, and have positive attitude. you are insulting again.
Sign In or Register to comment.