Skip to content

Links Per Minute

1810121314

Comments

  • to get high LPM do we need to scrape sites using other tools?
  • LeeGLeeG Eating your first bourne

    I dont do much link feeding unless I see a drop in certain link types personally.

    My vps is running flat out with ser and cb

    I have found you pull more results just running it with a good search engine selection and plenty of keywords

     

  • ronron SERLists.com

    @micb - It all depends on the competition. If I am steadily increasing rank with just a T1 & T2, I leave it there to see if I get to #1.

    If I get caught in a plateau and it stays there for a while, then that is when I usually start up a T3.

    Remember, you always have the ability to create multiple T2's. You are not just limited to T1, T2, T3, T4, etc.

  • edited February 2013
    I never do multi tier campaign in a go..Its waste of time and resources...best way is steadily create your tier1 links...Then find out your VIP links. I have following criteria for my VIP links

    - Must be older than 30 days [So I am sure its not going to get deleted]
    - Must be dofollow [I never build lower tier for nofollow links]
    - Must have my 'exact anchor text'
    - Must be indexed [I never build multitier links to nonindexed links]

    Once I have got enough VIP links..i sort them according to 'domain PR'. Then I start my seperate T2 campaign for only T1 links which pass above criteria and have high PR root domain..

    This strategy is working amazingly for me..and it dosen't take too much resources..I promote 25-30 sites through My SER and never went above 30-40 LPM ..cos I don't need that much links :-)

    PS - whenever i see my ranking droping for perticular keyword..the first thing i do is to 'POWERUP MY VIP LINKS'..this always push my keyword up...then I try to increase number of my vip links
  • LeeGLeeG Eating your first bourne

    There is an email address creation tool being developed by one third party that will make boosting lower level links easier.

    Stop forum Spam will be on overload when it gets released

    At the moment, if your not using a catch all, you have to create multiple hotmail type accounts and then add a divert to the account that your using to check registrations

    The one thats being developed, you add the email address you want emails forwarded to, add proxies and enter the amount of accounts needed. Which is about 3x the proxies used per 24hrs

    Once done, add the created accounts to ser in spintax

    So you will be able to add more forum profiles etc and cover your footprint more

  • mm i am having a hard time finding new social networks + articles to post, i dont know why i cant find a lot of these, how many verified social network + articles you have @ron or @leeg ?
  • LeeGLeeG Eating your first bourne

    Im not even going to tempt looking for the next couple of hours

    Last time I needed to check a setting when I was on a personal best roll, I crashed ser

    If your running short of a type of link, run Gscraper or Scrapebox to find more target sites

  • LeeGLeeG Eating your first bourne

    Another trick to boost article submissions, is to copy and paste the engine in the engines folder

    Then add that as well

    So your running double the searches for any engine types your low on

  • so you mean i can double check same engine if i copy and paste it again...  mmm smart move.
  • LeeGLeeG Eating your first bourne

    You can also try adding more footprints to the copied one, so your pulling a lot more results.

    There is a thread on bhw with a lot of extra footprints

    http://www.blackhatworld.com/blackhat-seo/black-hat-seo/491042-get-huge-search-engine-optimization-footprints-collections.html

  • @LeeG - Whats the syntax to add additional footprints to engine? Do you wrap each footprint in quotes and seperate with a |?
  • LeeGLeeG Eating your first bourne

    If you edit the files, you will see each footprint is separated by "|" without the "

    Easy enough to do

    Just remember to back up the edited engines. Each update, they get over written

  • Perfect, thanks for the assistance and tip.
  • LeeGLeeG Eating your first bourne

    Just hit my won all time recorded personal best with ser

    image

    See if I can bump those results even more once I start adding a lot more fresh email accounts

  • how do you add multiple email accounts ?
  • thanks for the awesome thread

    how many projects are needed to judge if my settings are running well?

    trying to figure out if i am using gsa properly cause my lpm is like
    only at 0.51 even after tweaking according to this
    thread (i only have 3 projects going though... too early to judge)?
  • ronron SERLists.com
    Way too early. Put up 50 T3's with no limits and then see what happens :)
  • LeeGLeeG Eating your first bourne
    edited February 2013

    Oi less of that. its 27 t3´s I run

    One guy I have been helping is 200k a day on three projects with no tiers 

  • 200k a day on three projects with no tiers?

    lol, i have 3 projects and i do 0.51 lpm...

    need to put up some more projects to see what happens

    thanks @ron @LeeG
  • I can do 180 LPM with just SPAM/JunkLinks..... Or 5-10 LPMs with contextual..... dont worry.. all depends
  • @skyf: how many threads are you using? if you are using <50 than try to use your private proxies for posting only, uncheck to modify "query time" and "use proxies for PR check" and see if thats work out for you.
  • LeeGLeeG Eating your first bourne

    @skyf several people have found that two main tricks can give a big boost in LpM

    Only using about five googles. 4 random + google international

    Plus only using engines that your captcha breaker can crack

    With CB thats easy to do, just by comparing the engines listed in ser setting the engines listed by type in CB

    I personally dont do a lot of the junk thats listed, ie image comments, guestbooks, video links, referrer and pingback. Plus I dont do the web 2´s, since I own other software to create those types of links.

     

  • New personal record:  400K woohoo

    image

    Thanks for the tips guys :)

    Details:
    ~25 Projects in GSA SER (12 new projects and 13  3 weeks old projects)
    - Projects run at the same time 3
    - Switch to next Project after 20 minutes
    - 240 Threads ; HTML timeout 70 seconds
    - SER skips recaptchas
    - 5-10 Googles used per project
  • LeeGLeeG Eating your first bourne

    Thats the downside, your doing about 1 in 40 verified

    I do between 1 in 10 to 1 in 20

    Just think if you was using a paid captcha service with that lot :o

    ewandy, what are you doing for proxies?

  • Verifications are set to 3000 minutes, that's enough for these projects :)

    No proxies, I'm sure I was reported a lot of times to my (dedi) hosting, but seems like they don't really care (good for me :D ).
    I've been with them for many months.
  • BrandonBrandon Reputation Management Pro
    @ewandy do you scrape as well with your dedi IP?
  • LeeGLeeG Eating your first bourne

    No doubt there will be questions asking which captcha software is used

    And there is only one that you can disable proxy types with :D

    Looks like Captcha Breaker strikes again for helping increase submissions :D

  • @Brandon
    Yes, in SER I use the dedi IP only for scraping (Used proxies for scraping before, but those were really slowing me down).
    I scrape separately with scrapebox too and feed those lists into SER.

    @LeeG
    Yes, it's Captcha Breaker (solve rate is at least 15%).
    The max I was able to get with Captcha Sniper was 150LpM. :)
  • So after a problematic sunday today back to full 200lpm posting.. strange.... must be vps speed/shared usage, proxy speed, global internet usage after all
  • BrandonBrandon Reputation Management Pro
    @ewandy that's what I assumed, there is no feasible way that I know of to scrape a decent amount with one IP, but I was open to a new idea!
Sign In or Register to comment.