Skip to content

Why do I have a so low LPM ?

Hello! I am running GSA SER on my VPS which has over 20 GHZ and 6 GB Memory at 1000 threads and with 200 dedicated proxies. However I have a LPM of 15. I have scrapped and imported over 6 million unique domains on the software but I still get a very low LPM.

For the tier 1 I only make backlinks on domains which has a PR 0 < 1 and for the tier 2 I don't mind for the PR. I also only build backlinks on do-follow and only contextual platforms.

Could anybody please help me out? I would really appreciate that, Thanks in advance!

Comments

  • "Good LPM" = "Quality Target URLS (for example prepared verified list) + No Filters + Tweaked Engines"

    Apparently you could do wonders with SER scraping also but it requires immense tweaking in my undestanding. 

    1-3 active projects would never give you any decent LPM at all. 

    Try this: 
    1. Take dummy project
    2, Turn off all filters
    3. Disable engines with not many available links

    ;)
  • Yes I understand that I can have a higher amount of LPM if I remove every filter and start making spammy links which aren't even so good for my tier 2-3 or even 1. Do you think if I remove the filters for my tier 2+ the seo quality will still be good? What are you settings actually? I saw some people having over 400 LPM... Also, I do not use Link Shortners.

    Thanks!
  • ronron SERLists.com
    edited July 2014
    @bymm4 - Running scraped lists always have a low LPM - that is very normal. The LPM on a scraped list is always way lower than if you let SER scrape or if you run a verified list that you bought.

    The best thing to do is to split up that file into chunks, set up some dummy test projects to act as your processors, have no filters on these test projects, and give each test project a chunk of that large amount you scraped. Have them run simultaneously. Let them find the good links, and write them to verified.

    Then have your real projects run using 'sitelist:verified'. Then you can let the real projects pick and choose the proper links based on your filters. That is the only sane way to do what you are doing.

    Never process a raw scrape against real projects - ever.
  • So what should I do now? I have processed my raw scrapes against my real projects :\. How can I fix this all this mess now and start getting a high LPM ? Your help is really appreciated, Thanks!
  • ronron SERLists.com
    @bymm4 - You didn't create a mess. You were just doing this inefficiently. So now you just stop doing it the way you were doing, and create some test projects to wikipedia or yahoo, with no filter, new emails, no limits on the linkbuilding, and just give each project a portion.

    To be honest, if you plan on doing this a lot, then I would create at least 20 test projects. Just set one up the right way, give it a name like Test -1, and then just clone it a bunch of times. SER will keep giving it a new number.

    Just make sure you create the first test project from scratch, so it has no history, no cache, etc. Check all engines. In other words, create the perfect spam test project from hell - before you clone it.

    And after they are all cloned, give them all fresh emails as a group. Then just let it run.

    In the meantime, just change all your real projects to use sitelist:verified.

    And make sure to dedupe your verified sitelist on the URL and domain level each day. Keep that file as skinny and perfect as you can so don't constantly have "already parsed" messages. 
  • Hmm, ok sure :). So you want me to create a spam project and disable every engine filters then duplicate that project and split my identified links ,which I've imported from my scrapes on GSA SER, to all of those test projects. Is there any quick way to split the identified url list to multiple projects on GSA SER? Thanks!
  • ronron SERLists.com
    edited July 2014
    @bymm4 - I think you just have a giant file from the sound of it. I would just copy a chunk, and then Import Target Urls from clipboard. Let SER identify it as it processes the list you imported. 
  • is that 1 project per chunk?
  • @ron - Your help is really appreciated! Thanks for everything! I am looking forward for subscribing to SERLists.com soon!
  • ronron SERLists.com
    @adamze - Yes.
  • @ron GSA is only going to post so fast, or as fast as your hardware will let it. Is there really any advantage to splitting up large verified lists to run through?
  • ronron SERLists.com
    It isn't a verified list. We were discussing how to run a raw scrape efficiently.
  • @ron what about tier2 ? What are your settings here: http://prntscr.com/3ywa6o to let GSA SER make backlinks from a domain to every tier 1 link?
  • ronron SERLists.com
    Guys, I'm heading out on vacation. Back Monday...
  • NO SIR!! WE HAVE MILLIONS AND MILLIONS OF QUESTIONS YOU MUST ANSWER!!!

    j/k have fun on vacation. We have no life, we'll be here. 
  • @ron NOOOOOOOO You can't leave us like  this.. What will we dooooooo
Sign In or Register to comment.