Skip to content

Need help with low LPM

A friend gave me a 5 million personal list.  He says its not been killed by the seo world.

Its been running for a good 12 hours now and the lpm is at 1.04.

These are all contextual type links by the way.

This campaign running now is only to verify good links.

What am i do wrong guys?

Comments

  • bangkokladbangkoklad online
    edited June 2014
    Share the list with me and I'll give you help :))
  • freshly scraped or verified in the first place?
  • bangkolad i would have took you up on that but the person that gave me this list made me swear i wouldnt share it. 

    Freshly scraped list i believe.  I have good fast VPS. 50 shared proxies.  10 campaigns running.

    I have been playing with thread count.  I could get 600 and it only used around 60% cpu.  I've lowered it now to see if it helps but the LPM is going even worse.

    Any ideas what i am doing wrong lads?
  • edited June 2014
    How do you solve captchas?
  • @cozziola‌ of course it was a joke. I wouldn't share it with a complete stranger too :)
  • A freshly scraped list of 5 million does not equal a verified list of 5 million. Your LPM will be awful, but you might end up with a decent sized verified list after it's gone through all your imported urls.
  • @cozziola: IMHO, your list is crapola. Even if the list was verified, it degrades very quickly. A list that big wasn't accumulated over night. At this point its FUBAR - Dead, Dead, Dead. You'd be better off scraping or buying a high Quality List.

    When @Gooner, @Ron, @2Take2 and myself got together, we swapped our verified lists. After we deduped, we wound up with around 1 Million verified. We then tested the list, and the LPM was awful - 90% of the list was dead. The list had gone bad. We were absolutely shocked.

    That's how we got into the list business. We pooled our efforts building verified lists, and decided to monetize it. Building a killer list is a big investment. We spend over $1200/per month on expenses (servers, software, etc) to build our lists, not to mention manpower. Lists don't build themselves.
  • Looks like I best build me a list! I just watch mat woodwards video on gsa. Very good! One thing I found crazy on the video is that he set gsa to find new places to publish as it is publishing. I need a list of some sorts to start a campaign right? I can't have gsa find me some and go from there? Anyone have any tips when it comes to the anchors and keywords? How many keywords should I be using roughly? How about the anchors.
  • davbeldavbel UK
    edited June 2014
    @Cozziola have you got scrapebox or gscraper?  If not, get one of them today, learn how to use it to get it scrapping and creating a list and within a few weeks you'll have more targets than SER can handle.

    Your other option is to use SER as a scraping tool and set up a campaign that posts to something like amazon/bing/youtube video/made up URL/other untouchable site. 

    Create a project that uses all site types and use a massive KW list - if you look on the forums you'll find one with 100k kw.  Then when you've set it up and got it working, duplicate it 10 times.  Set each one of these projects to search from different language search engines.  Set them going.

    With all you subsequent projects, go to the options screen and set the global list to be the one your scraping projects have been been saving sites to.

    Doing this means your 1st 10 projects are building your list for you and your new ones will have enough sites to post to.

    Finally, the easiest option is start buying lists.  Buying lists is the quickest and easiest solution, but as @Satans_Apprentice says they can get burnt out quickly.  If this is the route you choose, then I can vouch for SERlists

  • I own SB davbel.  I think i am going to buy the course of videos on here for $49 that teach how to scrape.   I will hopefully be a mega scraper after watching haha.

    I am not sure what you mean about posting and scraping other languages but i will read up on it.

    I have just bought a 1.2 million unique and contextual link only gig.

    My gsa is still working through the 5 million list.  Almost 24 hours deep now.
  • Its just finished this second :)
Sign In or Register to comment.