Skip to content

Links Per Minute

18910111214»

Comments

  • LeeGLeeG Eating your first bourne
    edited March 2013

    I am not using any hard method of finding the low yield footprints

    I do things for ease and speed

    Extract footprint, import them into gscraper pro and press a button to see how results are returned and then remove the low yield ones

    Now that is being above and beyond technical, I think not

    Its a set and forget method once done

     

    Rather than extract footprint, scrape keywords, pull 100k results

    Add to lower tier, repeat constantly with each engine your targeting

     

    Im not a fan of using scraped lists

    I can get the same or better submission results day in, day out just by letting ser run under its own steam

    And no need for a middle man list scraper and another pc running pulling the lists

  • ronron SERLists.com

    @LeeG you said that about me???

    I was just finishing reading a new book...the Best of Albert Einstein, lol. 

  • LeeGLeeG Eating your first bourne

    As I always say, there is one simple equation to using ser properly

    time + effort = high LpM and verified

    And this relatively easy method that can boost submissions is only time consuming

     

    Proof the above works

     

    image

     

    image

     

     

    Over 5k of those were verified yesterday were WikkaWiki

     

    One day editing engine files with the footprints

    Four consecutive days with double my normal verified

    Not rocket science to do. Even Rons not been able to over complicate this yet

    Give him time on that one :D

  • ronron SERLists.com

    That's a very large amount of verified links in one day.

    Maybe people will pay attention to what you say. :)

  • Thanks for the informative thread.

    Is this the right approach?  In the engines I've targetted PHPMotion as I get reasonable verified / identified.

    I load up all the search term foot prints into GScraper and filter out search terms < 1 million indexed.

    i'm left with 3 terms.

    "powered by phpmotion" Blogs
    "Powered Free by PHPmotion" Blogs
    "Blog Menu" "Create Blog" "My Blogs" "PHPmotion"


    I go and edit the PHPMotion engine and restart GSA.
  • Don't seem to be yielding many results early:

    17:19:22: [ ] 000/001 [Page END] results on google CX for PHPMotion with query "powered by phpmotion" Blogs
    17:19:22: [ ] 000/001 [Page END] results on google CO for PHPMotion with query dating "Powered Free by PHPmotion" Blogs
    17:19:30: [ ] 000/005 [Page END] results on google TD for PHPMotion with query "Blog Menu" "Create Blog" "My Blogs" "PHPmotion"
    17:19:31: [ ] 000/001 [Page END] results on google CO for PHPMotion with query "powered by phpmotion" Blogs
    17:19:37: [ ] 000/100 [Page 001] results on google CO for PHPMotion with query "Blog Menu" "Create Blog" "My Blogs" "PHPmotion"
    17:19:38: [ ] 000/001 [Page END] results on google CC for PHPMotion with query "powered by phpmotion" Blogs
    17:19:46: [ ] 000/006 [Page END] results on google CX for PHPMotion with query "Blog Menu" "Create Blog" "My Blogs" "PHPmotion"
    17:19:47: [ ] 000/114 [Page 002] results on google CO for PHPMotion with query "Blog Menu" "Create Blog" "My Blogs" "PHPmotion"
    17:19:54: [ ] 000/103 [Page 003] results on google CO for PHPMotion with query "Blog Menu" "Create Blog" "My Blogs" "PHPmotion"
    17:19:56: [ ] 000/000 [Page END] results on google TD for PHPMotion with query "powered by phpmotion" Blogs
  • Don't you need blog SE?
  • i've been using that too similar results...
  • @LeeG I noticed you wrote you got 5000 verified Wikkas. That is 5000 spread over many projects right? For a single project I have 102 verified after scraping for a hole night and not allowing to post twice on the same domain.
  • LeeGLeeG Eating your first bourne

    Lower tiers I allow posting on the same domain twice

    If your posting to blogs, how do you know your not going to miss out on a pr6 page

    Im now using blogs as a secondary type link. Got fed up with seeing the damn things in my verified

     

  • I'm not talking about blogs - just doing a wikka run :) So you are not having 5000 unique Wikkas verified just to be sure? (If you do I'm missing out :P)
  • Trevor_BanduraTrevor_Bandura 267,647 NEW GSA SER Verified List
    First I want to say this is a great thread. Just from yesterday to today. I have already doubled my LPM.

    There is one thing I read here about the Blacklist setting that i'm not really sure about.

    Am I supposed to have this "Checked" for ser to check if the domain is on that list and skip submission if it finds it there, or do I have it "Un Checked" to not even check the Blacklist?
  • edited April 2013
    @leeg those are fantastic results, well done!

    Do you just scrape generic keywords (must be at 100k per day???)?

    As always thanks for time and effort and generosity in sharing so much.
  • @Trevor For my clients I have it checked and for my own sites I leave unchecked because I don't care if "unsavoury" sites link to me
  • LeeG
    my question is
    what is your bandwidth = connection to www ??

    I also strictly use GSA to do all the work and get excellent results = MUCH better than ANY other tool ever used before (UD/AMR and other online services)

    but my limit seems to be the number of threads I can run

    in bad times I have 8-12 threads
    in best times 22-44 threads

    living / wkg in Cambodia,
    depending on daytime / weekday and ISP I use / I need at least 4 different mobile 3G providers to assure qn almost 100% up time)
    my bandwidth varies around 250kbps - 500kbps upload and 1-4Mbps download MAX

    my max submission per day is in the range of 800 and 10-15% are verified

    I am almost sure your results are achieved with much more bandwidth
  • LeeGLeeG Eating your first bourne

    No idea on monthly bandwidth in all honesty. I did monitor when I first moved to a vps and the totals reset at the beginning of each billing cycle

    And since I don't get any warnings about maxing out my bandwidth from the company I get my vps from, its one less thing to be concerned or worry about

    From memory it was a couple of tetra byte a month or a couple more for good luck

    Internet connection is 100mbt on my vps

  • Hello every-one!

    I've been using GSA seriously for about a month now. I just moved from Berman into a Powerup hosting.

    Here's how my numbers look like at the moment:
    image

    Now, I've been reading this forum (this thread especially), and playing around with GSA for the past 12 or 13 hours straight, and here's where I'm at:
    image

    I'm running all of this from a VPS like such:
    - 4 CPU's @ 3,6 Ghz
    - 120Gb Hard disk
    - 4G RAM
    - unmetered bandwidth
    - 100 Mbps connection

    On top of that I have 30 semi-dedicated proxies from Buy Proxies, and a Indexification license.

    Here's what I've done already:
    - drop the HTML timeout to 130
    - choose 6 English Google SE's
    - take OFF the SpamVilla, and run only GSA Captcha Breaker
    - put the verification to 1440

    I'm running 14 projects. All have a Tier 1 as well.

    Now, for every project I have a little bit of project-related keywords, and then that 99k+ list I kept tweaking and honing for probably something like 8 hours to get rid of the bad keywords.

    I'm beginning to run out of ideas here... I've removed the duplicate url's and domains every day.

    Still, I get these 60-70 LpM numbers only for short periods of time, and then I'm right back to the 10-20 range.

    You guys have any ideas where to go from here? :)

  • BrandonBrandon Reputation Management Pro
    How many threads? If you want higher LPM i would suggest importing lists. With only 30 proxies you won't scrape very many targets.
  • get rid of the public proxies
  • Okay, well couple of things have happened since the time I posted my question.

    I actually read all of this thread again (who wants to sleep at night, anyway?)

    Most important updates:
    - got myself 30 PRIVATE proxies, and tweaked the PROXY timeout and thread settings as Buy Proxies suggested => no more burning proxies!
    - got myself another VPS to run Gscraper on, and I've imported some lists from there as well

    It seems like I'm settling to about 50-60 LpM right now, although I DID visit the 900+ LpM zone for a while over there ;)

    @Brandon, I'm running 300 threads.

    One thing I noticed helping me was to import my submitted list to the projects. This makes sense as well, so that I can get a bigger verified list that way.
  • edited July 2014
    Made as much as you can.

    Visit here to make more links  http://cevapistiyoruz.com

  • gsa8mycowsgsa8mycows forum.gsa-online.de/profile/11343/gsa8mycows
    Okay, I necrod this thread. Get over it.
    But this is amazing to see what you ancients used to do.
    On contextual dofollow article engines only I get about 10lpm on a good vps, which sucks ass honestly.
    Now, I have to go back and tinker with settings.
    Au revoir!
  • nycseonycseo NYC
    edited May 2015
    400 LPM 500 threads, 200,000 + verified

    massive Tier 2 blast
  • andrzejekandrzejek Polska
    edited May 2015
    I suggest you guys stop focusing on your LPM and start focusing on the results you want to get.
    LPM depends on many things, engines, platforms, amount of e-mail adresses, email providers, your gsa settings, your proxies, your project settings, your captcha breaking solution, your engines.ini file, domains that you scrape, how you import lists into projects, how you submit... the list is so long...e.g

    when blasting fresh scrap for blog comments you wont get much lpm
    but when blasting blog comments using verified list and "allow to post on same site again" your lpm will blow...

    ill rather keep 20lpm than 2000 but get my website ranked

Sign In or Register to comment.