Skip to content

Links Per Minute

2456714

Comments

  • ronron SERLists.com

    @sonic81 - The more 'no limits' projects you have, the higher your LPM will be. Everyone thinks a person with 10 'no limits' projects will have the same LPM as a person with 30 'no limits' projects. It isn't so.  

  • Just got my "old" 3,3 Ghz Quadcore, Installed Server 2008 and am testing now...

    Been using paralles + win xp on a mac mini 2012 --- which was a bit slow sometimes working on too..

    % rate submit is still pooer when using junk / spam links...
  • LeeGLeeG Eating your first bourne

     @sonic, try experimenting with the amount of se your using

    I personally prefer to use a lot less than you have selected

    Over 150 less than your using

  • personally I only have 6 engines selected; I have been doing this for a while and I have huge site lists.

     I am looking at the figures posted above by @doubleup and @LeeG

    I can honestly say that getting these numbers isn't really that difficult.  good proxies-good rig- scale down engines/platforms that don't have high success rates.   and you can get these numbers as well. 

    Another thing @doubleup is running at 800 threads. He is pushing the hell out of that machine.  What is amazing to me is that low ram usage.  I have had the software all the way up to 2000 threads and the 2GB limitation just eats me alive. 

    These days though I have found that it isn't about the threads so much as it is about the proxy load. I have several machines that run over 150-250 lpm.  with thread counts between 200-300.  the machine runs more stable here and I don't have to worry about it locking up when I am away. 

    I am thinking about other things though.  verified links per minute would be a more valuable stat.  just my opinion. 

    image

    4 samples from different machines. 
  • AlexRAlexR Cape Town
    @krushinem - 6 engines? You mean SE's or platform engines? If SE's, would you mind to PM me them? I'd like to give them a test.
  • LeeGLeeG Eating your first bourne
    edited January 2013

    I only run 5 search engines on any tier.

    Or to be more exact, I only use one engine which is google and then choose four random countries plus the .com version

    Reduces getting your proxies blocked for too many search queries

    20hrs of running, Im still under the 2gb threshold

  • AlexRAlexR Cape Town
    @LeeG - does Google for any countries overlap so that if you select them you get your IP blocked? 

    If so, would be neat feature for it to randomly select any x number of Google SE's...
  • @GlobalGoogler there is nothing special about the SE's I selected.  I just wanted to select some.  
  • OzzOzz
    edited January 2013
    >If so, would be neat feature for it to randomly select any x number of Google SE's...

    just select all google with mask and you have random google.
  • LeeGLeeG Eating your first bourne

    Im now banging out 1/4 million submissions daily

    Its just that you need to use the magic formula of

    time + effort = high lpm and verified

    time analysing your results to see which engines you get high results submitting to

    effort making the changes so you only submit to those engines.

    There is no magic button in ser to do that

    Depending n the search engines and words your trying to rank for, using any google wont help.

    Some words get used in most languages (football, sex, viagra etc)

     

  • edited January 2013
    I've been reading this thread with great interest since my verified per day fell drastically after I imported a lot of scraped sites, and I have an idea why, I just hope you guys can help me so my LpM can get back to normal - or even better.

    I tried to compare stats to wheed out the platforms that didn't perform. Stats were the same for identified, submitted and verified. Then I noticed that GSA (at least my install of GSA) saves these urls in the same GSA root folder. Shouldn't they be in different folders? As it is right now choosing a project to pick only from the verified site list doesn't really work as they are mixed in all together - am I right about this?

    image
  • OzzOzz
    edited January 2013
    just open that folder and you see there are all kind of different files labeled as 'identified', 'submitted' and so on.

    to increase your submission rate just use the 'submitted' or 'verified' list which depends on how big your lists are. to do that uncheck 'save identified sites to' in the options of your screenshot or select what you want to use in your project options.
  • @Ozz that's the problem I don't have these files labelled that. I only have files starting with:
    sitelist_Article-BuddyPress
    sitelist_Guestbook-TPK Guestbook
    etc.

    Nothing with identified, submitted, verified etc.

    That is probably also why when I choose to view the stats for identified, submitted, verified, the numbers are all the same because all urls are being save to the same file? sitelist_*

    So - is my GSA installation bugged, or do I just need to point it to new subfolders?
  • I hope that I can help you here.  Don't worry about the file names.  Look at the folder in which the files lie.  They should be verified, submitted, identified, failed.  You will have the same file names in each folder.  I hope that helps.  
  • @krushinem

    Have a look at my screenshot. Identified, submitted, verified urls are ALL being saved in the same root dir of GSA, and there are only one sitelist per platform. This looks like to me that ALL 3 types of URL's are being submitted to the same sitelist files, which renders the whole idea of separating them useless.

    So - is my GSA FUBAR? Do I need to reinstall or am I missing something here? As I wrote before, when I view stats in the advanced section, the numbers for identified, submitted and verified are all the same - which of course they shouldn't.
  • Okay I looked at you pic again from above.  Most of us have this

    C:\Users\krushinem\AppData\Roaming\GSA Search Engine Ranker\site_list-identified

    C:\Users\krushinem\AppData\Roaming\GSA Search Engine Ranker\site_list-success

    C:\Users\krushinem\AppData\Roaming\GSA Search Engine Ranker\site_list-verify

    C:\Users\krushinem\AppData\Roaming\GSA Search Engine Ranker\site_list-failed

    Here is what I would do.  You have all of your verified,submitted, and identified going to the same folder

    If you want to weed through that (I don't know what projects you have nor how big it is)  

    download scrapebox duperemove

    put all the files together-

    now you have a list- you can clean the list by removing dup urls

    if you have been using blog comments you cant use remove dup domains because you will eliminate a portion of you blogs.  

    Now you have the the total list

    reset your folders

    import the list however you would like

    Go back at it.  Again hope that helps.  
  • @ krushinem thanks man - you've been very helpful :-)

    My site-lists are huge - GSA must be struggling hard at them. I just first today took notice when I saw how efficient some of you guys are with GSA :)
  • GNARR!!!!!!!!!!!!!!!!!!!

    new there was a reason behind remove dups websites / and urls :-( !!!!!!!!!!!!!
  • LeeGLeeG Eating your first bourne

    @claus10 sounds like your one of the older hands at ser

    In the early days, thats how ser sorted the lists

    Another quick way to do it which will take about ten minutes, maybe a bit longer depending on lists sizes

     

    Go into the location where they are stored at present.

    If the folder names are not there.

    Right click and create new folder, then make a name verified and repeat three times, adding the name

    Then add those locations to where the files should be

    Next the time saving clever bit

    Hit the tools button on the advanced tab > add urls from projects > submitted and again for verified

    Tools again > remove duplicate urls

  • edited January 2013
    @LeeG

    Yeah - I've had it for a loooooooooooong time - before the BHW thread reached 5 pages :-)

    Thanks for the import tip!
  • LeeGLeeG Eating your first bourne

    Your just plain showing off with your copy.

    One install on your original pc / vps and still running strong, with only using the update button :D

    Testament to Svens programming

    Which bhw thread, the sales thread or the original discussion :D

  • edited January 2013
    @LeeG Exactly - when I bought CB it was the first time in many years I bought piece of software or hardware that I never researched first for reviews. I just knew that Sven and his team would produce quality. No need to check up on it.

    I think I bought it around the end of january 2012.
  • One last question, which I asked in another thread actually, but never got an answer to.

    5 projects using the exact same keywords. Would it not make sense to have one project scraping and the last 4 projects just using the global site list without scraping?
  • LeeGLeeG Eating your first bourne

    I run a similar idea to that, but on a much bigger scale

    Everything scrapes from the engines and also uses the global sites lists

    I have managed to use every google listed in the process. Thats four random googles plus the .international used, over all my projects

    Sometimes you can be on a time limit block on a search engine

    So you can scrape once and get zero results.

    Scrape again and get some results

    And if google does the dirty and roll out a new algo, you might be running a few thousand keywords behind on one of the other projects and pick up the new sites that have jumped places.

    Thats my theory

  • Nice thinking actually yeah :-)

    I'm trying to maximize the effect of 40 threads running on very quick proxies, so I'd like to keep the scraping as low as possible. Does my approach make sense then (disable scraping on 4/5)?
  • LeeGLeeG Eating your first bourne

    Try it for 24hrs and see how it goes.

    You have an idea of your present submissions and verified daily ratio

    Which after you make the changes to the global sites list, should take a boost

  • @LeeG, may U can help me?
    image
    I use 7 project's an one moment per 20 min. I submit to Article, SN, SB and Web 2.0.. and I don't know why I have too few LpM 
  • @dudz1ok: read this thread and scan all posts of LeeG.

    LeeG is very open minded when it comes to sharing his techniques, but I don't think he will give you a blueprint. I can't speak for him though, but for me it isn't motivating at all if everything has to be repeated again and again. Just read, learn and understand his techniques.

    Furthermore a screenshot of your status bar helps noone as we don't know your settings. So do your homework, implement what you learn from this board, modify and test your setup first.
  • edited January 2013
    @Ozz , it's my setting's (I DELETE LINK )... please check my setting and help me!
  • I already told you in another thread. 

    "...post some quality question like @hyde did forexample."

    As I said, I don't like to repeat myself. So do your homework first and ask some intelligent questions later. 
Sign In or Register to comment.