Skip to content

Links Per Minute

189101214

Comments

  • Would deleting target URL Cache/History increase LpM if I've never done it?
  • @king for me it does sometimes..
  • ronron SERLists.com
    Go ahead and try it. Nothing will blow up. I do it every so often.
  • Just don't delete the accounts created aswell, i get huge lpm boosts deleting the cache\history.

  • When deleting history when it asks "Do You Want to delete accounts data as well" thats a No-No- right?
  • If you delete that, GSA will try and create new accounts, but the problem is it will use the same email, so most of them will just fail as the email is already registered.

    Only use that option if you are creating a new project that you duplicated based on an old one, not for these maintenance routines.

  • I never do that, excpet when I duplicate projects. I sometimes login to profiles and change some weird anchortext right. - there should be a security "do you really want to" question asked..

    most software have to many "do you really want to..."questions, but here I miss it..
  • edited March 2013
    I have tweaked SER to the best of my ability and only getting 50 LpM.

    -5.22
    -Submitting to only successful engines
    -Global site lists
    -7 search engines
    -Many projects (rotated w/ scheduler)
    -230 threads/130 timeout
    -10 private proxies, used everywhere
    -Verification 1440 mins

    Can't think of anything else.
    Please help me guys I want to get in the 230 LpM range
  • 10 private proxies and 230 threads. that sounds unbalanced to me but as long as its working, why not. maybe monitor your log for a while to see if thats the case.

    you could also try to not use any proxies for verification.
  • well, If there are good, that could work.

    Just watch the "download failed".

    But he has a high timeout.


  • edited March 2013
    I have 100k keywords loaded into 35 Projects all T3 deep. Could that be the problem?
    I'll buy more proxies if that would speed up things, but I can't see that jumping me to 5x the speed. I don't know how these guys are pulling in 200+. I did everything they've wrote.
  • to what platforms are you posting? if you do comments, you can easy geht high 3figure LPMs....

    ... with low re-verified.......

    if you do profile-based links like socialB, forums, etc.. its much slower.

    give your self some time.
  • OzzOzz
    edited March 2013
    do you use those 100k keywords on each project for the same engines you target? that doesn't make sense to me, but i fear many people do that. to split that keyword-list into different parts plus adding some self created list on other projects makes more sense to me.
  • LeeGLeeG Eating your first bourne

    I also use edited engines to bring in a lot more results, which most people never take the time to do

    Kill off low yield footprints, add high ones

    Try and keep footprints that will return millions of results, rather than thousands

    A good tool for checking the results a footprint can return is gscraper pro

    Extract them with the tool by santos and then import them into gscraper and you can easily see what to keep and bin

  • edited March 2013
    @thisisalex Ya I don't do blog comments that's like scrapebox isn't it?
    I do Article, Directory, Social Network, Social Bookmark, Forum, Video

    @Ozz I just do the same for all my projects.

    @LeeG Where do I go to view/extract them? Are these the GSA preloaded footprints?
  • LeeGLeeG Eating your first bourne

    So alex, Im holding back the snigger's and belly laughs on your above post

    Please answer me this, Im trying not to laugh at the stupidity of your comment

     

    How much longer does it take to post to an article site or wiki, than a blog comment or forum profile?

    Enough to slow down 200LpM to 2LpM?

  • BrandonBrandon Reputation Management Pro
    @LeeG there are significantly more locations and the search results are measurably higher with blog comments than Elgg installs for example. I know you know this and you're being facetious but someone will come along later and be confused :)
  • ronron SERLists.com

    Great point @LeeG on editing the engines. +1

    Yeah, I want to see the backup stats from @thisisalex on how those platforms slow down SER.

    So is it the racecar or the driver? :)

  • @Leeg - Do you backup your edited engines before each update and then restore the edited engines after the update?
  • edited March 2013
    @LeeG keep your insults by yourself. or post some LPM screenshots, thats more helpful.

    Quote Ozz"
    posting a comment to a blog is for sure faster than registering to a
    social network and post after account got verified. the less steps to
    take, the faster the submission. but i don't know how much faster it is
    overall. "

    thats what I mean. And I dont insult people, I help them and they help me back.
  • AlexRAlexR Cape Town
    @LeeG - "A good tool for checking the results a footprint can return is gscraper pro" - just want to say a big thanks. This was on my todo tomorrow and was going to see if I could get a coder to automate something like this...now there is no need. Cheers!  :-)
  • AlexRAlexR Cape Town
    @LeeG - can you offer your thoughts on:

    Would be much appreciated. 
  • edited March 2013
    Can someone help a nub where to find this "tool by sant0s"  so that I extract footprints then enter them in Gscraper Pro to only keep high yield footprints
  • posting a comment to a blog is for sure faster than registering to a social network and post after account got verified. the less steps to take, the faster the submission. but i don't know how much faster it is overall.
  • I got that from blackhatworld, uploaded here:

    http://rapidupload.net/f/5136
  • s4nt0ss4nt0s Houston, Texas
    edited March 2013
    @king818 - Grab it here: http://www.mediafire.com/?0ca7er18azzlw67

    How to use it:

    1) Click the search button and load the directory with the .ini files. (usually located C:\Program Files (x86)\GSA Search Engine Ranker\Engines)
    2) Check which engines/.ini's you want to extract footprints from.
    3) Click "Extract Search Terms"
    4) Click "Save" and choose where to save the .txt

    *extra option* Check the Append %KW% box if you want %KW% to be added at the end of every footprint. Easier for merging keywords with Scrapebox.

    ----------------------------------------------------

    Keep in mind I had this created before you could just pull the footprints right out of the advanced options tab lol. 

    *edit* - Thanks alex.
  • LeeGLeeG Eating your first bourne

    thisisalex, Im not insulting you, I just want know how you have come to the conclusion about the drop in LpM etc you claim to know about

    Your constantly giving advice at present that's always way off the mark

     

    The footprint extractor, I cant find the link to the thread on bhw, but this is the link to the download for it

    In the thread by Santos on bhw, he gives precise info on how to use the tool in conjunction with scrapebox

    http://www.mediafire.com/?0ca7er18azzlw67

    I have a back up of my edited files. Then just paste them back after each ser update

    A better method would be to simply rename the files edited files. That way they are not over written on each update. And if any engines are updated, you can easily see the new ones by date

    Some engines Im dropping down to one single footprint. BlogSpot and BlogSpot.es for example. No point in going after results that can return thousands, when you can be pulling a lot more results and reducing the already passed message

    And AlexR, I know one of the admin over there, I already told him about you mate. He asked if I knew a good online retailer of valium :D  (joke)

     

     

  • edited March 2013
    Thanks Guys. Let me see what I can manage
  • ronron SERLists.com
    Also, there is a sh*t ton of footprints at BHW in the sticky if you want them. 
  • LeeGLeeG Eating your first bourne

    That's where I got most of my extra footprints from

    And a few that have been shared on here

    Gscraper pro, has a built in part that gives you the amount of results on a footprint

    Thtats when gscraper is working and not being under a ddos attack like today

    So you can see the better ones to use that return a lot of results and in return reduce the amount of calls to the search engines. Thus reducing slaps to your proxies for too many searches too quickly

Sign In or Register to comment.