Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Improve LPM by tweaking the footprints in the engines

bmanbman United States
edited March 2015 in Need Help
Hi all,

I am relatively new to GSA, but am beginning to get to grips with it, thanks to some knowledgeable folks on here (LeeG, Ron, Ozz).
I have a VPS with 3xCPU and 3GB RAM from Powerup.
I have 10x dedi proxies and 20x semi dedi proxies from buyproxies.org.
I also have GSA Captcha breaker, and ScrapeBox setup.

I first bought a list from SERLists, which helped bring my LPM up to a nice 70, whilst running 350 threads.
Obviously, once I had exhausted the list, it dropped down considerably.

I have read about users who were able to get high LPMs by tweaking their system. This is what I gleaned as the main points to achieve this:
  1. Removing the engines that perform badly
  2. Improving the footprints in those engines
  3. Selecting Google INT and 5 x random Google engines
I have removed the engines that don't get many results, and selected the google engines, and that has helped, but I am not sure how to handle 2.
How can I tell which footprints don't perform well?
How can I tell which ones are not submitting?

I obviously want to avoid GSA wasting time searching against footprints that don't yield the results that I am after, but I am not sure where to start.
Any pointers?

Best Answers

Answers

  • How you know which engines give you results? You used one engine per project ?
  • bmanbman United States
    I'm only running one project to test.
    I checked the verified stats to see which ones were giving me verified links
  • @bman - In the quest for the holy grail of high lpm the advice is to remove engines that perform badly as you've done BUT you need to occasionally run a test that uses ALL engines just to make sure that sneaky old @sven hasn't updated the engines...I had Joomla K2 removed and now something's changed and it's working great.
  • bmanbman United States
    @filescape - Thanks for the tip. I will keep this in mind in the future.
    I only limited the engine selection recently, so I should be good for a while.
  • bmanbman United States
    @IdentifiedURLs Thanks for the link.
    I have read the post and like your recommendations. It does prompt a couple of questions.
    As you don't use GSA ser to find URLs, how do you setup scrapebox to find URLs?

    In my mind, I would want to export all of the footprints for the engines I have selected after the cleaning task.
    Then merge that with my list of keywords and scrape.
    Would that make sense?
  • bmanbman United States
    Awesome. Thanks for the info. I am going to give it a go. 

    Any tips on how to export the footprints from GSA?
    I'm using SER Footprint Editor at the moment to do it
  • bmanbman United States
    image

    I just made it to triple digits!
    Before this, I was only getting 50-70 at most.

    That was with a single project, using a list I scraped before with scrapebox.
    It isn't a massive list, so the project will run out of steam soon, but it proves that the theory is sound.

    Thanks @identifiedURLs for the tips
  • @bman, thanks for the shout out!

    Glad to see that some people take my advice seriously here on the forum and can prove its worth to others.

    Next thing you need to improve is the VPM's
  • bmanbman United States
    I set my verification to 1440, so the VPM's won't catch up till tomorrow.
Sign In or Register to comment.