Skip to content

Don't just use global site lists!

I was just using global site lists, and i was getting 30 lpm on all engines, only using the verified global list - terrible right? (With 150 Threads! 10 Proxies)
I went to options -> Advanced -> Tools -> Export site lists -> verified
I then imported that verified .sl list into 1 project with no limits.
image
If i went any higher thread wise my pc would probably shoot off like a fucking nasa rocket to join the mars rover.
So what i learned from this, Global lists suck - Verified lists rule

Comments

  • Tim89Tim89 www.expressindexer.solutions
    This is fairly normal.

    From what I understand, when using the verified global sitelist, the project picks out the targets obviously from the verified file, but it does not work it's way through the list in a linear fashion, it picks out targets randomly, this may contribute to the 'already parsed' messages and the reduced LPM in comparison to directly importing targets.

    When a list is directly imported to a project, that project runs through the entire list from start to finish, it is more reliable in that respect.

    Selecting targets from a sitelist by ticking a checkbox is more automated... I'm someone that uses SER to the maximum, normally running around 200-250 projects on each of my installs, with a massive verified sitelist, when I try to import my verified list directly to over 200 projects, SER simply freezes and crashes due to the sum of targets involved, being imported to each and every project.

    If you're using SER casually without so many projects and you don't mind loading up your targets manually then yes, directly importing your sitelist to the project is a lot more faster.
  • I don't know how you manage to get 166 vpm with 150 threads and 10 proxies.

    I'm running gsa ser on 200 threads with 20 private proxies and still at 10 vpm.
  • Tim89Tim89 www.expressindexer.solutions
    If you clean your list and import them directly to a project (making sure there are no duplicates) you'll see your lpm/vpm increase.

    Importing targets directly means that SER doesn't need to load more targets between runs and also prevents SER picking out the same targets causing 'already parsed' messages which in return slows down your link building.
  • Idk, but i was getting that VPM with only around 40 threads as my CPU was shooting through the roof.

  • Cleaning = deduping and re-verifying right?
  • Tim89Tim89 www.expressindexer.solutions
    Yes
  • Your objective should be to run some projects that "build" a global list on many platforms. Then you'll have a nice list or lists of 100k verifieds you can shoot at a churn and bake site in just a few hours. Remember to always go and remove your duplicate domains from your global verified list so you'll have a nice ip diversity/domain list.

    Matt
  • @Matt : I have one question related to your reply. Is there any way that you can save all the links scraped by GSA SER ? I mean.. you use a project to "build" global lists only with contextuals, BUT, I noticed that this happens very often, even though GSA SER scrapes for contextuals only, a good percentage of those scraped links are other engines... So, I was wondering if we can just save all the scraped links. :)
  • Under Options > Verified. Make sure that is checked and all your verified links will be saved there.

    Matt
  • Tim89Tim89 www.expressindexer.solutions
    edited January 2015
    Hello @Matt you seem new here.

    The question wasn't churn and burn, think you got this thread confused with another.

    Why would there be a reason to build a global list of many platforms if you are only interested in the important ones...

    You should indeed have a way of constantly building your verified list, but I don't think wasting time gathering up all platforms is a good way of doing things.

    All everyone needs to do to process their raw links is this...

    Set up a new project, name it 'Processing' select the platforms you want to start building a verified list for and then add some content/emails etc untick all sitelists and the global site list box, hit ok and you're done.

    Everytime you have completed a scrape, deduplicate the list and simply go to your 'Processing' project, right click and import targets and let it run, depending on how fast you want this done, pause other projects whilst doing this.

    With 1 Processing project which receives fresh targets regularly, all other projects can be set to verified lists only.

    Tim high five matt _o/
  • Churn and bake was just an example. Your exactly right. Maybe I didn't explain my self correctly, probably not since been up about 22 hours working. What you described is exactly how I do things to build a list. I have a "processing" project that attempts to post, objective complete, target stored in verified.

    For targets that are of value, that would be all of them except exploit, rss, referrer, pingback and indexer, those are junk in my opion. All the other targets have their place in structures. There are higher value targets such as article, wiki, etc; then there are junk links, but still of value for say tier 2 or 3, such as blog comments, guest post, trackbacks, etc;

    So what I do is build those target list and then I can put what goes where withing my skeleton depending on what type of project I'm running eg; churn and bake, long term tired sites. Yes you are correct in your thinking and explained it better than I should have the first time.

    Matt
Sign In or Register to comment.