Skip to content

Memory Issue

edited November 2012 in Bugs
Before the last 2 updates, GSA began and stayed at about 120 Mb of memory. Was always super impressed about that.

After updates, it now begins at approx. 400 Mb of memory. Then, after 24 hours, it jumps up to about 900 Mb of memory. Very disappointed about this...

*The only change I made was adding the 450,000 keyword list floating around, to all my campaigns. No new, additional campaigns added.

So, is there a memory bug? Or do the additional keywords affect the memory usage?

Comments

  • AlexRAlexR Cape Town
    @VbKing - where did you find that keyword list floating about? Would be super useful to share. 
  • SvenSven www.GSA-Online.de
    450000 more keywords....as a project file is loaded into memory you can imagine that alot more memory is wasted. I don't find that much keywords useful at all.
  • edited November 2012
    Well, then PLEASE build an option where keywords are not necessary. I've been asking for this for a while on BHW.

    I continually find campaigns that slow down to nothing... and it is quite tedious to come up with unnecessary keywords when I'm only concerned with contextual relevance. I just don't see many people getting successful keyword relevance on anything other than blog comments.

    Here is the link that s4nt0s posted:
    http://www.blackhatworld.com/blackhat-seo/buy-sell-trade/447629-gsa-search-engine-ranker-extremely-powerful-linkbuilding-software-free-5-day-trial-90.html#post4878225
  • Or please strategically come up with a list of like 10,000 keywords what would probably encompass all websites such as "privacy", "contact", "link", "blogroll", "us", "home", etc.

    Thoughts?
  • Not sure if this is relevant or not. I'm gonna assume that you let SER Search for sites to post on correct?  If so,  I've never once had a problem with Campaigns stopping or slowing down or anything of the sort. 

    Are you checking the option to get new keywords from scraped sites? and than using those new keywords to find more targets? 

    I have always used those 2 options and I only import around 100 keywords to start and than GSA will just keep importing new keywords on it's own without me ever doing anything.   I've had campaigns running for 3 months now just nonestop and it has never slowed down or anything. 
  • Well, I do not use that option. I guess I should.

    I still think my point is valid.
  • While Memory can be an issue -- I also believe that HD access times are important as well.

    I am running SSDs on all my machines -- and they tend to be able to take more than the usual machine -- and these are NOT power machines by any stretch of the imagination (2009 boxes).
  • Well, that's discouraging. So, I will need to go in and remove the 450K keyword list then.
  • Either that -- or get a super powerful box with like 24GIGS of RAM so you can keep it going.

    But I notice with a fraction of the keyword list size -- my machine slows down even with an SSD in it.

    So I reboot -- reclaims my memory -- and re-start GSA.

    May I ask -- what kind of system are you running GSA on?  Motherboard/RAM/etc...?

    And also -- where exactly are you adding this list of 450K keyword list to in GSA?
  • edited November 2012
    I have a link to the list above. s4nt0s posted it on BHW.

    The speed of my system is not relevant b/c I really like how it only ran at around 120 Mb. It was super disappointing to see the memory quadruple initially and then go up to 1 Gb.

    I really think it is the most amazing piece of software in our industry... so I can't fault them for this. It is a LOT of keywords.

    I just wish they would build an option to ignore keywords and just search for sites using the standard platform footprint...

    **I went ahead and put together a list of the most common keywords found on the web with the most common sight words mixed in:
    http://pastebin.com/tQgef0rp

    With this and these two options checked, you will never run out of sites to post to:
    "Collect keywords from target sites"
    "Use collected keywords to find new target sites"
  • Okay -- nuf said -- system irrelevant.

    And yes -- this truly is an amazing piece of software for sure.

    And thanks for sharing your MOST common keywords list.

    I totally agree that it's a LOT of keywords -- and while the more the merrier -- unfortunately, it's might large!  :-D

    Hmmmmm -- now I'm thinking.  


    Is there a possibility to have a keyword listing as an EXTERNAL .dat file that keywords are grabbed from so that memory is not used for storing the keywords?

    Just a thought
  • SvenSven www.GSA-Online.de

    the keyword can be used from an external file if you use e.g.

    %spinfile-<keywordfile>%

    in your project on the ekyword field.

  • if you get creative you could also spin your keywords.
  • AlexRAlexR Cape Town
    @Ozz - just to check I'm following you.... 

    You mean like have a file with 
    {big|great|top|best|worst} Keyword 1
    {big|great|top|best|worst} Keyword 2
    {big|great|top|best|worst} Keyword 3

    This would be super neat. Thanks for the share. You are always full of great ideas. 
  • OzzOzz
    edited November 2012
    {big|great|top|best|worst} {keyword 1|keyword 2| keyword 3}

    you don't need a file for that. just put the spins in the keyword field, seperated by ","
  • AlexRAlexR Cape Town
    Ozz for captain! :-)
  • I'm not using GSA to scrape right now, but couldn't you just break your huge keyword list into a bunch of smaller files... say 10k each and then spin the %spinfile-<keywordfile>% token like {%spinfile-<keywordfile>%|%spinfile-<keywordfile2>%|and so on...}? I would think that would reduce the memory load as it's only ever reading from a single relatively small file, but I have not tested this as I'm scraping in SB and Hrefer.
Sign In or Register to comment.