Skip to content

Fullest GSA can do?

I am pretty disappointed with the numbers of SER do backlinking... I know some factor that causes it... what i would like to know what is the best machine where GSA ser can do fast backlinking and number of projects running at the same time.???

I need to run 15 projects at the same time but gsa will tell you you are running out of ram or space or something... i would like to ask what is the best machine features to run gsa smoothly running a couple of projects simultaneously.

thanks
«13

Comments

  • i was gonna say its your settings but if your running out of memory with 15 projects then you need a better mahcine to run it on :D i used to get that on 2gig vps with 100 projects or so but since moving to 4 gig i havent seen that messsage
  • PeterParker coming by.. can you suggest what vps are you using right now?  
  • Use a VPS (solidseovps geek or wizard VPS) then if you still have problems run a scheduler (20 projects swap every 20 minutes or something similar). 

    If you still want more, get a better VPS or a dedicated server (check if SEO tools are allowed first) or a better machine and a faster home connection (check with your ISP for data transfers/bandwidth limits).

    I have 400 projects running with the scheduler. Need to buy another (or two more) license and VPS set up but clients are slow at paying this month for some reason...oh well, their projects are getting stopped, any more days late and I will start deleting links.
  • Thats pretty cozy peter you are probably making out of gsa ser... i haven't seen much climbs since i bought gsa ser even with 9k low comp based on top 10 google result, so i am very skeptic to invest on addons (vps) for this tool.
  • btw, anyone know what the keyword list everybody is talking about? they say they import 100k keyword list on gsa... where to do that? how to do that?

    thanks
  • Find online or scrape your own. Import the keywords into the keyword box in project >options. Make sure you don't use your keywords as your anchor, so fill in the anchors with your chosen keywords.


  • I am in the same boat james, slow climbs and only a cpl of kws just scraping front page at 9 and 10. for the record the upgrade in vps didnt actually increase the speed of GSA just so you know. it only stopped the memory errors.
  • goonergooner SERLists.com
    edited November 2013
    @peterparker - I was having probs too. Since a week or two ago i'm not scraping with SER at all. It was killing my proxies and producing bad results all the time.

    So i changed to scraping with Scrapebox and only posting with SER and things are much better. You could try that?
  • nice info. so you finally took the leap eh :D? is it alot more hasstle?- i have a bit more time freed up now ive got everything setup so now would be a good time to look into that actually. if im gonna do it I will do it in style and use hrefer the daddy of scrapers :)
  • goonergooner SERLists.com
    @peterparker - Nah it's not much more hassle, i'm mostly using global lists as targets since i got millions of verified and then using scrapebox to find new urls and import into a few projects to keep global list growing.

    Add to that "find urls from verified" (or whatever that option is called) and SER is flying.

    Found a couple of different methods to use with SB so i'm scraping more than i will ever need.

    Happy days :D
  • Good good. will give it a punt soon then.
  • goonergooner SERLists.com
    Cool mate, the other plus side is it's a lot more consistent - Was getting tired of 100k verified one day and then 20k the next. This way you can be sure you will a steady amount everyday.
  • BrandonBrandon Reputation Management Pro
    I never scrape with SER, there are more efficient programs that will scrape. I want all of my SER resources to be used for posting. Get into a public proxy program that gives you new proxies every day and use those with scrapebox and then import into SER.
  • @Gooner would you mine sharing how you scrape urls to use on ser using sb? I really wanted to know this and i do not know what footprints to scrape with. Also in this way you will have a list of general sites which will kill relevancy based on search term right?
  • goonergooner SERLists.com
    edited November 2013
    @jamesmurren - I don't pay attention to relevancy of the search term, i only use contextual links to the money site so it doesn't really matter.

    There are 2 methods i use, the first is to get more links from already verified urls, see here:

    http://www.blackhatworld.com/blackhat-seo/black-hat-seo-tools/605958-tut-how-easily-build-huge-sites-lists-gsa-ser.html

    The second is the method @2take2 wrote on this forum - To get url's from keywords, but i can't find the thread right now.

    Basically it involves taking the footprints from SER, pasting into a excel file, adding the keyword macro and pasting into a text file, then adding keywords in SB and merging those with the footprints text file.

    @2take2 - Can you provide a link to that post? It was a step by step guide which i'm sure will help James a lot.


  • 2Take22Take2 UK
    edited November 2013
    Hi @Gooner, just dropped by to post the link to the thread, but it looks like you found it already - glad you liked it.

    Sounds like you're really hitting the scraping hard, are you running scrapebox on its own dedicated box?


  • goonergooner SERLists.com
    hey @2take2 - It's working out nicely right now. At the moment i'm running SB off a spare desktop i have laying about.

    Running 2 instances of SB, one for your method and one for the link extraction method. Total of 150 threads and 100 private proxies and i'm scraping a million links in around 3 hours for your method and a good few with the other method too, with no noticeable slow down on the connection.

    Very happy! Thanks for posting that method.
  • @gooner - that thread is awesome. That can be added to next week's things to do list ;) 
  • Cool thanks evryone.
  • Why i hav 94.00 karma?
  • @gooner, you said: Add to that "find urls from verified" (or whatever that option is called) and SER is flying. Are you talking about the "Use URLs linking on same verified URLs" option found under the "Options" tab in a projects menu? I never figured out what that was for so I always leave it unticked.
  • @2Take2 or @gooner can you confirm this understanding of mine about importing keyword list...

    during creating a project on the "keyword" field is where I will input the 100k keywords??? Am I right? and make sure i should not use keyword as anchor text... ???

    thanks in advance really noob on this kind of stuff.
  • Tim89Tim89 www.expressindexer.solutions
    edited November 2013
    I personally think it's a GSA issue, my machine was more than capable of running at 800 threads with around 140 projects, however, now I'm getting the 'out of memory' warning running at 150 threads and roughly 25 projects!
  • goonergooner SERLists.com
    edited November 2013
    @jamesmurran - Yes, you're right.
    @jampackedpsam - Yes, that's the option i meant. It works like the scrapebox method above for extracting links from urls as far as i know.
  • goonergooner SERLists.com
    you're welcome
  • @gooner the strategy of 2take2 about scraping urls based on keywords.... do you split the keywords? i have scraped 15k keywords and when loaded to sb... sb crashes... 
  • goonergooner SERLists.com
    I use just enough keywords to get at least 1 million links. That will depends on how many footprints you are using. For me it's about 500 keywords.

    But i think @2take2 loads up as many keywords as SB will take without crashing, maybe he will confirm for you.
  • @gooner doesn't make sense i have splitted to 500 my scraped keywords and loaded to sb and merged with the footprints.. makes likes this:

    Powered by Pligg "this is my  keyword here"
    What Is Pligg? "this is my specific keyword here"
    intitle:"Pligg Beta 9" "this is my specific keyword here"
    http://www.pligg.com "this is my specific keyword here"
    Published News "Upcoming News" "this is my specific keyword here"
    Submit a New Story "Powered by Pligg" "this is my specific keyword here"
    inurl:"story.php?title=" "this is my specific keyword here"

    and when i start harvesting its not harvesting at all!! no urls harvested....
Sign In or Register to comment.