I was reading some threads or posts here and I keep on asking my self, what threads are they talking about?? I'm a total newbie in GSA so I do really need some support! Thanks in advance
Several hundred projects, no problem. But if you use SER for scraping then 500 threads will probably be too much for 50 proxies. Try 200, if you find the proxies are banned fast then you will need to lower threads.
@isagani - It's very hard to answer your questions because you are at the beginning and there is a massive gap in your knowledge. First i suggest you read the FAQs, learn how to use the software and then most of these things you will answer for yourself.
1 project can make 1000s of links. 1 project usually relates to one URL you wish to build links for - But 1 project can contain multiple URLs.
Scraping is searching for sites that you can post links to. SER needs to look online via the search engines to find sites you can post to. But you can also use Scrapebox or Gscraper for example to scrape and then import the sites into SER.
Yes you can just use SER. It's easier until you want bigger and better things from SER, then you can start using SB or GScraper. Practice on some throwaway domains or someone else's sites to learn the how, thw what, the who and the when of it all before letting loose on your own domains. (or create Blogspots under different email accounts)
Comments
Each thread is like a single connection to the internet, so 200 threads = 200 connections.
Does that help?
Maybe best you start with the FAQs: https://forum.gsa-online.de/discussion/155/inofficial-gsa-search-engine-ranker-faq/p1
Maybe 500 will be ok, if you have just a few projects then maybe even more. You need to test for yourself.
1 project can make 1000s of links. 1 project usually relates to one URL you wish to build links for - But 1 project can contain multiple URLs.
Scraping is searching for sites that you can post links to. SER needs to look online via the search engines to find sites you can post to. But you can also use Scrapebox or Gscraper for example to scrape and then import the sites into SER.