Skip to content

Few important questions about GSA ser capabilities and working mechanism

Alright my first question about how does it behave in the following scenario

1 : I disable search engine usage. And import root sites of 1m websites. Does GSA ser check any other url than those imported 1m urls. Lets say some of them are blogs and blog comments can be made but do i need to provide those blog posts urls for GSA ser to be able post comment ?

2 : How big domain list text file i can import to the GSA ser ? Currently i have like 130m domains and with my software i am checking which one of them  alive. My rough estimate so far is about 40m of them. I am planning to import all and let gsa ser to process them. Do i have to need to split files and import each ? How many websites it can handle in its repository ? What limitation i have ?

3 : Are there any option to make GSA crawl added url list to certain depth ? If there is i would like to make GSA to crawl all urls at the given page - depth 1

Ty very much

Comments

  • SvenSven www.GSA-Online.de

    1. Depends on the engine. For blog comments it is useful to give the direct URL, other engines like a Forum or Social Bookmark will accpept anything. Basiacally every engine not posting on its own created page will need you to use import the correct url and not the root one.

    2. No need to split anything. SER will read the imported file in 1MB batches and proceed it.

    3. No

  • @Sven can you give list of which engines exactly needed to be provided direct urls ?

    And for which root urls would be sufficient

    Thank you very much
  • SvenSven www.GSA-Online.de
    mouse over the engine shows the description. Basically all Blog Comment and Image Comment engines, some Guestbooks as well.
Sign In or Register to comment.