Skip to content

best method for importing multiple url domains (comments etc)

Curious how you guys are doing it for max results/efficiency

Bulk uploading your entire scrape only de-dupped on a url level and letting gsa try posting to all inner urls for comments. Or de-dupping on a domain level and importing that, letting gsa run thru, then taking the comment success list and doing a site: search to grab all the inner urls of working comment domains and uploading that?

If a particular page or root domain does not have comments enabled on it, will gsa spider other urls on the same domain if its identified as a comment based platform, to find a url with comments enabled or will it just skip it?

Thoughts?
Sign In or Register to comment.