Sitelist usage (quick question)
Scraped 4k URLs with scrapebox, imported into GSA. Interestingly, I only scraped for contextual footprints (article, social, web 2.0, wiki) but after importing, I also have lots of non-contextual platforms in there.
My question: If I set up a project to post to this sitelist, but I only want to have contextual links (i.e. on the left hand side I tick only web 2.0, wiki, social, article under "where to submit") - will GSA only post to these platforms from the sitelist?
Or does it ignore that and "sitelist" just means it will post to absolutely every engine on that list?
If "no" to the above questions, will I additionally have to check the according "type of backlinks to create" boxes (i.e. only contextual ones)?
Thanks guys...feel pretty overwhelmed at the moment, despite dozens of hours of testing already...really appreciate your support.
My question: If I set up a project to post to this sitelist, but I only want to have contextual links (i.e. on the left hand side I tick only web 2.0, wiki, social, article under "where to submit") - will GSA only post to these platforms from the sitelist?
Or does it ignore that and "sitelist" just means it will post to absolutely every engine on that list?
If "no" to the above questions, will I additionally have to check the according "type of backlinks to create" boxes (i.e. only contextual ones)?
Thanks guys...feel pretty overwhelmed at the moment, despite dozens of hours of testing already...really appreciate your support.
Comments
One more thing, I'm not really sure how to handle duplicates?
F.e., when I scrape a new list of sites every day with SB and import into GSA, the chance rises with every scrape that there are dupes between the SB list and the one already in GSA.
I'm especially asking since in my "verified" list, all the sites aren't in 1 single file (so I could compare that file with the one from SB and remove dupes in the SB one) but these are dozens of files, i.e. 1 for each platform/link type.
How do you handle that?
Or does GSA recognize dupes automatically upon import and just ignores them?
1. If I have a global site list of "failed" and then set up a dummy project to run through these URLs to filter out the good ones...will these be removed from "failed" and moved to "verified" automatically?
Or will all these URLs stay in "failed" (but also get moved to "verified")?
Might seem obvious to some but it seems like I already posted to some of these URLs now but the count in my "failed" list is still the same.
2. I'm seeing odd stuff in my log since adding the global site list, things like:
Loaded 1/67 URLs from site lists
Loaded 1/67 URLs from site lists
Loaded 1/67 URLs from site lists
Loaded 1/67 URLs from site lists
Loaded 4/134 URLs from site lists
Loaded 1/67 URLs from site lists
Loaded 1/67 URLs from site lists
But that site list has currently 3k+ URLs on it (and in the settings of this project, these are all engines that I want to post to).
Any ideas?