Someone PMed my asking me what scrapebox addon I used above. For the benefit of others I paste my reply here,
Scrapebox Google Competition Finder - allows you to check the amount of
pages indexed in Google for a list of keywords to help perform
competition analysis.
If your footprint shows very little pages indexed in Google, remove it...
gsanewb29
HI All,
sorry to chip in late on this one.
I am scraping in SB leave it overnight so i get a list of 100k for example - I then delete duplicates (but do not trim to root)
I then go to GSA and upload the list, with identify platforms.
Am I doing this all wrong? sorry to ask the list bit is a little unclear to me.
Should I be trimming to root in SB?
should I not be uploading all these links to a global list to use across all tiers/projects?
Olve1954
>Should I be trimming to root in SB?
No need. SER will know what to do with the raw urls.
>should I not be uploading all these links to a global list to use across all tiers/projects? If you're using "identify and sort in", you're uploading to the global list.
the_other_dude USA
edited August 2014
If you think about it some of the platforms are on a sub directory. If you trim to root SER won't know the CMS is there since it was installed on a sub directory. Trimming to root can make you miss targets.
gsanewb29
@olve1954 thanks that helps clear things up - to confirm, building up the global list is what i want to be doing? And choosing to run a set list on a tier would 'turbo' boost that specific tier?
Comments
Scrapebox Google Competition Finder - allows you to check the amount of pages indexed in Google for a list of keywords to help perform competition analysis.
If your footprint shows very little pages indexed in Google, remove it...
>should I not be uploading all these links to a global list to use across all tiers/projects?
If you're using "identify and sort in", you're uploading to the global list.