collecting urls
Hi Sven,
Maybe a nice idea for an update.
I think I, and many more can use this very
well.
Why not create a project but not to submit
content, only for collecting urls on a given keyword and save these to a log
file.
GSA SER has all the tools to search for
these urls or domains. But when we create a project, we need to fill in
content, destination website, anchor text.
It would be great if there is an option to
pass all these things and to setup a project to collect relevant high quality
urls and use these for manual posting.
(like the harvesting option in scrapebox)
Think about it.
kind regards,
Comments
Hi Sibolo, I played with this option
several times. And it is great to find sites on footprints. Only, how do you
filter on PR, and keyword, etc.
Yes, you are right.
Anyway, maybe it is possible with GSA to
find urls with a certain pagerank, footprint, keyword etc. But I love the
harvesting tool in scrapebox. For now I stick to scrapebox if it comes to
search for urls on a given keyword.