Skip to content

collecting urls

edited October 2012 in Feature Requests

Hi Sven,

Maybe a nice idea for an update.

I think I, and many more can use this very
well.

Why not create a project but not to submit
content, only for collecting urls on a given keyword and save these to a log
file.

GSA SER has all the tools to search for
these urls or domains. But when we create a project, we need to fill in
content, destination website, anchor text.

It would be great if there is an option to
pass all these things and to setup a project to collect relevant high quality
urls and use these for manual posting.

(like the harvesting option in scrapebox)

Think about it.

kind regards,

Comments

  • You can do it through Options->Advanced->Tools->Search online for URLs

    There are some more specific topics in the forum if you want to know how it all works...
  • Hi Sibolo, I played with this option
    several times. And it is great to find sites on footprints. Only, how do you
    filter on PR, and keyword, etc. 

  • I understand now what you mean, I think I talked about this with Sven some time ago, in fact I asked him if it was possible to extend the Custom Mode and make it list the urls and then when I have some time I can manually post through the Custom Mode. Otherwise we eould need to stay always in front of the pc...
  • Yes, you are right.

    Anyway, maybe it is possible with GSA to
    find urls with a certain pagerank, footprint, keyword etc. But I love the
    harvesting tool in scrapebox. For now I stick to scrapebox if it comes to
    search for urls on a given keyword.

Sign In or Register to comment.