Skip to content

A few questions that i cant figure out

Hi everyone,

Sry for asking this, but i have searched the forum but could see any info about this.

I am looking to start a project for a dating site, and there i would want GSA SER to do the linking to
sites of the same nich. I have been looking at the settings called "Always use keywords to find target sites" and  there is another one
that is called "At least XX of my keywords must be present on the site"

From the looks of it , these  two settings would somewhat ensure that links are posted on similer nich sites ??
Have i understood this correct, or am i way wrong here !! In the case i am wrong, is there a way for GSA to post to relevant sites, and not just random sites with PR above XX.


Thanks for taking the time to read this, all help is very much appritiated

BN
Norway


Comments

  • SvenSven www.GSA-Online.de

    You understood that correctly, but keep in mind that you will not find many sites for engines that use e.g. something like:

    "powered by XYZ" inurl:/register/blahah.asp YOUR_KEYWORD


    ... because your keyword will most likely not be on the page that is used to register. Some engines also don't need such keywords as they build there own sites like article sites or bookmarking sites. So maybe you want to split your project into engines where you want to use the options and some where you don't.

  • Thanks Sven for taking the time to answer me :)

    Cheers
  • AlexRAlexR Cape Town
    I've been thinking about this keywords and footprint problem and trying to come up with a solution. Since if we all use the same footprints without keywords we'll over spam the same sites again and again. 

    But as Sven says when you add keywords to various footprints results are drastically reduced since keywords are not on the register page.

    @Sven - what about if there was a 2 part solution? 
    1) It scrapes with a footprint that can work with keywords. e.g. "powered by XYZ" + KEYWORD
    (yes, results may not be that accurate, but this will mean a LOT of new places to post to are found) 
    2) It then has this large list and it then quickly runs through it to see if it matches certain engines and finds the register page or it takes this large list and does a second scrape this time with just the footprint that has a higher match to the engine. 

    This will allow it to find a huge number of unique places to post to as each user is using different keywords. 

    Not sure of the implications...and if it's even possible but just trying to find a solution for using keywords to find more targets with some of the more limiting footprints. 


Sign In or Register to comment.