Country filter VS skip sites with words in url
What is most efficient if you want only targets from a certain country other than english?
1. Country filter
a) is it that gsa does search engine queries with e.g. "&cr=countryFR" in the url parameters
b) or does it a normal search query and just filter out the countries selected in project options? ... which would be very unefficient to target a specific country.
2. Footprints per language
a) Does the option "always use keywords to find target sites" make sense at all for other languages than english?
Because many times the engine footprints are more complex than "powered by wordpress", there would be very few results.
example footprint: my french keyword "RSS Feeds" "Add us to favorites" "Make us your home page" "Submit Articles"
b) Would it make sense to add your own language specific footprints to the engines or is it too much work, as you would also had to edit the submission. verification scripts?
1. Country filter
a) is it that gsa does search engine queries with e.g. "&cr=countryFR" in the url parameters
b) or does it a normal search query and just filter out the countries selected in project options? ... which would be very unefficient to target a specific country.
2. Footprints per language
a) Does the option "always use keywords to find target sites" make sense at all for other languages than english?
Because many times the engine footprints are more complex than "powered by wordpress", there would be very few results.
example footprint: my french keyword "RSS Feeds" "Add us to favorites" "Make us your home page" "Submit Articles"
b) Would it make sense to add your own language specific footprints to the engines or is it too much work, as you would also had to edit the submission. verification scripts?
Tagged:
Comments
if you only want e.g. french targets ...
"country filter (all except france)" & "skip sites with words in url" (e.g. !.fr) just make a normal google search and filter out domains from in other languages. That means that 99% of the results will be english and therefore filtered out, right? unless you click the option: "always use keywords to find target sites" and mix all queries which keywords in your language which on the other hand will have few results anyway if used in another language.
So if the above is right, those options dont help much for other languages.
it would help if we could use those google url parameters:
A solution could be this:
With this new option everything would work like before except that the above url-parameters are added whenever google searches. Probably easy to implement.
Just read Ozz's post and there's some good stuff in there. I use heavy filters on my T1, but even if you use !.ca and !.fr you will still get links from sites that aren't hosted in france or canada (and it will be even harder to get canadian sites in French). Also, Sven has said that using the "Always Use Keywords to Find Targets" option can really slow down SER.
I'm not an expert, but I think you would be better off scraping URLs with something like Scrapebox, using French keywords combined with English footprints. Then you could use !.ca and !.fr filters in SER. It would dramatically limit how many links you build, but the ones you do get would be highly targeted.
I haven't this on my gsa