Skip to content

[Local SEO Links] Feature

Hi

If you want to target a certain TLD, I'd like to see the following feature in GSA SER:

Let's say that I want to target .de domains. I add a filter !.de, and just enable my sitelists since I know there are many .de domains in there.

It seems that GSA SER, grabs a random selection of URL's from the sitelist, loads them up, then checks if they match the filter. It then only keeps the ones that pass (i.e. with .de in the url) and removes the others since the !.de filter rejected them. Given that most of my sitelists are not .de url's it spends a lot of time doing this!

I'd like to see an option, if you want to target a specific country or TLD, GSA SER does a search in the sitelist for ".de" and then only loads a random selection of those .de domains. This is much better than grabbing random urls from sitelist and then filtering. 

@sven - is this possible?

Comments

  • SvenSven www.GSA-Online.de
    Just import all the site lists urls directly into the project. It filters them out when being active.
  • AlexRAlexR Cape Town
    edited March 2014
    @sven - I've tried that but it's not working nicely.

    For example a few projects have 4 000 000 target URL's and only 16 verified and 32 submitted over 3 weeks! It's for contextual links. I imported my sitelists for those engines. I have this on many projects. I think it's just importing URL's that get rejected by the URL filter. (!.de) and this just wastes so much time. Loading all of these URL's only to reject them. (I know that there are many URL's in the list that match the filter but it's just not getting to them as there are many that don't!) 

    [FEATURE] Is there no way you could have an option to apply the URL filter the imported sitelists before they get loaded or while they get loaded??? This would be such a nice feature for local SEO.
  • SvenSven www.GSA-Online.de
    after import you can show remaining target urls->select those you don't want->delete.
  • @Alex R can you not trim the list to only have .de domains in it before importing to SER? 

    Also, I found that adding a 2 million list it would take SER forever, but split them up into 50k chunks and it rattles through them so much faster - Scrapebox can split files easily. Check out gooner's posts on keeping SER 'thin' in terms of imported lists.
  • AlexRAlexR Cape Town
    The issue is the number of platforms and all are in different .txt files.

    I'm just testing this method and it's working well.
    Import Sitelist (depends on engines selected) I.e. all contextuals.
    Select by mask.
    Export to .txt.
    It trims the list right down. Then I can just import the exact ones in my exported .txt. :-)
Sign In or Register to comment.