Skip to content

[DOWNLOAD] - Free sorted list of 8,700+ AA sites for SER

HinkysHinkys SEOSpartans.com - Catchalls for SER - 30 Day Free Trial
Hey guys,

Here's a sweet share for all you GSA SER fellow users out there, especially those that don't have powerful setups. 

Basically it's an export of my 8,700+ verified links from one of my projects. All those links have been submitted to using out-of-the-box Captcha Breaker with spun computer generated content.

Download:

The easiest way to make use of this entire list is to take this file and use the "import target URLs" on one of your projects. Ideally, you would setup a dedicated project for this that will go through the list and sort it nicely for you but you can just use one of your regular projects if you wish to do so. Just make sure that you're not using any paid per solve captcha service (such as DeathByCaptcha) as this will eat through your balance rather quickly.

The easiest way to make use of this entire list is to take this file and use the "import target URLs" on one of your projects. Ideally, you would setup a dedicated project for this that will go through the list and sort it nicely for you but you can just use one of your regular projects if you wish to do so. Just make sure that you're not using any paid per solve captcha service (such as DeathByCaptcha) as this will eat through your balance rather quickly.

If you need some help with setting up a project like this, you will find it the tutorial I just published:

Basically the list shared here is only 20% of the links I created with just 1 run of my method of building a site list for SER. 

In the tutorial I did my best to explain how you can do this method yourself and really start building up your site list.

Whatever you do, just make sure you're saving identified / successful submitted / verified sites before importing it. (if you want to save them for later use)

Hope you put this to good use. ;)

Comments

  • edited September 2013
    nice list... using it alrdy, i will try this method to scrape more links... thanks

    how you created that excel with all the details? you can do that with SER?
  • Trevor_BanduraTrevor_Bandura 267,647 NEW GSA SER Verified List
    Thanks
  • Thanks HinkysHinkys - much appreciated.
  • awesome job ! Will Thank you on BHW
  • HinkysHinkys SEOSpartans.com - Catchalls for SER - 30 Day Free Trial
    @rodol - yeah, just go to your verified / submitted list and export - all. Now just clean up the csv and save it. ;)
  • Cool share, thanks
  • HinkysHinkys SEOSpartans.com - Catchalls for SER - 30 Day Free Trial
    @Metster - Well you will find that SER can post to a very good number of those sites. And they are sites just like any other, filter out the unindexed ones if you wish and you will end up with a good list just like you would if you were scraping them directly (only you will get much more sites this way)
  • Big Thanks for that. I run a little test and from 200 blog comments i extracted 150k urls, For my surprise, when i imported into GSA SER it's start to post like mad with 50 LPM. Thank you man.
  • AlexRAlexR Cape Town
    @Hinkys - how does this work in scraping contextual platform links?
  • HinkysHinkys SEOSpartans.com - Catchalls for SER - 30 Day Free Trial
    @AlexR - Well you will gets loads of article / wiki / social network sites (as well as most of the other platforms supported by SER) this way. :D
  • AlexRAlexR Cape Town
    @Hinkys - what's the best way to scrape .co.za contextual links? I'm struggling with that as we have limited SE's and it keeps finding the same ones. 
  • @AlexR
    using SB requires substantially less bandwidth than SER

    I use above method successfully since I read  the full article on BHW / downloaded the list

    SB external link extraction however gets you ALL links from the URL list you add to SB

    filtering for only .co.za would then have to be done offline
    I do such offline filtering using Linux and a shell script that extracts only those platforms or URLs I want to target

    one long SB run for me is about 8-12+ hrs and results in some 800k URLs
    after filtering I usually have some 50k-70k target dedup URLs
    and may be 2000-10'000 dedup domains to be imported into SER

    since the external inks you find are all approved links  = verified = else that guy would never place a link to it ...
    your LpM will be substantially higher than using regular SB search or SER search

    but
    searching only for .co.za  ....
    and what about those South African sites .com, .net, .org, ... etc ???
    I just check my previous list of 7000+ dedup URL and found only about a dozen .co.za

    what about including your neighbor countries namibia, zimbabwe, botswana, etc
    besides the fact that such visitors also may turn into customers when traveling, may be from SEO/SERPS point of view no big difference

    or starting above method
    but first create your own start list using SB
    doing G search on EACH of your and neighbor countries with site:co.za + keyword for each platform (as well as for neighbor countries ?)

    then see which platforms are open for auto approval,
    then run SB ext link extraction on those sites ...
    or use above method on all sites you have currently submitted / verified
  • AlexRAlexR Cape Town
    @hans51 - thanks. Using something similar but wanted to see if there was a quicker way to extract all the .co.za targets. :-)
  • HinkysHinkys SEOSpartans.com - Catchalls for SER - 30 Day Free Trial
    @AlexR - the external links method isn't a very good option if you're only after a specific platform, specific domain extension or w/e. It's when you want to populate your site list with...erm...everything. xD 

    Sure using this method on a seed list of .co.za blogs might result in more .co.za links after extracting those external links but it's still far from a targeted approach and I'd be willing to bet you will still find only a small number of them. Doing this might be faster but if you're ONLY after co.za domains and you don't care about any other links then there's really no point in doing it this way.

    The best approach would be to use SB (or SER's "search online for URLs" if you don't have SB) and just scrape like you normally would with site:co.za + keyword for each platform as @hans51 said. Then clean the list and import it to a sorting project in SER.
  • worst case scenerio
    get a list of ALL .co.za domains EVER registered
    then SB or SER the full list
    NOT on your 5$/GB line
    but from another flat rate line or for one single months on a VPS = still cheaper than $/GB

    may be WE ALL check our lists and "donate" our .co.za to you
    just for fun or love among family members ...

    how many .co.za URLs do you have already ??

    and how did you extract the OTHER TLDs locaed in S. Africa .... by IP allocations ? or

    may be maxmind.com GeoIP lite version (free) may help to find all IPs for your country and then find / create another solution to get all your needed sites

    I just ran a quick test across my 3 SB runs from past 24 hrs
    and got some 600+ URLs lines BEFORE filtering for SER URLs ... from a total of about 640k URLs
  • very very thanks and nice job.
  • days back, I saw this thread and forgot to bookmark it. today it takes me nearly half an hour to find it again. I hate myself.
  • Ok, there are some bugs, at least for me. After removing some lower volume footprints from Article Beach the .ini file ended up with only 16 lines, all the rest was removed lol. 

    And some quotes on the footprint line disappeared too and looks like this:

    search term=upload your articles and keep updated about new articles."|"Website Design and Developed by ArticleBeach"|"upload your articles and keep updated about new articles."|"Here are the most popular 100 articles

    Obviously, the first and the last quote are gone :)

Sign In or Register to comment.