Skip to content

RankerX and GSA

Hello,

I've been using RankerX for a while now and I'm happy with the results however I would like to take it a step further and heard that GSA is good for tier 2/3 links on top of the RankerX tier 1 links but I have a couple of questions I'm hoping someone can help me clear up:
1. Most of the campaigns in RankerX are already tier 1/2/3, should I import all 3 tier links to GSA or only certain tiers?
2. I currently use XEvil with RankerX, can I also use XEvil with GSA or do I need GSA Captcha Breaker as well? I don't know if there's any point in buying GSA Captcha Breaker if I've got XEvil?
3. Do I need to buy an SER list if I'm not doing tier 1 links? if I do need an SER list, can anyone recommend a cheap but decent company?
4. After I've set everything up and imported the RankerX links to GSA, what happens when I have new links in RankerX, do I add them to existing campaigns in GSA or create a new campaign for the new links? Also is there an option to duplicate campaigns in GSA?
5. I currently have 10 proxies from buyproxies.org, with this be enough to run both RankerX and GSA?

Please let me know if you have any other advice that can help.

Thanks

Comments

  • 1. You can boost any tiers you want
    2. If you hav X-Evil already that solves solvemedias and RC v2 and 3 plus some H you'll be fine. Maybe capMonster Cloud to get more H-Captcha. 
    3. You need sites to post to so scraping or buying a list will be needed to go beyond shortners and comments and a few other easy engines. 
    4. Can you export them to a file which you can then read from from SER at an interval in the advanced option there is read verfifieds from a folder option.
    5. To run both on 10 proxies may be wishful thinking. Personally I would grab more. And keep them seprateted but really depends what your doing with them I think.
    Thanks for the quick response :)
    1. Do you think it would be best to copy all 3 tier links from RankerX into GSA for maximum results then?
    2. I also use a third-party captcha provider on top of XEvil but I'm hoping to keep the costs down with them, so do you think it would be pointless buying GSA Captcha Breaker or do you think it would help a little?
    3. Do you have any recommendations for a cheap but decent SER list provider?
    4. What I mean is if I import more links from RankerX into GSA in the future, do I add them into an existing campaign or a new one because my thinking is if I add new links to an existing campaign, it won't build backlinks to the previous URL'S already summited for the older links or will it?
    5. Think I'll give it ago with the 10 and see what happens, I can always buy some more if needed, I'm assuming it will just run a bit slower and I will need to buy more to speed things up?
  • 1. You can boost any tiers you want
    2. If you hav X-Evil already that solves solvemedias and RC v2 and 3 plus some H you'll be fine. Maybe capMonster Cloud to get more H-Captcha. 
    3. You need sites to post to so scraping or buying a list will be needed to go beyond shortners and comments and a few other easy engines. 
    4. Can you export them to a file which you can then read from from SER at an interval in the advanced option there is read verfifieds from a folder option.
    5. To run both on 10 proxies may be wishful thinking. Personally I would grab more. And keep them seprateted but really depends what your doing with them I think.
    Thanks for the quick response :)
    1. Do you think it would be best to copy all 3 tier links from RankerX into GSA for maximum results then?
    2. I also use a third-party captcha provider on top of XEvil but I'm hoping to keep the costs down with them, so do you think it would be pointless buying GSA Captcha Breaker or do you think it would help a little?
    3. Do you have any recommendations for a cheap but decent SER list provider?
    4. What I mean is if I import more links from RankerX into GSA in the future, do I add them into an existing campaign or a new one because my thinking is if I add new links to an existing campaign, it won't build backlinks to the previous URL'S already summited for the older links or will it?
    5. Think I'll give it ago with the 10 and see what happens, I can always buy some more if needed, I'm assuming it will just run a bit slower and I will need to buy more to speed things up?
    It would be great if someone could help me with this as I'm looking to buy everything I need as soon as possible but want to make sure it's all possible first?
  • 3 . build your own list, paid ones are spammed to death. Unless you find a seller whose list is not spammed to death. buy for one  month and see where it submits to , if it submitting to urls in that list which has 20,000 blog comments already , you know what to do 
  • sickseosickseo London,UK
    1. You can boost any tiers you want
    2. If you hav X-Evil already that solves solvemedias and RC v2 and 3 plus some H you'll be fine. Maybe capMonster Cloud to get more H-Captcha. 
    3. You need sites to post to so scraping or buying a list will be needed to go beyond shortners and comments and a few other easy engines. 
    4. Can you export them to a file which you can then read from from SER at an interval in the advanced option there is read verfifieds from a folder option.
    5. To run both on 10 proxies may be wishful thinking. Personally I would grab more. And keep them seprateted but really depends what your doing with them I think.
    Thanks for the quick response :)
    1. Do you think it would be best to copy all 3 tier links from RankerX into GSA for maximum results then?
    2. I also use a third-party captcha provider on top of XEvil but I'm hoping to keep the costs down with them, so do you think it would be pointless buying GSA Captcha Breaker or do you think it would help a little?
    3. Do you have any recommendations for a cheap but decent SER list provider?
    4. What I mean is if I import more links from RankerX into GSA in the future, do I add them into an existing campaign or a new one because my thinking is if I add new links to an existing campaign, it won't build backlinks to the previous URL'S already summited for the older links or will it?
    5. Think I'll give it ago with the 10 and see what happens, I can always buy some more if needed, I'm assuming it will just run a bit slower and I will need to buy more to speed things up?
    It would be great if someone could help me with this as I'm looking to buy everything I need as soon as possible but want to make sure it's all possible first?
    1. You should be using GSA to point links at all tiers in Rankerx. Any link that has zero inbound links has very little chance of being indexed. What is a waste of time though is pointing links at their bookmarking module. They've added a ton of google redirect urls which are blocked from indexing. So absolutely pointless to point links at these.
    2. If you have xevil already, there won't be much left for captcha breaker to solve, so no point getting it. 
    3. serverifiedlists isn't bad. I'm testing it right now and seems to have a wide range of working targets. Personally I prefer scraping my own targets. You'll get better results than using over spammed site lists. I don't normally buy lists so nothing I would recommend getting.
    4. Best to set up new campaigns for new rankerx campaigns. You will find it hard to keep up with rankerx. Each multi tier template makes thousands of links. That means you'll need hundreds of thousands of links to power them up. Best to keep gsa campaigns separate so that each rankerx link gets the proper attention. If you over load a gsa campaign with too many rankerx links, many links won't get any links built to them.
    5. I use datacenter proxies. They can handle many more threads than your normal ipv4 proxies. Any decent proxy provider should be able to handle 100 threads per proxy. So 10 dedicated proxies means you should be able to push 1000 simultaneous threads through them before the performance deteriorates.
  • sickseo said:
    1. You can boost any tiers you want
    2. If you hav X-Evil already that solves solvemedias and RC v2 and 3 plus some H you'll be fine. Maybe capMonster Cloud to get more H-Captcha. 
    3. You need sites to post to so scraping or buying a list will be needed to go beyond shortners and comments and a few other easy engines. 
    4. Can you export them to a file which you can then read from from SER at an interval in the advanced option there is read verfifieds from a folder option.
    5. To run both on 10 proxies may be wishful thinking. Personally I would grab more. And keep them seprateted but really depends what your doing with them I think.
    Thanks for the quick response :)
    1. Do you think it would be best to copy all 3 tier links from RankerX into GSA for maximum results then?
    2. I also use a third-party captcha provider on top of XEvil but I'm hoping to keep the costs down with them, so do you think it would be pointless buying GSA Captcha Breaker or do you think it would help a little?
    3. Do you have any recommendations for a cheap but decent SER list provider?
    4. What I mean is if I import more links from RankerX into GSA in the future, do I add them into an existing campaign or a new one because my thinking is if I add new links to an existing campaign, it won't build backlinks to the previous URL'S already summited for the older links or will it?
    5. Think I'll give it ago with the 10 and see what happens, I can always buy some more if needed, I'm assuming it will just run a bit slower and I will need to buy more to speed things up?
    It would be great if someone could help me with this as I'm looking to buy everything I need as soon as possible but want to make sure it's all possible first?
    1. You should be using GSA to point links at all tiers in Rankerx. Any link that has zero inbound links has very little chance of being indexed. What is a waste of time though is pointing links at their bookmarking module. They've added a ton of google redirect urls which are blocked from indexing. So absolutely pointless to point links at these.
    2. If you have xevil already, there won't be much left for captcha breaker to solve, so no point getting it. 
    3. serverifiedlists isn't bad. I'm testing it right now and seems to have a wide range of working targets. Personally I prefer scraping my own targets. You'll get better results than using over spammed site lists. I don't normally buy lists so nothing I would recommend getting.
    4. Best to set up new campaigns for new rankerx campaigns. You will find it hard to keep up with rankerx. Each multi tier template makes thousands of links. That means you'll need hundreds of thousands of links to power them up. Best to keep gsa campaigns separate so that each rankerx link gets the proper attention. If you over load a gsa campaign with too many rankerx links, many links won't get any links built to them.
    5. I use datacenter proxies. They can handle many more threads than your normal ipv4 proxies. Any decent proxy provider should be able to handle 100 threads per proxy. So 10 dedicated proxies means you should be able to push 1000 simultaneous threads through them before the performance deteriorates.
    Thanks sickseo, that really helped clear things up :)

    I've been testing GSA Captcha Breaker on top of XEvil with RankerX for the past 2 days and so far Captcha Breaker hasn't been able to solve any of the links that XEvil has sent over and they have been forwarded over to 0captcha instead. I'm going to retest with GSA once it's up and running but I've got a feeling it will probably be the same. I'm thinking about using the DeepSeek option instead of 0captcha as a backup, does anyone have any experience with this, is it cheaper than 0captcha?

    I've just brought a list from GSA Verify Link List - Shen e-Services (recommended by GSA), I've been speaking to him on WhatsApp and he even set everything up for me :)

    I just need to figure out the email account options and then I think I'm all done with the setup, any advice on the email side of things? I'm thinking about creating an email account for each domain for example seo@test1.com for test1.com backlinks and seo@test2.com for test2.com backlinks, what do you think? will 1 email account be ok or will I need a couple per domain? can I use the email for all campaigns? and new campaigns?
  • sickseosickseo London,UK
    I use deepseek for text captcha solving in gsa. Seems to work and I am getting links in sites that have text captcha on them. The solve numbers in deep seek are way higher than my actual verified links though. Am guessing I'm hitting sites that have registration disabled. But it's cheap enough anyway to keep running.

    The 0captcha service isn't much different to Xevil solving. I tested a campaign with just xevil and then the same campaign with just 0captcha and the overall link numbers were pretty much identical. I actually wonder if the 0captcha service is powered by Xevil. Many captcha services do just this. 

    For emails I use catchall emails - 1 catchall email per gsa install. You may get better results with 1 catchall email per project, but I've not noticed a difference in performance or link numbers between the 2 options. If you use the spin file format for the catchall, you'll get hundreds of thousands of unique emails geenrated with just 1 catchall email. The software will do it automatically.
    Thanked by 1atmosphereswamp
  • I'm not sure about that because I have been using RankerX - XEvil - 0captcha for a while and XEvil has missed a couple that 0captcha managed to solve however 0captcha has been costing me around £5 per month which isn't a lot and I'm happy to pay but I really like the idea of using AI (DeepSeek ) so I'm thinking of trying RankerX - XEvil - DeepSeek - 0captcha out of curiosity to see if I can get rid of that £5 per month I seem to be spending with 0captcha as the backup. If it works I will then do the same with GSA once set up. I might also add Captcha Breaker into the mix for example GSA - XEvil - Captcha Breaker - DeepSeek - 0captcha but I have my doubts about this because I have tried RankerX - XEvil - Captcha Breaker - 0captcha and Captcha Breaker missed all of them.

    I'm thinking about setting catchall up myself in cPanel on my main domain but I'm worried about the domain getting blacklisted or any other negative effects?
  • sickseosickseo London,UK
    If your main domain is on shared hosting, then it will likely cause issues with server performance and effect other users. Best to put it on a vps with cpanel or direct admin. I've got mine set up with solidseovps on thier linux servers with cpanel. But there will be other options. Main thing is that the domain is privately hosted and not on shared hosting.

    I'll have to test 0captcha again. Although my costs of running that could be quite high as I've got multiple rankerx running 24/7, so I prefer the fixed cost option with just xevil.
    Thanked by 1atmosphereswamp
  • AliTabAliTab https://gsaserlists.com
    For creating catchall emails, check out my latest product. It’s designed to simplify GSA SER email management, a concern I’ve had for years. Despite using catchalls, I constantly needed to update projects with fresh emails. That’s why I developed CatchE.

    https://forum.gsa-online.de/discussion/33628/catche-catchall-email-creator-for-any-tool-automatic-gsa-ser-project-email-updater-email-hosting
  • @backlinkaddict What's the reason for changing host IP and domains? do they get blacklisted after a while? I was going to use my main domain but this has got me worried now and I am wondering if I should buy another domain just to use with GSA?
  • AliTabAliTab https://gsaserlists.com
    Conragts getting into production, Looks good! It is quite annoying to manage many email accounts through c-panel, I am changing host IP constantly and switching domains as well
    Thank you! Yes, having fresh domains is very important.
  • I've been using the same 5 domains for over 2 years for catchall emails. All 5 domains sit on the same server so same ip. Just checked them against  https://mxtoolbox.com/blacklists.aspx and it seems that none of the domains are blacklisted.
  • AliTabAliTab https://gsaserlists.com
    sickseo said:
    I've been using the same 5 domains for over 2 years for catchall emails. All 5 domains sit on the same server so same ip. Just checked them against  https://mxtoolbox.com/blacklists.aspx and it seems that none of the domains are blacklisted.
    The server's IP doesn't play a significant role when it’s just being used as an inbox for receiving emails, but I have noticed a big difference in the success rate when using fresh domains for the emails.
  • sickseo said:
    I've been using the same 5 domains for over 2 years for catchall emails. All 5 domains sit on the same server so same ip. Just checked them against  https://mxtoolbox.com/blacklists.aspx and it seems that none of the domains are blacklisted.
    @sickseo Are these your websites domains or separate domains you've brought just to use for catchall? whats your reason for using 5 domains, do you use a different on each software install? I'm just trying to get an idea so I can figure out what route I'm going to go down.

    To give you an idea of my situation, I'm currently using RankerX to build backlinks to my 6 websites and I've now brought GSA to build backlinks on top of these backlinks. I also have separate hosting (cPanel) just for my email accounts. I have around 100 campaigns running it RankerX so there will also be another 100 running in GSA once all set up.
  • Escribe tu comentario
    AliTab said:
    For creating catchall emails, check out my latest product. It’s designed to simplify GSA SER email management, a concern I’ve had for years. Despite using catchalls, I constantly needed to update projects with fresh emails. That’s why I developed CatchE.

    https://forum.gsa-online.de/discussion/33628/catche-catchall-email-creator-for-any-tool-automatic-gsa-ser-project-email-updater-email-hosting

    I'm glad to see contributions in SEO, and it's funny that I have a very similar system.

    Years ago I developed a little script that allows me to create thousands of emails in a few seconds, then just import them into CPanel and GSASER.

    I usually use 1 domain per campaign, but if that campaign is going to involve building a lot of backlinks, I'll expand to 3 or 5 domains to avoid any spam considerations.

  • sickseo said:
    I've been using the same 5 domains for over 2 years for catchall emails. All 5 domains sit on the same server so same ip. Just checked them against  https://mxtoolbox.com/blacklists.aspx and it seems that none of the domains are blacklisted.
    @sickseo Are these your websites domains or separate domains you've brought just to use for catchall? whats your reason for using 5 domains, do you use a different on each software install? I'm just trying to get an idea so I can figure out what route I'm going to go down.

    To give you an idea of my situation, I'm currently using RankerX to build backlinks to my 6 websites and I've now brought GSA to build backlinks on top of these backlinks. I also have separate hosting (cPanel) just for my email accounts. I have around 100 campaigns running it RankerX so there will also be another 100 running in GSA once all set up.
    These are separate domains that were purchased for the sole purpose of being used for catchall emails. Best to keep them separate and not use your main domain/money site domain.

    The reason I have 5 is that the cpanel hosting I have is for 5 domains. Just turned out that way. There are unlimited sub domains, so unlimited catchall emails. I use them in Rankerx, GSA. Also Xrumer now and again. It's probably quite expensive set up as direct admin would be way cheaper. Likely other alternatives that are way cheaper. But it's always worked and only issues I ever had was not having enough storage. Suppose I should really change the domains to fresh ones, but never got round to it.

    I think the reason why the domains aren't on any blacklists is due to the blacklist filter I use in GSA.
  • It's already set to auto delete emails older than 1 day in gsa and rankerx. I just have too many emails in use.

    The rankerx emails can go quite high - sometimes it can be 200-300mb per account. Depends on how many campaigns and which templates I'm running. 16 installs with 5 catchall emails in each. 80 catchalls just for rankerx.

    The gsa set up uses even more emails but storage per account is normally very low. I suspect gsa does a better job of emptying the inbox than rankerx does.

    I've had to upgrade the storage twice now as it wasn't enough.
  • I think I've figured out the email account side of things now, just wondering if anyone has any suggestions in terms of the settings or do you think the default settings are good?


  • "RankerX" was a good web 2.0 writing registration program in the beginning.
    However, recent updates are also meager, and there are so many posts that get deleted.
    Rather than being deleted after an index to Google,
    Even if the quantity is small, it is better to write it by hand and not delete it and keep it for a long time.
    Everyone says "RankerX" is good, but if you look up the backlinks that are actually deleted,
    Will it be a good program?
    For web 2.0 with large filtering in the first place, it is correct to do it by hand.
    If you rely solely on automation, you will experience numerous deletion backlinks.
  • i use {C|C.}{h|h.}{a|a.}{u|u.}{n|n.}{c|c.}@gmail.com This style and cloudflare  catchall emails
  • We use both of them together, gmail and catchall domains
  • I'm almost done setting up GSA, just wondering if anyone has any advice on which sites to submit to just for tier 2, I'm not looking to build tier 1 because I will be using RankerX for these and I don't want to waste my time ticking them all and adding all the data if I don't need to?


  • I also haven't got around to looking at the options yet, any recommendations? again this will only be for 2 tier links.


  • sickseosickseo London,UK
    edited September 2024

    That's what I run on my last tier. Personally, I think blog comments and guestbooks are an absolute must as these are pages already indexed in Google. So anything they point at will get crawled by google. Sometimes I run indexing campaigns with just blog comments and guestbooks as these are the most effective for indexing links - but they tend to use high cpu.

    Url shorteners and indexers are questionable, as these days they don't get indexed as easily as they used to. They do still create another path to your T1 links for google to crawl, so that's why I still use them. You just don't know what impact a particular url will have as a Tier 2 on the T1 link.

    Exploits you can probably deselect. I still use them as they're 100% do follow with keyword anchors. But again indexing rates on them will be non existant.

    Articles, forums, social networks and wikis are my core engines which I use normally on T1, but I do put them in as T2 as 90% of my site list here are no follow links, and platforms like wiki, gnu, and dwqa - I have thousands of these sites. Both Google and Bing do use no follow links as a hint for crawling, so T2 is the best place to put these types of links to get your T1 links crawled.



    I don't have many options selected. The scheduled/additional posting can be a good one to use as it will make more links on sites that support it - mainly article sites like wiki and wordpress. So with 5 accounts and 5 posts, the supported sites will make 25 links each time instead of 1.


    Thanked by 1atmosphereswamp
  • I just pasted in the domains from the money robot software and rankerx site list too. If you post to the MR site list without money robot software, the link gets deleted automatically. Their sites end up in my site list from the scraping I do, so blacklisting them is the only way to prevent posting with GSA.

    I do however use MR to post direct to money sites. It moves rankings, that's why I use it.

    I pay zero attention to the data in sem rush. It's incomplete poop data that people still pay for. I'm not a fan of any of these 3rd party data services. 
  • sickseo said:

    That's what I run on my last tier. Personally, I think blog comments and guestbooks are an absolute must as these are pages already indexed in Google. So anything they point at will get crawled by google. Sometimes I run indexing campaigns with just blog comments and guestbooks as these are the most effective for indexing links - but they tend to use high cpu.

    Url shorteners and indexers are questionable, as these days they don't get indexed as easily as they used to. They do still create another path to your T1 links for google to crawl, so that's why I still use them. You just don't know what impact a particular url will have as a Tier 2 on the T1 link.

    Exploits you can probably deselect. I still use them as they're 100% do follow with keyword anchors. But again indexing rates on them will be non existant.

    Articles, forums, social networks and wikis are my core engines which I use normally on T1, but I do put them in as T2 as 90% of my site list here are no follow links, and platforms like wiki, gnu, and dwqa - I have thousands of these sites. Both Google and Bing do use no follow links as a hint for crawling, so T2 is the best place to put these types of links to get your T1 links crawled.



    I don't have many options selected. The scheduled/additional posting can be a good one to use as it will make more links on sites that support it - mainly article sites like wiki and wordpress. So with 5 accounts and 5 posts, the supported sites will make 25 links each time instead of 1.


    sickseo I noticed you haven't got Web 2.0 ticked, do you not use this for tier 2? I also noticed you haven't got GSA SEO Indexer ticked, don't use GSA SEO Indexer or GSA URL Redirect PRO? I do own both of these but they seem heavy on the CPU and I'm wondering if they're needed for only tier 2 links and whether it's even worth using them?

  • sickseo I'm also wondering what sort of options you're using at the top of the data tab?
    P.s really appreciate your help :)
  • sickseo said:

    That's what I run on my last tier. Personally, I think blog comments and guestbooks are an absolute must as these are pages already indexed in Google. So anything they point at will get crawled by google. Sometimes I run indexing campaigns with just blog comments and guestbooks as these are the most effective for indexing links - but they tend to use high cpu.

    Url shorteners and indexers are questionable, as these days they don't get indexed as easily as they used to. They do still create another path to your T1 links for google to crawl, so that's why I still use them. You just don't know what impact a particular url will have as a Tier 2 on the T1 link.

    Exploits you can probably deselect. I still use them as they're 100% do follow with keyword anchors. But again indexing rates on them will be non existant.

    Articles, forums, social networks and wikis are my core engines which I use normally on T1, but I do put them in as T2 as 90% of my site list here are no follow links, and platforms like wiki, gnu, and dwqa - I have thousands of these sites. Both Google and Bing do use no follow links as a hint for crawling, so T2 is the best place to put these types of links to get your T1 links crawled.



    I don't have many options selected. The scheduled/additional posting can be a good one to use as it will make more links on sites that support it - mainly article sites like wiki and wordpress. So with 5 accounts and 5 posts, the supported sites will make 25 links each time instead of 1.


    sickseo I noticed you haven't got Web 2.0 ticked, do you not use this for tier 2? I also noticed you haven't got GSA SEO Indexer ticked, don't use GSA SEO Indexer or GSA URL Redirect PRO? I do own both of these but they seem heavy on the CPU and I'm wondering if they're needed for only tier 2 links and whether it's even worth using them?
    The web 2.0 doesn't make any links, so I don't use it.

    Seo indexer and redirect pro I do use but use them as standalone tools to link blast new domains or individual urls. Combined they make about 25k links.

    To connect them to every gsa ser install I have just creates an endless queue of backlinks to process. Basically the 2 tools can't keep up with the volume of links coming from gsa ser. Besides the same link sources are in my gsa ser site list and are already being used as T2.
    Thanked by 1atmosphereswamp

  • sickseo I'm also wondering what sort of options you're using at the top of the data tab?
    P.s really appreciate your help :)
    If it's the last Tier then I' normally do this:



    I'll use a large number of keywords mixed with domain and generic anchors.
    Thanked by 1atmosphereswamp
  • ksatul said:
    3 . build your own list, paid ones are spammed to death. Unless you find a seller whose list is not spammed to death. buy for one  month and see where it submits to , if it submitting to urls in that list which has 20,000 blog comments already , you know what to do 
    Hello

    Could I ask you with what kind of tool for scrap you do your list ? Is it with GSA SER or scrapped box or another ? I tried to do with scrapped box but GSA SER did not select any of them for publication. I'm having trouble doing something in scrappebox. thank you for your response.
    Muriel
  • highsavehighsave usa
    edited October 2024
    Hello,

    I've been using RankerX for a while now and I'm happy with the results however I would like to take it a step further and heard that GSA is good for tier 2/3 links on top of the RankerX tier 1 links but I have a couple of questions I'm hoping someone can help me clear up:
    1. Most of the campaigns in RankerX are already tier 1/2/3, should I import all 3 tier links to GSA or only certain tiers?
    2. I currently use XEvil with RankerX, can I also use XEvil with GSA or do I need GSA Captcha Breaker as well? I don't know if there's any point in buying GSA Captcha Breaker if I've got XEvil?
    3. Do I need to buy an SER list if I'm not doing tier 1 links? if I do need an SER list, can anyone recommend a cheap but decent company?
    4. After I've set everything up and imported the RankerX links to GSA, what happens when I have new links in RankerX, do I add them to existing campaigns in GSA or create a new campaign for the new links? Also is there an option to duplicate campaigns in GSA?
    5. I currently have 10 proxies from buyproxies.org, with this be enough to run both RankerX and GSA?

    Please let me know if you have any other advice that can help.

    Thanks

    I recently purchased GSA and RankerX. Can you share your experience using Rankerx? How to generate tier 1 articles?
  • londonseolondonseo London, UK
    sickseo said:
    The web 2.0 doesn't make any links, so I don't use it.

    Seo indexer and redirect pro I do use but use them as standalone tools to link blast new domains or individual urls. Combined they make about 25k links.

    To connect them to every gsa ser install I have just creates an endless queue of backlinks to process. Basically the 2 tools can't keep up with the volume of links coming from gsa ser. Besides the same link sources are in my gsa ser site list and are already being used as T2.

    @sickseo - Do you connect both SEO Indexer & URL Redirect Pro to each other?
  • sickseosickseo London,UK
    I've tested that. 2 issues with doing that is that you're just creating tiers of redirects, neither of which are likely to get indexed. Plus the sheer volume of links that one tool makes creates an endless queue of links to process in the 2nd tool. You can reduce link numbers by applying the PR filter though.  

    To get these types of links indexed, you need to be pointing other types of links at them - blog comments, guestbooks, wikis, articles. These types of links are good for crawling/indexing - so you'd need to point them at the redirects for them to have any chance of being crawled/indexed. 

    Currently, I use these 2 tools as domain boosting tools only. Many of the sites have decent DA/DR so pointing them at homepage of money sites then running a 2nd tier on them using those link sources mentioned above is probably the best use.

    Indexing them these days is tough. Google recently targetted redirects in one of their spam updates. So things have changed.
    Thanked by 1londonseo
  • londonseolondonseo London, UK
    sickseo said:
    I've tested that. 2 issues with doing that is that you're just creating tiers of redirects, neither of which are likely to get indexed. Plus the sheer volume of links that one tool makes creates an endless queue of links to process in the 2nd tool. You can reduce link numbers by applying the PR filter though.  

    To get these types of links indexed, you need to be pointing other types of links at them - blog comments, guestbooks, wikis, articles. These types of links are good for crawling/indexing - so you'd need to point them at the redirects for them to have any chance of being crawled/indexed. 

    Currently, I use these 2 tools as domain boosting tools only. Many of the sites have decent DA/DR so pointing them at homepage of money sites then running a 2nd tier on them using those link sources mentioned above is probably the best use.

    Indexing them these days is tough. Google recently targetted redirects in one of their spam updates. So things have changed.

    @sickseo - for the combined 25k links would you set SER to post 100 times per
    sickseo said:
    The web 2.0 doesn't make any links, so I don't use it.

    Seo indexer and redirect pro I do use but use them as standalone tools to link blast new domains or individual urls. Combined they make about 25k links.

    To connect them to every gsa ser install I have just creates an endless queue of backlinks to process. Basically the 2 tools can't keep up with the volume of links coming from gsa ser. Besides the same link sources are in my gsa ser site list and are already being used as T2.

    @sickseo - for the combined 25k links would you set SER to post 100 times per account and make 250 accounts?
  • sickseosickseo London,UK
    I just set up a T2 campaign in GSA and let it run for several weeks. My set up will auto reset campaigns every 24 hours so it's completely hands free. No need to use scheduled posting and reposting on multiple accounts.

    But sure, you can do it that way as well. As long you build T2's, that's all that matters. The only benefit of using the reposting settings is that the project will make more links before it needs to be reset. Otherwise it will reach the end of your site list and stop making new links.
  • londonseolondonseo London, UK
    sickseo said:
    I just set up a T2 campaign in GSA and let it run for several weeks. My set up will auto reset campaigns every 24 hours so it's completely hands free. No need to use scheduled posting and reposting on multiple accounts.

    But sure, you can do it that way as well. As long you build T2's, that's all that matters. The only benefit of using the reposting settings is that the project will make more links before it needs to be reset. Otherwise it will reach the end of your site list and stop making new links.

    Thanks for your help and contribution to this forum. B)


    Is there an option in SER to auto-reset campaigns?


  • sickseosickseo London,UK
    It's not built into GSA SER, so I had to make my own solution for it.

    First you need to install a software called cyber D's autodelete onto same machine where you have gsa ser.



    Then you add the above filters which will auto delete different files from your GSA install. This is the file path to the GSA folder: C:\Users\Administrator\AppData\Roaming\GSA Search Engine Ranker\projects

    These files will correspond to target urls, target url cache, submitted links, verified links and accounts. It replicates the steps of deleting target url cache, history and accounts as well as deleting verified and submitted links. In essence it's doing a full reset of link building data.

    So if you don't want to delete any of these then probably best not to use it. Or perhaps set up your own filters based on what you want to be reset.

    The .success filter will remove verified links, so that's the one to be careful of as if you are running T1 campaigns you may not want your T1 links deleted.

    To control this, I rename my projects with "Power Up" and the above filter uses that as a mask (*Power Up.success*) so that only projects labelled "power up" will have verified links deleted each time the software runs.

    Then I use windows task manager to schedule when this program will run. Currently I've got it set to run every 24 hours, but it can be scheduled for any X number of days or even hours via windows task scheduler. 



    If you want the settings file for the above set up I can share it. Then you can use the import function on the autodelete software to have the same filters as above.
  • londonseolondonseo London, UK
    sickseo said:
    It's not built into GSA SER, so I had to make my own solution for it.

    First you need to install a software called cyber D's autodelete onto same machine where you have gsa ser.



    Then you add the above filters which will auto delete different files from your GSA install. This is the file path to the GSA folder: C:\Users\Administrator\AppData\Roaming\GSA Search Engine Ranker\projects

    These files will correspond to target urls, target url cache, submitted links, verified links and accounts. It replicates the steps of deleting target url cache, history and accounts as well as deleting verified and submitted links. In essence it's doing a full reset of link building data.

    So if you don't want to delete any of these then probably best not to use it. Or perhaps set up your own filters based on what you want to be reset.

    The .success filter will remove verified links, so that's the one to be careful of as if you are running T1 campaigns you may not want your T1 links deleted.

    To control this, I rename my projects with "Power Up" and the above filter uses that as a mask (*Power Up.success*) so that only projects labelled "power up" will have verified links deleted each time the software runs.

    Then I use windows task manager to schedule when this program will run. Currently I've got it set to run every 24 hours, but it can be scheduled for any X number of days or even hours via windows task scheduler. 



    If you want the settings file for the above set up I can share it. Then you can use the import function on the autodelete software to have the same filters as above.

    Thanks a lot for this! B)
  • londonseolondonseo London, UK
    edited October 2024
    sickseo said:
    The web 2.0 doesn't make any links, so I don't use it.

    Seo indexer and redirect pro I do use but use them as standalone tools to link blast new domains or individual urls. Combined they make about 25k links.

    To connect them to every gsa ser install I have just creates an endless queue of backlinks to process. Basically the 2 tools can't keep up with the volume of links coming from gsa ser. Besides the same link sources are in my gsa ser site list and are already being used as T2.

    I noticed that using the multiple URLs option does not give me N25k links each.

    I have to do it individually.

    Did you have this issue?


  • sickseosickseo London,UK

    Getting just under 25k urls combined with the multiple urls option.
    Thanked by 1londonseo
  • londonseolondonseo London, UK
    edited October 2024
    sickseo said:

    Getting just under 25k urls combined with the multiple urls option.

    A single URL gives under 25k links combined.

    So, if I had 3 URLs and used the multiple URL option, it should give under 75k links, but this is not the case.

    @sven - Is there a reason for this?


  • sickseosickseo London,UK

    That's using the single url option. Slight difference in numbers. But could be related to connection issues or proxies or sites not working during submission.

    Url redirect pro is giving an extra 1300 links when using the single url option. 
  • londonseolondonseo London, UK
    sickseo said:

    That's using the single url option. Slight difference in numbers. But could be related to connection issues or proxies or sites not working during submission.

    Url redirect pro is giving an extra 1300 links when using the single url option. 

    Yes, so with the multiple URL options, one should get X times the number of links.
  • sickseosickseo London,UK
    edited October 2024
    You do get X times the number of links with multiple urls. But the link numbers shown in the software are for the current run only. When it starts processing a new url, those url stats are reset.

    Are you using the "save to file" and "create  reports" options on the tools? I'm getting separate files of links for each url from each tool. Link numbers do correlate to the above in each file.
    Thanked by 1londonseo
  • londonseolondonseo London, UK
    Ok, I was not using the "save to file" and "create  reports" options.

    Let me try that.

    Thanks again B)
Sign In or Register to comment.