Skip to content

Low LPM


Currrently i have this proble of low submission. I am using verified list. Proxy from storm proxies and myprivateproxy, email from catchallboxes.com captcha using XEVIL. It has been very slow the last few days. What might seem to be the problem? Having this problem for a week now after some new update.








«1

Comments

  • SvenSven www.GSA-Online.de
    Sorry but I don't know what "slow" means for you and even then I would not see anything without logs.
  • hyprgandrhyprgandr Indonesia
    edited November 3
    I have less than 1 - 3 LPM for my projects. usually it is around 10 or 20 LPM. After update i notice the drop, so really don't know what i need to set to have faster lpm again. 

  • SvenSven www.GSA-Online.de
    That log was from all kinds of projects and not helpful. I guess the amount of targets is just not enough.  Are you relying only on site lists?
  • hyprgandrhyprgandr Indonesia
    Sven said:
    That log was from all kinds of projects and not helpful. I guess the amount of targets is just not enough.  Are you relying only on site lists?
    Yes I am relying on site list from serpgrow. Can the site list become the problem? if so what verified list do you suggest to have the best speed and quality right now?
  • SvenSven www.GSA-Online.de
    Im sorry, I have no comparison on that. Maybe someone else can suggest here.
  • sickseosickseo London,UK
    edited November 4
    LPM/VPM can be effected by quite a few different things. List makes a big difference and not all sites that you have found to work with gsa ser will always work. Some sites have a shelf life, especially ones that are moderated by owners. Some list sellers will sell the same list to a few hundred different people. So sites will get hammered and if you owned that site, you'd likely do something to stop all that content being posted on your site.

    First thing I noticed is that you are using serpgrow verified list and have put that in the verfied folder and also ticked the box next to verified folder. So you are telling the software to save any new verified links to the same folder where your purchased list is. This is no good as you are trying to write to the purchased list that is being synced into your dropbox. Keep them separate.

    Second thing is that you are only using private proxies for submission. You should also use them for email checking and verification.

    1. The email server will eventually block your request as it comes from the same ip over and over. Most have limits on simultaneous connection. For example, 3 simultaneous connections per ip.
    2. During verification, you are using the same server ip over and over again, which is no good as sites which detect and block your ip from repeated requests will eventually block the software from retreiving the verified link. At high speeds, some sites will think they are under attack and their security software will kick in. So use private proxies here.

    Create A Clean Verified List First

    You should have the purchased list in a different folder to use with any of these 3 folders: Identified, submitted, failed. Let's use the identified folder for example.
    Your verified folder should be empty to start with. (As the box is ticked, any newly verified links that work with your set up will be saved.) Then set 1 project to use the folder where your purchased list is, identified folder. This project will process the purchased list and filter out non-working sites. It basically cleans the list. This project will run slow as the list will contain a mixture of working/non-working sites.
    Set all other projects to only use the verified folder. These projects will run super fast as all links are tested to be working with your set up. The list now contains only working sites.

    Making just that change above will increase your lpm/vpm significantly. Recommend doing that atleast once per month, or when you notice a drop in lpm/vpm. This is just maintenance.

    Performance for me right now is excellent. On a dedi with 1000 threads I get 1000vpm+ continously with v2/v3 solving enabled. On a vps with 300 threads I get anything from 200vpm-400+ vpm.

    Speed also depends on the site selection. Faster links would be fast indexer, exploits, urls shorteners and these platforms can run very very fast as there is no emails, content or captchas to solve on most of them. Less work for the software so links are made faster.
    Slower links are article, blog comments, forum, social network and wikis and any other link sources that requires emails, content and captchas.

    V2/V3 sites will significantly slow things down as average solve times on V2/V3 will be 50-65 seconds per submission. Captchas for sites with no V2/V3 get solved in milliseconds.

    Aside from the above, you can further optimise things for speed. Such as upgrading your vps to one that has a faster internet line such 1gbps or even 10gbps and then using proxies that support the same speed in the same location as your vps.

    Hope that helps.
  • sickseosickseo London,UK
    Also 30 threads is incredibly low. The more threads you use, the higher the lpm/vpm. Try Hetzner Cloud. The vps in their helsinki location are incredibly fast. Their CPX31 can run gsa ser at 300 threads, xevil v2/v3 solving at 50 threads and costs less than 15 euros per month. 1gbps connection. I've moved all my vps there now.
  • hyprgandrhyprgandr Indonesia
    sickseo said:
    Also 30 threads is incredibly low. The more threads you use, the higher the lpm/vpm. Try Hetzner Cloud. The vps in their helsinki location are incredibly fast. Their CPX31 can run gsa ser at 300 threads, xevil v2/v3 solving at 50 threads and costs less than 15 euros per month. 1gbps connection. I've moved all my vps there now.
    thank you for the really helpful advice. If i may can i get a screen shoot at your "submission and advanced setting" espcially regarding the verified list? 

    1. May i also know about the captcha and the proxy you are using?

    2. My proxy providers said(storm proxies) i can not use more than 10 thread or they are going to ban me since i was using their rotating proxy, if you have any better recommendation please do share some of your providers. So that's why i use low thread count for my project. I run 10 project at the same time currently. 

    It would be great if at least I have above 100 LPM 

    Once again am thankful for your advice
  • sickseosickseo London,UK
    edited November 4
    Here you go:






    Captcha

    For captcha I have different set ups running on different machines.

    1st Set Up: The best one is using gsa captcha breaker and xevil with V2/V3 solving. That install generates the most links and using both captcha solving tools together will allow you to make links on a lot more sites. Most list sellers use both software to build their lists. Serpgrow definitely does.  Each tool is good at solving different types of captchas so worth having both.

    2nd Set Up: On the other machines I'm using just GSA captcha breaker and xevil and have disabled v2/v3 solving. Slightly less sites but things run a lot faster when v2/v3 is disabled. Plus I don't have to pay the extra for xevil proxies to solve the v2/v3.

    3rd Set up: No captcha solving. Far less sites that get verified links but the vpm is pretty nuts on the vps i'm using.



    I still see ranking increases, but that's due to the list I've built for myself and the strategy I use. Even without captcha solving, I can still build links on a few thousand sites if you include blog comments and url shorteners. If you include indexers, then this install can make links on over 80k domains, with no captcha solving lol That's plenty to be able to rank with.

    So up to you. All 3 set ups work great, but the 1st is the most expensive as you need 2 sets of proxies, one for gsa and one for xevil.

    Proxies

    Definitely need to be using dedicated proxies here. Avoid public proxies and avoid shared proxies if you want to max out the speed. Storm proxies for gsa link submision/verification is a big no no. I recently started using 10gbps private proxies from here: https://www.leafproxies.com/collections/monthly-datacenter-proxies/products/october-monthly-uk-proxies-10gbps

    The vps I have is in helsinki, so UK proxies still run pretty fast for me. At the very least you need to be using 1gbps private proxies if you want to see an improvement in your vpm, along with a vps that has a 1gbps connection. 

    I've got 35 proxies in total and they're running in 39 different vps, each at 300 threads. That's about 12,000 threads in total being run through 35 dedicated proxies. Speeds are amazingly fast. Proxies are $2 each, so it's not expensive. As they are 10gbps proxies, I can run 10 times as many threads through them compared to my previous supplier which had 1gbps speeds. Plus it's unlimited - run as many threads as your vps can handle.

    Storm proxies is only good for scraping non-google search engines and also for ctr direct traffic bots.

    Once you've built your verified list and changed your proxies to dedicated, you'll notice a massive improvement in vpm/lpm. At 300 threads you should be hitting between 200 and 400 lpm/vpm easily.
    Thanked by 2bolaplay hyprgandr
  • bolaplaybolaplay Indonesia
    @sickseo , I though using SERemail dont need to enabled proxy for email checking and verifications, thx for the tips. Anyway, for your XEvil, what proxy you recommended that have high success rate?
  • sickseosickseo London,UK
    For xevil, i'm using these proxies: 

    https://reproxy.network/#prices

    Best prices I've found with excellent solve rates.

    I can't speak for seremails, I don't use their service. I use my own private catchall emails set up on vps from solidseovps. I have a lot of installs running with a lot of projects, so the server receives a lot of connections simultaneously and starts to reject my connections.

    It depends on how fast you run it, how many projects you have and how many installs you're running. If you run just one install with 10 projects you probably won't notice an issue. But if you run 6000 projects simultaneously, then you will definitely start to run into some issues and will need to use proxies for email verification along with additonal delays between each connection per email account.
    Thanked by 1bolaplay
  • bolaplaybolaplay Indonesia
    Your suggestion makes sense even for low project run, since I have dedicated proxy so why not use it for email too :), I have changed using proxy for email and verifications, and proxy latency still good on 200 threads. And also turn off "skip for identifications".
  • googlealchemistgooglealchemist Anywhere I want
    sickseo said:
    Here you go:






    Captcha

    For captcha I have different set ups running on different machines.

    1st Set Up: The best one is using gsa captcha breaker and xevil with V2/V3 solving. That install generates the most links and using both captcha solving tools together will allow you to make links on a lot more sites. Most list sellers use both software to build their lists. Serpgrow definitely does.  Each tool is good at solving different types of captchas so worth having both.

    2nd Set Up: On the other machines I'm using just GSA captcha breaker and xevil and have disabled v2/v3 solving. Slightly less sites but things run a lot faster when v2/v3 is disabled. Plus I don't have to pay the extra for xevil proxies to solve the v2/v3.

    3rd Set up: No captcha solving. Far less sites that get verified links but the vpm is pretty nuts on the vps i'm using.



    I still see ranking increases, but that's due to the list I've built for myself and the strategy I use. Even without captcha solving, I can still build links on a few thousand sites if you include blog comments and url shorteners. If you include indexers, then this install can make links on over 80k domains, with no captcha solving lol That's plenty to be able to rank with.

    So up to you. All 3 set ups work great, but the 1st is the most expensive as you need 2 sets of proxies, one for gsa and one for xevil.

    Proxies

    Definitely need to be using dedicated proxies here. Avoid public proxies and avoid shared proxies if you want to max out the speed. Storm proxies for gsa link submision/verification is a big no no. I recently started using 10gbps private proxies from here: https://www.leafproxies.com/collections/monthly-datacenter-proxies/products/october-monthly-uk-proxies-10gbps

    The vps I have is in helsinki, so UK proxies still run pretty fast for me. At the very least you need to be using 1gbps private proxies if you want to see an improvement in your vpm, along with a vps that has a 1gbps connection. 

    I've got 35 proxies in total and they're running in 39 different vps, each at 300 threads. That's about 12,000 threads in total being run through 35 dedicated proxies. Speeds are amazingly fast. Proxies are $2 each, so it's not expensive. As they are 10gbps proxies, I can run 10 times as many threads through them compared to my previous supplier which had 1gbps speeds. Plus it's unlimited - run as many threads as your vps can handle.

    Storm proxies is only good for scraping non-google search engines and also for ctr direct traffic bots.

    Once you've built your verified list and changed your proxies to dedicated, you'll notice a massive improvement in vpm/lpm. At 300 threads you should be hitting between 200 and 400 lpm/vpm easily.
    Great write up thank you...

    How many proxies do you need per xevil threads to prevent them from getting banned?

    With the 'slightly less sites' using setup2 vs setup1...is it that much faster to run setup2 that its building more overall links per day vs setup1? I'm also wondering if some/most of the sites using the recapv2/3 would be overall higher quality and worth the extra time and cost to get a link from?
  • sickseosickseo London,UK
    So for xevil proxies, this is what i'm using: https://reproxy.network/buy.php

    I'm getting hardly any bans. They charge by how many solving threads you'll use. 50 threads is plenty for one gsa install, but this may depend on how many v2/3 sites are in your list.

    With regards to set up 2 Vs set up 1, yes, you're absolutely right. Definitely worth getting links on more sites and using v2/v3, especially in the tier 1. The proxies only cost 5 bucks for ipv6 for 50 solving threads. 

    Set up 1 is what I'll be using for my own business. Although my tests with the no captcha site list is still moving rankings up, so in my case, it's actually not necessary to use any captcha solving at all. Also when I see 500+ vpm with no captcha solving, I can finish 3 days of campaigns in literally 1 day. The impact to rankings when you can build links that quickly is quite phenomenal. Hence why I prefer speed to actual site numbers, given the choice.
  • sickseo said:
    Here you go:






    Captcha

    For captcha I have different set ups running on different machines.

    1st Set Up: The best one is using gsa captcha breaker and xevil with V2/V3 solving. That install generates the most links and using both captcha solving tools together will allow you to make links on a lot more sites. Most list sellers use both software to build their lists. Serpgrow definitely does.  Each tool is good at solving different types of captchas so worth having both.

    2nd Set Up: On the other machines I'm using just GSA captcha breaker and xevil and have disabled v2/v3 solving. Slightly less sites but things run a lot faster when v2/v3 is disabled. Plus I don't have to pay the extra for xevil proxies to solve the v2/v3.

    3rd Set up: No captcha solving. Far less sites that get verified links but the vpm is pretty nuts on the vps i'm using.



    I still see ranking increases, but that's due to the list I've built for myself and the strategy I use. Even without captcha solving, I can still build links on a few thousand sites if you include blog comments and url shorteners. If you include indexers, then this install can make links on over 80k domains, with no captcha solving lol That's plenty to be able to rank with.

    So up to you. All 3 set ups work great, but the 1st is the most expensive as you need 2 sets of proxies, one for gsa and one for xevil.

    Proxies

    Definitely need to be using dedicated proxies here. Avoid public proxies and avoid shared proxies if you want to max out the speed. Storm proxies for gsa link submision/verification is a big no no. I recently started using 10gbps private proxies from here: https://www.leafproxies.com/collections/monthly-datacenter-proxies/products/october-monthly-uk-proxies-10gbps

    The vps I have is in helsinki, so UK proxies still run pretty fast for me. At the very least you need to be using 1gbps private proxies if you want to see an improvement in your vpm, along with a vps that has a 1gbps connection. 

    I've got 35 proxies in total and they're running in 39 different vps, each at 300 threads. That's about 12,000 threads in total being run through 35 dedicated proxies. Speeds are amazingly fast. Proxies are $2 each, so it's not expensive. As they are 10gbps proxies, I can run 10 times as many threads through them compared to my previous supplier which had 1gbps speeds. Plus it's unlimited - run as many threads as your vps can handle.

    Storm proxies is only good for scraping non-google search engines and also for ctr direct traffic bots.

    Once you've built your verified list and changed your proxies to dedicated, you'll notice a massive improvement in vpm/lpm. At 300 threads you should be hitting between 200 and 400 lpm/vpm easily.

    How do you scrape google with just 35 dedi proxy?
  • sickseosickseo London,UK
    edited November 19
    Firstly, I don't scrape google. It's too expensive to even try. To do it properly, you'll need proxy subscriptions from places like Smart Proxy or OxyLabs. 

    Those installs I have are just doing link building. For scraping new sites, I use hrefer on separate machines with either public proxies and Storm Proxies scraping non google search engines. This syncs to a folder which then gets processed by Platform Identifier and creates an identified list for me automatically. This identified list then gets processed with one project on one of my GSA installs. It's all automated and this adds sites to my working verified list. The verified list is also synced to all my installs and all servers build new links from this verfiied list.

    I prefer to use GSA SER for what it's really good at, which is building links at speed. If you do both link building and scraping at the same time inside GSA SER, then you will notice your vpm suffer considerably, as you are using it's resources to scrape sites, then test sites, most of which won't be working sites. While your GSA is scraping and testing, mine are just building links 24/7 from a pre-filtered and tested working verified list. That's why my vpm is usually very high.

    Hence why I keep the 2 processes separate and even use different tools to complete the scraping job.


    Thanked by 1draculax
  • googlealchemistgooglealchemist Anywhere I want
    sickseo said:
    Firstly, I don't scrape google. It's too expensive to even try. To do it properly, you'll need proxy subscriptions from places like Smart Proxy or OxyLabs. 

    Those installs I have are just doing link building. For scraping new sites, I use hrefer on separate machines with either public proxies and Storm Proxies scraping non google search engines. This syncs to a folder which then gets processed by Platform Identifier and creates an identified list for me automatically. This identified list then gets processed with one project on one of my GSA installs. It's all automated and this adds sites to my working verified list. The verified list is also synced to all my installs and all servers build new links from this verfiied list.

    I prefer to use GSA SER for what it's really good at, which is building links at speed. If you do both link building and scraping at the same time inside GSA SER, then you will notice your vpm suffer considerably, as you are using it's resources to scrape sites, then test sites, most of which won't be working sites. While your GSA is scraping and testing, mine are just building links 24/7 from a pre-filtered and tested working verified list. That's why my vpm is usually very high.

    Hence why I keep the 2 processes separate and even use different tools to complete the scraping job.


    mind saving me some testing time and letting us know what kind of ratios you have found for bing/yahoo proxies/threads? 
  • sickseosickseo London,UK
    Most of my installs run at 10-25 threads due to thread limitations on the proxy packages. Otherwise I'm running at 100 threads with public proxies from gsa proxy scraper and scraping 8 different search engines, including bing. Bing quite literally spits out the results as well as rambler and seznam.cz. I could easily run more threads if I had more proxies or better quality proxies. Although it depends on the footprints and keywords used, I'm easily pulling 100,000+ scraped results per day per install.

    I'm testing another proxy service Proxy Rack which seems to be giving me google results but I'm still seeing a lot of ip bans with it.
  • googlealchemistgooglealchemist Anywhere I want
    I dont mean to be dense, but im still confused as to how many threads per total proxies you mean?
    How many total pub proxies are you using to run 100 threads? 
  • sickseosickseo London,UK
    Anything from 400 to 3000 public proxies. Numbers vary as it depends on what GSA proxy scraper has tested to be working. In any case, I still run at 100 threads with public proxies regardles of how many I have. It's split across 8 search engines also, so the same search engine doesn't get the same proxy used often. 

    Best to just test for yourself and see what works for you and your set up. 
  • sickseo said:
    Anything from 400 to 3000 public proxies. Numbers vary as it depends on what GSA proxy scraper has tested to be working. In any case, I still run at 100 threads with public proxies regardles of how many I have. It's split across 8 search engines also, so the same search engine doesn't get the same proxy used often. 

    Best to just test for yourself and see what works for you and your set up. 
    Hi,
    I am using proxies from webshare they are very cheap and almost all work all the time>
    I have also tried proxies from Instant proxy but result are same

    My LPM is very low. like   5- 20 lpm when I run gsa on 800 threads with 90 proxies
    I am not using url shortness because they are useless
    I got my list from  www.serverifiedlists.com


    1. Do you have any solution on how to increase lpm?
    these are the engines I use



    My setup ====>
    Xevil +gsa
    system- ryzen 7 -    4.5ghz
    32gb ram running at 3800mhz
    nvme ssd ( super fast)
    RTX 3060 

  • sickseosickseo London,UK
    On that set up with 800 threads, you should be seeing around 800 vpm. Sounds like it's your list. Have you processed the purchased list and separated the working sites? 

    I'm guessing you are running all your projects on that purchased list? If you make a clean verified list first, it will run a lot faster.
  • HI I just got the list and im working o it with 4 campaigns
    yes, all run on the same purchased list

    How do i make a clean verified list?
  • Please check the selected enigines, i am not using url shortners or anything like that
  • sickseosickseo London,UK
    sickseo said:
    LPM/VPM can be effected by quite a few different things. List makes a big difference and not all sites that you have found to work with gsa ser will always work. Some sites have a shelf life, especially ones that are moderated by owners. Some list sellers will sell the same list to a few hundred different people. So sites will get hammered and if you owned that site, you'd likely do something to stop all that content being posted on your site.

    First thing I noticed is that you are using serpgrow verified list and have put that in the verfied folder and also ticked the box next to verified folder. So you are telling the software to save any new verified links to the same folder where your purchased list is. This is no good as you are trying to write to the purchased list that is being synced into your dropbox. Keep them separate.

    Second thing is that you are only using private proxies for submission. You should also use them for email checking and verification.

    1. The email server will eventually block your request as it comes from the same ip over and over. Most have limits on simultaneous connection. For example, 3 simultaneous connections per ip.
    2. During verification, you are using the same server ip over and over again, which is no good as sites which detect and block your ip from repeated requests will eventually block the software from retreiving the verified link. At high speeds, some sites will think they are under attack and their security software will kick in. So use private proxies here.

    Create A Clean Verified List First

    You should have the purchased list in a different folder to use with any of these 3 folders: Identified, submitted, failed. Let's use the identified folder for example.
    Your verified folder should be empty to start with. (As the box is ticked, any newly verified links that work with your set up will be saved.) Then set 1 project to use the folder where your purchased list is, identified folder. This project will process the purchased list and filter out non-working sites. It basically cleans the list. This project will run slow as the list will contain a mixture of working/non-working sites.
    Set all other projects to only use the verified folder. These projects will run super fast as all links are tested to be working with your set up. The list now contains only working sites.

    Making just that change above will increase your lpm/vpm significantly. Recommend doing that atleast once per month, or when you notice a drop in lpm/vpm. This is just maintenance.

    Performance for me right now is excellent. On a dedi with 1000 threads I get 1000vpm+ continously with v2/v3 solving enabled. On a vps with 300 threads I get anything from 200vpm-400+ vpm.

    Speed also depends on the site selection. Faster links would be fast indexer, exploits, urls shorteners and these platforms can run very very fast as there is no emails, content or captchas to solve on most of them. Less work for the software so links are made faster.
    Slower links are article, blog comments, forum, social network and wikis and any other link sources that requires emails, content and captchas.

    V2/V3 sites will significantly slow things down as average solve times on V2/V3 will be 50-65 seconds per submission. Captchas for sites with no V2/V3 get solved in milliseconds.

    Aside from the above, you can further optimise things for speed. Such as upgrading your vps to one that has a faster internet line such 1gbps or even 10gbps and then using proxies that support the same speed in the same location as your vps.

    Hope that helps.

    I explained it here.
  • sickseosickseo London,UK
    @ProGamer Is that for pointing at your tiers? Any reason why you don't have article selected? That's the best source of links.

    Ignoring url shorteners is a big mistake in my opinion. There is a ton of high DA sites if you can find them. Most are 301 redirects which pass link juice. Many have keywords in the url which is a strong signal. And a ton of them are do follow with decent DA. Plus it dilutes your anchors if you point them at your money site as there is no anchor text on most of them.
    I've got backlinks from google on multiple TLDS by using url shorteners.
    I actually see no reason to be avoiding them.

    This is what I do. I use a different set up for my t1 links, and a different set up for my Tiers.

    For T1, Dofollow campaigns, to money site



    For tiers, Do follow campaigns, use on all tiers


    Those settings max out the link juice available from linking dofollow links. All no follow platforms are deselected.

    I've got different set ups for no follow links too, but I use those much less.

    That is just one way of doing it.
  • okay, ill try this out for tier 1 
    also is there a place for a good list? to be honest all seem pretty trash.


    so should I use url shortners and microblog on tier 1?
    I cant use blog comments on tier1 ?

  • sickseosickseo London,UK
    There are no list sellers that I could recommend unfortunately. I've been scraping and building my own lists. My list appears to be way bigger than any of the current list sellers lol

    For tier 1 I literally use any link that is dofollow and has keyword as anchor text. If you build a 3 tier structure with enough T2 and T3 links, you will rank for every keyword used in T1 link. Even those exploit links are a hidden ranking gem that no one really knows about. Probably because they're too scared to use it on their money site lol 

    Url shorteners are great for tiers, especially if you just use the dofollow ones. I've used it to money site as tier 1, but as there is no keyword anchor text, it doesn't directly influence rankings for a particular keyword, as there is no anchor text in the T1 link. However, it will boost the page authority of any page it's pointed at, which will help boost rankings for any keywords that appear on your landing page. It's like "indirect" SEO.

    There are plenty of other beneifts, explained already. I'd do it once to all money site urls as part of a 3 tier campaign, just to get the boost in DA. But to actually move keywords up the rankings, the above engine selections is what I'd recommend as the majority of link sources are do follow with keywords as anchor text.

    Microblog I use because there is one site in my list with very good DA. No big deal if you use these or not. Depends on what you have in your site list.

    Blog comments can be very good for tier 1 as there are literally millions of them so it's an easy way to boost your DA. Links from unique domains that are do follow contribute to boosting a sites DA. I filter the obls on projects to have max 100 obls. This filters out the heavily spammed sites with high obls. I've seen plenty of sites with 10,000+ obls lol These sites you want to avoid using as there is literally no SEO value when the obls are that high. Only place for these types of links is to be pointed at links that need indexing.
  • praveenpraveen India
    edited December 1
    Use the right verified lists. Correct private proxies and emails are very important.

    You can use gsaserlist.one for verified list. It's very useful.
  • sickseo said:
    There are no list sellers that I could recommend unfortunately. I've been scraping and building my own lists. My list appears to be way bigger than any of the current list sellers lol

    For tier 1 I literally use any link that is dofollow and has keyword as anchor text. If you build a 3 tier structure with enough T2 and T3 links, you will rank for every keyword used in T1 link. Even those exploit links are a hidden ranking gem that no one really knows about. Probably because they're too scared to use it on their money site lol 

    Url shorteners are great for tiers, especially if you just use the dofollow ones. I've used it to money site as tier 1, but as there is no keyword anchor text, it doesn't directly influence rankings for a particular keyword, as there is no anchor text in the T1 link. However, it will boost the page authority of any page it's pointed at, which will help boost rankings for any keywords that appear on your landing page. It's like "indirect" SEO.

    There are plenty of other beneifts, explained already. I'd do it once to all money site urls as part of a 3 tier campaign, just to get the boost in DA. But to actually move keywords up the rankings, the above engine selections is what I'd recommend as the majority of link sources are do follow with keywords as anchor text.

    Microblog I use because there is one site in my list with very good DA. No big deal if you use these or not. Depends on what you have in your site list.

    Blog comments can be very good for tier 1 as there are literally millions of them so it's an easy way to boost your DA. Links from unique domains that are do follow contribute to boosting a sites DA. I filter the obls on projects to have max 100 obls. This filters out the heavily spammed sites with high obls. I've seen plenty of sites with 10,000+ obls lol These sites you want to avoid using as there is literally no SEO value when the obls are that high. Only place for these types of links is to be pointed at links that need indexing.
    Yea, I am kinda scared to use the exploit on my tier 1
    I am using webshare proxy ( 400 proxies) and they are working for as of now  because i run 2000 threads and it can be expensive if i buy 200 proxy per $1 each
    getting 140lpm now on this 2000 threads, I'm sure it can get higher if I get a better list. I'm using serverifiedlists right now
    after i updated my list my lpm increase a lot

    1. When I use shortners it feels like almost all backlinks which  are created just  url shortners now


    I just spent so many hours creating 8 campaigns ( 5 tiers each) for my website, lets see how fast it ranks now I will be updating here!

    I'm really confused on what to use for engines because  don't want to harm my sites
    If anyone genuinely knows a good list then please lmk, no referrals or all that LOLLL
    I am using these for now I selected the using the option when i right right on the engine bar 
    " uncheck engines that use no contextual links"

Sign In or Register to comment.