Skip to content

VERIFICATION SUCCESS RATE [need help]

TimothyGTimothyG Indonesia
need help please,

I have been running this campaign for 1 week, but why is it that out of 10,000 to 30,000 submitted, only 250 are verified?

I have made sure to use a private proxy ipv4, I also use 0captcha and XEVIL services, for email I use personal hosting, there are approximately 200 articles with spintax format, my verification settings are set every 3 hours

fyi: the campaign I run all engines, except Redirect, Short Url Pingback

Comments

  • Where did you get the target list? 
  • TimothyGTimothyG Indonesia
    I bought from a trusted seller in Indonesia, he is a YouTuber in Indonesia so it seems like he really maintains his credibility
  • edited July 2
    TimothyG said:
    I bought from a trusted seller in Indonesia, he is a YouTuber in Indonesia so it seems like he really maintains his credibility
    No. You got scammed. That’s most likely an identified list. I have been scraping lists for a very long time. That’s what my identified list looks like after I sort raw scrape through GSA PI. Your LPM is much better than a raw scrape would be. Right on par with trying to post to an identified list. 

    An identified list is just a raw scrape that’s ran through GSA PI or SER itself sorted by identified platform. There’s no guarantee you can post to them. So you turn an identified list into a verified list by running campaigns like you are now. 

    If you’re going buy lists buy a verified list. There’s a buy sell trade forum here. Those sellers are way more trustworthy than a scammer on YouTube with a bunch of fake accounts hyping the service. 
  • sickseosickseo London,UK
    Extremely high submitted numbers can either be an issue with your email or that the site no longer works for making new accounts and links.

    You've got a lot of "already parsed" messages in the logs - this means your site list contains duplicates.

    It does not look like a clean verified list with duplicates removed. The rate of new links being verified is way too slow. So yes, it does look like you've wasted your money on this list.

    A good verified site list should be able to make atleast 100 new verified links per minute. (Contextuals and profiles at 300 threads).

    With other settings configured such as scheduled reposting, my servers can go as fast as 400-500 verified links per minute. Which equates to over 500,000 links per day per server.

    That is what the software is capable of if you feed it a good verified list.
  • TimothyGTimothyG Indonesia
    sickseo said:
    Extremely high submitted numbers can either be an issue with your email or that the site no longer works for making new accounts and links.

    You've got a lot of "already parsed" messages in the logs - this means your site list contains duplicates.

    It does not look like a clean verified list with duplicates removed. The rate of new links being verified is way too slow. So yes, it does look like you've wasted your money on this list.

    A good verified site list should be able to make atleast 100 new verified links per minute. (Contextuals and profiles at 300 threads).

    With other settings configured such as scheduled reposting, my servers can go as fast as 400-500 verified links per minute. Which equates to over 500,000 links per day per server.

    That is what the software is capable of if you feed it a good verified list.

    what factors/variables determine such good results?

    I have tried to maximize my campaign by using good articles, ipv4 proxy, 3 layers of captcha (gsa captcha breaker, XEVIL, 0captcha), email using my own hosting so what else needs to be maximized?
  • sickseosickseo London,UK
    Your configuration of the software, proxies, captcha solving, emails sounds fine. You could also add deepseek to solve text captchas  - very cheap.

    You just need need more working link sources. Buying site lists from different list sellers is one way to do this. The better way is to scrape them yourself, test them and build your own working verified list. You will find sites that most list sellers won't be selling in their lists.

    If you want to get even more link sources, then take a look at the new sernuke engines: https://forum.gsa-online.de/discussion/33465/sernuke-custom-gsa-ser-engines I've added a few thousand new sites to my site list from just these engines alone and I'm still scraping and finding more everyday.

    GSA SER High VPM.mp4 That's one of my installs running close to 400 vpm using a list of around 6000 domains. Scheduled posting enabled -  runs quicker by reposting to same accounts.

    For scraping new sites I've buit my own bots with zennoposter that scrape Google, duck duck go, aol, ecosia, brave, yandex and seznam search engines. There are many more search engines out there that I can scrape. I'm only running 30 bots that target a small group of engines.



    This is where your biggest opportunity is right now. You need a good scraping solution and also need more engines that you can scrape sites for. The more sites you have in your list, the easier it will be to rank when you start building your tiers with them.
  • TimothyGTimothyG Indonesia
    edited July 3
     sickseo said:
    Your configuration of the software, proxies, captcha solving, emails sounds fine. You could also add deepseek to solve text captchas  - very cheap.

    You just need need more working link sources. Buying site lists from different list sellers is one way to do this. The better way is to scrape them yourself, test them and build your own working verified list. You will find sites that most list sellers won't be selling in their lists.

    If you want to get even more link sources, then take a look at the new sernuke engines: https://forum.gsa-online.de/discussion/33465/sernuke-custom-gsa-ser-engines I've added a few thousand new sites to my site list from just these engines alone and I'm still scraping and finding more everyday.

    GSA SER High VPM.mp4 That's one of my installs running close to 400 vpm using a list of around 6000 domains. Scheduled posting enabled -  runs quicker by reposting to same accounts.

    For scraping new sites I've buit my own bots with zennoposter that scrape Google, duck duck go, aol, ecosia, brave, yandex and seznam search engines. There are many more search engines out there that I can scrape. I'm only running 30 bots that target a small group of engines.



    This is where your biggest opportunity is right now. You need a good scraping solution and also need more engines that you can scrape sites for. The more sites you have in your list, the easier it will be to rank when you start building your tiers with them.
    Deepseek as a captcha solver? How can Deepseek be used to solve captchas? 

    By the way, where can I get a high-quality verified list?
  • sickseosickseo London,UK
    Support for deepseek text captcha solving is configured in the software already. You just need to add your api key and it will work great. I fund my deepseek account with $10 and it lasts several months before topping up again.

    For site lists, this is probably one of the better ones out there: https://gsaserlists.com/ They also include sites for the new sernuke engines. Their verified list does contain working sites, so you should see an increase in speed when using it.
  • TimothyGTimothyG Indonesia
    sickseo said:
    Support for deepseek text captcha solving is configured in the software already. You just need to add your api key and it will work great. I fund my deepseek account with $10 and it lasts several months before topping up again.

    For site lists, this is probably one of the better ones out there: https://gsaserlists.com/ They also include sites for the new sernuke engines. Their verified list does contain working sites, so you should see an increase in speed when using it.
    Thank you for your help earlier — it really helped me improve my GSA performance.

    do you have any recommendations for reliable proxy and Remote Desktop Protocol (RDP) services?
  • TimothyGTimothyG Indonesia
    sickseo said:
    Support for deepseek text captcha solving is configured in the software already. You just need to add your api key and it will work great. I fund my deepseek account with $10 and it lasts several months before topping up again.

    For site lists, this is probably one of the better ones out there: https://gsaserlists.com/ They also include sites for the new sernuke engines. Their verified list does contain working sites, so you should see an increase in speed when using it.


    because of listening to your advice I just decided to buy GSA SER Link Lists

    the question is, do I have to use sernuke or not?
  • sickseosickseo London,UK
    For proxies, any dedicated proxy provider should be ok if you're just running 1 install.

    Personally I have high resource needs as I have multiple installs running software at high threads and most proxy providers will limit your threads and bandwidth. The only provider I found that has truly unlimited threads/bandwidth is privateproxy.me. I use their static datacenter proxies and got them at 50% off with code: 50OFF. Their starter package with 13 proxies is more than enough for running 1 gsa. Proxies are fast.

    For RDP I'm using Terminals which is free. 

  • sickseosickseo London,UK
    TimothyG said:
    sickseo said:
    Support for deepseek text captcha solving is configured in the software already. You just need to add your api key and it will work great. I fund my deepseek account with $10 and it lasts several months before topping up again.

    For site lists, this is probably one of the better ones out there: https://gsaserlists.com/ They also include sites for the new sernuke engines. Their verified list does contain working sites, so you should see an increase in speed when using it.


    because of listening to your advice I just decided to buy GSA SER Link Lists

    the question is, do I have to use sernuke or not?
    I'd recommend it. These new engines have a lot of new contextuals and profile link sources which are the best types of links to use for rankings.

    Once you have the licenses from sernuke, you'll be able to build links on a lot more sites than just using the default engines. Links from more unique domains will lead to higher authority for your money sites if you power up the T1's with tiered links.

    The more sites you can build links on, the easier it will be to rank your projects. Of course, you will still need to get your links indexed in Google to see results in rankings. But that's another challenge. 
  • TimothyGTimothyG Indonesia
    edited July 4
    sickseo said:
    TimothyG said:
    sickseo said:
    Support for deepseek text captcha solving is configured in the software already. You just need to add your api key and it will work great. I fund my deepseek account with $10 and it lasts several months before topping up again.

    For site lists, this is probably one of the better ones out there: https://gsaserlists.com/ They also include sites for the new sernuke engines. Their verified list does contain working sites, so you should see an increase in speed when using it.


    because of listening to your advice I just decided to buy GSA SER Link Lists

    the question is, do I have to use sernuke or not?
    I'd recommend it. These new engines have a lot of new contextuals and profile link sources which are the best types of links to use for rankings.

    Once you have the licenses from sernuke, you'll be able to build links on a lot more sites than just using the default engines. Links from more unique domains will lead to higher authority for your money sites if you power up the T1's with tiered links.

    The more sites you can build links on, the easier it will be to rank your projects. Of course, you will still need to get your links indexed in Google to see results in rankings. But that's another challenge. 
    If I only use GSA SER Link Lists, without using sernuke

    can the backlink creation process run? and can increase my VPM from before
  • sickseosickseo London,UK
    Yes of course. The software will still run good without the sernuke engines add on.

    The gsa ser list should give you much better VPM than your previous list as it contains working sites - dead sites are frequently removed from their list and new sites are added automatically.
  • TimothyGTimothyG Indonesia
    sickseo said:
    Yes of course. The software will still run good without the sernuke engines add on.

    The gsa ser list should give you much better VPM than your previous list as it contains working sites - dead sites are frequently removed from their list and new sites are added automatically.
    I've been running this campaign for a week, using a sitelist from gsaserlists.com.

    The number of verified links is still not optimal.

    In your opinion, what could be improved? What else can I do to make it better?


  • sickseosickseo London,UK
    edited 12:02PM
    You've still got a high number of submitted links not being verified. This normally means that the site no longer works or there is an issue with your email preventing emails from being verified. Are you using public proxies for submissions? This is a big no no.

    To optimise things further, you should clean that site list first and remove all non working sites. Once you've done that, things should run a lot faster. Although, if you are only getting 800+ links (including blog comments) from that site list, it hardly seems worth bothering!

    Also, blog comments use a lot of cpu and runs the slowest out of all the link sources. I wouldn't expect things to run fast when you have blog comment engines enabled.

    You really need to learn how to scrape and test sites so that you can build your own site list. Until you do that, you'll never tap into the true potential of the software.

    It looks like you are runing T2 campaigns, so that's a good time to be using re-posting options so that the software creates multiple accounts and posts from the same site list.



    That's one example, but you may have to tweak settings. Enabling the "per url" option will have the effect of re-using the same site list and point each site at each T1. So it will make a lot more links a lot quicker.

    The software is capable of making 800 links in a few minutes - it shouldn't take 1 week to make 800 links.
  • edited 12:37PM
    Shame that you bought a list from a reputed seller and you’re getting unverified targets - no form at all Looks like another scam to me. 

    Agree with cleaning that list up to remove non working sites. That should speed it up. 

    Here’s what I like to do to build a site list with a lot of (spammy) targets quickly:

    Scrape blog comments and guestbooks. Use GSA PI to identify targets and clean up the list. GSA PI also has a feature to extract external links. You can also use scrapebox to crawl all internal links of the websites then extract the external links. Do one of the two. 

    Doing this yields targets people have posted to already using GSA SER. 
  • sickseosickseo London,UK
    You are right - that list is incredibly disappointing. Especially considering that's supposed to be one of the better ones out there.
    Thanked by 1the_other_dude
  • sickseo said:
    You are right - that list is incredibly disappointing. Especially considering that's supposed to be one of the better ones out there.
    I think the last good list was the one sold back in 2013 that had the red list and the blue list separated by the types of captchas on the sites. One list had recaptcha the other was for GSA CB. Those lists rocked! I hate to see people wasting time and money. That sucks!!
  • sickseosickseo London,UK
    Oh absolutely - those lists were amazing. There were thousands of contextual link sources. I still remember them well. 

    No one is doing anything like that anymore. To be fair though, it's harder these days to build a decent list with just the default engines. Especially do follow contextuals to use as T1 link sources. Gnu board was the last cms that was great for this - but over the last couple of years the working site numbers have been dropping pretty quick.

    Which is why it's unlikely you will find any list seller that offers a list with thousands of working sites. You'll be lucky to get a few hundred contextual sites and even then maybe 80% of them will be no follow. No good for T1.

    Without the sernuke engines I was very close to abandoning the software, as the majority of my targets have been blog comments, guestbooks, redirects and indexers. None of these are any good for T1.

    I've been building my PBN network for the last 3 years and that's the route I'm likely to continue with. These automated tools like gsa ser and rankerx are better for T2/3. They can't be sustained as a T1 link source.

    Thanked by 1the_other_dude
Sign In or Register to comment.