Appreciate your input. I'm tempted to buy your list and test it myself with my set up. Looking at the following, the submitted numbers are crazy high with very low verified numbers, and he's enabled blog comments as well.
Yes of course. The software will still run good without the sernuke engines add on.
The gsa ser list should give you much better VPM than your previous list as it contains working sites - dead sites are frequently removed from their list and new sites are added automatically.
I've been running this campaign for a week, using a sitelist from gsaserlists.com.
The number of verified links is still not optimal.
In your opinion, what could be improved? What else can I do to make it better?
It could be an issue with his set up such as the emails or even proxies. But I'll test it and see.
I’d be happy to support this effort to test that list. Let me know if you would like me to pay for half of the expense of the list.
I'm sorry, but this isn't possible as I'm strictly trying to keep the lists limited to a small number of buyers. Currently, the monthly subscription is also sold out, with 60 members already joined. I've already granted access to @sickseo and will consider his feedback to implement better automated approaches.
Since this topic is about GSASERLists.com, I’d like to explain that my lists go through a live algorithm to ensure all links are active and match the engines. I'm not using GSA Platform Identifier for this process, unlike most other GSA link list providers. This means I can confidently say that most of the links are active, and any non-working ones are automatically removed throughout the day. A link may be re-added to the lists if the algorithm detects it has come back online. To verify this, I checked the verified link list, which includes around 757K targets, using Scrapebox’s Alive Check add-on. I’ve uploaded a video of the results in this comment. Even the few URLs marked as "Not Alive" were actually live, likely because I didn't use proxies during the check or due to the size of the webpages.
What concerns me about your process is that you mention using scrapebox in your process as well as a custom algorithm to check for live links.
I don't use such things to build and check my verified lists. I use GSA SER to do this. I suspect every user that purchases your list will also do the same. This is the only way to ensure that a site works and results in a verified link.
A site may still be live but have registration disabled or sending of emails disabled - only GSA SER can check if a site still yields a verified link - scrapebox and a custom algorithm can not do this.
I also don't use Scrapebox to check the lists. Everything is written in Python, and the algorithm follows the engines. That said, you're right. Although each target has gone through GSA SER, later checks can only verify if the target is GSA SER-friendly. They can't determine if registration is still possible without further checking, which is time and resource-consuming. So, I'm leaving those targets for GSA SER.
I didn't realise that you limited your sales of the site list. This is a very good thing - same with limited sales of the sernuke licenses - all helps keep sites alive for longer if less users are using them.
I understand about the resource usage to recheck lists. I'm running your list and the software is maxing out at 3gb ram already. Most likely because of the blog comments/guestbook sites with high obls. Normally I run just contextual and profile sources and I never see any issue with high memory usage.
@sickseo Thanks for leaving your honest review. I'm also interested in having you test my CatchE program, as I believe you're an expert in this field. If you're interested in testing the CatchE software, please let me know and I’ll provide you with access.
It's a set-and-forget program for managing your project's emails using fresh catch-all addresses. It would be great if you could also review the program and share your insights. I believe it's the best option available on the internet.
I'd be happy to test it out and let you know my thoughts. I've taken a brief look at it but never played with it properly.
I've been using my own system through zennoposter. Have a cheap cpanel hosting with 10 domains on it and zenno creates the catchalls for me via cpanel and exports the catchall emails in the right format for gsa or rankerx. Hence why I've not had a need to look elsewhere.
I'd be happy to test it out and let you know my thoughts. I've taken a brief look at it but never played with it properly.
I've been using my own system through zennoposter. Have a cheap cpanel hosting with 10 domains on it and zenno creates the catchalls for me via cpanel and exports the catchall emails in the right format for gsa or rankerx. Hence why I've not had a need to look elsewhere.
That sounds great. I’ll stay in touch with you via PM.
You've still got a high number of submitted links not being verified. This normally means that the site no longer works or there is an issue with your email preventing emails from being verified. Are you using public proxies for submissions? This is a big no no.
To optimise things further, you should clean that site list first and remove all non working sites. Once you've done that, things should run a lot faster. Although, if you are only getting 800+ links (including blog comments) from that site list, it hardly seems worth bothering!
Also, blog comments use a lot of cpu and runs the slowest out of all the link sources. I wouldn't expect things to run fast when you have blog comment engines enabled.
You really need to learn how to scrape and test sites so that you can build your own site list. Until you do that, you'll never tap into the true potential of the software.
It looks like you are runing T2 campaigns, so that's a good time to be using re-posting options so that the software creates multiple accounts and posts from the same site list.
That's one example, but you may have to tweak settings. Enabling the "per url" option will have the effect of re-using the same site list and point each site at each T1. So it will make a lot more links a lot quicker.
The software is capable of making 800 links in a few minutes - it shouldn't take 1 week to make 800 links.
You've still got a high number of submitted links not being verified. This normally means that the site no longer works or there is an issue with your email preventing emails from being verified. Are you using public proxies for submissions? This is a big no no.
To optimise things further, you should clean that site list first and remove all non working sites. Once you've done that, things should run a lot faster. Although, if you are only getting 800+ links (including blog comments) from that site list, it hardly seems worth bothering!
Also, blog comments use a lot of cpu and runs the slowest out of all the link sources. I wouldn't expect things to run fast when you have blog comment engines enabled.
You really need to learn how to scrape and test sites so that you can build your own site list. Until you do that, you'll never tap into the true potential of the software.
It looks like you are runing T2 campaigns, so that's a good time to be using re-posting options so that the software creates multiple accounts and posts from the same site list.
That's one example, but you may have to tweak settings. Enabling the "per url" option will have the effect of re-using the same site list and point each site at each T1. So it will make a lot more links a lot quicker.
The software is capable of making 800 links in a few minutes - it shouldn't take 1 week to make 800 links.
You've still got a high number of submitted links not being verified. This normally means that the site no longer works or there is an issue with your email preventing emails from being verified. Are you using public proxies for submissions? This is a big no no.
To optimise things further, you should clean that site list first and remove all non working sites. Once you've done that, things should run a lot faster. Although, if you are only getting 800+ links (including blog comments) from that site list, it hardly seems worth bothering!
Also, blog comments use a lot of cpu and runs the slowest out of all the link sources. I wouldn't expect things to run fast when you have blog comment engines enabled.
You really need to learn how to scrape and test sites so that you can build your own site list. Until you do that, you'll never tap into the true potential of the software.
It looks like you are runing T2 campaigns, so that's a good time to be using re-posting options so that the software creates multiple accounts and posts from the same site list.
That's one example, but you may have to tweak settings. Enabling the "per url" option will have the effect of re-using the same site list and point each site at each T1. So it will make a lot more links a lot quicker.
The software is capable of making 800 links in a few minutes - it shouldn't take 1 week to make 800 links.
Based on your suggestion about verification email, I'm now using CATCHE from gsaserlists.com. Thanks for the helpful advice
i always using private proxy btw
Oh yaa, you said you could reach 300vpm. Did you activate the indexer engine, pingback, URL shortener, or exploit at that time?
Shame that you bought a list from a reputed seller and you’re getting unverified targets - no form at all Looks like another scam to me.
Agree with cleaning that list up to remove non working sites. That should speed it up.
Here’s what I like to do to build a site list with a lot of (spammy) targets quickly:
Scrape blog comments and guestbooks. Use GSA PI to identify targets and clean up the list. GSA PI also has a feature to extract external links. You can also use scrapebox to crawl all internal links of the websites then extract the external links. Do one of the two.
Doing this yields targets people have posted to already using GSA SER.
You've still got a high number of submitted links not being verified. This normally means that the site no longer works or there is an issue with your email preventing emails from being verified. Are you using public proxies for submissions? This is a big no no.
To optimise things further, you should clean that site list first and remove all non working sites. Once you've done that, things should run a lot faster. Although, if you are only getting 800+ links (including blog comments) from that site list, it hardly seems worth bothering!
Also, blog comments use a lot of cpu and runs the slowest out of all the link sources. I wouldn't expect things to run fast when you have blog comment engines enabled.
You really need to learn how to scrape and test sites so that you can build your own site list. Until you do that, you'll never tap into the true potential of the software.
It looks like you are runing T2 campaigns, so that's a good time to be using re-posting options so that the software creates multiple accounts and posts from the same site list.
That's one example, but you may have to tweak settings. Enabling the "per url" option will have the effect of re-using the same site list and point each site at each T1. So it will make a lot more links a lot quicker.
The software is capable of making 800 links in a few minutes - it shouldn't take 1 week to make 800 links.
You've still got a high number of submitted links not being verified. This normally means that the site no longer works or there is an issue with your email preventing emails from being verified. Are you using public proxies for submissions? This is a big no no.
To optimise things further, you should clean that site list first and remove all non working sites. Once you've done that, things should run a lot faster. Although, if you are only getting 800+ links (including blog comments) from that site list, it hardly seems worth bothering!
Also, blog comments use a lot of cpu and runs the slowest out of all the link sources. I wouldn't expect things to run fast when you have blog comment engines enabled.
You really need to learn how to scrape and test sites so that you can build your own site list. Until you do that, you'll never tap into the true potential of the software.
It looks like you are runing T2 campaigns, so that's a good time to be using re-posting options so that the software creates multiple accounts and posts from the same site list.
That's one example, but you may have to tweak settings. Enabling the "per url" option will have the effect of re-using the same site list and point each site at each T1. So it will make a lot more links a lot quicker.
The software is capable of making 800 links in a few minutes - it shouldn't take 1 week to make 800 links.
You've still got a high number of submitted links not being verified. This normally means that the site no longer works or there is an issue with your email preventing emails from being verified. Are you using public proxies for submissions? This is a big no no.
To optimise things further, you should clean that site list first and remove all non working sites. Once you've done that, things should run a lot faster. Although, if you are only getting 800+ links (including blog comments) from that site list, it hardly seems worth bothering!
Also, blog comments use a lot of cpu and runs the slowest out of all the link sources. I wouldn't expect things to run fast when you have blog comment engines enabled.
You really need to learn how to scrape and test sites so that you can build your own site list. Until you do that, you'll never tap into the true potential of the software.
It looks like you are runing T2 campaigns, so that's a good time to be using re-posting options so that the software creates multiple accounts and posts from the same site list.
That's one example, but you may have to tweak settings. Enabling the "per url" option will have the effect of re-using the same site list and point each site at each T1. So it will make a lot more links a lot quicker.
The software is capable of making 800 links in a few minutes - it shouldn't take 1 week to make 800 links.
Based on your suggestion about verification email, I'm now using CATCHE from gsaserlists.com. Thanks for the helpful advice
i always using private proxy btw
Oh yaa, you said you could reach 300vpm. Did you activate the indexer engine, pingback, URL shortener, or exploit at that time?
I don't use any of those link sources at the moment, but they do run really fast. In my tests they are all ineffective now, so I'm only running articles, forums, social networks, wikis and sernuke engines across all my tiers - everything else is disabled.
The crazy speeds I'm getting is from enabling the reposting on each site with multiple accounts. Like this:
This results in 1 post/account being made every day on every site in my list and will keep doing that for 30 days. The extra posts is only supported on the contextual engines. Profile link sources will only create 1 link per account.
First day, you won't get those crazy speeds, because of the delay settings above. It's still making accounts and verifying emails.
From day 2 onwards you'll see the speed increasing as the reposting on sites with pre-created accounts begins.
I've got 6 of these running right now - speeds run between 200 - 400+ vpm.
You've still got a high number of submitted links not being verified. This normally means that the site no longer works or there is an issue with your email preventing emails from being verified. Are you using public proxies for submissions? This is a big no no.
To optimise things further, you should clean that site list first and remove all non working sites. Once you've done that, things should run a lot faster. Although, if you are only getting 800+ links (including blog comments) from that site list, it hardly seems worth bothering!
Also, blog comments use a lot of cpu and runs the slowest out of all the link sources. I wouldn't expect things to run fast when you have blog comment engines enabled.
You really need to learn how to scrape and test sites so that you can build your own site list. Until you do that, you'll never tap into the true potential of the software.
It looks like you are runing T2 campaigns, so that's a good time to be using re-posting options so that the software creates multiple accounts and posts from the same site list.
That's one example, but you may have to tweak settings. Enabling the "per url" option will have the effect of re-using the same site list and point each site at each T1. So it will make a lot more links a lot quicker.
The software is capable of making 800 links in a few minutes - it shouldn't take 1 week to make 800 links.
You've still got a high number of submitted links not being verified. This normally means that the site no longer works or there is an issue with your email preventing emails from being verified. Are you using public proxies for submissions? This is a big no no.
To optimise things further, you should clean that site list first and remove all non working sites. Once you've done that, things should run a lot faster. Although, if you are only getting 800+ links (including blog comments) from that site list, it hardly seems worth bothering!
Also, blog comments use a lot of cpu and runs the slowest out of all the link sources. I wouldn't expect things to run fast when you have blog comment engines enabled.
You really need to learn how to scrape and test sites so that you can build your own site list. Until you do that, you'll never tap into the true potential of the software.
It looks like you are runing T2 campaigns, so that's a good time to be using re-posting options so that the software creates multiple accounts and posts from the same site list.
That's one example, but you may have to tweak settings. Enabling the "per url" option will have the effect of re-using the same site list and point each site at each T1. So it will make a lot more links a lot quicker.
The software is capable of making 800 links in a few minutes - it shouldn't take 1 week to make 800 links.
You've still got a high number of submitted links not being verified. This normally means that the site no longer works or there is an issue with your email preventing emails from being verified. Are you using public proxies for submissions? This is a big no no.
To optimise things further, you should clean that site list first and remove all non working sites. Once you've done that, things should run a lot faster. Although, if you are only getting 800+ links (including blog comments) from that site list, it hardly seems worth bothering!
Also, blog comments use a lot of cpu and runs the slowest out of all the link sources. I wouldn't expect things to run fast when you have blog comment engines enabled.
You really need to learn how to scrape and test sites so that you can build your own site list. Until you do that, you'll never tap into the true potential of the software.
It looks like you are runing T2 campaigns, so that's a good time to be using re-posting options so that the software creates multiple accounts and posts from the same site list.
That's one example, but you may have to tweak settings. Enabling the "per url" option will have the effect of re-using the same site list and point each site at each T1. So it will make a lot more links a lot quicker.
The software is capable of making 800 links in a few minutes - it shouldn't take 1 week to make 800 links.
Based on your suggestion about verification email, I'm now using CATCHE from gsaserlists.com. Thanks for the helpful advice
i always using private proxy btw
Oh yaa, you said you could reach 300vpm. Did you activate the indexer engine, pingback, URL shortener, or exploit at that time?
I don't use any of those link sources at the moment, but they do run really fast. In my tests they are all ineffective now, so I'm only running articles, forums, social networks, wikis and sernuke engines across all my tiers - everything else is disabled.
The crazy speeds I'm getting is from enabling the reposting on each site with multiple accounts. Like this:
This results in 1 post/account being made every day on every site in my list and will keep doing that for 30 days. The extra posts is only supported on the contextual engines. Profile link sources will only create 1 link per account.
First day, you won't get those crazy speeds, because of the delay settings above. It's still making accounts and verifying emails.
From day 2 onwards you'll see the speed increasing as the reposting on sites with pre-created accounts begins.
I've got 6 of these running right now - speeds run between 200 - 400+ vpm.
Wow!! Amazing results!
So, currently, the configuration I'm using is as follows: - VPS: Contabo, European server - Proxy: Private - Captcha: Xevil, Captmonster self-hosted, GSA CB - Sitelist: Gsaserlists - Email: Catche, Gsaserlists - GSA settings, following your suggestions
What else should I do to get hundreds of VPMs like you?
Your set up sounds fine. For vps I'm using layer7 - 4cpu amd core vps, but contabo should be fine.
Main thing is to make sure your list is clean and has all non-working sites removed. To do this you need to start with an empty verified folder - then run 1 project to process any old lists you have. The software will save all verified links into your verified folder - these will be 100% working sites.
I repeat this process maybe once a month or even once every 2-3 months. That's how I maintain the high speeds.
Your set up sounds fine. For vps I'm using layer7 - 4cpu amd core vps, but contabo should be fine.
Main thing is to make sure your list is clean and has all non-working sites removed. To do this you need to start with an empty verified folder - then run 1 project to process any old lists you have. The software will save all verified links into your verified folder - these will be 100% working sites.
I repeat this process maybe once a month or even once every 2-3 months. That's how I maintain the high speeds.
Currently, the progress shows 1 vpm.
If I use Sernuke, wow wonder, will the vpm increase?
If you add more working sites to your list, then yes, the vpm will increase. The ser nuke engines have the highest number of working sites. But you will need to scrape them or buy a list that contains these sites. These are the link numbers I processed for GSA SER Lists:
This is my own list that I recently processed:
This is the breakdwon of sites by engine, although I'm still scraping and testing to find more:
But 1 vpm usually means there are no more working sites in your list, or that the site list you are procesing contains mostly non-working sites. If you enable important messages for projects, the software will leave messages for each project once there are no more targets to post to.
Once the software reaches the end of the list, the projects will need to be reset by performing delete target url cache, history and accounts. This will make the software re-use the site list again from the beginning.
I don't see any real issues with your selection of tools or project set ups.
Paid site lists are usually the cause of low vpm as they don't clean their site lists. They expect the user to do that.
The group 1 projects have very low verified numbers - contextuals and profile link sources are quite rare these days for the software.
It's the high submitted numbers that are a concern. This usually means an issue with emails not being verified, or in your case it's more likely that the sites no longer work - site owners have disabled the registration process. Those sites probably used to work but don't anymore. You'll have to clean the list and remove non-working sites.
The group 2 projects are more of a concern though as you have engines like indexer and redirects selected - these site numbers are normally in excess of 10,000 each - yet you've only got a few hundred. They don't even need emails to make a link.
Although I've just noticed you've set a custom time delay of 2 days - that's far too long - that's why you've got a ton links still sitting as submitted, waiting to be verified.
Set it to verify automatically instead. Then see if your verified numbers increase.
Also de-select "skip hard to solve captchas". You have xevil, so no need to use this option.
I don't see any real issues with your selection of tools or project set ups.
Paid site lists are usually the cause of low vpm as they don't clean their site lists. They expect the user to do that.
The group 1 projects have very low verified numbers - contextuals and profile link sources are quite rare these days for the software.
It's the high submitted numbers that are a concern. This usually means an issue with emails not being verified, or in your case it's more likely that the sites no longer work - site owners have disabled the registration process. Those sites probably used to work but don't anymore. You'll have to clean the list and remove non-working sites.
The group 2 projects are more of a concern though as you have engines like indexer and redirects selected - these site numbers are normally in excess of 10,000 each - yet you've only got a few hundred. They don't even need emails to make a link.
Although I've just noticed you've set a custom time delay of 2 days - that's far too long - that's why you've got a ton links still sitting as submitted, waiting to be verified.
Set it to verify automatically instead. Then see if your verified numbers increase.
Also de-select "skip hard to solve captchas". You have xevil, so no need to use this option.
Thank you very much for your guidance. I just changed the settings as you suggested. And could you please tell me how to clean up the list and remove non-working sites?
I don't see any real issues with your selection of tools or project set ups.
Paid site lists are usually the cause of low vpm as they don't clean their site lists. They expect the user to do that.
The group 1 projects have very low verified numbers - contextuals and profile link sources are quite rare these days for the software.
It's the high submitted numbers that are a concern. This usually means an issue with emails not being verified, or in your case it's more likely that the sites no longer work - site owners have disabled the registration process. Those sites probably used to work but don't anymore. You'll have to clean the list and remove non-working sites.
The group 2 projects are more of a concern though as you have engines like indexer and redirects selected - these site numbers are normally in excess of 10,000 each - yet you've only got a few hundred. They don't even need emails to make a link.
Although I've just noticed you've set a custom time delay of 2 days - that's far too long - that's why you've got a ton links still sitting as submitted, waiting to be verified.
Set it to verify automatically instead. Then see if your verified numbers increase.
Also de-select "skip hard to solve captchas". You have xevil, so no need to use this option.
Thank you very much for your guidance. I just changed the settings as you suggested. And could you please tell me how to clean up the list and remove non-working sites?
I don't see any real issues with your selection of tools or project set ups.
Paid site lists are usually the cause of low vpm as they don't clean their site lists. They expect the user to do that.
The group 1 projects have very low verified numbers - contextuals and profile link sources are quite rare these days for the software.
It's the high submitted numbers that are a concern. This usually means an issue with emails not being verified, or in your case it's more likely that the sites no longer work - site owners have disabled the registration process. Those sites probably used to work but don't anymore. You'll have to clean the list and remove non-working sites.
The group 2 projects are more of a concern though as you have engines like indexer and redirects selected - these site numbers are normally in excess of 10,000 each - yet you've only got a few hundred. They don't even need emails to make a link.
Although I've just noticed you've set a custom time delay of 2 days - that's far too long - that's why you've got a ton links still sitting as submitted, waiting to be verified.
Set it to verify automatically instead. Then see if your verified numbers increase.
Also de-select "skip hard to solve captchas". You have xevil, so no need to use this option.
Thank you very much for your guidance. I just changed the settings as you suggested. And could you please tell me how to clean up the list and remove non-working sites?
Go to Options > Advanced
Click the "Site Lists" tab
Tick the list(s) you want to clean
Click "Tools" to the right
Then "Clean-Up"
I tried to clean it up according to the steps above, but it seems to be an error.
Just select all the site lists then Tools > Clean-Up
I did not include "Open Folder" in the steps earlier on.
Still the same problem. Is it related to the fact that I don't have permission to edit the files synchronized by serverifiedlists.com? The site list path here is synchronized from the Dropbox file.
Just select all the site lists then Tools > Clean-Up
I did not include "Open Folder" in the steps earlier on.
Still the same problem. Is it related to the fact that I don't have permission to edit the files synchronized by serverifiedlists.com? The site list path here is synchronized from the Dropbox file.
Try deselecting those ones and select the other 5 and Clean-Up
Just select all the site lists then Tools > Clean-Up
I did not include "Open Folder" in the steps earlier on.
Still the same problem. Is it related to the fact that I don't have permission to edit the files synchronized by serverifiedlists.com? The site list path here is synchronized from the Dropbox file.
Try deselecting those ones and select the other 5 and Clean-Up
Just select all the site lists then Tools > Clean-Up
I did not include "Open Folder" in the steps earlier on.
Still the same problem. Is it related to the fact that I don't have permission to edit the files synchronized by serverifiedlists.com? The site list path here is synchronized from the Dropbox file.
Try deselecting those ones and select the other 5 and Clean-Up
The others can work fine.
Do you own Scrapebox? I was thinking to download the serverifiedlists.com list manually and clean-up with scrapebox or something similar.
Just select all the site lists then Tools > Clean-Up
I did not include "Open Folder" in the steps earlier on.
Still the same problem. Is it related to the fact that I don't have permission to edit the files synchronized by serverifiedlists.com? The site list path here is synchronized from the Dropbox file.
Try deselecting those ones and select the other 5 and Clean-Up
The others can work fine.
Do you own Scrapebox? I was thinking to download the serverifiedlists.com list manually and clean-up with scrapebox or something similar.
Sorry, I don't have this tool. I just contacted serverifiedlists.com regarding the above issue. Thanks again for your guidance.
When I mentioned cleaning the site list, what I meant was to first start with an empty verified folder.
Then run your different site lists as normal, making sure that your verified folder is set like this:
- "Read only" option must be disabled. - "Add urls" type must be set to verified.
- All your other folders containing site lists must have read only enabled.
This will save any newly verified/working sites to your empty verified folder.
All your other projects can be set to use the verified folder for making links - you will see a significant increase in vpm on projects using the verified folder.
This process I repeat every few months - that's how to maintain the performance - it will give you the best vpm, as your site list will contain only working sites.
I don't see any real issues with your selection of tools or project set ups.
Paid site lists are usually the cause of low vpm as they don't clean their site lists. They expect the user to do that.
The group 1 projects have very low verified numbers - contextuals and profile link sources are quite rare these days for the software.
It's the high submitted numbers that are a concern. This usually means an issue with emails not being verified, or in your case it's more likely that the sites no longer work - site owners have disabled the registration process. Those sites probably used to work but don't anymore. You'll have to clean the list and remove non-working sites.
The group 2 projects are more of a concern though as you have engines like indexer and redirects selected - these site numbers are normally in excess of 10,000 each - yet you've only got a few hundred. They don't even need emails to make a link.
Although I've just noticed you've set a custom time delay of 2 days - that's far too long - that's why you've got a ton links still sitting as submitted, waiting to be verified.
Set it to verify automatically instead. Then see if your verified numbers increase.
Also de-select "skip hard to solve captchas". You have xevil, so no need to use this option.
Thanks to your advice, I purchased one of the engine options on Sernuke. My VPM also increased from 1 to 9-10
Your advice and insights have really helped me
Thank you so much for your helpful answers. I hope your business continues to be successful
Comments
I understand about the resource usage to recheck lists. I'm running your list and the software is maxing out at 3gb ram already. Most likely because of the blog comments/guestbook sites with high obls. Normally I run just contextual and profile sources and I never see any issue with high memory usage.
@sickseo Thanks for leaving your honest review. I'm also interested in having you test my CatchE program, as I believe you're an expert in this field. If you're interested in testing the CatchE software, please let me know and I’ll provide you with access.
It's a set-and-forget program for managing your project's emails using fresh catch-all addresses. It would be great if you could also review the program and share your insights. I believe it's the best option available on the internet.
I've been using my own system through zennoposter. Have a cheap cpanel hosting with 10 domains on it and zenno creates the catchalls for me via cpanel and exports the catchall emails in the right format for gsa or rankerx. Hence why I've not had a need to look elsewhere.
Based on your suggestion about verification email, I'm now using CATCHE from gsaserlists.com. Thanks for the helpful advice
i always using private proxy btw
Oh yaa, you said you could reach 300vpm. Did you activate the indexer engine, pingback, URL shortener, or exploit at that time?
currently i bought list from gsaserlists.com btw
do you mean GSA PLATFORM IDENTIFIER ?
The crazy speeds I'm getting is from enabling the reposting on each site with multiple accounts. Like this:
This results in 1 post/account being made every day on every site in my list and will keep doing that for 30 days. The extra posts is only supported on the contextual engines. Profile link sources will only create 1 link per account.
First day, you won't get those crazy speeds, because of the delay settings above. It's still making accounts and verifying emails.
From day 2 onwards you'll see the speed increasing as the reposting on sites with pre-created accounts begins.
I've got 6 of these running right now - speeds run between 200 - 400+ vpm.
So, currently, the configuration I'm using is as follows:
- VPS: Contabo, European server
- Proxy: Private
- Captcha: Xevil, Captmonster self-hosted, GSA CB
- Sitelist: Gsaserlists
- Email: Catche, Gsaserlists
- GSA settings, following your suggestions
What else should I do to get hundreds of VPMs like you?
Main thing is to make sure your list is clean and has all non-working sites removed. To do this you need to start with an empty verified folder - then run 1 project to process any old lists you have. The software will save all verified links into your verified folder - these will be 100% working sites.
I repeat this process maybe once a month or even once every 2-3 months. That's how I maintain the high speeds.
This is my own list that I recently processed:
This is the breakdwon of sites by engine, although I'm still scraping and testing to find more:
But 1 vpm usually means there are no more working sites in your list, or that the site list you are procesing contains mostly non-working sites. If you enable important messages for projects, the software will leave messages for each project once there are no more targets to post to.
Once the software reaches the end of the list, the projects will need to be reset by performing delete target url cache, history and accounts. This will make the software re-use the site list again from the beginning.
Group 1 and 2 submit config
Group 3 submit config
Paid site lists are usually the cause of low vpm as they don't clean their site lists. They expect the user to do that.
The group 1 projects have very low verified numbers - contextuals and profile link sources are quite rare these days for the software.
It's the high submitted numbers that are a concern. This usually means an issue with emails not being verified, or in your case it's more likely that the sites no longer work - site owners have disabled the registration process. Those sites probably used to work but don't anymore. You'll have to clean the list and remove non-working sites.
The group 2 projects are more of a concern though as you have engines like indexer and redirects selected - these site numbers are normally in excess of 10,000 each - yet you've only got a few hundred. They don't even need emails to make a link.
Although I've just noticed you've set a custom time delay of 2 days - that's far too long - that's why you've got a ton links still sitting as submitted, waiting to be verified.
Set it to verify automatically instead. Then see if your verified numbers increase.
Also de-select "skip hard to solve captchas". You have xevil, so no need to use this option.
Go to Options > Advanced
Click the "Site Lists" tab
Tick the list(s) you want to clean
Click "Tools" to the right
Then "Clean-Up"
I tried to clean it up according to the steps above, but it seems to be an error.
Here, I recorded a video. Please help check it. Thank you very much.
Try deselecting those ones and select the other 5 and Clean-Up
The others can work fine.
Do you own Scrapebox? I was thinking to download the serverifiedlists.com list manually and clean-up with scrapebox or something similar.
Then run your different site lists as normal, making sure that your verified folder is set like this:
- "Read only" option must be disabled.
- "Add urls" type must be set to verified.
- All your other folders containing site lists must have read only enabled.
This will save any newly verified/working sites to your empty verified folder.
All your other projects can be set to use the verified folder for making links - you will see a significant increase in vpm on projects using the verified folder.
This process I repeat every few months - that's how to maintain the performance - it will give you the best vpm, as your site list will contain only working sites.
Thanks to your advice, I purchased one of the engine options on Sernuke. My VPM also increased from 1 to 9-10
Your advice and insights have really helped me
Thank you so much for your helpful answers. I hope your business continues to be successful