Skip to content

Finally all set up again I think? (I have a couple of questions)

edited January 14 in Need Help
Hello,

It's been a long time since I last used anything to do with seo. I first bought GSA tools in 2013 and had massive success with it. Since I stopped around 2017 (I think) I have ran all kinds of PPC and still do.

Anyway, I finally stopped procrastinating and got everything I need (I think) so far to get the ball rolling. What I have bought/setup so far:

  • I've started paying for GSASERlists
  • I've started paying for catchall emails
  • I use Solidseo VPS
  • I paid for Xevil
  • I added some money to OpenAI and get articles from there into SER
  • I just paid for Spinrewriter and I'm now using their API to spin the AI articles.
  • I've bought 30 proxies from buyproxies (20 for SER, 10 for Xevil) good idea or not?
  • I got the MOZ chrome extension for PA/DA.

I made a random blog, just to test and play around with until I figure out what I want to focus on ranking.

I think I've covered everything above so far what I've done. Thanks to @backlinkaddict & @organiccastle
for replying to my thread when I came back, it seemed like information overload but now everything is launched/setup it's not so bad afterall!

Anyway, everything is running and Xevil keeps throwing these errors and has quite a lot of fails:



And then here's the success rates so far:



I followed the guide in GSASERlists on how to setup Xevil.

Questions:
Why is Xevil throwing those errors?
What's the best way to track serps these days? I forgot the tool I used to use that had green arrows/red arrows etc when moving up or down - stupid I know but I really liked that for tracking serps.

How can I increase LPM? I use verified lists auto-syncing from dropbox, SER doesn't scrape or look for targets.
I'm currently getting 6LPM with 1 project.

Well, I'm glad to be back!

Happy to see it's quite active and still lots of valuable info here. Also happy to see @Sven constantly supportive too after all this time! Awesome stuff you do here, it's greatly appreciated.
«1

Comments

  • Thanks for the reply!

    I will try that now with Xevil and just signed up to Serpcloud.

    I left it blasting and just got online now the success rate is like this:



    I'll try your suggestions and report back. Thanks!
  • edited January 15
    @Anth20

    I've started paying for GSASERlists
    Don't ever pay for a list. They are all shit. Learn how to create your own list ( Not from Google, with Scrapebox)

    I've started paying for catchall emails
    It costs $12 to create unlimited catchall emails on Zoho. Learn that

    I use Solidseo VPS
    Ready-made Windows VPSs are overpriced. Learn how to install Windows on a Linux machine. 

    I paid for Xevil
    Ok

    I added some money to OpenAI and get articles from there into SER
    Ok

    I just paid for Spinrewriter and I'm now using their API to spin the AI articles.
    Ok

    I've bought 30 proxies from buyproxies (20 for SER, 10 for Xevil) good idea or not?
    You did what? You bought private ipv4 proxies for Xevil? Who gave you that advice? Oh my god. 
    There are thousands of posts here stating that you need IPv6 proxies for Xevil. The IPv4 proxies you bought from buyproxies will be immediately banned by Google.

    I got the MOZ chrome extension for PA/DA.
    PA/DA is a shit metric, let alone MOZ. The truth is, you don't really need PA/DA data for the websites where you want to place backlinks. Supply is limited anyway.





    Thanked by 1Anth20
  • Make sure to use IPv6 proxies for Xevil. I believe reproxy.network is the provider everyone is using.

    I am using Xevil 5 as the primary captcha solver, capmonster.cloud as a backup and for hCaptchas.



    Some sites have not implemented Captchas correctly, thus the errors you are seeing. Below a screenshot of the capmonster.cloud log.




    VPS: Take a look at the Hetzner VPS where you can install Windows with minimal effort.

    Spintax: If you own a license of Scrapebox and the Article Scraper plugin, you can generate Spintax with it through the OpenAI-API. This works very well also for non-English content.

    Link Lists: You can scrape your own using the footprints in SER or - if you have Xrumer - extract footprints. @royalmice has a great instruction on how-to: https://asiavirtualsolutions.com/use-xrumer-to-find-foot-prints-for-gsa-search-engine-ranker/. Buying link lists seems to be a religion although these all overlap. Worst I have experienced is "SER Verified Lists" which does not even provide identified targets but pure rubbish.

    Catchalls: I suggest you to try the IPv6 proxies in Xrumer first to see how this improves your efforts. I am using my own catchalls. It is some minutes of work to set up your own mail server on a small Linux VPS: https://www.linuxbabe.com/mail-server/postfixadmin-ubuntu

    Metrics: Do these matter to Google? I doubt it. If you are really into metrics, https://seo-rank.my-addr.com/ gives you MOZ, Semrush and ahrefs figures at a low price.

    Indexer: It is pointless to build links if these are not getting indexed at all. Consider an indexer at least for your T1 links to see results in organic traffic. 
    Thanked by 1Anth20

  • @organiccastle how to get this picture?I can't find it on GSA
  • Jason_88 said:

    @organiccastle how to get this picture?I can't find it on GSA
    Double-click on the letter "C" located at the bottom of the GSA.
  • SvenSven www.GSA-Online.de
    or on STATS label in captcha settings.
  • malcom said:
    @Anth20

    I've started paying for GSASERlists
    Don't ever pay for a list. They are all shit. Learn how to create your own list ( Not from Google, with Scrapebox)

    I've started paying for catchall emails
    It costs $12 to create unlimited catchall emails on Zoho. Learn that

    I use Solidseo VPS
    Ready-made Windows VPSs are overpriced. Learn how to install Windows on a Linux machine. 

    I paid for Xevil
    Ok

    I added some money to OpenAI and get articles from there into SER
    Ok

    I just paid for Spinrewriter and I'm now using their API to spin the AI articles.
    Ok

    I've bought 30 proxies from buyproxies (20 for SER, 10 for Xevil) good idea or not?
    You did what? You bought private ipv4 proxies for Xevil? Who gave you that advice? Oh my god. 
    There are thousands of posts here stating that you need IPv6 proxies for Xevil. The IPv4 proxies you bought from buyproxies will be immediately banned by Google.

    I got the MOZ chrome extension for PA/DA.
    PA/DA is a shit metric, let alone MOZ. The truth is, you don't really need PA/DA data for the websites where you want to place backlinks. Supply is limited anyway.





    Thanks for your reply!

    I do have scrapebox, so I will look into re-learning how to get my own. I assumed these verified lists were better, but it's been a long time since I did any of this.

    I'll look into Zoho!

    For now, I'll just stick with the vps I got. I can think about changing or w.e in future.

    Ok, thanks. I'll keep 30 proxies for SER only then. And look into getting ipv6 for Xevil!

    So PA doesn't matter anymore when passing juice through dofollow links? What do you mean supply is limited?

    Thanks @malcom!
  • Make sure to use IPv6 proxies for Xevil. I believe reproxy.network is the provider everyone is using.

    I am using Xevil 5 as the primary captcha solver, capmonster.cloud as a backup and for hCaptchas.



    Some sites have not implemented Captchas correctly, thus the errors you are seeing. Below a screenshot of the capmonster.cloud log.




    VPS: Take a look at the Hetzner VPS where you can install Windows with minimal effort.

    Spintax: If you own a license of Scrapebox and the Article Scraper plugin, you can generate Spintax with it through the OpenAI-API. This works very well also for non-English content.

    Link Lists: You can scrape your own using the footprints in SER or - if you have Xrumer - extract footprints. @royalmice has a great instruction on how-to: https://asiavirtualsolutions.com/use-xrumer-to-find-foot-prints-for-gsa-search-engine-ranker/. Buying link lists seems to be a religion although these all overlap. Worst I have experienced is "SER Verified Lists" which does not even provide identified targets but pure rubbish.

    Catchalls: I suggest you to try the IPv6 proxies in Xrumer first to see how this improves your efforts. I am using my own catchalls. It is some minutes of work to set up your own mail server on a small Linux VPS: https://www.linuxbabe.com/mail-server/postfixadmin-ubuntu

    Metrics: Do these matter to Google? I doubt it. If you are really into metrics, https://seo-rank.my-addr.com/ gives you MOZ, Semrush and ahrefs figures at a low price.

    Indexer: It is pointless to build links if these are not getting indexed at all. Consider an indexer at least for your T1 links to see results in organic traffic. 

    Thanks for the reply! Your solve rates look good, definitely going to get ipv6 proxies. Ahh that makes sense with the errors.

    I do own scrapebox, it didn't occur to me to use it to spin articles. Don't you think it's easier/more simplified just using OpenAI and letting SER spin them with my spinrewriter API? It's then all auto, especially in future if working with multiple projects.

    I will setup my own catchalls then, makes sense. Do you see them burn out often/at all?

    I have been running email marketing for the past 2-3 years, I can easy make catchalls but I assumed I'd be swtiching them out often so thought of using a service.

    I did consider indexing services, I used to use them a lot, but I recently seen contradicting information on them so I thought I'd wait.

    Thanks for your replies guys, I just dove in to ensure I get the ball rolling, now I can tweak/change and get closer to the perfect setup especially with your advice.

    I asked this years ago, and the answer depends on a lot of variables.. But how long does it take you to rank these days? Picture your own strategies now, if implementing on a new site, what time scale do you expect to get yourself top 10. Too many variables I know, just a figure would help give me an idea on say low-med competition keywords.

    Thanks!
  • edited January 16
    Sven said:
    or on STATS label in captcha settings.
    Can they be reset so I can better track if my changes improve %?

    Edit: nvm, right clicking done it!
  • Anth20 said:

    Thanks for the reply! Your solve rates look good, definitely going to get ipv6 proxies. Ahh that makes sense with the errors.

    I do own scrapebox, it didn't occur to me to use it to spin articles. Don't you think it's easier/more simplified just using OpenAI and letting SER spin them with my spinrewriter API? It's then all auto, especially in future if working with multiple projects.

    I will setup my own catchalls then, makes sense. Do you see them burn out often/at all?

    I have been running email marketing for the past 2-3 years, I can easy make catchalls but I assumed I'd be swtiching them out often so thought of using a service.

    I did consider indexing services, I used to use them a lot, but I recently seen contradicting information on them so I thought I'd wait.

    Thanks for your replies guys, I just dove in to ensure I get the ball rolling, now I can tweak/change and get closer to the perfect setup especially with your advice.

    I asked this years ago, and the answer depends on a lot of variables.. But how long does it take you to rank these days? Picture your own strategies now, if implementing on a new site, what time scale do you expect to get yourself top 10. Too many variables I know, just a figure would help give me an idea on say low-med competition keywords.

    Thanks!
    ReCaptcha: Give it a try with the IPv6 proxies and you will see a huge difference. It's $5 a month for the smallest package with 50 threads. My SER is running at 200+ threads.

    Spintax: Spinrewriter is probably better (for English content). I don't have a subscription so I am happy to pay per use through Scrapebox / OpenAI.

    Catchalls: I've ever experienced any to be blocked / blacklisted. It is a one time effort to set up the postfix thing and it just runs.

    Indexing: I did wait, too. Too long. Used GSA Indexer plus various workarounds with sitemaps, more tiers, etc.. Now, I am happy to pay less than a cent to get my links indexed. Still running GSA Indexer and tiers though.

    Results: I can see results within a week, 10 days for easy to medium keywords. Quite impressive. The content on your site makes a huge impact. GSA Keyword Research with its Article Writer is a beast to create & improve it. Take the suggestions + your original content, task OpenAI to improve it accordingly, give it a final touch and you'll rank way better and for so many more keywords. Yes, there is an additional cost for the license plus the APIs, but it will pay back in no time.
    Thanked by 2Anth20 Jason_88
  • edited January 17
    Anth20 said:

    Thanks for the reply! Your solve rates look good, definitely going to get ipv6 proxies. Ahh that makes sense with the errors.

    I do own scrapebox, it didn't occur to me to use it to spin articles. Don't you think it's easier/more simplified just using OpenAI and letting SER spin them with my spinrewriter API? It's then all auto, especially in future if working with multiple projects.

    I will setup my own catchalls then, makes sense. Do you see them burn out often/at all?

    I have been running email marketing for the past 2-3 years, I can easy make catchalls but I assumed I'd be swtiching them out often so thought of using a service.

    I did consider indexing services, I used to use them a lot, but I recently seen contradicting information on them so I thought I'd wait.

    Thanks for your replies guys, I just dove in to ensure I get the ball rolling, now I can tweak/change and get closer to the perfect setup especially with your advice.

    I asked this years ago, and the answer depends on a lot of variables.. But how long does it take you to rank these days? Picture your own strategies now, if implementing on a new site, what time scale do you expect to get yourself top 10. Too many variables I know, just a figure would help give me an idea on say low-med competition keywords.

    Thanks!
    ReCaptcha: Give it a try with the IPv6 proxies and you will see a huge difference. It's $5 a month for the smallest package with 50 threads. My SER is running at 200+ threads.

    Spintax: Spinrewriter is probably better (for English content). I don't have a subscription so I am happy to pay per use through Scrapebox / OpenAI.

    Catchalls: I've ever experienced any to be blocked / blacklisted. It is a one time effort to set up the postfix thing and it just runs.

    Indexing: I did wait, too. Too long. Used GSA Indexer plus various workarounds with sitemaps, more tiers, etc.. Now, I am happy to pay less than a cent to get my links indexed. Still running GSA Indexer and tiers though.

    Results: I can see results within a week, 10 days for easy to medium keywords. Quite impressive. The content on your site makes a huge impact. GSA Keyword Research with its Article Writer is a beast to create & improve it. Take the suggestions + your original content, task OpenAI to improve it accordingly, give it a final touch and you'll rank way better and for so many more keywords. Yes, there is an additional cost for the license plus the APIs, but it will pay back in no time.

    Yes not long after your post I bought them after trying multiple times (Russian/Rubles card not working etc) and it's running at 50 threads, although at the time I tested it was using just 12 or something. They seem much better though, thanks!

    Yes I agree, for now anyway it's gonna help make that easier with spinrewriter.

    That's great about catchalls, going to make my own now!

    Less than a cent? That's awesome. 

    Yeah that's really impressive and happy to hear it! No special t1 stuff, just all through GSA and scraping your own links etc?

    I'll definitely have to get the Keyword Research tool, does that also work better than Keyword Planner? (forgive me if that's stupid, like I said, it's been years!) haha

    Thanks for your input, much appreciated!
  • The proxy [5] bFrame proxy error in XEvil actually means it's not working for v3 Recaptcha and that I'd need XEvil 6 for that to work. Typical when I just bought 5.

    So do you guys who use 5, just ignore those errors? Or scrape sites likely to not have v3? (above info is according to reproxy.network) I asked them because I thought it was proxy error.
  • With XEvil 6, do you know which cores are recommended now to use with GSA SER?
  • Anth20 said:
    The proxy [5] bFrame proxy error in XEvil actually means it's not working for v3 Recaptcha and that I'd need XEvil 6 for that to work. Typical when I just bought 5.

    So do you guys who use 5, just ignore those errors? Or scrape sites likely to not have v3? (above info is according to reproxy.network) I asked them because I thought it was proxy error.
    These errors also show in capmonster.cloud. So I am just ignoring these and continue with Xevil5. 80+-% overall success rate is fine for me.
    Thanked by 1Anth20
  • Anth20 said:
    The proxy [5] bFrame proxy error in XEvil actually means it's not working for v3 Recaptcha and that I'd need XEvil 6 for that to work. Typical when I just bought 5.

    So do you guys who use 5, just ignore those errors? Or scrape sites likely to not have v3? (above info is according to reproxy.network) I asked them because I thought it was proxy error.
    These errors also show in capmonster.cloud. So I am just ignoring these and continue with Xevil5. 80+-% overall success rate is fine for me.
    Ok great, yes that makes sense! I'd love around 80% right now haha.

    So I started using reproxy.network, they advised the UKR ipv6 so that's what I'm using and check this:



    No idea why/what's going on. I have these selected in cores:



    Is this the issue?


  • These are the modules I have enabled:


    Thanked by 1Anth20
  • yes,I only has Xevil,it only can solved 50% captchas at least.I asked them to solved issues,they suggested me to buy Xevil 6 Beta bacause it won't shows this issues. I think they are lying customers.
  • DeeeeeeeeDeeeeeeee the Americas
    edited January 17
    malcom said:
    @Anth20

    I've started paying for GSASERlists
    Don't ever pay for a list. They are all shit. Learn how to create your own list ( Not from Google, with Scrapebox)

    So you don't think paid lists have any use value at all, then?  Why?
    I understand that making my own lists would work better.
  • Anth20 said:

    Thanks for the reply! Your solve rates look good, definitely going to get ipv6 proxies. Ahh that makes sense with the errors.

    I do own scrapebox, it didn't occur to me to use it to spin articles. Don't you think it's easier/more simplified just using OpenAI and letting SER spin them with my spinrewriter API? It's then all auto, especially in future if working with multiple projects.

    I will setup my own catchalls then, makes sense. Do you see them burn out often/at all?

    I have been running email marketing for the past 2-3 years, I can easy make catchalls but I assumed I'd be swtiching them out often so thought of using a service.

    I did consider indexing services, I used to use them a lot, but I recently seen contradicting information on them so I thought I'd wait.

    Thanks for your replies guys, I just dove in to ensure I get the ball rolling, now I can tweak/change and get closer to the perfect setup especially with your advice.

    I asked this years ago, and the answer depends on a lot of variables.. But how long does it take you to rank these days? Picture your own strategies now, if implementing on a new site, what time scale do you expect to get yourself top 10. Too many variables I know, just a figure would help give me an idea on say low-med competition keywords.

    Thanks!
    ReCaptcha: Give it a try with the IPv6 proxies and you will see a huge difference. It's $5 a month for the smallest package with 50 threads. My SER is running at 200+ threads.

    Spintax: Spinrewriter is probably better (for English content). I don't have a subscription so I am happy to pay per use through Scrapebox / OpenAI.

    Catchalls: I've ever experienced any to be blocked / blacklisted. It is a one time effort to set up the postfix thing and it just runs.

    Indexing: I did wait, too. Too long. Used GSA Indexer plus various workarounds with sitemaps, more tiers, etc.. Now, I am happy to pay less than a cent to get my links indexed. Still running GSA Indexer and tiers though.

    Results: I can see results within a week, 10 days for easy to medium keywords. Quite impressive. The content on your site makes a huge impact. GSA Keyword Research with its Article Writer is a beast to create & improve it. Take the suggestions + your original content, task OpenAI to improve it accordingly, give it a final touch and you'll rank way better and for so many more keywords. Yes, there is an additional cost for the license plus the APIs, but it will pay back in no time.

    So what's way are you uesd to index baklinks? I use Indexer and buliding tiers for projects. Google index is so slow.I indexed backlinks 18 days ago,only few backlinks were indexed.
  • Deeeeeeee in my opinion a majority of the time a shared list or bought gsa list with be spammed to shit and may not offer value your looking for. (depending on goals)

    Plus, at that scale eventually you'll run out of good quality postable targets that will be placed, indexed and stick anyways. (just a waste of resources to spend time on a majority of them)

    Good practice to grow your own list. (even niche/project specific)

    I think its also worth noting that some people forget or may not know how to use the tools to clean up their own global site lists.

    You can dedupe domains and or urls. Then cleanup/ remove non working. This will probably take at least a few hours but its def worth doing once in awhile.

    You may be amazed how much faster things go when you set up a new project, use your newly cleaned and deduped sitelist for just posting.

    You can view statistics of sitelists as well per engine in tools and or compare lists.

    Can even go another step more and get metrics of a list and remove ones of less then "X" metric whichever you like to go by and now you know in your global sitelist all URLs your posting to have a certain "value" to them.

    It's harder to imagine whats going on in SER being so complex in comparison to some other softwares but this is where good templates will make the difference.

    If you have good templates and a global sitelist full of a certain "quality" urls now you can use that and create advanced templates with links that will actually power up your other links/projects.

    Also, it's easy to test like this and measure/see whats working and whats not and can adjust templates accordingly.

    Maybe this will help and it's just my experience/opinion.

    how to find more high quality lists to post backlinks?
  • Jason_88 said:

    Indexing: I did wait, too. Too long. Used GSA Indexer plus various workarounds with sitemaps, more tiers, etc.. Now, I am happy to pay less than a cent to get my links indexed. Still running GSA Indexer and tiers though.


    So what's way are you uesd to index baklinks? I use Indexer and buliding tiers for projects. Google index is so slow.I indexed backlinks 18 days ago,only few backlinks were indexed.
    Check the Buy / Sell section, get a free trial and test it.
  • edited January 18
    Anth20

    I dont see how that solve rate is possibly that bad for Xevil 5???

    I ran a test a few months back and had 65,000 captchas solved with less then a handful of errors in Xevil 5.

    Something in your settings must need to be adjusted.

    I have had mine set like yours above without hotmail but also checked the one before and after it. That gave me best results in my testing. It seemed to identify and choose the correct network and lets you know on front end what one it used to solve. Maybe start there and change as needed.

    I dont think that error has anything to do with upgrading to "6" either unless they are "moving features" to a new version again which really would be dishonest enough of a move to make me not use it anymore. Seems they have been doing this for some time now. You buy it and then features "moved" to next version and then theres a 10 dollar monthly fee which in the video says its optional but does not seem so?

    I was using low amount of no special proxies either. I was just playing around with different neural nets and changing the proxies in the list with given options until I had a consistant solve. I think always its like that for me with these systems working together. It's a bitch at first but then you hit a "sweet spot" with your settings/integrations and things start going much smoother from there.

    I think maybe your issue is settings, and seems misleading to have GSA CB set first and then Xevil second. I don't know if your randomly selecting services or in and ceratin order, or telling certain servcies to solve certain captchas? retries? skips? etc.. Did you put xevil as your "send captchas to 1st service" and move Xevil to the first service or similiar and try? Otherwise the solve rate is going to be higher for first service as its getting all the capcthas first and sending the others of to other services (dependant on settings)

    I only have Xevil 5 as well. You should not need Xevil 6 for Recaptcha but for the "gpu acceralated module" that was coming out 2 years ago and for "h" , "fun" and "clouflaire" as their marketing suggests. I recently contacted them to ask how "6" was coming along and they took 1.5 weeks to answer the question and the response was just ..." log into the tech forum and then click the banner on your xevil and purchase it" .... Well no shit but thats what I asked. I wanted to know  the state of version "6" 

    And the forum seems as dead as the "new" Money Robot  FB one with pinned posts from 2015 and well known "fake profiles" trying to boost confidence of the random users who will pop in.

    Capcloud was same way I asked how the state of things were for solving "h" and "fun"  and got shit response back. Seems in my experience 2captcha is best for solving these and give you client libraries on how to solve every type if you want to do it on your own. Ive been using for 10+ years. Everything has been great so im not going to fix something thats not broken with something that is.

    I dont know if you remember but I just thought of when we were all in here trying to solve the "double blobs" with EVE ocr, takes me back. lol

    My experience CB to Xevil to 2captcha give best results or just Xevil to 2captcha. I like to use CB for its extra functionality which is great for testing.


    Scrapebox - yes they have added another premium plugin that uses Openai API and have done a lot of recent changes to it. They updated the batch poster and added a few more premium plugins I noticed too. Havent had tons of time to play with it yet but seems some time has gone into developing it further which is nice to see. There was always option to scrape, spin and batch post articles to wordpress and a few other sites if you had the premium article plugin. I think the expired domain finder plugin was probably my most used premium plugin before this. As well as a few other for checking web 2.0 availability.

    Still I think people dont understand the power of SER and Scrapebox and all its features. I'll admit it is intimidating at first and learning curve is too high for the average person. High enough that for most people the time to learn/cost vs paying someone else would probably be easier to outsource the work. You will not get friendly interface with either of these softwares. That being said they are not needed especially if just ripping through automation tasks and it would just slow things down.

    You can tell someone with a hackerish thinking analytical mind was behind development of these softwares not from a UI/UX standpoint. Probably why they run so great once set up properly.

    These tools I would exspect to find in Kali Linux build.

    Anyways, 

    Let me know if you find solution to your solve rate. Im actaully going to go test Xevil now so maybe ill find something helpful..

    Ill share if I do..

    Thanks, I'll keep playing around with settings. I have it set as "ask all services to fill captchas". I'll try with the engines selected as organiccastle showed a screenshot of and see how that goes.

    I will also try with XEvil first and CB second etc, like you say.. gotta find a sweet spot :)

    I'm playing around for now, just blasting a blog I made getting used to the systems again.

    I'll definitely allocate some time to learn scraping my own again, more relevant to niches etc.

    I left it blasting since my last post and have results like this:



    Maybe all I need is the extra neuronets, time will tell and when I finally get to a better % I'll update how.

    Here's my settings for XEvil in SER:



    I had retries at 2, just changed to 1.

    Also, I moved XEvil above CB now, see how it goes.
  • edited January 19
    It seems to be a little better since I added more neuronets, put XEvil first, and changed retries to 1.



    Maybe it could also be the sitelists? The types of sites it's trying to solve captchas on?

    This is an improvement which is good, but would love 60%+  :D

    Edit: I just noticed in my previous post, I have Recaptcha v3 selected. If reproxy.network support said I need XEvil 6 for that, should I untick Recaptcha v3 then?

    I'll do that, reset stats and check again later.
  • SerpCloud is the service with the red and green arrows that integrates with SER your looking for.

    My memory was vague, it was serprobot I loved because it tells you the first rank seen, then also it's best rank, and latest like this:



    So it was great to see a new url pop in at 86 for example, and then see that it then changes to 12, if it moves back to 19, it will keep the best at 12 so we know the best position it's been at.

    They have updated UI since I used it, but it's still the same so I'm happy :smile:

    Had to check emails from 2013/2014 :D
  • I agree, I have XEvil5. Since I read your comment I unticked H captcha and re-enabled v3 and success rate is 80% now!

    Probably H captcha has been the issue, and I had 50 threads too.

    I'm using reproxy.network proxies which rotate and it looks like there is 999 ipv6 proxies in there for $5/mo.
  • Yeah! Happy with it now, thanks for your help!

    Oh I see, yeah I visit the site for info but it's nostalgic to be using this and I like the info it provides and it's really cheap, like $5/mo for 75kw.

    Hmm ok I'll keep that in mind, thanks.
  • DeeeeeeeeDeeeeeeee the Americas
    @backlinkaddict: Yes; this was very helpful! backlinkaddict said: Deeeeeeee in my opinion a majority of the time a shared list or bought gsa list with be spammed to shit and may not offer value your looking for. (depending on goals) Hmmm...OK...what about using them at the lower tiers (higher numbered tiers)? Just to add overall "weight" to "better" links the list-derived backlinks point to? Or...should I avoid lists altogether as they are somehow toxic/malware ridden/etc targets? Or, maybe less likely but who knows - -honeypots for SEs to identify artificial ranking improvement attempts?! :anguished:
  • Anth20 said:
    I agree, I have XEvil5. Since I read your comment I unticked H captcha and re-enabled v3 and success rate is 80% now!

    Probably H captcha has been the issue, and I had 50 threads too.

    I'm using reproxy.network proxies which rotate and it looks like there is 999 ipv6 proxies in there for $5/mo.
    This is good news!

    I believe you can further optimize load on your machine by putting GSA 1st for image captchas. But since I don't have a license, this is just a guess.

    Add a commercial captcha solver for the HCaptchas to get another 5% solved. @backlinkaddict is happy with 2captcha, I am with capmonster.cloud.
    Thanked by 1Anth20
  • Deeeeeeee said:
    @backlinkaddict: Yes; this was very helpful! backlinkaddict said: Deeeeeeee in my opinion a majority of the time a shared list or bought gsa list with be spammed to shit and may not offer value your looking for. (depending on goals) Hmmm...OK...what about using them at the lower tiers (higher numbered tiers)? Just to add overall "weight" to "better" links the list-derived backlinks point to? Or...should I avoid lists altogether as they are somehow toxic/malware ridden/etc targets? Or, maybe less likely but who knows - -honeypots for SEs to identify artificial ranking improvement attempts?! :anguished:
    Of course, these link lists are being shared amongst various customers. serverifiedlists with 200 subscribers, as their support confirmed, other vendors might have different figures. Plus the sharing by non-paying subscribers or reselling the lists as their own. From my experience, you will find many target sites in verified lists not being verified at all but just identified.

    You have the tool at your hand. Consider the theoretical idea of testing the links in the lists with your competitor's sites. Obviously, you don't want their rankings to improve but just test the lists. So, take the content of your competitors websites as input and spread it to all the target sites the lists gives you. Google honors duplicate content a lot ;). As said, just a theoretical idea.

    I personally rely on the various filters within SER for my self-scraped lists as well as for bought lists. OBL, words, language, countries, etc. for T1 links, less restrictive settings for T2 up. DA, DR, spam score are irrelevant to me, only rankings and thus traffic matters.
    Thanked by 1Anth20
  • You like the re proxy network you mentioned? Its working well now? Have you stressed tested or cranked up threads and see if sucess is still high.

    Sounds really affordable so I wonder, maybe Ill try or at least look into. Thanks!
    I found their service for Xevil IPv6 proxies very stable and reliable.

    It seems they have also changed their IPv4 setup recently. I am using these for GSA Website Contact now and can't complain about anything but the proxy speed.

    For scraping, I am using webshare proxies with the Google-option, for SER submissions cheaper webshare proxies without that option but more traffic volume.
Sign In or Register to comment.