Skip to content

GSA SER still works amazingly in 2023. But pay heed to the following: - Update July 2023

Note before reading: This is not an attempt to take a cheap shot at "experts" not is it meant to be put down anyone. Everyone has their own experiences. These opinions are just my own, and I wish to share them with any people so they don't have the same doubts that I did and so that they can save their time.

Search engine marketing forums (black hat, grey hat, white hat etc) are full of alleged experts insisting that you cannot use automated tools to build your money site. I have learned this is nonsense and I wish to share something I have learned.

GSA is amazingly powerful if you know how to use it right. Even while not going nuclear, I am getting 70+ LPM a minute while targeting high-quality sites and the links are getting indexed and staying live. One myth that is common is that its easy to get google slapped for having some bad links.
It is not, otherwise think about how lucrative the market would be for taking out all the competition if it was so easy to get them banned from the search engines. It may be true that low-quality links give you zero benefit, but they are unlikely to de-index you. There is a difference between getting zero benefits from something and getting penalized for it. Get as many backlinks to your site indexed as possible and in general, the risks to your rank are to the upside.

So how can you make these backlinks get indexed and give you juice? Three mistakes people make:

1.) Emails.
2.) Content.
3.) Indexing.

Do not use generic throw-away emails. Your accounts will get banned and then all your work has been in vain. This is the most crucial part before going anywhere else. I have learned the following methods are very effective:

1.) Buying gmail accounts.
2.) Buying cheap domains and set up your own catch-all. (This is the most economical and works)
3.) Buying old yahoo accounts.

Not only do the accounts get registered more smoothly, but they do not get banned and hence you can reuse the article and social network and bookmark ones over and over. This not only saves you money on captchas but also significantly increases your LPM. Give it a try. You will see a 10X increase once you have your accounts all made.

Use at least decent quality content and you can get it indexed. It is that simple. It may be more expensive but it works. The most important thing that I have realized is the use of spinners. Spinrewriter and Word AI work wonders and have a far higher index rate than generic built-in junk which will not get indexed.

Use the filters that GSA offers to fine-tune your verified list. If you are targeting UK traffic, build your links onto UK-hosted sites. GSA gives you the option to use it. I know this sounds petty, but trust me, it makes a difference. There is a dating term that has 10x more UK traffic than the US and I have noticed a significant gain already in google.co.uk just by making this tuning. I'm sitting just outside of the top 10 and hope to break the first page for this 20k+ searched term in the next month.

Now the final part how do you Index? A link that isn't indexed has no value, we all know this song and dance. However, Indexing is harder now than in was in 2015. However, in all cases, Indexing starts with getting your link in front of the web crawlers. That is the goal of this next part.

Well, I have a really neat and almost free trick(it will cost if you use paid hosting but still be a fraction of the so-called indexing services, which do not seem to work anymore) that I have been trying for the last two weeks and it has worked more than any of the indexing services I have used. (Yes, I was bored enough and had enough free time to do multiple control testing).

Here is all you need to do:
1.) Get some hosting (it can even be free with a free subdomain).
2.) Install the free Yourls PHP script (it is a URL shortenter and redirected).
3.) Batch upload your created links into it using a custom PHP script. I'm happy to post the code I use here, if the admins allow it.
4.) Generate a sitemap from this script.
5.) Use IndexNow (which takes two minutes to set up) to batch submit the URLs. (I have noticed there is a 10k limit per submission, which you can get around by using a loop to do multiple submissions).
6.) You will see the bots crawling your URLs within minutes and following the redirects. Note, despite what people say, google always does this.


Sit back and enjoy the results. If the link created is decent enough (content is what matters more so that the domain in 95% of cases), it will be indexed within a few days.

I hope you all benefit from this post as I feel it adds way more value than all the naysayers on BHW who I am convinced now have no idea about what they are talking about in 99% of all cases.

GSA does not get enough credit, it is amazing and has:
  1. The friendly community.
  2. The constant updates.
  3. The one-off price for the software.
  4. The ease of use. It literally is like a well-oiled machine and can just keep going.

Disclaimer: I have zero affiliation with any of the products mentioned and am simply a customer/user. I am sharing this as a way of giving back to this community that has helped me a lot in the past.

Disclaimer 2: As I said, I'm happy to post open source code here for the above, but won't do so without the green light from the forum owners. I want to respect protocol.
«13

Comments

  • edited August 2022
    Thanks for the share. I totally agree with you on all fronts, esp about the (laughable) gurus on BHW. Every single person that says SER dOeSnT wOrK has a pbn backlink service in their signature, OR is a marketplace seller. If everyone still used SER then the marketplace wouldn’t generate as much money. That’s why tools like this are frowned upon because they have turned link building into a PayPal money generator (the marketplace)

    Post the code please!
  • edited July 2023
    CODE DELETED
  • edited July 2023
    CODE DELETED
  • edited July 2023
    CODE DELETED
  • Thanks for the share. I totally agree with you on all fronts, esp about the (laughable) gurus on BHW. Every single person that says SER dOeSnT wOrK has a pbn backlink service in their signature, OR is a marketplace seller. If everyone still used SER then the marketplace wouldn’t generate as much money. That’s why tools like this are frowned upon because they have turned link building into a PayPal money generator (the marketplace)

    Post the code please!
    Gotten crawled yet?
  • Momo said:
    Thanks for the share. I totally agree with you on all fronts, esp about the (laughable) gurus on BHW. Every single person that says SER dOeSnT wOrK has a pbn backlink service in their signature, OR is a marketplace seller. If everyone still used SER then the marketplace wouldn’t generate as much money. That’s why tools like this are frowned upon because they have turned link building into a PayPal money generator (the marketplace)

    Post the code please!
    Gotten crawled yet?
    I have not implemented this yet. I am programming an algorithm today. I will implement this as soon as I have time. I will post when I do. Thank you for this share. It will save $!
  • Momo said:
    Thanks for the share. I totally agree with you on all fronts, esp about the (laughable) gurus on BHW. Every single person that says SER dOeSnT wOrK has a pbn backlink service in their signature, OR is a marketplace seller. If everyone still used SER then the marketplace wouldn’t generate as much money. That’s why tools like this are frowned upon because they have turned link building into a PayPal money generator (the marketplace)

    Post the code please!
    Gotten crawled yet?
    I have not implemented this yet. I am programming an algorithm today. I will implement this as soon as I have time. I will post when I do. Thank you for this share. It will save $!
    If ever you wish to collaborate on any projects, please let me know. Im enjoying being back in the space and trying new things.
    Thanked by 1the_other_dude
  • In my opinion, Bing's IndexNow does not work. It can be done automatically on your own site with a plugin or Cloudflare.
    I am not able to index my sites on Bing with 1,5k words unique and well-formatted content on its homepages.
    Yes, it sends a bot, but simply refuses to index sites for weeks with weird errors such as -
    Bing Index:
    Live URL:
    Also, I noticed that Bing deindexed my old sites having backlinks and traffic when I started to make 10-20 contextual backlinks (referring domains) daily with GSA.



  • Smeklinis said:
    In my opinion, Bing's IndexNow does not work. It can be done automatically on your own site with a plugin or Cloudflare.
    I am not able to index my sites on Bing with 1,5k words unique and well-formatted content on its homepages.
    Yes, it sends a bot, but simply refuses to index sites for weeks with weird errors such as -
    Bing Index:
    Live URL:
    Also, I noticed that Bing deindexed my old sites having backlinks and traffic when I started to make 10-20 contextual backlinks (referring domains) daily with GSA.



    From the report above it seems their crawler cannot access your site. I'm sure it is not a block by robots.txt or something that silly as I've read some of your comments, and you do know what you are doing. What server setup are you using? I once had an issue with the fail2ban plugin on Plesk blocking search engine crawlers.
  • Momo said:
    From the report above it seems their crawler cannot access your site. I'm sure it is not a block by robots.txt or something that silly as I've read some of your comments, and you do know what you are doing. What server setup are you using? I once had an issue with the fail2ban plugin on Plesk blocking search engine crawlers.
    Server is fine and I am getting Bing bot visits everyday with 200 OK in the server's log file on the current domain. I am not blocking Bing. As I heard Bing something f*cked up from their side about 3 months ago.
    In the Site Explorer I can see this:
    The date of Last crawled say that site was even not launched on that time.
    But as I said before, Bing bot visits the site everyday.

    Saw on BHW many threads that after some changes on Bing (3 months ago) everyone has trouble indexing on it.

  • Smeklinis said:
    Momo said:
    From the report above it seems their crawler cannot access your site. I'm sure it is not a block by robots.txt or something that silly as I've read some of your comments, and you do know what you are doing. What server setup are you using? I once had an issue with the fail2ban plugin on Plesk blocking search engine crawlers.
    Server is fine and I am getting Bing bot visits everyday with 200 OK in the server's log file on the current domain. I am not blocking Bing. As I heard Bing something f*cked up from their side about 3 months ago.
    In the Site Explorer I can see this:
    The date of Last crawled say that site was even not launched on that time.
    But as I said before, Bing bot visits the site everyday.

    Saw on BHW many threads that after some changes on Bing (3 months ago) everyone has trouble indexing on it.

    That is strange. I have been indexing new pages with them all week long, even with a domain that was registered on 21 July 2022.
  • Momo said:
    Momo said:
    Thanks for the share. I totally agree with you on all fronts, esp about the (laughable) gurus on BHW. Every single person that says SER dOeSnT wOrK has a pbn backlink service in their signature, OR is a marketplace seller. If everyone still used SER then the marketplace wouldn’t generate as much money. That’s why tools like this are frowned upon because they have turned link building into a PayPal money generator (the marketplace)

    Post the code please!
    Gotten crawled yet?
    I have not implemented this yet. I am programming an algorithm today. I will implement this as soon as I have time. I will post when I do. Thank you for this share. It will save $!
    If ever you wish to collaborate on any projects, please let me know. Im enjoying being back in the space and trying new things.
    Glad to hear you’re doing things that you enjoy. If I can think of anything in the future I’ll be sure to drop a line about collab.
  • Thanks for this encouraging thread 
    i'm back to GSA since a while 
    i just followed your steps 
    I hope it work 

    for the bulk upload 
    i just found a plugin you can add to your Yourls website : 
    https://github.com/vaughany/yourls-bulk-import-and-shorten

    Just to confirm you're building Tier 1 with GSA to your money site directly ? 
    Did you use any verified list service or you are crawling your own ? 

    Thanks 
  • "Just to confirm you're building Tier 1 with GSA to your money site directly ? "

    I am.

    "Did you use any verified list service or you are crawling your own ?"

    Combination of both (most bought) as I found scraping to not be worth it. I then import lists into each project tier. I break down which sites I choose after using Scrapebox's page authority checker. 
    Thanked by 1hardcorenuker
  • edited August 2022
    @Momo

    Thanks for openly sharing your experience !

    From what I see (in indexnow.org) Google is not supporting indexnow ?

  • coral99 said:
    @Momo

    Thanks for openly sharing your experience !

    From what I see (in indexnow.org) Google is not supporting indexnow ?

    As of November 2021, rumours started that they were experimenting with it. I also see their crawlers (mobile versions) crawling links that I have submitted in such a way. But none of that matters.

     Most importantly, I see the pages submitted in such a way indexed by google and that's the most important thing. Costs literally nothing and is 1,580% better, as per my tests, than services that charge people hundreds to thousands of dollars per yet for junk that hasn't worked for years. Naturally none of them will admit it, they earn their bread and butter of such ignorance, like the same people on BHW who say GSA SER does not work, yet sell services for hundreds of dollars per month by using their GSA SER programs to build links.....
    Thanked by 1coral99
  • Momo said:
    coral99 said:
    @Momo

    Thanks for openly sharing your experience !

    From what I see (in indexnow.org) Google is not supporting indexnow ?

    As of November 2021, rumours started that they were experimenting with it. I also see their crawlers (mobile versions) crawling links that I have submitted in such a way. But none of that matters.

     Most importantly, I see the pages submitted in such a way indexed by google and that's the most important thing. Costs literally nothing and is 1,580% better, as per my tests, than services that charge people hundreds to thousands of dollars per yet for junk that hasn't worked for years. Naturally none of them will admit it, they earn their bread and butter of such ignorance, like the same people on BHW who say GSA SER does not work, yet sell services for hundreds of dollars per month by using their GSA SER programs to build links.....
    Thanks, I'll check this out and report
  • Momo said:
    "Just to confirm you're building Tier 1 with GSA to your money site directly ? "

    I am.

    "Did you use any verified list service or you are crawling your own ?"

    Combination of both (most bought) as I found scraping to not be worth it. I then import lists into each project tier. I break down which sites I choose after using Scrapebox's page authority checker. 
    Do you mind sharing the engine types you use for Tier 1 2 and 3? :)
  • Do you have any feedback on the indexing after your test @the_other_dude
  • May i know which hosting company is the best (in terms of value for money) to catch our own catchall email accounts?
  • May i know which hosting company is the best (in terms of value for money) to catch our own catchall email accounts?
    You don't need a powerful VPS for such, a small and cheap machine will do. I am running several own and customer catchall accounts on a small dedicated VPS and it is stable and performant.
  • May i know which hosting company is the best (in terms of value for money) to catch our own catchall email accounts?
    You don't need a powerful VPS for such, a small and cheap machine will do. I am running several own and customer catchall accounts on a small dedicated VPS and it is stable and performant.
    what software do you run off a vps for the email server?
  • May i know which hosting company is the best (in terms of value for money) to catch our own catchall email accounts?
    You don't need a powerful VPS for such, a small and cheap machine will do. I am running several own and customer catchall accounts on a small dedicated VPS and it is stable and performant.
    what software do you run off a vps for the email server?
    I use cheap cpanel shared hosting from BHW for catchall. It’s about $10 a year, unlimited domains :)

  • Thanks for sharing can you give us what is the service at 10 usd per year please? @the_other_dude
  • googlealchemistgooglealchemist Anywhere I want
    Momo said:
    Now that we finally have the links in the database with the redirects created and the sitemap made, we can summon the crawlers. For this, you first need to go here: https://www.bing.com/indexnow#implementation
    Generate an API, download the text file and upload it to your hosting. Make a note of the name.
    Now you need to create and run the following in python (.py file). (if you don't have it, install pycharm community edition which is free). Not this is a basic script for one sitemap but you can easily use it to loop through and submit multiple sitemaps. If there is enough interest, ill make such a script and release it.
    import requests
    import advertools as adv
    
    
    key = "12367346914527865247865429" #this is the key you generated
    keyLocation = "https://mysite.com/12367346914527865247865429.txt" #this is where you uploaded it
    
    list = ["https://mysite.com/sitemap.php"] #this is where your sitemap is
    
    for sitemap in list:
        sitemap_urls = adv.sitemap_to_df(sitemap)
        urls = sitemap_urls["loc"].to_list()
        #urls = sitemap_urls["URL"].to_list()
    
        host = 'https://www.bing.com/indexnow'
        website = 'https://mysite.com'
    
        headers = {"Content-type": "application/json", "charset": "utf-8"}
        url = host
        myobj = {
                "host": website,
                "key": key,
                "keyLocation": keyLocation,
                "urlList": urls
            }
        x = requests.post(url, json=myobj, headers=headers)
        print(x.status_code)

    If done correctly, the code will return 200 Status and you will see the search engine bots (Bing, Yandex, Google) visiting your robots.txt files and submitted URLs in less than a minute. Hope you all benefited from this. Please reach out if you have any questions.
    thanks for the post

    i get lost with any sort of coding/db stuff

    whats the point of putting all the links into the redirect thing vs just posting them all in a blog post to get that post into the sitemap and using that sitemap to submit to indexnow?
  • Momo said:
    Now that we finally have the links in the database with the redirects created and the sitemap made, we can summon the crawlers. For this, you first need to go here: https://www.bing.com/indexnow#implementation
    Generate an API, download the text file and upload it to your hosting. Make a note of the name.
    Now you need to create and run the following in python (.py file). (if you don't have it, install pycharm community edition which is free). Not this is a basic script for one sitemap but you can easily use it to loop through and submit multiple sitemaps. If there is enough interest, ill make such a script and release it.
    import requests
    import advertools as adv
    
    
    key = "12367346914527865247865429" #this is the key you generated
    keyLocation = "https://mysite.com/12367346914527865247865429.txt" #this is where you uploaded it
    
    list = ["https://mysite.com/sitemap.php"] #this is where your sitemap is
    
    for sitemap in list:
        sitemap_urls = adv.sitemap_to_df(sitemap)
        urls = sitemap_urls["loc"].to_list()
        #urls = sitemap_urls["URL"].to_list()
    
        host = 'https://www.bing.com/indexnow'
        website = 'https://mysite.com'
    
        headers = {"Content-type": "application/json", "charset": "utf-8"}
        url = host
        myobj = {
                "host": website,
                "key": key,
                "keyLocation": keyLocation,
                "urlList": urls
            }
        x = requests.post(url, json=myobj, headers=headers)
        print(x.status_code)

    If done correctly, the code will return 200 Status and you will see the search engine bots (Bing, Yandex, Google) visiting your robots.txt files and submitted URLs in less than a minute. Hope you all benefited from this. Please reach out if you have any questions.

    Thank you so much for sharing! Loop code would be very helpful
  • about
    "DISCOVERED BUT NOT CRAWLED"

    i didnt read all of the other information that you had to go with this on purpose, because with discovered but not crawled the info bing & google gives is sometimes contradictory, so you will be tearing your hair out trying to resolve it based on what you see in front of you, when it is not always accurate, but i have had "DISCOVERED BUT NOT CRAWLED" & it was because of semi duplicate content, i kept taking out some of the duplicate content but it was not indexing the pages, it was only when i took out all of the duplicate content that this changed & got the rest of the pages that werent duplicate content indexed.
    in google search console it will give you various data/messages etc, but some of these are not accurate at all, so if you have the "DISCOVERED BUT NOT CRAWLED" message, see if you think that google/bing might be seeing some (ANY) pages might be seen as duplicate content or not good enough in googles eyes, because it wont just stop it indexing those pages, it might stop google/bing indexing the rest of the pages, i dont know if this helps, but any other messages you see might not be accurate/relevant, i had that (contradictory messages) but when i removed all of the duplicate pages it started to index the non duplicate pages as well.

    another thing, i think it depends what market you are in & how big the market is, smaller, less competitive markets you can get away with more, such as local markets, but if you apply the same things to national/international sites sometimes google does not index the pages.


  • Momo said:
    Now that we finally have the links in the database with the redirects created and the sitemap made, we can summon the crawlers. For this, you first need to go here: https://www.bing.com/indexnow#implementation
    Generate an API, download the text file and upload it to your hosting. Make a note of the name.
    Now you need to create and run the following in python (.py file). (if you don't have it, install pycharm community edition which is free). Not this is a basic script for one sitemap but you can easily use it to loop through and submit multiple sitemaps. If there is enough interest, ill make such a script and release it.

    If done correctly, the code will return 200 Status and you will see the search engine bots (Bing, Yandex, Google) visiting your robots.txt files and submitted URLs in less than a minute. Hope you all benefited from this. Please reach out if you have any questions.
    @momo Are you actually seeing any evidence from Bing and/or Google?

    As far as my installation is concerned, only Yandex has crawled in the last several days.
    There's been no Bing or Google - and from what research shows Google is not a participant in Index now - only Microsoft Bing, Yandex, and Seznam.
  • KaineKaine thebestindexer.com
    edited December 2022
    Interesting thread, if you are interested we index on Google, Bing and Yandex simultaneously and our formulas are unlimited.
    We are going to open a thread with 1 day trial offered on TheBestIndexer.com, free for all members of the GSA forum to celebrate the happy new year.
    This will be effective today or tomorrow, stay around to enjoy it!
  • Kaine said:
    Interesting thread, if you are interested we index on Google, Bing and Yandex simultaneously and our formulas are unlimited.
    We are going to open a thread with 1 day trial offered on TheBestIndexer.com, free for all members of the GSA forum to celebrate the happy new year.
    This will be effective today or tomorrow, stay around to enjoy it!
    That is great, yet another thread hijack attempt to promote your services or lists, when people are discussing their own solutions. You are really adding value..... Is anyone else getting bored of this type of behaviour?

    rastarr said:
    Momo said:
    Now that we finally have the links in the database with the redirects created and the sitemap made, we can summon the crawlers. For this, you first need to go here: https://www.bing.com/indexnow#implementation
    Generate an API, download the text file and upload it to your hosting. Make a note of the name.
    Now you need to create and run the following in python (.py file). (if you don't have it, install pycharm community edition which is free). Not this is a basic script for one sitemap but you can easily use it to loop through and submit multiple sitemaps. If there is enough interest, ill make such a script and release it.

    If done correctly, the code will return 200 Status and you will see the search engine bots (Bing, Yandex, Google) visiting your robots.txt files and submitted URLs in less than a minute. Hope you all benefited from this. Please reach out if you have any questions.
    @momo Are you actually seeing any evidence from Bing and/or Google?

    As far as my installation is concerned, only Yandex has crawled in the last several days.
    There's been no Bing or Google - and from what research shows Google is not a participant in Index now - only Microsoft Bing, Yandex, and Seznam.
    Would like to show you something:



    Here is a site, I started juicing with GSA and using this 301 redirection sitemap method in the second week of May. Already the Googlebot has discovered 140k+ links and has crawled them as directed by the 301. Now some may ask why does it not index on my domain, well the reason is simple:

    The sitemap is there to feed Tier links to the google bot. It then crawls them and indexes them. Magic, free and easy, as you can see below.



    Using this method you can literally throw more new links in front of the SERPS than the competition. Already have some top 10s in 100k monthly volume competitive niches, competing with the "Kings" who have been there 20+ years, and all in a few months.

    I hope to soon combine this with a Scraper I am building that will target platforms only used by GSA SER and then rank them based on social signals. Picking the best of these, I hope to throw out millions of links per month in lots of new niches and see how far I can go.

    Core message: Don't listen to people who spread fear and doubt. GSA works very well and all you need is GSA SER, A good list to link build on and then to use the above free script I made to get those links in front of the search engine bots to be indexed. It really is that simple (assuming you actually have content on your site). Don't be discouraged by the average black hat expert on certain forums, they literally have no idea what they are talking about and earn bread and milk money hustling fiver gigs with scrapebox and senuke spam. Took me ages to accept that fact. Once I did, I'm getting results.
Sign In or Register to comment.