Skip to content

Don't believe people. This actually works amazingly in 2022. Try these suggestions

Note before reading: This is not an attempt to take a cheap shot at "experts" not is it meant to be put down anyone. Everyone has their own experiences. These opinions are just my own, and I wish to share them with any people so they don't have the same doubts that I did and so that they can save their time.

Search engine marketing forums (black hat, grey hat, white hat etc) are full of alleged experts insisting that you cannot use automated tools to build your money site. I have learned this is nonsense and I wish to share something I have learned.

GSA is amazingly powerful if you know how to use it right. Even while not going nuclear, I am getting 70+ LPM a minute while targeting high-quality sites and the links are getting indexed and staying live. One myth that is common is that its easy to get google slapped for having some bad links.
It is not, otherwise think about how lucrative the market would be for taking out all the competition if it was so easy to get them banned from the search engines. It may be true that low-quality links give you zero benefit, but they are unlikely to de-index you. There is a difference between getting zero benefits from something and getting penalized for it. Get as many backlinks to your site indexed as possible and in general, the risks to your rank are to the upside.

So how can you make these backlinks get indexed and give you juice? Three mistakes people make:

1.) Emails.
2.) Content.
3.) Indexing.

Do not use generic throw-away emails. Your accounts will get banned and then all your work has been in vain. This is the most crucial part before going anywhere else. I have learned the following methods are very effective:

1.) Buying gmail accounts.
2.) Buying cheap domains and set up your own catch-all. (This is the most economical and works)
3.) Buying old yahoo accounts.

Not only do the accounts get registered more smoothly, but they do not get banned and hence you can reuse the article and social network and bookmark ones over and over. This not only saves you money on captchas but also significantly increases your LPM. Give it a try. You will see a 10X increase once you have your accounts all made.

Use at least decent quality content and you can get it indexed. It is that simple. It may be more expensive but it works. The most important thing that I have realized is the use of spinners. Spinrewriter and Word AI work wonders and have a far higher index rate than generic built-in junk which will not get indexed.

Use the filters that GSA offers to fine-tune your verified list. If you are targeting UK traffic, build your links onto UK-hosted sites. GSA gives you the option to use it. I know this sounds petty, but trust me, it makes a difference. There is a dating term that has 10x more UK traffic than the US and I have noticed a significant gain already in google.co.uk just by making this tuning. I'm sitting just outside of the top 10 and hope to break the first page for this 20k+ searched term in the next month.

Now the final part how do you Index? A link that isn't indexed has no value, we all know this song and dance. However, Indexing is harder now than in was in 2015. However, in all cases, Indexing starts with getting your link in front of the web crawlers. That is the goal of this next part.

Well, I have a really neat and almost free trick(it will cost if you use paid hosting but still be a fraction of the so-called indexing services, which do not seem to work anymore) that I have been trying for the last two weeks and it has worked more than any of the indexing services I have used. (Yes, I was bored enough and had enough free time to do multiple control testing).

Here is all you need to do:
1.) Get some hosting (it can even be free with a free subdomain).
2.) Install the free Yourls PHP script (it is a URL shortenter and redirected).
3.) Batch upload your created links into it using a custom PHP script. I'm happy to post the code I use here, if the admins allow it.
4.) Generate a sitemap from this script.
5.) Use IndexNow (which takes two minutes to set up) to batch submit the URLs. (I have noticed there is a 10k limit per submission, which you can get around by using a loop to do multiple submissions).
6.) You will see the bots crawling your URLs within minutes and following the redirects. Note, despite what people say, google always does this.


Sit back and enjoy the results. If the link created is decent enough (content is what matters more so that the domain in 95% of cases), it will be indexed within a few days.

I hope you all benefit from this post as I feel it adds way more value than all the naysayers on BHW who I am convinced now have no idea about what they are talking about in 99% of all cases.

GSA does not get enough credit, it is amazing and has:
  1. The friendly community.
  2. The constant updates.
  3. The one-off price for the software.
  4. The ease of use. It literally is like a well-oiled machine and can just keep going.

Disclaimer: I have zero affiliation with any of the products mentioned and am simply a customer/user. I am sharing this as a way of giving back to this community that has helped me a lot in the past.

Disclaimer 2: As I said, I'm happy to post open source code here for the above, but won't do so without the green light from the forum owners. I want to respect protocol.

Comments

  • edited August 2
    Thanks for the share. I totally agree with you on all fronts, esp about the (laughable) gurus on BHW. Every single person that says SER dOeSnT wOrK has a pbn backlink service in their signature, OR is a marketplace seller. If everyone still used SER then the marketplace wouldn’t generate as much money. That’s why tools like this are frowned upon because they have turned link building into a PayPal money generator (the marketplace)

    Post the code please!
    Thanked by 1hardcorenuker
  • edited August 2
    The first thing you will need to do is edit the database table for Yourls, this one 'yourls_url' so it has the following structure:



    Then make a txt file e.g. "urls.txt" and upload your URLs into it.

    Then create a PHP file (call it whatever you want. e.g. "crawlme.php") with the following code. Fill in the server information with your details:
    <?php
    
    ini_set('max_execution_time', '0'); //I set this at 0 so it doesn't time out. It depends on your server capacity/preferences.
    
    
    //now it goes through the file line by line and if the url isnt already in the database, it will add it
    $handle = fopen("urls.txt", "r"); //note to change the name to the what you called your .txt file
    $x = 0;
    if ($handle) {
        while (($line = fgets($handle)) !== false) {
            //echo $line;
    		if (!doesExist($line)){
    			$x++;
    			createRecord($line);
    		}
        }
        fclose($handle);
    }
    
    //finally we output how many links have been added
    echo $x . " urls added to sitemap.";
    
    
    //function to see if the record already exists
    function doesExist($line){
    		$dbname = "";
    		$dbuser = "";
    		$password = "";
    		$host = "";
    		$mysqli = new mysqli($host, $dbuser, $dbpassword, $dbname);
    		$stmt = $mysqli->prepare("SELECT * FROM yourls_url WHERE url = ? LIMIT 1");<br>		$stmt->bind_param("s",$line);
    		$stmt->execute();
    		$result = $stmt->get_result();
    		if ($result->num_rows < 1) return false;
    		else return true;
    }
    
    
    //function to create the new record
    function createRecord($line){
    		$dbname = "";
    		$dbuser = "";
    		$password = "";
    		$host = "";
    		$keyword = md5($line);
    		$mysqli = new mysqli($host, $dbuser, $dbpassword, $dbname);
                    $stmt = $mysqli->prepare("INSERT INTO yourls_url  (keyword, url, ip)<br>		VALUES (?, ?, '1.1.1.1')");<br>		$stmt->bind_param("ss",$keyword $line);;
    		$stmt->execute();
    }
    ?>

    In the next post, ill share code on how to generate the sitemap.

    Note: not sure why the code above has added some "<br>"s, please remove them.
  • edited August 2
    To generate the sitemap, make another .php file e.g. "sitemap.php" and call it whatever you want and use the following code:

    <?php
    header('Content-Type: application/xml');	
    echo '<?xml version="1.0" encoding="UTF-8"?>';
    echo '<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">';
    $dbname = "";
    $dbuser = "";
    $password = "";
    $host = "";
    $url = ''; //set this to your domain. E.g. 'https://mysite.com/' the trailing slash is important.
    $conn = new mysqli($host, $dbuser, $password, $dbname);
    $sql = "SELECT * FROM yourls_url ORDER BY row ASC LIMIT 10000";
    /*note its limited to 10000, so for multiple sitemaps start teh query above at a different record but still limit to 10000 or index now will not function*/ 
    $result = $conn->query($sql);
    while ($row = $result->fetch_assoc()){
    echo "<url><loc>" . $url . $row['keyword'] . "</loc>";
    echo "<lastmod>" . gmdate('Y-m-d\TH:i:s+00:00', strtotime($row['timestamp'])) . "</lastmod>";
    echo "<changefreq>Daily</changefreq><priority>0.8</priority>";	
    echo "</url>";	
    }
    echo "</urlset>";
    ?>

    Now you have your links in the database and a sitemap generated. This is the sitemap you will use to summon IndexNow to crawl the site.
  • Now that we finally have the links in the database with the redirects created and the sitemap made, we can summon the crawlers. For this, you first need to go here: https://www.bing.com/indexnow#implementation
    Generate an API, download the text file and upload it to your hosting. Make a note of the name.
    Now you need to create and run the following in python (.py file). (if you don't have it, install pycharm community edition which is free). Not this is a basic script for one sitemap but you can easily use it to loop through and submit multiple sitemaps. If there is enough interest, ill make such a script and release it.
    import requests
    import advertools as adv
    
    
    key = "12367346914527865247865429" #this is the key you generated
    keyLocation = "https://mysite.com/12367346914527865247865429.txt" #this is where you uploaded it
    
    list = ["https://mysite.com/sitemap.php"] #this is where your sitemap is
    
    for sitemap in list:
        sitemap_urls = adv.sitemap_to_df(sitemap)
        urls = sitemap_urls["loc"].to_list()
        #urls = sitemap_urls["URL"].to_list()
    
        host = 'https://www.bing.com/indexnow'
        website = 'https://mysite.com'
    
        headers = {"Content-type": "application/json", "charset": "utf-8"}
        url = host
        myobj = {
                "host": website,
                "key": key,
                "keyLocation": keyLocation,
                "urlList": urls
            }
        x = requests.post(url, json=myobj, headers=headers)
        print(x.status_code)

    If done correctly, the code will return 200 Status and you will see the search engine bots (Bing, Yandex, Google) visiting your robots.txt files and submitted URLs in less than a minute. Hope you all benefited from this. Please reach out if you have any questions.
  • Thanks for the share. I totally agree with you on all fronts, esp about the (laughable) gurus on BHW. Every single person that says SER dOeSnT wOrK has a pbn backlink service in their signature, OR is a marketplace seller. If everyone still used SER then the marketplace wouldn’t generate as much money. That’s why tools like this are frowned upon because they have turned link building into a PayPal money generator (the marketplace)

    Post the code please!
    Gotten crawled yet?
  • Momo said:
    Thanks for the share. I totally agree with you on all fronts, esp about the (laughable) gurus on BHW. Every single person that says SER dOeSnT wOrK has a pbn backlink service in their signature, OR is a marketplace seller. If everyone still used SER then the marketplace wouldn’t generate as much money. That’s why tools like this are frowned upon because they have turned link building into a PayPal money generator (the marketplace)

    Post the code please!
    Gotten crawled yet?
    I have not implemented this yet. I am programming an algorithm today. I will implement this as soon as I have time. I will post when I do. Thank you for this share. It will save $!
  • Momo said:
    Thanks for the share. I totally agree with you on all fronts, esp about the (laughable) gurus on BHW. Every single person that says SER dOeSnT wOrK has a pbn backlink service in their signature, OR is a marketplace seller. If everyone still used SER then the marketplace wouldn’t generate as much money. That’s why tools like this are frowned upon because they have turned link building into a PayPal money generator (the marketplace)

    Post the code please!
    Gotten crawled yet?
    I have not implemented this yet. I am programming an algorithm today. I will implement this as soon as I have time. I will post when I do. Thank you for this share. It will save $!
    If ever you wish to collaborate on any projects, please let me know. Im enjoying being back in the space and trying new things.
    Thanked by 1the_other_dude
  • In my opinion, Bing's IndexNow does not work. It can be done automatically on your own site with a plugin or Cloudflare.
    I am not able to index my sites on Bing with 1,5k words unique and well-formatted content on its homepages.
    Yes, it sends a bot, but simply refuses to index sites for weeks with weird errors such as -
    Bing Index:
    Live URL:
    Also, I noticed that Bing deindexed my old sites having backlinks and traffic when I started to make 10-20 contextual backlinks (referring domains) daily with GSA.



  • Smeklinis said:
    In my opinion, Bing's IndexNow does not work. It can be done automatically on your own site with a plugin or Cloudflare.
    I am not able to index my sites on Bing with 1,5k words unique and well-formatted content on its homepages.
    Yes, it sends a bot, but simply refuses to index sites for weeks with weird errors such as -
    Bing Index:
    Live URL:
    Also, I noticed that Bing deindexed my old sites having backlinks and traffic when I started to make 10-20 contextual backlinks (referring domains) daily with GSA.



    From the report above it seems their crawler cannot access your site. I'm sure it is not a block by robots.txt or something that silly as I've read some of your comments, and you do know what you are doing. What server setup are you using? I once had an issue with the fail2ban plugin on Plesk blocking search engine crawlers.
  • Momo said:
    From the report above it seems their crawler cannot access your site. I'm sure it is not a block by robots.txt or something that silly as I've read some of your comments, and you do know what you are doing. What server setup are you using? I once had an issue with the fail2ban plugin on Plesk blocking search engine crawlers.
    Server is fine and I am getting Bing bot visits everyday with 200 OK in the server's log file on the current domain. I am not blocking Bing. As I heard Bing something f*cked up from their side about 3 months ago.
    In the Site Explorer I can see this:
    The date of Last crawled say that site was even not launched on that time.
    But as I said before, Bing bot visits the site everyday.

    Saw on BHW many threads that after some changes on Bing (3 months ago) everyone has trouble indexing on it.

  • Smeklinis said:
    Momo said:
    From the report above it seems their crawler cannot access your site. I'm sure it is not a block by robots.txt or something that silly as I've read some of your comments, and you do know what you are doing. What server setup are you using? I once had an issue with the fail2ban plugin on Plesk blocking search engine crawlers.
    Server is fine and I am getting Bing bot visits everyday with 200 OK in the server's log file on the current domain. I am not blocking Bing. As I heard Bing something f*cked up from their side about 3 months ago.
    In the Site Explorer I can see this:
    The date of Last crawled say that site was even not launched on that time.
    But as I said before, Bing bot visits the site everyday.

    Saw on BHW many threads that after some changes on Bing (3 months ago) everyone has trouble indexing on it.

    That is strange. I have been indexing new pages with them all week long, even with a domain that was registered on 21 July 2022.
  • Momo said:
    Momo said:
    Thanks for the share. I totally agree with you on all fronts, esp about the (laughable) gurus on BHW. Every single person that says SER dOeSnT wOrK has a pbn backlink service in their signature, OR is a marketplace seller. If everyone still used SER then the marketplace wouldn’t generate as much money. That’s why tools like this are frowned upon because they have turned link building into a PayPal money generator (the marketplace)

    Post the code please!
    Gotten crawled yet?
    I have not implemented this yet. I am programming an algorithm today. I will implement this as soon as I have time. I will post when I do. Thank you for this share. It will save $!
    If ever you wish to collaborate on any projects, please let me know. Im enjoying being back in the space and trying new things.
    Glad to hear you’re doing things that you enjoy. If I can think of anything in the future I’ll be sure to drop a line about collab.
  • Thanks for this encouraging thread 
    i'm back to GSA since a while 
    i just followed your steps 
    I hope it work 

    for the bulk upload 
    i just found a plugin you can add to your Yourls website : 
    https://github.com/vaughany/yourls-bulk-import-and-shorten

    Just to confirm you're building Tier 1 with GSA to your money site directly ? 
    Did you use any verified list service or you are crawling your own ? 

    Thanks 
  • "Just to confirm you're building Tier 1 with GSA to your money site directly ? "

    I am.

    "Did you use any verified list service or you are crawling your own ?"

    Combination of both (most bought) as I found scraping to not be worth it. I then import lists into each project tier. I break down which sites I choose after using Scrapebox's page authority checker. 
    Thanked by 1hardcorenuker
  • edited August 4
    @Momo

    Thanks for openly sharing your experience !

    From what I see (in indexnow.org) Google is not supporting indexnow ?

  • coral99 said:
    @Momo

    Thanks for openly sharing your experience !

    From what I see (in indexnow.org) Google is not supporting indexnow ?

    As of November 2021, rumours started that they were experimenting with it. I also see their crawlers (mobile versions) crawling links that I have submitted in such a way. But none of that matters.

     Most importantly, I see the pages submitted in such a way indexed by google and that's the most important thing. Costs literally nothing and is 1,580% better, as per my tests, than services that charge people hundreds to thousands of dollars per yet for junk that hasn't worked for years. Naturally none of them will admit it, they earn their bread and butter of such ignorance, like the same people on BHW who say GSA SER does not work, yet sell services for hundreds of dollars per month by using their GSA SER programs to build links.....
    Thanked by 1coral99
  • Momo said:
    coral99 said:
    @Momo

    Thanks for openly sharing your experience !

    From what I see (in indexnow.org) Google is not supporting indexnow ?

    As of November 2021, rumours started that they were experimenting with it. I also see their crawlers (mobile versions) crawling links that I have submitted in such a way. But none of that matters.

     Most importantly, I see the pages submitted in such a way indexed by google and that's the most important thing. Costs literally nothing and is 1,580% better, as per my tests, than services that charge people hundreds to thousands of dollars per yet for junk that hasn't worked for years. Naturally none of them will admit it, they earn their bread and butter of such ignorance, like the same people on BHW who say GSA SER does not work, yet sell services for hundreds of dollars per month by using their GSA SER programs to build links.....
    Thanks, I'll check this out and report
  • Momo said:
    "Just to confirm you're building Tier 1 with GSA to your money site directly ? "

    I am.

    "Did you use any verified list service or you are crawling your own ?"

    Combination of both (most bought) as I found scraping to not be worth it. I then import lists into each project tier. I break down which sites I choose after using Scrapebox's page authority checker. 
    Do you mind sharing the engine types you use for Tier 1 2 and 3? :)
  • Do you have any feedback on the indexing after your test @the_other_dude
  • May i know which hosting company is the best (in terms of value for money) to catch our own catchall email accounts?
  • May i know which hosting company is the best (in terms of value for money) to catch our own catchall email accounts?
    You don't need a powerful VPS for such, a small and cheap machine will do. I am running several own and customer catchall accounts on a small dedicated VPS and it is stable and performant.
  • May i know which hosting company is the best (in terms of value for money) to catch our own catchall email accounts?
    You don't need a powerful VPS for such, a small and cheap machine will do. I am running several own and customer catchall accounts on a small dedicated VPS and it is stable and performant.
    what software do you run off a vps for the email server?
  • May i know which hosting company is the best (in terms of value for money) to catch our own catchall email accounts?
    You don't need a powerful VPS for such, a small and cheap machine will do. I am running several own and customer catchall accounts on a small dedicated VPS and it is stable and performant.
    what software do you run off a vps for the email server?
    I use cheap cpanel shared hosting from BHW for catchall. It’s about $10 a year, unlimited domains :)

    Thanked by 1draculax
Sign In or Register to comment.