Skip to content

GSA SER still works amazingly in 2023. But pay heed to the following: - Update July 2023

2

Comments

  • I am new on this forum since today.   I really want to try this.   But how many links you can build daily to you're moneysite?

    I don't have a paid version of GSA, is it still working in 2023?
  • rastarrrastarr Thailand
    Momo said:

    rastarr said:
    Momo said:
    Now that we finally have the links in the database with the redirects created and the sitemap made, we can summon the crawlers. For this, you first need to go here: https://www.bing.com/indexnow#implementation
    Generate an API, download the text file and upload it to your hosting. Make a note of the name.
    Now you need to create and run the following in python (.py file). (if you don't have it, install pycharm community edition which is free). Not this is a basic script for one sitemap but you can easily use it to loop through and submit multiple sitemaps. If there is enough interest, ill make such a script and release it.

    If done correctly, the code will return 200 Status and you will see the search engine bots (Bing, Yandex, Google) visiting your robots.txt files and submitted URLs in less than a minute. Hope you all benefited from this. Please reach out if you have any questions.
    @momo Are you actually seeing any evidence from Bing and/or Google?

    As far as my installation is concerned, only Yandex has crawled in the last several days.
    There's been no Bing or Google - and from what research shows Google is not a participant in Index now - only Microsoft Bing, Yandex, and Seznam.
    Would like to show you something:

    Here is a site, I started juicing with GSA and using this 301 redirection sitemap method in the second week of May. Already the Googlebot has discovered 140k+ links and has crawled them as directed by the 301. Now some may ask why does it not index on my domain, well the reason is simple:

    The sitemap is there to feed Tier links to the google bot. It then crawls them and indexes them. Magic, free and easy, as you can see below.



    Using this method you can literally throw more new links in front of the SERPS than the competition. Already have some top 10s in 100k monthly volume competitive niches, competing with the "Kings" who have been there 20+ years, and all in a few months.

    I hope to soon combine this with a Scraper I am building that will target platforms only used by GSA SER and then rank them based on social signals. Picking the best of these, I hope to throw out millions of links per month in lots of new niches and see how far I can go.

    Core message: Don't listen to people who spread fear and doubt. GSA works very well and all you need is GSA SER, A good list to link build on and then to use the above free script I made to get those links in front of the search engine bots to be indexed. It really is that simple (assuming you actually have content on your site). Don't be discouraged by the average black hat expert on certain forums, they literally have no idea what they are talking about and earn bread and milk money hustling fiver gigs with scrapebox and senuke spam. Took me ages to accept that fact. Once I did, I'm getting results.
    Excellent and thanks for the update. great to see this is working for you.
    My installation is still running.
    There's very very little Googlebot activity on my installation.

    [1] Have you made any changes to your original script though?
    [2] How are you getting Google to crawl your Yourls installation so well?
    [3] Have you changed the script to include multiple sitemaps? I see mention of multiples in your screenshot, is all.

    I'd love to get these same results. It sounds like you've made changes since your original post, a year ago though.
  • I've just tried to run the scripts but both and are giving errors:

    [Tue Jul 04 06:41:41.039095 2023] [php:error] [pid 5550] [client 95.91.221.5:7164] PHP Parse error:  syntax error, unexpected variable "$line", expecting ")" in /var/www/html/crawlme.php on line 47

    [Tue Jul 04 06:41:47.234973 2023] [php:error] [pid 5548] [client 95.91.221.5:7128] PHP Fatal error:  Uncaught mysqli_sql_exception: Unknown column 'row' in 'order clause' in /var/www/html/sitemap.php:13\nStack trace:\n#0 /var/www/html/sitemap.php(13): mysqli->query()\n#1 {main}\n  thrown in /var/www/html/sitemap.php on line 13

    Anyone having success and/or suggestions?
  • I've just tried to run the scripts but both and are giving errors:

    [Tue Jul 04 06:41:41.039095 2023] [php:error] [pid 5550] [client 95.91.221.5:7164] PHP Parse error:  syntax error, unexpected variable "$line", expecting ")" in /var/www/html/crawlme.php on line 47

    [Tue Jul 04 06:41:47.234973 2023] [php:error] [pid 5548] [client 95.91.221.5:7128] PHP Fatal error:  Uncaught mysqli_sql_exception: Unknown column 'row' in 'order clause' in /var/www/html/sitemap.php:13\nStack trace:\n#0 /var/www/html/sitemap.php(13): mysqli->query()\n#1 {main}\n  thrown in /var/www/html/sitemap.php on line 13

    Anyone having success and/or suggestions?
    Regarding:

    [Tue Jul 04 06:41:47.234973 2023] [php:error] [pid 5548] [client 95.91.221.5:7128] PHP Fatal error:  Uncaught mysqli_sql_exception: Unknown column 'row' in 'order clause' in /var/www/html/sitemap.php:13\nStack trace:\n#0 /var/www/html/sitemap.php(13): mysqli->query()\n#1 {main}\n  thrown in /var/www/html/sitemap.php on line 13

    You didn't add the row column to the SQL database as instructed above. Please see the very fist sentence written:

    "The first thing you will need to do is edit the database table for Yourls, this one 'yourls_url' so it has the following structure"

    Regarding:

    [Tue Jul 04 06:41:41.039095 2023] [php:error] [pid 5550] [client 95.91.221.5:7164] PHP Parse error:  syntax error, unexpected variable "$line", expecting ")" in /var/www/html/crawlme.php on line 47

    Thanks for pointing out. It is missing a comma. Have updated.

    Good luck!
  • Momo said:
    I've just tried to run the scripts but both and are giving errors:

    [Tue Jul 04 06:41:41.039095 2023] [php:error] [pid 5550] [client 95.91.221.5:7164] PHP Parse error:  syntax error, unexpected variable "$line", expecting ")" in /var/www/html/crawlme.php on line 47

    [Tue Jul 04 06:41:47.234973 2023] [php:error] [pid 5548] [client 95.91.221.5:7128] PHP Fatal error:  Uncaught mysqli_sql_exception: Unknown column 'row' in 'order clause' in /var/www/html/sitemap.php:13\nStack trace:\n#0 /var/www/html/sitemap.php(13): mysqli->query()\n#1 {main}\n  thrown in /var/www/html/sitemap.php on line 13

    Anyone having success and/or suggestions?
    Regarding:

    [Tue Jul 04 06:41:47.234973 2023] [php:error] [pid 5548] [client 95.91.221.5:7128] PHP Fatal error:  Uncaught mysqli_sql_exception: Unknown column 'row' in 'order clause' in /var/www/html/sitemap.php:13\nStack trace:\n#0 /var/www/html/sitemap.php(13): mysqli->query()\n#1 {main}\n  thrown in /var/www/html/sitemap.php on line 13

    You didn't add the row column to the SQL database as instructed above. Please see the very fist sentence written:

    "The first thing you will need to do is edit the database table for Yourls, this one 'yourls_url' so it has the following structure"

    Regarding:

    [Tue Jul 04 06:41:41.039095 2023] [php:error] [pid 5550] [client 95.91.221.5:7164] PHP Parse error:  syntax error, unexpected variable "$line", expecting ")" in /var/www/html/crawlme.php on line 47

    Thanks for pointing out. It is missing a comma. Have updated.

    Good luck!
    Thanks. I also spotted some differences in the DB settings (dbpassword / password).

    Sitemap has been submitted, so I am waiting for the crawlers now :)
  • Momo said:
    I've just tried to run the scripts but both and are giving errors:

    [Tue Jul 04 06:41:41.039095 2023] [php:error] [pid 5550] [client 95.91.221.5:7164] PHP Parse error:  syntax error, unexpected variable "$line", expecting ")" in /var/www/html/crawlme.php on line 47

    [Tue Jul 04 06:41:47.234973 2023] [php:error] [pid 5548] [client 95.91.221.5:7128] PHP Fatal error:  Uncaught mysqli_sql_exception: Unknown column 'row' in 'order clause' in /var/www/html/sitemap.php:13\nStack trace:\n#0 /var/www/html/sitemap.php(13): mysqli->query()\n#1 {main}\n  thrown in /var/www/html/sitemap.php on line 13

    Anyone having success and/or suggestions?
    Regarding:

    [Tue Jul 04 06:41:47.234973 2023] [php:error] [pid 5548] [client 95.91.221.5:7128] PHP Fatal error:  Uncaught mysqli_sql_exception: Unknown column 'row' in 'order clause' in /var/www/html/sitemap.php:13\nStack trace:\n#0 /var/www/html/sitemap.php(13): mysqli->query()\n#1 {main}\n  thrown in /var/www/html/sitemap.php on line 13

    You didn't add the row column to the SQL database as instructed above. Please see the very fist sentence written:

    "The first thing you will need to do is edit the database table for Yourls, this one 'yourls_url' so it has the following structure"

    Regarding:

    [Tue Jul 04 06:41:41.039095 2023] [php:error] [pid 5550] [client 95.91.221.5:7164] PHP Parse error:  syntax error, unexpected variable "$line", expecting ")" in /var/www/html/crawlme.php on line 47

    Thanks for pointing out. It is missing a comma. Have updated.

    Good luck!
    Thanks. I also spotted some differences in the DB settings (dbpassword / password).

    Sitemap has been submitted, so I am waiting for the crawlers now :)
    That is great. Glad you got it working. Good luck!
  • hi,
    thanks for all the work. I do have a couple of questions if you could help.
    do you submit every link  in the database or just the new additions when using the python script to send new links to indexnow
    right now i set it up to only send the new entries by adding a where row > x in the dynamic sitemap generator php script
    i also made a robots.txt file that refers to a full sitemap.xml holding every entry in the database
    python doesnt work on my hosting so i run the php scripts on the hosting but run the python script on another computer.
    and i am getting the 200 success response.
    does what i am doing sound reasonable,
    thanks again,
    george
  • hi,
    one other question, do you open a google webmaster account on the site hosting the redirects and then submit that full sitemap?
    thanks again,
    george
  • hi,
    thanks for all the work. I do have a couple of questions if you could help.
    do you submit every link  in the database or just the new additions when using the python script to send new links to indexnow
    right now i set it up to only send the new entries by adding a where row > x in the dynamic sitemap generator php script
    i also made a robots.txt file that refers to a full sitemap.xml holding every entry in the database
    python doesnt work on my hosting so i run the php scripts on the hosting but run the python script on another computer.
    and i am getting the 200 success response.
    does what i am doing sound reasonable,
    thanks again,
    george
    Hello George. I just submit the sitemaps. I found it works and it saves AGES of time.

    Regarding a full stiemap.xml bear in mind that search engines bots stop reading a sitemap after a certain size. So you are better off dividing into multiple smaller ones when you have tens of millions of links. I usually go with 5000 mapped to another 5000 and mapped to another 5000. I found that this works.

    If you are getting the 200 success response, it means that the submission worked well. The best way to verify would be via your logs though. Look for evidence of the crawlers coming.
  • edited July 2023
    CODE DELETED
  • thank you so much,
    i will drop the webmaster account,
    i really appreciate your response,
    thanks,
    george
  • rastarrrastarr Thailand
    @Momo said:
    Hello George. I just submit the sitemaps. I found it works and it saves AGES of time.

    Regarding a full stiemap.xml bear in mind that search engines bots stop reading a sitemap after a certain size. So you are better off dividing into multiple smaller ones when you have tens of millions of links. I usually go with 5000 mapped to another 5000 and mapped to another 5000. I found that this works.

    If you are getting the 200 success response, it means that the submission worked well. The best way to verify would be via your logs though. Look for evidence of the crawlers coming.
    Would it be possible for you to re-post your Yourls script, as it is today? While I have some rudimentary Python skills ( https://forum.gsa-online.de/discussion/30601/free-recaptcha-v2-and-v3-breaker#latest ), getting your working multiple 5000 entry sitemap stuff would be a great help since I found a few bugs in your original stuff.
  • edited July 2023
    CODE DELETED
  • rastarrrastarr Thailand
    Momo said:

    I don't do it in Yourls, thats purely server side. I use a different python script for it locally. For example, the following method will suffice if you have 25m links. (5000*5000). Then you can zip the files. Upload them to a directory on your server (e.g. httpsdocs/sitemaps1/) and unzip them. This will save your database from constantly being smashed every time the bots crawl (and they crawl A LOT). It also means you save costs on hosting as you wont need a beast SQL server. The method is very simple.


    #here you call the method with the textfile (list of links to index) and domain (e.g. example.com)
    def createXMLSitemap(textfile, domain):
        urls = []
        with open(textfile) as f:
            for line in f:
                line = line.strip('\n')
                urls.append(line)
        numberToDo = len(urls)/5000
        print("Sitemaps to make: " + str(row_count))
        print("Sitemaps to make: " + str(numberToDo))
        x = 1
        while x - 1 < numberToDo:
            with open("sitemap_" + str(x) + ".xml", 'w') as fp:
                fp.writelines('<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">')
                for link in urls:
                    fp.writelines('<url>')
                    fp.writelines('<loc>'+ link + '</loc>')
                    fp.writelines('<lastmod>2023-07-08T15:15:27+00:00</lastmod>')
                    fp.writelines('<changefreq>Daily</changefreq>')
                    fp.writelines('<priority>0.9</priority>')
                    fp.writelines('</url>')
                fp.writelines('</urlset>')
            x = x + 1
        x = 1
        with open("sitemap_index.xml", 'w') as fp:
            fp.writelines('<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">')
            while x - 1 < numberToDo:
                fp.writelines('<url>')
                fp.writelines('<loc>https://' + domain  + '/sitemaps_1/sitemap_' + str(x) + '.xml</loc>')
                fp.writelines('<lastmod>2023-07-08T15:15:27+00:00</lastmod>')
                fp.writelines('<changefreq>Daily</changefreq>')
                fp.writelines('<priority>0.9</priority>')
                fp.writelines('</url>')
                x = x + 1
            fp.writelines('</urlset>')


    Calling the method as an example:
    textfile = "myT2urls.txt"
    domain= "example.com"
    createXMLSitemap(textfile, domain)

    depending on your PC specs, you may get memory errors if the size of the list is too big. If that is the case, split the list up and call the method a few times, adjusting x as appropriate.
    So you don't actually shorten the URLs, if I'm reading this new code correctly. Right?
    You are just filling the sitemap with links from another domain.
    Won't they get ignored by Google if the sitemap contains non-domain links?
  • No. I shorten them with Yourls. This is how I generate the sitemaps. Hence why you need to enter your domain and the shortened urls created.
  • I think it would be better for everyone if I just release a python package for this. That way, everyone can benefit from it with minimal confusion. Will write one later today/tomorrow when I have an hour to spare.
  • One suggestion I have is if you are needing help with code to ask chatgpt to make it for you. Or show it the code you’re having problems with and give it the error. I’ve generated a bunch of code like this I wouldn’t have been able to make myself.
  • Ok. Released a python package to literally make it easy for even newbies to understand and use. Very simple to use.

    Requirements:
    Python.

    Installation:
    pip install momositemaps

    Code:

    import momositemaps as sitemapmaker

    host = ""  # Database Host
    user = ""  # Database Username
    password = ""  # Database Username
    database = ""  # Database Username
    domain = ""  # Domain hosting the sitemaps
    directory = ""  # Directory of maps on domain (folder where you will upload the sitemaps)
    links = "links.txt"  # List of links to be submitted then made into sitemaps

    # This method will add all the links to the database and then recreate all sitemaps
    sitemapmaker.submitAndBuildSitemaps(host, user, password, database, domain, directory, links)


    Enjoy everyone.


  • Thank you for contributing man. I was reading forum for few years and just registered to say thank you! Best post here for last few years for sure.
    Thanked by 1Momo
  • Jsilva said:
    Thank you for contributing man. I was reading forum for few years and just registered to say thank you! Best post here for last few years for sure.
    Thank you. Really happy to hear this. You made my day. I wish you success sir!
  • Hey Momo, thanks for updating this with more valuable information. I am setting up catch all email domains now and getting ready to use SER after a long break.
    For some reason I though that google has made it so we cant use SER with gmail a year or so ago..? Gmails are working again?
  • I haven't had any issues with them, but actually started just using my own hosted catchalls a while back, it was more economic.
    It could be possible that google did something for newer accounts or accounts which aren't verified by mobile. I always buy aged accounts for things like twitter, gmail etc. I find they raise less flags.
    Thanked by 1chrischopodliq
  • Momo said:
    I haven't had any issues with them, but actually started just using my own hosted catchalls a while back, it was more economic.
    It could be possible that google did something for newer accounts or accounts which aren't verified by mobile. I always buy aged accounts for things like twitter, gmail etc. I find they raise less flags.
    I agree with catchalls being more economical. That’s all I’ve used for years is my own catchalls. I’ll just stick with that since my google account are expensive and I don’t want to get them banned or something. Thanks.
  • Thanks for this thread
  • Have deleted the repository and code. This has been abused by unsavory people on BHW to sell services to others. That is 100% against my intentions when I made this freely available and I want to stop it getting any worse. I hope you all understand my wishes in this regard.
  • I hate people they ruin it for everyone lol. 

    I read this post a few days ago and sat tonight to get this sorted as indexing was a problemo for me. 

    Would appreciate a PM witht he code @Momo, in advanced thanks for all the knowledge sharing!

  • rastarrrastarr Thailand
    Momo said:
    Have deleted the repository and code. This has been abused by unsavory people on BHW to sell services to others. That is 100% against my intentions when I made this freely available and I want to stop it getting any worse. I hope you all understand my wishes in this regard.
    wait, what? I'm so glad I got this working yesterday, after some code modification to get it working with Yourls v1.91.
    There will always be some people who spoil it for others.
    I've certainly appreciated your contributions @Momo
    btw if you want the modifications I made to your code, please ask. Happy to share those back to you
  • cherubcherub SERnuke.com
    Which service was it? So I know not to give them any business (I've probably tried every single indexing service out there)
  • thanks again, i have been trying to crack the indexing problem all year and you started me off on the right track. great post! btw, google cloud has an indexing api meant for job posting that ties into the webmaster console. eg. if the yourls shortner is on a site and that site is coupled into the webmaster console you can submit links that you dont own which are shortened, but this adds another level of complication.
    thank you again!
  • This is pretty great, I have taken @Momo 's path and successfully got this working....I am planning on adding the google indexing API with this and have it run continuously and submit 200 links a day. 



Sign In or Register to comment.