Why don't I see any benefit from GSA?
I've been using this program for a year now. Over the course of this year, I have become well versed in all the functions and settings.
I use 1000 good proxies, GSA captcha breaker, Xevil. Paid link indexers.
I probably have the largest database of sites, since I bought all the databases that are sold, and also ran my database of several Billions of links through the program.
Now the question! What am I doing wrong???
1) I take an optimized website, build normal natural links to it through link exchanges and crowdmarketing.
2) through the GSA, there are another 5,000 article links to these hundred natural links. not some kind of spam, but normal articles written by Open AI (and spin them)
3) I send all these articles to paid inspectors.
all these stages are carried out systematically for several months in a row
and the result is zero!
Thanked by 1Deeeeeeee
Comments
Hello @Zeto
It sounds like you've put a lot of effort into mastering GSA and incorporating various tools and strategies in your process. I'd like to point out that if GSA has successfully built backlinks for you, then it is indeed fulfilling its intended purpose as a link-building tool. It's important to remember that GSA specializes in creating backlinks, but it's not a tool designed to directly influence your website's ranking in search engine results.
As you know, backlinks are just one of the many elements search engines consider when ranking sites. While automated link building can be a part of your strategy, it might be beneficial to also focus on other critical aspects. One key area is enhancing the quality of your website itself. Ensuring that your site offers valuable, engaging, and useful content to visitors can make a significant difference. This approach not only improves user experience but can also contribute positively to your site's organic ranking over time.
Hope this helps, and best of luck with your website!
Maybe try gtmetrix.com and test your website speed, then select compare and add some competitors in there.
If you see that competing websites are all faster than yours then maybe try to optimize for speed.
Are you spinning openai content on your main websites?
If so personally I would not do that.
Good AI content is unique and ready to use, should also be very unique and not reused in link building campaigns via spinning for better results
Look at it this way: If you drive a car and crash into a tree. You don't blame it on the car; it is not the car that, on purpose, drove into the tree ( well, maybe if it is a Tesla :-) ), but the car's driver steered the car into the tree, so the fault is with the driver for not correctly steering the car.
You can use the available filters to tell GSA SER where you don't want it to post.
You mentioned your GSA SER is posting to WordPress sites with 3000 other posts. To avoid that, use the below filter. For example, you can set it to tell GSA not to post to sites that have more than X number of outgoing links on it. Set it to whatever you are comfortable with; for me, 50 - 100 would be a good number.
Take some time and study all of the available filters; if you are unsure exactly what each does, then ask.
He is also talking about "wordpress articles" too not blog posts. Everything is a "post" as you are posting or getting something on a server via post and or get requests.
Outbound links are page specific so the less outbound links on the page, the more juice can be passed to your link as you are sharing the "link juice" with less sites/urls. For do-follows at least.
Think of it like this, soemone can go onto facebook and keep creating accounts and making highly shit spun spammy posts with just many affiliate links all year long. This can be in form of posts, articles, status updates, videos, sharing link etc.
That does not make facebook a bad place to get links from or use for SEO purposes because of this.
You may use settings to avoid posting in certain places that may not provide results you are after.
There are plenty of people who sign up for free website builder type sites like wordpress com and create shitty looking websites with shitty articles manually because its best they can do.
Personally, I can automate the process of building these sites better than whats mostly there using SER and quality data.
Do you have screen shot of projects you could share and maybe shed some light on where some users here may be able to give you better tips on what to change settings wise?
As mentioned above 50-100 outbound is setting someone finds useful. I find 75(avg) to be ok which is basically in middle of what royalmice said. If set to low you will get very few or no sites.
If your using SER the way some youtube videos are showing you how (no thought super spammy 5 minute setup) it's most likey that and your settings/tier structure I would assume.
I heard one say SER is just for spamming and its main point, nothing could be farther from the truth.
It's a tool that takes user inputs and campaign should be thought out for good results.
Here's example, I purchased an expired domain in a certain niche, no crazy metrics but related to niche Im rebuilding content for. I setup 12 service pages and about 7 blogposts all of which on-page is done first. Then find all old links that have power and would provide a 404 using ahrefs or moz or semrush.
Now I 301 redirect those "lost" pages to my new related urls.
Just build a few links in high quality articles related to your niche. They should be dofollow and work with tier structures to make those dofollow indexable links worth more to your main site.
I would not suggest to put main site in SER and just hit start without really being meticulous with settings. Especially if this is new site and you create many backlinks you will probably only rank for domain brand name if lucky.
Within a few weeks, I have service related search queries poping up in search console spot 50 and above to start.
If you dont see this there is most likely an issue with domain or onpage I find.
My example is in high competition/commercial searches as well.
If provide some screeshots and more detail id be happy to help point out were I think you can optimize and Im sure other users here would do the same.
The 75 has nothing to do with backlinks built, its how many links are on page that your link will be on. So if you skip pages with 50 outbound links then SER wont posts your data on that sites that have 50 outbound links on page.
Lower links on page generally means if you were now to build links to that page you are sharing the link authority that will be passed with less links is all which should be better seo wise.
I dont focus on links per day or LPM, not related in my world.
I dont really build massive links until tier 3 and above.
You can download your backlinks from like ahrefs and power them up to test campaign before you go ruining a domain is my advice.
Expired domain and redirecting old 404 pages thats a more advanced topic I guess.
I mean recupping the backlinks by rebuilding the old urls again that have backlinks which are now 404 errors and redirecting them properly.
Has it dropped frequently? Are the mertics being manipulated? For example, links webmasters are going to delete or sometimes domains may look good like old fake PR because someone is 301 a few good domains which after you buy they will direct to another making domains you bought now shit.
This maybe where your/an issue could be.
If new site I will try to naturally build links like natural social media profiles and get good onpage going first and im hoping that when indexed and seen for first time its in good condition. Like first impresion. If built a ton of links and crawlers go to see site and its shit well this is its first impression forever and if your using exact match anchors you may run into issues.
Good sites/business will start having branded links and naked url links. They may even have social media profiles and a few posts linking back, maybe a few social bookmarks. If your onpage is ok a crawler can tell what page is about anyways so no need to use money terms early and def not frequently as its dead spam givaway.
Recreate a unique related and helpfull site and even for first tier links and results should be better.
So you reregister your domain, you now own domain.com. But before you bought it it had a bunch of older pages with links pointing to them.
domain.com/keyword1
domain.com/keyword2
domain.com/keyword3
domain.com/keyword4
It good practice to use moz or ahrefs or semrush and compare all links that will now be 404 eror because pages do not exist on same url now.
So get this list of old urls site had or pages.
Use a redirect plugin to redirect general urls to homepage and related urls to newly created content highly related to the new urls on pages you newly built. Keep checking make sure when click link they are all redirecting and not showing 404 erros to your domain.
Eventually, all these links will be deleted by webmasters if not fixed anyway and you may get no value from expired domain linkwise, well maybe age.
I think we may be mixing up a few topics here. Can you provide project details or screen shots?
Is it traffic, conversions, rankings, moz/ahrefs/semrush/majestic figures?
Are the pages on your website index at all? Are the links built with SER being indexed? Sending to an indexer is one thing, getting indexed another. If your desired result is organic traffic, discovered links by moz/ahrefs/semrush/majestic don't matter, only the indexation in the search engine counts. Use SER and/or Scrapebox to verify.
I had been at the same point of frustration but learned a lot from the forums posts here made by SER veterans @sickseo, @cherub, @royalmice and others.
I would also suggest not to rely on SER T1 links based on link lists only. Scrape your own lists, buy T1 links, buy T2/T3 links and expand your verified list with these. Below a screenshot from one of my sites where I was running SER only for quite some time. Beautiful articles, indexed, but not a significant change in rankings and traffic. End of October, I bought 5 T1 links from a vendor in a Facebook group, $1 each, used SER to get these indexed and powered up. Organic traffic improved immediately since then. $5 vs. expenses for VPS, proxies, captcha, content, time ...
You can see that rankings overall have not improved, but impressions and clicks have grown a lot.
This is for a site in English language, target US traffic (Adsense). The situation is different for websites in other languages, I've learned. Most of my sites are in German language. What I am doing for these is a) adapt the SER footprints for specific platforms with the German translations, b) scrape target sites with the site:.de (.ch/.at) operator, c) build lots of relevant blog comments directly to my site and d) build only few links from sites in other languages with generic anchors. This does help traffic wise but still needs to be improved.
Next step for me is on the content of my site(s). I was used to rely on Semrush for keywords. Recently, I stumbled on a comment from @Sven about GSA KR being underestimated. Having evaluated it with the trial, I bought the license and am more than happy with it. Semrush is nice to get KW ideas and difficulty estimations. But you need to spends lots of manual time to do a competitor research, etc.. Working nice for English KWs, same poor situation for my language. So, with KR I can automate this process, use my time for other things than starring at a screen, waiting, exporting etc.. And avoid having to pay a monthly subscription or the hazzle to get another 14 day trial for free or some dollars. Time is money.
Keep on going with SER and use it to your benefits!
This is how I found these:
1) Join the Facebook group "Guest Posting" (or another group with a decent number of members and activity). (https://www.facebook.com/groups/3007238042823581)
2) Look out for the various offerings and ask the respective sellers to send you a site list. Be aware of the numbers in the FB posts - many vendors post prices in PKR, not USD. 1 USD = 280 PKR.
3) Check the site list in your favorite tool.
- I used Semrush to filter and sort by traffic. You don't want to push DA / DR but make your site's content rank, thus backlinks need to come from a site where content + links are being indexed.
- Check anchors to this site (Semrush)
- Check number of pages indexed using the site: query (Google)
4) Pick the site(s) you want backlinks from and place your order.
5) Once the link is active, wait for some days and then use GSA to power it up. Not too much, a couple of links to it shall be sufficient.
6) Check the bought backlinks are being indexed. I had no issue with any of the 250 PKR links, even without GSA links to these. 150 PKR links (my greed greater than reason) did not index easily or at all.
Google Search Console is showing this:
Semrush is showing this (for US):
The initial results from the cheap backlinks motivated me to do some Onpage-SEO, showing significant improvements in the number of keywords and positions.
My site has 800 blog posts published, all pure AI content (GPT3.5). I am now reworking the blog posts one by one. Formatting, external links (mainly to scientific, governmental, NGO sites), adding (AI) images. Internal links are build automatically with ILJ free version (https://wordpress.org/plugins/internal-links/). For those blog posts with significant traffic potential, I am using GSA Keyword Research and its article builder + GPT4-turbo to optimize the content. I am in no hurry nor aiming to become mad by doing this work, so I am doing just 5 posts a day, 7 days a week. Google seems to appreciate it plus the patience of not doing massive changes in a day or two.
Whilst doing this, SER is working in the background to build new links to the homepage. Next step will be to build links to the inner 800 URLs.
In summary, I appreciate building SER links for a year without significant improvements can be frustrating. Giving up is not an option though, in particular after a year of work.
Give it another try with some other link sources than those in the link lists used by hundreds of people for thousands of projects. (I found the list I subscribed to as I could not resist the Black Friday offer is being shared with 70 accounts. A good number of these will resell the list, so even more users. Google is a machine, but not that stupid to identify link patterns.) 250 PKR - less than 1 Dollar!
And scrape you own list, specific to your site, and build links on it. Even blog comments will help. Create your own blog comments and profiles rather than using the default settings in SER. OpenAI is your friend to create these in bulk and matching your site + content. Use generic anchors and URLs rather than 100 % exact match keywords. If your site is in a, let's call it difficult, niche, use NLPCloud or another AI platform with no harsh content filters to generate individual and specific content. Create a Facebook page for your site, instruct AI to create posts about your site's content, create Reels. You'll see a massive reach and some traffic coming through it. Google seems to like it, too, and indexes your content in no time.
Its not saturated either so you can pretty much be the first to "own" highly searched #hashtags which will provide more visabilty and more and more as platform gains traction.
Caveat: You can only add your main link in profile from connected instagram account if you download app on a phone. (without trickery)
@organiccastle You are right I see same thing with reels, stories, lives, news type content.
Get free traffic for new content posted even if it just an initial boost while they "test" engagement rate to new content.
These methods of getting refferal traffic rolling in will help greatly improve everything else organically.
I always like to have my on-page on point, social network profiles created and interlinked with site content posted out to these profiles, then once thats all set up correctly I will start to build link more heavily.
Its key to have these basics done first.
I think alot of people are in such a rush they end up ruining the domain for any organic traffic.
Example: You have 500 backlinks but no traffic, no social media visabilty or good content on new domain is dead giveaway its self promotion spam.
I think about in in same "real world" examples when creating marketing campaigns.
Soon they will want a DNA sample for verification.
Each one has a whole bunch off "extra" steps to complete/verify and new features they are changing/rolling out constantly its hard to ignore them or even keep up.
I remember when you could just buy a bunch of tweets and it would impact rankings.
These days not so much.
I took very short video showing someone where to find wire harness and ground wire on a Saab alternator for a car related forum on my phone recently.
Until the other day, I was unaware somehow youtube grabbed it and posted it as a reel on a generic name with some numbers at the end.
So I go on youtube to look something up from same phone and I have a reel with thousands of views and comments for 15 seconds of something I didnt even want there in the first place and def didnt optimize.
I am not a fan of social media in that way either. I learned how creepy it gets doing PPC advertising. I think Facebook alone keeps 155,000 data points on a profile. Its good for advertisers but is creepy AF as they know more about most people then they know about themselves.
They know when your at home, at work, if your looking at your phone, when your most actively ready to purchase and on and on.
And thats just one nework!
I hate facebook most! I made a new account years ago privately for business purposes only and it starts connecting me with people from the area I def would have not excepted the request from.
But yes they are great for links, indexing purposes, and getting referral traffic rolling in!
I doubt that's going to change, maybe just the changes in specific platforms, features and currently best working or new strategies.
Do you think you would have had the same results, without GSA blasting the links you bought on that FB group?
PA of the $1 links to my site is growing nicely.