Skip to content

Need a consultation with a professional in GSA. For a fee




I need help with setting up GSA. I already understand the program and basic functions quite well, but the results are still weak

In a few days, I received a maximum of 15 links/project.

I want to receive at least 50-70 contextual links per day for my TIER-2

I use:

Xevil
Good proxies
Spintax content generated with ChatGPT
Powerful VPS

I do not use ready-made listings - I want GSA to find suitable engines for placing links itself.

If necessary, I will buy additional resources.

Please write in PM only if you are sure that you can conduct a consultation, as a result of which I will start receiving a large number of contextual links (at least 50 per day).

I am not interested in any garbage like blog comments, indexers, pingbacks. Only contextual links, not using SER-Lists
Thanked by 1Deeeeeeee
Tagged:

Comments

  • Well maybe find the engines types you do want and try to make some new footprints scanning the verifieds that you did find. For getting more SER targets, using the footprint studio is the best way, I think. Also parsing links from certain verified URL's though it may be hard starting out with just those few. 

    It takes time to create a fresh list you can post to.  You got right idea not going with recycled old lists though, IMO


    Thanked by 1Deeeeeeee
  • sickseosickseo London,UK
    Looks like you need to scrape more sites. Paying for a consultation is not going to put new sites in your list.

    Contextual do follow links are in gnuboard, wordpress, buddypress, smf, joomlak2, moodle, xpressengine. These are the main ones to focus on scraping.

    There are plenty of other engines to scrape but they will either be do follow profile links or no follow contextuals.
    Thanked by 1Deeeeeeee
  • DeeeeeeeeDeeeeeeee the Americas
    edited August 9
    I'm actually FINALLY up to this step...actually getting my years-old To-Do list completed!!
    I am paying way more attention to e
    Well maybe find the engines types you do want and try to make some new footprints scanning the verifieds that you did find. For getting more SER targets, using the footprint studio is the best way, I think. Also parsing links from certain verified URL's though it may be hard starting out with just those few.

    As I finally get through my Work To-Do Task List, including SEO Tasks, I find there are a number of GSA-related tasks I need to get to. Probably good to make a new GSA Learning and Doing Task List for these tasks I need to think about but never actually made it to the list! 

    What you say about scraping and lists, @backlinkaddict, is definitely  something I must get to next. And take seriously! And, have fun with!! I appreciate you reminding us all.   :relieved: 

    Scraping and really using Footprint Studio successfully, testing it for the time it will take to get results is another goal.  I've played with this tool, but never really got results probably bc I never  did much beyond learn how it works. :|  I am sure I needed to spend more time playing around with Footprint Studio.

    But there are other tasks.  Like the proxy stacking.  Maybe playing with new purchasable engines when I have extra to spend on that?

    Making lists and using SER in a way more targeted way if necessary. There are more steps for me than this, though this is all I can think of for now. Oh, yes, get a new Text Captcha solver...I think...
    Thanked by 1backlinkaddict
  • Thank you for your comments, my friends.

    I am currently working on the footprints, I hope this can give some results.

    Now I have a question. Why aren't more footprints added to the GSA by default? It would be convenient if you didn't have to configure footprints yourself and scraping via the GSA would be more efficient by default
  • cherubcherub SERnuke.com
    Erni_bro said:
    Now I have a question. Why aren't more footprints added to the GSA by default? It would be convenient if you didn't have to configure footprints yourself and scraping via the GSA would be more efficient by default
    Because anything that is added by default is then used by every user of SER by default. This results in everyone finding the same urls and submitting to the same places, and turning everything into a spam dump. Working out your own footprints that bring back different sets of urls means that you have a better chance of finding sites that are less frequently posted to.

    If you want to make a success of automated link builders, you need to put in the work yourself, and not expect everything to be handed to you on a plate.
  • cherub said:
    Erni_bro said:
    Now I have a question. Why aren't more footprints added to the GSA by default? It would be convenient if you didn't have to configure footprints yourself and scraping via the GSA would be more efficient by default
    Because anything that is added by default is then used by every user of SER by default. This results in everyone finding the same urls and submitting to the same places, and turning everything into a spam dump. Working out your own footprints that bring back different sets of urls means that you have a better chance of finding sites that are less frequently posted to.

    If you want to make a success of automated link builders, you need to put in the work yourself, and not expect everything to be handed to you on a plate.
    Thank you! That makes sense
  • Tell me if I have a large list of footprints that are not sorted by engines. Let's say 300 lines.

    Where can I add it so that GSA can determine whether there is a match to any engine in the process of parsing search engines?
  • cherubcherub SERnuke.com
    Erni_bro said:
    Tell me if I have a large list of footprints that are not sorted by engines. Let's say 300 lines.

    Where can I add it so that GSA can determine whether there is a match to any engine in the process of parsing search engines?
    Use the 'Import URLs (identify platform and sort in)' function from Options > Advanced > Site Lists > Tools. I also recommend enabling 'Use engine filter' and only selecting the platforms you're interested in.

    You might also need to find and update the 'page must have' footprints on the relevant engines. More info here: https://docu.gsa-online.de/search_engine_ranker/script_manual?s[]=page must have
    Thanked by 1Deeeeeeee
  • edited August 9
    Erni_bro  Also be aware that as these platforms grow/change over time, some even die so the patterns you will find URL's with and make your footprints will change over time as well. I think it's good to have good least 50 to 100 verifieds and then make sure engine is selected in drop down and import and analyze the verified domains. (default pulls 50 i think)

    You can select random patterns after it's done analyzing to create then test "footprints". You can select a search engine to test against as well. I think this part is key. I wouldn't just source random footprints and add them personally. I would build them from your recent verifieds. Also not just the patterns SER finds for you, I like to look through the URL's on the left to find inurl:/some/random-pattern/. These very specific inurl: patterns are the key to bring back some new Urls for that engine when they are available pretty quickly. Just saying checking through these in the Urls on the left is also helpful in addition to building custom footprints and testing them against search engines before adding them. 

    @Sven I don't know if happens for everyone, but footprint studio I notice sometimes it adds quotes on the clicked footprints and sometimes it does not. So there are times I click the patterns and after it adds them I still have to go in and add the " " around the footprint pattern. Small bug maybe, IDK. Either way this Footprint Studio is great tool! Maybe some day it will find its way to the front tools pane where it can shine ;) 

    Some one said it best years ago, "Think of SER as a 1 time payment "Open Source like" program you can do whatever you want with" or something similar. This makes everyone's SER copy and methods unique to them. 

    @cherub You can Identify and sort in footprints also somehow? I have used that for URL's, but never even tried footprints. I also agree adding to the page must have= param is another great way to optimize SER engines, not to mention probably not wasting captchas and resources since this param would filter out unsuitable URL's you can't post to anyways. 
  • @Deeeeeeee I didn't forget you, Thanks for the kind words  :)
    Thanked by 1Deeeeeeee
  • cherubcherub SERnuke.com
    @cherub You can Identify and sort in footprints also somehow? I have used that for URL's, but never even tried footprints.
    Ah, I see he was talking about a list of footprints rather than a list of urls. I'd say if you have a list of 300 footprints that aren't sorted by platforms, it's probably useless outdated data. The best you can do from it is to perform a big scrape using it, then using the identify/sort in procedure I highlighted.
    Thanked by 2Deeeeeeee Erni_bro
  • @Cherub Gotcha - I wasn't sure if there was method maybe I didn't know. I wouldn't expect much from old outdated footprint list scrape either. But agree that procedure you highlighted and selecting engine filter would likely grab what you did find, if anything useful.
    Is there way in SER to search through HTML to get page must have = "patterns" or just plain text really? I thought there was a way using some debugging tool but I can't remember at the moment.
    Thanked by 1Deeeeeeee
  • cherubcherub SERnuke.com
    I'm doing it manually, after 10 years of trawling through platforms with this stuff I've kinda developed an eye for it. Start off with fragments of the platform name, see if it appears as part of standard classes/id tags, javascript snippets, powered by links etc. Some platforms are virtually impossible to detect or tie together :(
    Thanked by 1Deeeeeeee
  • Some platforms are virtually impossible to detect or tie together 

    This is true the JS snippets or sometimes specific platform plugins or themes can be helpful, but these days you def have to go digging around. I thought maybe there was way to scan HTML and have it find footprints there rather then just the text. I don't know of better way to do it in SER either, currently. If I find a way or helpful script I'll share  ;)

    Thanked by 1Deeeeeeee
  • DeeeeeeeeDeeeeeeee the Americas
    Erni_bro said:
    Thank you for your comments, my friends.

    I am currently working on the footprints, I hope this can give some results.

    Now I have a question. Why aren't more footprints added to the GSA by default? It would be convenient if you didn't have to configure footprints yourself and scraping via the GSA would be more efficient by default

    I think the out-of-the-box setup for GSA-SER is really just a template intended  to be highly customized by the user.  The footprints, that's the work for each of us to do in our own way, I would guess. That's yet another area that can be approached in different ways.
  • cherub said:
    @cherub You can Identify and sort in footprints also somehow? I have used that for URL's, but never even tried footprints.
    Ah, I see he was talking about a list of footprints rather than a list of urls. I'd say if you have a list of 300 footprints that aren't sorted by platforms, it's probably useless outdated data. The best you can do from it is to perform a big scrape using it, then using the identify/sort in procedure I highlighted.
    Yes, I was talking about the list of footprints.

  • @Sven I don't know if happens for everyone, but footprint studio I notice sometimes it adds quotes on the clicked footprints and sometimes it does not. 
    I have the same issue
  • SvenSven www.GSA-Online.de
    quotes get added if there is a space in the phrase
  • Erni_broErni_bro Turkey
    edited August 12
    By the way, I would still be happy to use the consultation if there is someone who can help me set up getting more links. $15-20 for a couple of consultations, if someone is satisfied with this price

    Perhaps you can share your footprints if that is the only problem.

    I can send you my settings in advance
  • 15$ for a few consultations??? I don't know of any consultation or service in this price range.

    I believe this is really something you must learn by trial and error. 

    Either way, still need content , proxies, captcha solving, emails etc which are not cheap. From what I've heard, it sounds like you won't have money to keep this running really. Is hiring someone else to do it an option or your determined to learn on your own?
Sign In or Register to comment.