Skip to content

I need Some Clarification excuse me if my question are newbie

edited June 6 in Need Help
Thank you for Accepting me in the Forum .
my site having DA 3 and  PA 5 according MOZ
i also have good presence in social media
my niche have around 8 related niches

i want to use the software in a very moderate use

excuse me if my question are newbie

1- i understand that it is better to scrap for the niche i am trageting for then to import these links into the system before activating the sofware is that correct ?
2- how many emails do i have to input in the system ?
3- i am targeting to create quality backlinks 12 a month how will be the set up ?
4- can i choose how many backlinks i can do ?
5- what proxi do i use and how many proxi ?
6- do i add my site in the sofware or it is a bad idea ?
7- can i create backlinks to my article on meduim ?
8- can i create backlinks to facebook page or instagram profile or facebook will ban my account?


Comments

  • No issue everyone starts somewhere and there's a lot to be aware of.

    1. Building links from related niche sites is always best, but not always possible. At least have the backlink in contextually related content if cant control this. (majestic is good for this- looking at links topics)

    2.  Variety is always best, but really depends on volume of which you said will be low so handful should be fine. 

    3. So I guess you would want to set strict settings for this and pause the project after a few verifies for the week if looking for 12 a month.

    4. Of Course, you can literally tell SER to do many tasks all of which have many options.

    5. If your scraping for targets you'll likely need to pay attention here and get good proxies that pass google test and set delay between searches which can be LONG since you are in no hurry. 

    6. This is bad idea especially as beginner testing with no real "control" to compare a campaign to. Start with a few of those do-follow social media profiles lightly and monitor efforts after the get indexed.

    7. Sure but sparingly, likely a large influx of links quickly can cause it to get deleted.

    8. Try it but if we are talking 12 links a month here I don't see you running into any problems really.

    Just my quick take, feel free to add or expand on. . . :)
  • thank you for your reply i appreciate it. 
    i will go for 12 to 15 back link per month

    what do you think ?  becuase i will only target limited back links i can use public filtered GSA proxy and i will use the GSA captcha breaker    
  • what do you think ?  becuase i will only target limited back links i can use public filtered GSA proxy and i will use the GSA captcha breaker 
    Well maybe build those 12 links a month (3 a week) by hand on some popular domains with indexable do-follow backlinks and great content, then put those in GSA as URL's. Some of these sites will require solving H-captcha and RC which by default GSA CB wont solve so do it by hand or you will need an API to a service that solves these or have to fill out manually. I don't think you will have much luck using free scraped proxies to scrape Google, so if its a MUST maybe try scraping Yahoo/Bing.

    Seems the higher the quality of site the harder it is to automate these days for the fixed sites, even for the CMS ones today. I would build some by hand and then maybe use @cherubs engines https://sernuke.com/ and/or check and see what @AliTab https://gsaserlists.com/gsa-ser-custom-engines/ has available for fresh targets. 

    Use those to power backlinks up as well as help indexing. You will have to monitor the results somehow after indexing the back links to see whats working. Likely, they drop and come back stronger each time in my experience so don't get nervous instantly if rankings drop.

    It called the Google Dance, has been this way for as long as I can remember. 

    You can use GSA Keyword Researcher for monitoring this. . . https://www.gsa-online.de/product/keyword_research/

    But you will likely still need decent set of passing proxies for accurately parsing "G" and other API's if you want more data.

    I would set some very tight filters for the links you build to those links as well!

    Just my experience and what works for me, has worked and still does. 

    Its pretty consistent results if you do it this way. 

    It's also great idea to make sure that the keyword you target in URL, H1 tag and for page that you are creating content for is better then whats currently ranking there, throw some internal links to page also from other pages on site with descriptive anchor text.

    Get it indexed and see where it starts in first 100 results or 10 pages. Also make sure the strength of your domain is some what close to avg of whats currently on first page. So dont pick a hard keyword if your site is guessed to be "pr 2" and the serps are showing "avg pr 7" with 20 year old domains. That's kinda useless waste of resources.

    Try starting with longtail and easier keywords to get some traffic rolling in first so search engines can use core web vitals to decide to show your site more via more impressions. Then after you have trusted website and some easier keywords ranking start going after harder ones.

    I learned from school of hard knocks, but my approach has always been more grey hat.

    Hopefully, this is helpful. . .











  • I don't think it's worth creating Tier 1 with automation mainly because it will not last long as we are going to create those accounts on low quality. I am still using RankerX for Tier 1, and it's better than trying those with GSA itself. By building solid Tier 1, you don't have to run those links on an alive checker to see whether those URLs exist. Build Tier 2 with contextual engines.You can semi-automate Tier 1 link building with a stealth browser.

     I found Tier 2 projects work best and provide more authority power to the root domain. Make sure to use unique wide referring domains to pass through all signals. Reddit and GSA work. Grab some link lists and run them on contextual engines while waiting for a boost.If you have SCM, now you can run LLMA3 for unlimited unique content.

    There are nice captcha solvers on black hat world market place. You can request for a trial and test what's best for you. My setup is quit big but I recentlychanged lot of supplies..  
Sign In or Register to comment.