Skip to content

Scraping/Filtering sites for Tier 1

cipherpodliqcipherpodliq Bulgaria
edited November 2023 in SERLib
Hello everyone, I know that this might have been discussed in the forum.

Basically can you share with me some threads/methods or maybe your own method of doing this. How can i scrape high quality websites that i can use for tier 1 link building with GSA SER. I assume it is just a matter of testing footprints + different keywords. I don't aim for anything crazy but maybe 3-4 links a day that are high quality and niche relevant. How can I do that? Please share threads from the forum or any other place. I am interested in good reads.

Thanks in advance!
Tagged:

Comments

  • You can set to post on say do-follow only or sites with less than say 50 out bound links.

    Since PR is not available there is Dom detailer API which allows you to pull moz and majestic details which you can filter by.

    Like use moz rank instead of PR and filter to only post if 3 or better.

    DomDetailer is paid alternative to having to buy majestic and moz account and pull data into software. I think 35 dollars for so many credits but they last long time.

    There is also addon gsaserlists web 2.0 with extra blogs and profiles for 22 monthly which are great for tier ones as long as good content and not spamming sites as you want them to stick and not get deleted
  • You can set to post on say do-follow only or sites with less than say 50 out bound links.

    Since PR is not available there is Dom detailer API which allows you to pull moz and majestic details which you can filter by.

    Like use moz rank instead of PR and filter to only post if 3 or better.

    DomDetailer is paid alternative to having to buy majestic and moz account and pull data into software. I think 35 dollars for so many credits but they last long time.

    There is also addon gsaserlists web 2.0 with extra blogs and profiles for 22 monthly which are great for tier ones as long as good content and not spamming sites as you want them to stick and not get deleted

    Thanks for the reply! I will be testing this right away! 
    One thing that I read is that the radio button for do/no follows works only on WordPress websites? Do you know if this is actually true?

    Also what do you think of adding to scrapebox footprints something like <rel = dofollow> or perhaps have it search the HTML of a page to find the string match?
     
  • Im not unsure of radio button not working?
     
    I think as usual the settings of specific engine in the script would matter for this.

    So if the engine script says dofollow=1 or true rather than 0 or false it will use that.

    If the script was written and the creator of said script did not define dofollow parameter, I think this is what matters most and maybe if not added it may default to a value (can be checked in macro guide and script manual)

    Maybe also go into SER and on engines pane right click on engines none, then select again and select only dofollow engines and see which ones end up being checked as these should be all the dofollow marked engine types.

    You can scrape url with footprints and find domains with scrapebox yes, but should add that list into the Identified folder and run to see which urls end up in submitted meaning SER can post to those links.

    You can also import into project, one method may work better than the other.

    I will get back to you shortly

  • @backlinkaddict
    I am new to this. I am trying to find where these options are and where do you enter the API for this PR tracker?
  • Dom Detailer is too expensive and it's not practical at all. When it comes to automation, you can't be that neat in reality. Instead, use high-quality data fields with GSA SER. It has Open AI, and you can generate high-quality articles and all the data with quality content. Also, arrange images, videos to an acceptable level, use anchor texts and build backlinks to contextual-only engines. Never use indexers or exploits such as engines for tier 1. Using premium link lists is more cost-effective. I own 3 link lists with overlap, and still, it's cheaper and more worthwhile than using DomDetailer or such credit-based services.

    If you want to build a few 3-4 links a day, you can build them manually, and I don't think it's necessary to go beyond that. Cover the basics, throw in 50-100 GSA verified links, and you are good to go. I am an OCD person, but I'm still managing everything with GSA SER. Don't focus too much on super quality, niche-targeted backlinks. Instead, I create good backlinks with good content using GSA. Don't overkill the process.


  • APOBLower said:

    Dom Detailer is too expensive and it's not practical at all. When it comes to automation, you can't be that neat in reality. Instead, use high-quality data fields with GSA SER. It has Open AI, and you can generate high-quality articles and all the data with quality content. Also, arrange images, videos to an acceptable level, use anchor texts and build backlinks to contextual-only engines. Never use indexers or exploits such as engines for tier 1. Using premium link lists is more cost-effective. I own 3 link lists with overlap, and still, it's cheaper and more worthwhile than using DomDetailer or such credit-based services.

    If you want to build a few 3-4 links a day, you can build them manually, and I don't think it's necessary to go beyond that. Cover the basics, throw in 50-100 GSA verified links, and you are good to go. I am an OCD person, but I'm still managing everything with GSA SER. Don't focus too much on super quality, niche-targeted backlinks. Instead, I create good backlinks with good content using GSA. Don't overkill the process.


    Thanks i totally agree and really appreciate your reply! So what I am trying to do is scrapebox a bunch of keywords + merge them with some footprints for blog comments. Out of all the urls gathered - i will put them in GSA SER and tick only the do- follow links + ougoing links lower than 50-60  or smth like that. 

    Do you think this will be a valid strategy? Thanks in advance!
  • @APOBLower "Dom Detailer is too expensive and it's not practical at all."

    As opposed to paying for a monthly Majectic and Moz subscription?

    The free software it comes with is great for pasting in links and filtering out the best ones to use in a project.

    Its also great for getting metrics for expired domains for your PBNs.

    Do you mean not practical in using inside SER?




  • Grio43 I answered your question but somehow It was erased. Bascially, you find in options---> advanced--> Misc---> and chang yandex and there are option there and a popup will ask for DD api
  • @APOBLower "Dom Detailer is too expensive and it's not practical at all."

    As opposed to paying for a monthly Majectic and Moz subscription?

    The free software it comes with is great for pasting in links and filtering out the best ones to use in a project.

    Its also great for getting metrics for expired domains for your PBNs.

    Do you mean not practical in using inside SER?





    Of course, it's cheaper than MOZ/Majestic. But the fact that you get 25K links for $35 is still expensive. That's because after processing 25K GSA verified links with DomDetailer, you won't get many links. On the other hand, it's very easy to manipulate those so-called metrics. Instead, it's far better to build more generic links with high-quality content. I would rather focus on how I'm going to fill GSA SER fields. Like I always maintain a good anchor text ratio, I always make sure to mix it up with other competitors' links, etc. By practicing these, it's not only cost-effective but also does the job without the risk of penalization. At least, that's the case for me. I've used DomDetailer and web2.0s with no luck. For me, I focus on Google indexed root domain backlinks rather than those metrics. Spending on Open API



  • APOBLower said:

    Dom Detailer is too expensive and it's not practical at all. When it comes to automation, you can't be that neat in reality. Instead, use high-quality data fields with GSA SER. It has Open AI, and you can generate high-quality articles and all the data with quality content. Also, arrange images, videos to an acceptable level, use anchor texts and build backlinks to contextual-only engines. Never use indexers or exploits such as engines for tier 1. Using premium link lists is more cost-effective. I own 3 link lists with overlap, and still, it's cheaper and more worthwhile than using DomDetailer or such credit-based services.

    If you want to build a few 3-4 links a day, you can build them manually, and I don't think it's necessary to go beyond that. Cover the basics, throw in 50-100 GSA verified links, and you are good to go. I am an OCD person, but I'm still managing everything with GSA SER. Don't focus too much on super quality, niche-targeted backlinks. Instead, I create good backlinks with good content using GSA. Don't overkill the process.


    Thanks i totally agree and really appreciate your reply! So what I am trying to do is scrapebox a bunch of keywords + merge them with some footprints for blog comments. Out of all the urls gathered - i will put them in GSA SER and tick only the do- follow links + ougoing links lower than 50-60  or smth like that. 

    Do you think this will be a valid strategy? Thanks in advance!


    I can't say anything honesly.  But for sure your stratergy is safer for anyone who  likes to on safe side.  I am not that good lol. I build backlinks directly to money site and all the tiers.  Daily 100 - 200 with no filters except junk engines. Works for me. I guess all depend on the niche market.

  • @APOBLower "Dom Detailer is too expensive and it's not practical at all."

    As opposed to paying for a monthly Majectic and Moz subscription?

    The free software it comes with is great for pasting in links and filtering out the best ones to use in a project.

    Its also great for getting metrics for expired domains for your PBNs.

    Do you mean not practical in using inside SER?




    i feel that there are apis on rapidapi who are more sufficient
  • APOBLower I agree and use Openai as well for filling data fields and other projects.

    However, filtering a list by metrics is another topic.

    While using DD api to filter massive lists it's just going to eat up credits which makes not great option for working with large lists,

    I just use the software to paste in expired SB domains from and already filtered list and then filter further using DD. 

    Then I can check the backlinks profile, anchor text, spam check etc from what I'm left with so im now working only with the best URLS now and wasting not time researching domains that will be crap anyways. Great to find PBN or even money site urls to build off of. 

    Also, useful for pasting in a scraped list of say "social bookmark" sites from SER sitelist and then remove all the ones with no "rank" at all (as they wont pass any juice anyways) This list can then be imported into projects so you have "bookmarks with authority" and you can post to right away with SER in any project you have now. (can use those to juice up your project links nicely)

    I find this method quit useful.

    All metrics can be manipulated so they are just helpful hints, this is why its nice to pull data from a few sources all in one dashboard and to further check from there depending on goal and project needs.

    I guess for people filtering large lists its not good option but for fine tuning or "detailing" your domains URL list it does its job and saves me money. But Im just using for a more refined filter for smaller lists really.


    Im not sure how well that method is workin with expired web 2.0 but I know I used scrape box to find a bunch off tummblrs. Checked them in DomDetailer and I was able to reregister the top 3 with best metrics.

    It something I was wondering/testing myself and maybe create script for if results seem promising on a test project.





Sign In or Register to comment.