Skip to content

I'm only going to use comments. Please teach me the way.

I bought about 70,000 url. And I did import url by pressing the tools button in setting. 
But I don't know if these url lists have been applied properly. 
Where can I check this out?
I try watching a lot of YouTube and I don't know how. FYI, the list was a txt file.
And I'm only going to write comments, how do I use comments on the URL list I bought?

Comments

  • SvenSven www.GSA-Online.de
    If you want a project to post to these sites, simply add them to the project directly via right click on it->import target urls.
    Make sure that you have Blog Comments enabled in project options on the very left box of the engines.
  • minsolminsol korea

    Blog comments are active
    But SER doesn't write comments on the url list I bought.
  • cherubcherub SERnuke.com
    Are you sure the list of blog comments you bought is compatible with SER? What did the vendor of the list say? Have they given a usage guide?
  • minsolminsol korea
    I bought at hitman.agency


    I don't have a manual....

    seller didn't provide a manual

    It's really hard
  • sickseosickseo London,UK
    minsol said:
    I bought at hitman.agency


    I don't have a manual....

    seller didn't provide a manual

    It's really hard
    I bought that same list. Most targets don't work. Was a complete waste of money. No idea why they still have an active thread on BHW.
  • sickseosickseo London,UK
    I don't normally fall for these types of sales pages but it was professionally done and had lots of "proof" of verified links with screenshots from GSA SER. So it did all look quite legit like he'd built the list and tested them with gsa ser. Plus I wanted to boost my site list for blog comments quick. But of course, just like all the other sellers, the list was absolute poop lol

    I'll stick to scraping my own lists - it's the best way. Blog comments are easy to find and there are literally millions of them out there, although I've only found over 10k so far.
  • sickseosickseo London,UK
    I've been using GSA PI to do that. It extracts external links from every url it checks. So as I scrape comment links it automatically extracts the urls and builds another site list from them. That whole set up is automated.

    Plus once you've got 1 blog comment url, you can extract other urls from the sitemap or even use the site: command to grab other indexed urls.
  • sickseosickseo London,UK
    I bought that list months ago. It's the 1st list I've bought in maybe a year and was also the last list I bought lol

    I tend to dabble in some of these paid lists from time to time. They can contain sites that I don't have, simply due to the list seller scraping something that I haven't. So I'll still do it, although scraping the targets myself is the way to go.

    Take those new engines that @cherub released. I doubt any list seller has sites that support those engines. Even his previous releases for gnu and dwqa - no list seller had sites for them when I tested them. By scraping myself I've literally got 1000's of sites across these engines alone.
Sign In or Register to comment.