Skip to content

How to work with scraped/verified URL list in GSA SER?

There is something about this procedure that I just don't get and hoping someone here can help. I've got a list (from a reliable source) of scraped URLs, verified, and a mixture of types of sites where I can add a link; ie, article site (articleMS, article beach, article friendly, etc), blogs (blogspot, blogengine, wordpress, etc), directories (indexu, myengines, otwarts mini, phplinks, etc), forums (aska bbs, burning board, ipboard, minibb, ipboard, etc), guest books (various), social bookmarks (various), you get the idea.

I think I know how to get them into GSA SER (advanced, import), but then what? Do I import them one at a time or just import them ALL and GSA SER will figure it out? How to I tell GSA SER to NOT look in the search engines but to only use the imported URLs?

Some help and direction here would be greatly appreciated.

Comments

  • goonergooner SERLists.com
    edited July 2014
    To tell SER not to use search engines, you can uncheck all search engines in the project options.
    To import your list, simply import the URLs into a project. SER knows what to do with them.
  • s4nt0ss4nt0s Houston, Texas
    I see you imported via advanced settings but you can also right click on a project > import target URLS. As gooner mentioned, you can disable the search engines in your project and it will only post to the imported list.

    Disable search engines by right clicking in the search engine selection box > check none.
  • looplineloopline autoapprovemarketplace.com
    Does it matter, for most targets in SER if you import the domains trimmed to last folder or if you import the end url?  In my tests it "seemed" to work better to import the end url of a page where a link is created, rather then the last folder.  But just wondered what the general consensus was. 
  • Hi @loopline, I just import as-it-is. I'm sure SER will "know" what to do with them, trim or whatever. Sven is a very talented programmer...
  • edited July 2014
    If the link itself is dead however and there is no footprints on the target page, then SER would have difficulties posting and will give "no engine matches". Its valid at least for profiles links (you can try for yourself giving SER working link VS link with garbage to "emulate" 404). So it is good practice to have some types of engines links trimmed nonetheless.
  • Yes, it's certainly a good practice to trim the urls. It's an additional step (or steps), and if you'll to do everyday, it become a chore. I'm hoping SER is clever enough to trim the urls if required. So @Sven, do we have to trim the urls to last folder?...
  • SvenSven www.GSA-Online.de

    You can leave the URL as it is and SER should know what to do with it. If you trim to root or folder it might have some content not being part of the engine itself in cases where the engine was installed to a sub folder.


  • Thanks Sven...
  • looplineloopline autoapprovemarketplace.com
    Thanks as well Sven, and for the good tool as well.
  • yea, what they said....   ;-)

Sign In or Register to comment.