Skip to content

Isn't it time to expand the very hidden, invisible scraper of GSA SER?

Why isn't there support for GSA SER's scraper ?

wouldn't it be cool to scrape sites based on keywords differently than the keywords you want to rank for? 

this means that I can set up to scrape with GSA SER on my selected engines for Nintendo, Even though I just want Super Mario Keyword backlinks / anchors / brand names etc.. 

Right now you are forced to put those keywords which it will scrape for and build links for. 


Everything is so in detail, expanded in such great detail, yet this scraper GSA SER is performing in the background is more mythical than ancient ninjas with near zero options and possibilities. 

Comments

  • Why isn't there support for GSA SER's scraper ?
     Because anyone serious about scraping is using a dedicated scraping software like Scrapebox.

    wouldn't it be cool to scrape sites based on keywords differently than the keywords you want to rank for?
    SER doesn't use the keywords you put in the "keywords" field for most of the sites. It uses the keywords mostly for comment engines and if you tick the box "always use keywords when searching for targets". Otherwise it goes with the footprints and scrapes everything it can.

    If you want to learn more about the workings of the inbuilt scraper you can read the forum. There were plenty of questions asked and answered about it during the years.
    In case you would like to ask seven about building a GSA scraper there is already one built into GSA Proxy Scraper. It's in the tools and called " Search engine parser"

  • s4nt0ss4nt0s Houston, Texas
    @matzedoon - Why not use the scraper in the advanced menu? 



    Options > advanced tab > tools button > search online for URLS. 

    Maybe I misunderstood what you meant, but there is a separate scraper you can use within SER located in the tools menu. 

  • DeeeeeeeeDeeeeeeee the Americas
    edited September 2017
    @Sven or @s4nt0s or anyone else who might know...

    Are the keywords only able to be added via file? If so, what format files? I just used .txt and it worked, but what other options are there?

    Also, does "Save to Custom file" save the present [Footprints] only that are in the top-left window, regardless of how they got there (user input or selected in GSA-SER)? Thanks!

    I'm running this module for the first time right now!!!!! :) Very cool!!





  • DeeeeeeeeDeeeeeeee the Americas
    edited October 2017
    If I abort (as oppose to just PAUSE)  a task that is running, will I lose the data thus-far collected, or will I be prompted to save? Thanks!

    (I have something running right now and don't want to lose the URLs found during the many hours it's been on if I have to wait for it to complete!)
  • DeeeeeeeeDeeeeeeee the Americas
    edited October 2017
    I finally decided to hit ABORT, as I chose a broad set of targets and keywords to scrape for and it was still not done after all this time.

    I was NOT prompted with an option to save the data. GSA-SER also closed with the scraper window, as soon as I hit ABORT.

    Have the URLs been saving all along to a file that gets appended with each new passed URL?  If so, where is the file and what would its name be?


    * * * * * * * * * * * * * * *

    The answer is, the checkbox [ ] with "Save to custom file" cannot be ignored. I didn't set this, so I have no files to find. lol

    hmm...I just went back to look over the module and re-test...and GSA-SER indicated in an information bubble that it was updating resources...(I've never seen this message before)

    And, when I hit abort this time, a window popped up  with "Added 0/0 URLs to site list." at the top, and SAVE and OK buttons at the bottom. That makes sense, because I only ran it for a few seconds.

    So there IS a way to save when you abort a scrape in progress, apparently even when you don't have the scraper set to save as a custom file.

    It just didn't display that black-background menu the first time. :*  I don't know why, but GSA-SER also closed when I hit ABORT and so I never saw that menu.

    And I didn't set GSA-SER's scraper to save the URLs to a file, either.

    Now I know. :)








Sign In or Register to comment.