How to import a SB scraped list into "failed" in GSA?
So I did a scrape with scrapebox today, exported it as txt. I had 4k URLs and wanted to import them into GSA, did so with the options > advanced > tools > "import URLs (identify platform + sort in)". Once finished, most of the 4k URLs were added to the "identified" list.
However, I'd actually like to have all my scrapebox scraped list to be separate from what GSA found. In some threads in this forum I often saw people posting about how they imported a list as "failed", then set up a dummy project to only use the global "failed" list. (@AlexR, great tip)
When I try to import my SB txt export into "Import site lists > failed", GSA wants to have a .sl file, not a .txt.
How can I import my SB scraped list only into the "failed" global site list in GSA (and obviously have GSA identify and sort by platform as well)?
Thanks.
However, I'd actually like to have all my scrapebox scraped list to be separate from what GSA found. In some threads in this forum I often saw people posting about how they imported a list as "failed", then set up a dummy project to only use the global "failed" list. (@AlexR, great tip)
When I try to import my SB txt export into "Import site lists > failed", GSA wants to have a .sl file, not a .txt.
How can I import my SB scraped list only into the "failed" global site list in GSA (and obviously have GSA identify and sort by platform as well)?
Thanks.
Comments
Would appreciate if someone who's doing this right now could comment.
However, I am not sure if this is going to work with plain .txt file and not a list file.
Another option would be to
go to ser > options > advanced > tools > import urls (identify platform and sort in) > from file.
+ cut all the previous identified urls out and put them in another folder. Afterwards you can cut out new identified results (from your sb list) and get back to previous routine.I hope this makes sense.
I understand what you mean with your 2nd suggestion, that's actually what I discovered the same time you replied, lol. Thanks man.