Skip to content

how do I make a .sl file

so using scrapebox, I have a ton of urls scraped, how do I make that list of urls.txt into an .sl file so I can import to the folder where I want it to go (identified)?

Or am I jumping the gun, and need to break it up first into platforms?

Comments

  • No way to do that. Put the links in SER and when you verify them, you can export them in ".sl".
  • ok thanks - my problem, if it even is one, is that I use serlists, so am pulling those from my identified folder for current submissions - what's the workaround?
  • ronron SERLists.com
    edited June 2014

    @jaybee - I would never do what you are planning. You end up screwing up the good list (you never want to mix in a raw scrape with a golden verified list - ever). Always keep the list separate from these other URL's.

    I think what you want to do is ultimately get everything in the proper folder. So you dump the scrapebox scrape into junk tiers/lower tier prtojects with no limits, and simply let it process. It will then go into verified. You can/should always dedupe Verified at the URL and Domain level. Whatever you do, before using verified as a sitelist, please dedupe first.

    When Scrapebox URLs are done processing, they are fed into Verified. You can then export the verified list. At that point (when you have successfully exported) - you have a .sl file. At this point, you can import that .sl file into identified if this is where you keep your list. So you can merge .sl files at any time. Just remember that in order to even get an .sl file, SER literally has to process URLs and write them into a folder. That is where they are separated by engine type. Only in that format can you use the export function to create a .sl file.

    I hope you just understood what I wrote, lol.

  • Hey Ron... no, not really lol - or actually maybe sorta, now I've also typed my stuff while referring back as well :D

    Here's what I did:

    1. exported my current verifieds to it's own .sl file (also have a backup of serlists .sl file)

    2. imported my scrapebox'd urls > identify > sort platform

    3. deleted all current txt files in my 'verified' folder, so only new verifieds will now get written there

    What I *think* is now gonna happen is that my identified list *within* SER (which is about 900k, not sure if that still includes the imported serlists.sl or not) will get posted to (or attempted to) by SER, and the verifieds get spit out so that I can then end up with a cleaned list for export then import into identified (along with also importing my current backup.sl and serlists.sl, and in order to achieve that I need to

    4. turn off 'use global site lists' for each junk/tier project

    is that about right or am I still getting confused?




  • Thanks Ron! That exactly answers my question as I've been cleaning my scrapes for a few weeks now. So next I will export as sl, and it probably wouldn't be a bad idea to create a new verified folder for this link build and keep my other verified list separate because that one's been being created from paid lists...
Sign In or Register to comment.