Skip to content

Adding a list - Trim to Root or Leave full URL

I plan to create new project, using only articles and blogs.
When importing SB list to GSA SER should I:
a) trim everything to root and let SER find its own way how to post article in it ?

b) upload full list that contains full URLs ? I have created a list using reverse enginering of my competitors, so the full list contains all the URLs they have created in the past.

Related question:
Is it possible to import URLs for new project without SER trying to identify the platform? I have already sorted them, so no need to do that once again. However, I do not see any option in Advanced > Tools>Import URLS>From File. Am I missing smth.?

Comments

  • 2Take22Take2 UK
    edited October 2013
    @Rayban, upload the list as is, don't trim to root, just de-dupe it first in scrape box for best results.
  • @2take2 - does this mean it is easier for SER to use complete URLs?

  • @sven is probably the best person to ask about this, but my take on it is;

    If you trim to root, you may come across situations where the SER footprint is not on the page, so the program cant identify it, and also there may be extra clicks required to submit to it, which SER won't be able to do.

    I.e, Some sites have a blog in a subfolder, if SER lands on the home page there will be no way to submit from that page, and the submission will fail.

    If you use SER footprints to scrape, then you want to feed it URLs where that footprint has been previously found.
  • SvenSven www.GSA-Online.de
    @2Take2 is right...don't trim even though for most platforms it would not matter.
  • @sven - could you please also give some comments regarding 2nd question:
    "Is it possible to import URLs for new project without SER trying to identify the platform? I have already sorted them, so no need to do that once again. However, I do not see any option in Advanced > Tools>Import URLS>From File. Am I missing smth.?"
  • You can just right click on the project in the SER gui > Import Target URLs > From File and select the file with the sorted/identified urls you want to use.
  • SvenSven www.GSA-Online.de

    what @cherub said ;) If you want to skip identification of the engine to use, you have to import them with...

    http://someurl.com#engine_name

  • nawshalenawshale Sales & Tech Support at www.SERVerifiedLists.com
    @RayBan

    @2Take2

    Can you all tell me how many keywords you guys using to merge (For Scrapebox)

    I copied GSA platform to Sb and merge
    ..with scraped keywords (1000+ ) but SB stucked ...even that happen with GSA SER too ...can you guys tell me how could you build million no duplicate list with SB or GSA SER ..how many keyword counts ... :(


    Sorry for Hijacking this tread..but felt to ask here ..since tread is related also im sorry for my english im a girl from sri lanka ..so bad english :)
  • @nawshale, you will always get duplicates when you scrape, but you can lower the percentage by using varied keywords.

    Generally, I use as many keywords as I can merge with the footprints without crashing the program, but even if you don't have a powerful computer you can still build a nice list with a small amount of keywords - it just takes longer.

    There is a nice guide to scrape box by Jacob King on his blog, maybe check it out for some tips.


  • nawshalenawshale Sales & Tech Support at www.SERVerifiedLists.com
    @2Take2

    Thank you soo much for your help im a girl from Sri lanka ..I have very low resources...Trying to earn some money like your guys ! 100% understood! Ty
Sign In or Register to comment.