Best way to import urls with projects
OK, just looking to discuss, how we import urls etc.. to find the best optimal way to import urls for projects.
I use gscraper to scrape my urls (using there proxies)
When I have my list I then import them into Gscraper list and tell it to "remove duplicate top domain", then export the results for importing into SER.
In SER I goto options>advanced>import urls (identify platform & sort in) > from file.
This process can take many hours.....
In SER I tell my ALL my projects (T1,T2,T3 & TA1, etc) to use global list and I tick identified & submitted only.
This is how I do it. Anyone have a better solution or recommendations?
I use gscraper to scrape my urls (using there proxies)
When I have my list I then import them into Gscraper list and tell it to "remove duplicate top domain", then export the results for importing into SER.
In SER I goto options>advanced>import urls (identify platform & sort in) > from file.
This process can take many hours.....
In SER I tell my ALL my projects (T1,T2,T3 & TA1, etc) to use global list and I tick identified & submitted only.
This is how I do it. Anyone have a better solution or recommendations?
Comments
It really is better to put a little effort into finding answers to your questions first before you just ask one that has already been answered several (or many) times.