can i speed up the import/indentify function?
googlealchemist
Anywhere I want
Is the import function on a set number of threads in and of itself? Can I speed that up somehow to import large lists faster?
Comments
If it was me importing a big list, I dont bother.
I add the list to a t4 and that way it gets sorted into submitted and verified
How did you figure this out and how much time per day do you spend working with SER options? Thanks a ton for all the helpful information!
Im known for being a ser geek and a lot of the tricks I use have been shared on here
On a good day i can do 300,000 submissions with ser. Thats been shown on these forums
A lot of my tricks are being recommended by others and they are getting good results, some have jumped from 20k submissions a day to over 100k
Simple things like minimum search engine choices, streamlining engines you submit to. Only using engines cb works with
Im constantly trying to find ways to cut down on time and push ser to the limits
If you import the way you were, the link will be in the identified site lists
Something I dont use, I only use submitted and verified lists
By doing it my way, you cut out the middle man.
if you dont run a t4, use a t3. Import the list once and have those links put where they can be pulled from quicker and gain the results quicker.
In all honesty, Im a lazy sob.
I just import the list under normal running
@LeeG - That's a great tip. Let a T4 figure out what sites are postable to, and then only use submitted and verified.
I take back that thing about you running around with your pants on fire
How do some of you lot posses the ability to breath at times :O
Come on, scrape a list with gscraper free version, use the free tool by Santos to extract the footprints
Leave that running over night with its 100,000 url limit and bang, following day add them to a t4
Easily sorted into the two main site list areas.
At worst you have a load of junk links on a lower tier
No doubt given time you will find a method to complicate a simple task by adding your quantum physics girly tool methods
Gscraper for the win.
Can someone point me to the tool Santos created for extracting footprints?
Ozz, dont give them any more methods. They will be desperately confused and trying to over complicate things by adding g force calculus and pi x infinity type calculations to work out how many times a sparrow flaps its wings and then adding that to the final method of getting the best footprints to use
Not saying ron does that, but I might be implying it
A little trick with gscraper free to get your 100k results maxed is to remove duplicate domains at scraping
Im testing the free version at the moment to see how good it is
Its more basic than scrapebox. Less confusing for us old wise ones
Im also running a proxy scraper 24/7 now to feed it fresh proxies on each run.
If any asks, Proxy Multiply. 7 day free trial
why use the tool for the footprints when you can just use "search online for URls" and get your footprints from there??
oh, btw: trying to get my auto ip changer script working for my fritzbox (so I can scrape mor with gscraper)...
glad I found this SER tool... so I can finally fix my computer games addiction.. and spend my free time somehwere else..
Why use your home ip with gscraper when I just pointed out a proxy scraper with a fully working 7 day trial
Im sure there used to be a discount thread on bhw at one time for it.
Santos tool will save the footprints to file, plus it does some other magic thing which makes the footprints work with scrapebox
If you have access to a proxy scraper, one I mentioned above has a full 7 day trial, you run that for a few hours and then import the list of scraped proxies
I had a look last night and they are doing a 50% off deal on bhw
I use the wrong type of proxies, but you might get away with private proxies you can set to your ip, so no log in and password needed