Skip to content

can i speed up the import/indentify function?

googlealchemistgooglealchemist Anywhere I want
edited February 2013 in Need Help
Is the import function on a set number of threads in and of itself? Can I speed that up somehow to import large lists faster?

Comments

  • BrandonBrandon Reputation Management Pro
    The import function uses your thread count. I would be interested in another way to increase this as well as I do a lot of importing.
  • LeeGLeeG Eating your first bourne

    If it was me importing a big list, I dont bother.

    I add the list to a t4 and that way it gets sorted into submitted and verified

  • @LeeG - Just to clarify, the way you import your URLs is by right clicking your Tier 4 campaign and click Import Target URLs and not going into Options -> Advanced -> Tools -> Import URLs? By doing it this way, it's automatically sorted for you?

    How did you figure this out and how much time per day do you spend working with SER options? Thanks a ton for all the helpful information!
  • LeeGLeeG Eating your first bourne

    Im known for being a ser geek and a lot of the tricks I use have been shared on here

    On a good day i can do 300,000 submissions with ser. Thats been shown on these forums

    A lot of my tricks are being recommended by others and they are getting good results, some have jumped from 20k submissions a day to over 100k

    Simple things like minimum search engine choices, streamlining engines you submit to. Only using engines cb works with

    Im constantly trying to find ways to cut down on time and push ser to the limits

     

    If you import the way you were, the link will be in the identified site lists

    Something I dont use, I only use submitted and verified lists

    By doing it my way, you cut out the middle man.

    if you dont run a t4, use a t3. Import the list once and have those links put where they can be pulled from quicker and gain the results quicker.

  • To force SER to use your imported URLs on that tier, do you not choose any search engines under the Options tab for that tier?
  • yes, also "analyse and post to competitor links" needs to be unchecked.
  • Excellent thanks for the help @LeeG and @Ozz
  • I guess I would also need to un-check "Use URLS from Global Site List if enabled"?
  • LeeGLeeG Eating your first bourne

    In all honesty, Im a lazy sob.

    I just import the list under normal running :D

  • @scp: yes,  thre project should stop then once SER has gone through your imported list

    @LeeG: me too :)
  • ronron SERLists.com

    @LeeG - That's a great tip. Let a T4 figure out what sites are postable to, and then only use submitted and verified.

    I take back that thing about you running around with your pants on fire :)

  • LeeGLeeG Eating your first bourne

    How do some of you lot posses the ability to breath at times :O

    Come on, scrape a list with gscraper free version, use the free tool by Santos to extract the footprints

    Leave that running over night with its 100,000 url limit and bang, following day add them to a t4

    Easily sorted into the two main site list areas.

    At worst you have a load of junk links on a lower tier

    No doubt given time you will find a method to complicate a simple task by adding your quantum physics girly tool methods

  • ronron SERLists.com

    Gscraper for the win.

  • scpscp
    edited February 2013
    #AnotherGreatLeegTip - Thanks as always Leeg. I think you might start trending!

    Can someone point me to the tool Santos created for extracting footprints?
  • search for "GSA footprint extractor" or something like these in google.
    but it has a bug that it doesn't convert "Umlauts" IIRC.

    what you should do imo is to click
    options -> advanced -> tools -> search online for URLs -> add predefined footprints -> add all from XXX

    do this for each platform you like to scrape for and copy/paste the footprints to a text file or directly to your scraper.
  • I like LeeGs idea to use a lower tier to import it.. But for Gscraper: You cant let it run over night.... without proxies, your IP will get banned fast..


  • LeeGLeeG Eating your first bourne

    Ozz, dont give them any more methods. They will be desperately confused and trying to over complicate things by adding g force calculus and pi x infinity type calculations to work out how many times a sparrow flaps its wings and then adding that to the final method of getting the best footprints to use

    Not saying ron does that, but I might be implying it

    A little trick with gscraper free to get your 100k results maxed is to remove duplicate domains at scraping

    Im testing the free version at the moment to see how good it is

    Its more basic than scrapebox. Less confusing for us old wise ones

    Im also running a proxy scraper 24/7 now to feed it fresh proxies on each run.

    If any asks, Proxy Multiply. 7 day free trial

  • back to the no-rocket science approach_

    why use the tool for the footprints when you can just use "search online for URls" and get your footprints from there??
  • scpscp
    edited February 2013
    @thisisalex - You can increase your LPM by focusing SER on posting and not scrapping.
  • edited February 2013
    tell me how to add proxies to gscraper? the basic version does not allow proxies?

    oh, btw: trying to get my auto ip changer script working for my fritzbox (so I can scrape mor with gscraper)...

    glad I found this SER tool... so I can finally fix my computer games addiction.. and spend my free time somehwere else..
  • LeeGLeeG Eating your first bourne

    Why use your home ip with gscraper when I just pointed out a proxy scraper with a fully working 7 day trial

    Im sure there used to be a discount thread on bhw at one time for it.

    Santos tool will save the footprints to file, plus it does some other magic thing which makes the footprints work with scrapebox

  • ronron SERLists.com
    edited February 2013
    @LeeG, thanks for only implying that I might have a structured approach lol.
  • I get it......... you cant used private proxies in Gscraper, but "open" proxies..
  • LeeGLeeG Eating your first bourne

    If you have access to a proxy scraper, one I mentioned above has a full 7 day trial, you run that for a few hours and then import the list of scraped proxies

    I had a look last night and they are doing a 50% off deal on bhw

    I use the wrong type of proxies, but you might get away with private proxies you can set to your ip, so no log in and password needed

Sign In or Register to comment.