Skip to content

Another "is that normal" question - about site list

edited February 2013 in Need Help
My sitelist #:
identified: 209424
successful: 32609
verified: 4886
failed: 88163
So, is that normal? How is yours?
Don't be furious, here is some settings I think could be related to the results
All URL scraped by SER
Dedicated proxies used everywhere, daily checked
Use CB & DBC against all captchas, no PR filter
Different engines used, depending on t1, t2, t3, altogether, only video & document sharing NOT used.
100K keywords
Use only do-follow URLs for multi-tier > unchecked
Only verify URLs with a PR of x > unchecked
When to verify > auto
try to always place an URL with anchor text in description/comments > checked
Continuously try to post to a site even failed b4 > checked
Always use keywords to find target sites > unchecked
Use URLs from sitelist > yes to all 4
Analyse & post to competitors backlinks > unchecked
Filter URLs: only "avoid posting  urls on same domain twice" & "put url to places where it is clearly seen as spam" CHECKED.

Try to do a verifeid/identified analyse like @LeeG, but that's gonna leave me nowhere to post with this sitelist I got.
Is that submitted/identified need to be improved? Or verified/submitted? Or only thing need to be done is wait for a larger sample?
Please help. Any other info needed?

Comments

  • OzzOzz
    edited February 2013
    what kind of captcha solving service are you using?

    apart from that, everything "is normal" because thats how you've set your projects up. you may want to use "submitted" list only though unless your sample size has increased by time.

    and make sure your proxies are working as expected.

    PS: there is no magic button to make everything work. all you need is a good understanding how things work. LeeG was kind enough to share his tips. now its up to everyone to implement this in a proper way to their projects.
  • Forgot to mention, daily remove duplicate urls & domains, but after reading this today https://forum.gsa-online.de/discussion/2395/removing-duplicates-from-sitelist-feature#Item_17, only urls.

    @Ozz, thanks for fast reply. I use CB as 1st solver, deathbycaptcha as 2nd.
    Will try only pull urls from submitted list. Anything else I can do? 210k identified is not such a small size imho, if more submission/verify could be made out of it (of course I will keep enlarging iddentified list), I will be glad.
    Damn, write this reply on my 3.7 inch mobile is definitly nightmare.
  • As Leeg has pointed out, it's not about looking at the totals for submitted and verified. It's about looking at each engines success and verified to create a percentage of each one. for example. If a wiki submission is 10 and verified is 9, well hell that 90%. However if a web 2.0 is 15 submitted and 1 verified thats less than 8%. So, it might be in your best interest to stop submitting to that web 2.0. It's up to you to figure out what an acceptable percentage is per engine and not as a total of ALL submissions and verifies.
  • @scp yea already got that, and I see some misunderstandings.
    What I m saying is, with this TOTAL numbers, 1.9% verified/identified in ALL, means really few single ONES works good for now(it's simple math). If put these #s into specific engine like u said, I'd love to ask, well, why is EVERY single engine has awful result. After roughly go through the list, I can confirm that less than 10 engines has higher ratio than 10%. If this is the right conclusion, all I need to do is kick out like 95% engines. It is a waste, and I know there must be something wrong.

    I doubt 
    1. either my verified sample is too small to get a conclusion, nobody can tell from 210K verified list until it's tripled or 10x bigger
    2. or im doing somthing wrong with my settings/something else maybe, and cause the lower submitte/verify rate. 
    Either way, I can't get the result from the engine performance analyse as expected, 1st thing need to be done is, get a good/normal sample.
    Just wanna know which one is it, may be share your #s of list could help for the 1st step.

    @LeeG @ron am I missing something, could you please help? thx

  • LeeGLeeG Eating your first bourne
    edited February 2013

    A lot of poor results can also be caused by the content you use

    Pull anything that is standard in ser

    Sven has blog comments to help people to start off

    How many of those comments are used time and time again by a lot of ser users

    Again, the standard of the articles you use.

    Look at it from the other side, if you was running as site and had the article posted. Would you read it and think WTF, what is this bollocks being posted and then delete it and add the email to a spam database

    Some engines are high submissions and submission vrs verified will look poor. But the poor will be high numbers. Blog comments is one for that

    Directory submissions, another iffy one. These can take months to get verified if your lucky.

    Build those on lower tiers and chances are, by the time its been approved by the owner, the link its aimed at will have been deleted

    Only use submitted and verified global sites lists. I know Ozz only uses submitted

  • Thx @LeeG
    Switched into submitted list only, for an hour. will keep it for 11 more then change into id & verifeid, keep it for 12 hrs for comparing how the list goes.

    Speaking of content, i was like "WTF, how come that haven't come into my mind till now?! Seriously?!" :D
    I use default comment, thinking of replacing with scraped, manually choose & spin comments.
    Same for microblogs, Social bookmarks, I should check some good ones, then be "creative" with my own.

    In contrast, I remember @ron use kontent machine generated articles in tier1, is that looks like "WTF, what is this bollocks being posted" stuff? COME ON it's not fair! How could u done it with this poor content!
    :-))  anyway, what's in your mind?
  • LeeGLeeG Eating your first bourne

    I use AutoContentWriter for articles

    But I have no idea how it compares to other article generators on the market

    I know it gets me a lot of wiki links

  • What is the best way to focus on getting the wiki links? I dont see an actual platform for it.
  • on Article Platforms (Mediawiki, moinmoin and tikiwiki)
    Make sure you select on Filter urls > Put URL to places where it is clearly seen as spam
  • articles -> mediawiki, tikiwiki, moodle

    also activate the "post to sites where its clearly seen as SPAM" option in your project settings to post articles to those sites.
  • 1 minute faster  B-)
  • :D
    you forgot moodle and i forgot moinmoin though.
  • LeeGLeeG Eating your first bourne

    But at least one is not listed in cb ;)

    I looked at the question and thought "your avin a laugh"

    He must be looking with his eyes wide shut :D

  • AlexRAlexR Cape Town
    When sorting your platforms to decide which to use for successful submission rates, what ratio and threshold are you using?
  • thanks guys gonna test that to see if i can get more contextual links even if they are only from wikis, just been scared to push that button to this point of post where seen as spam lol
  • mediawiki is lots nofollow, tiki is lots of profiles...... moodle is ok.....

    look at the mediafire links:

    https://forum.gsa-online.de/discussion/1630/links-per-minute/p5
  • edited February 2013
    @GlobalGooglerpersonally, Im thinking at least 10% verfied/id. Found out something interesting, daily ratio of that # hit 10% - 20% most time(from status bar), when comes to the list, only 2%. yea now I guess delete duplicate url/domain in wrong way is something I can blame \:D/ Thanks to you and @Leeg for that. Gonna keep the projects running and only delete some dup urls to see if it gets better.
  • Update: 
    Only submitted list chosed run for 12 hrs, then switch into active(verify only) run for 20mins when verified #s seemed stable.
    10980 submitte/1146 verified

    update coming soon
Sign In or Register to comment.