What quality images should I use on my site or blog?
Deeeeeeee
the Americas
Helllllllllo GSA pplz! Hope all of you are well!
My question is this: What's most appropriate for my sites or blogs I'm making backlinks to: Clearer images that load slower versus blurrier images that load faster?
I know page load speed is important, but aren't clear photos and images, as marketing aids?
Blurry photos look cheap, I feel. But if clearer photos destroys the page load time, then what?
My question is this: What's most appropriate for my sites or blogs I'm making backlinks to: Clearer images that load slower versus blurrier images that load faster?
I know page load speed is important, but aren't clear photos and images, as marketing aids?
Blurry photos look cheap, I feel. But if clearer photos destroys the page load time, then what?
Comments
Of course you want faster page loads, but you need to find that sweet spot where load times and quality meet. This is subjective for most people obviously.
However, it is good practise to always ensure that elements you use on a page are properly optimised.
BUT...for SEO...kills page load speed.
I know different image sizes can be specified. I've done some of that, have way more to go.
Was wondering if anyone has ever based which compression level of images to load based on testing connection speed or something else?
There are various page measurement tools to help you find direction:
https://developers.google.com/speed/pagespeed/insights/
http://www.webpagetest.org/
https://gtmetrix.com/
If you control the page and can add your own CSS then perhaps use media queries to deliver images based on view port size. Pointless delivering an 800px image to a device that can only display 400px.
Trying to base delivery on the enduser's connection speed, is in my opinion, a futile task. You'll use just as much time, if not more, in latency, IF you were able to send a query down the user's line to retrieve their connection speed. I say IF, because I don't know of any way to retrieve the user's connection speed without them first allowing you to - think speedtest type sites.
**WISDOM
Thanks, I actually need to do just this right now and move on to other matters. I'll keep it in mind for a later date...
Also, srcset can also utilize, let's say jpegs, at different sizes AND levels of compression.
</code><code>
@sharonsmith884 : thank you for the suggestions. I will have to check them out.
Thought more about the idea I had, tho.
To go faster, could check every 20th row/column. Then, once an edge is found, go back 10 then 5 then 2, to get close to a fine edge to clip.
Once it's found, then go back -10 C/R, so the image random data doesn't go RIGHT on the edge.
How to find if a row/column is clear in the first place?
Color table (& image data?).
OK, this is supposed to be on-the-fly, right? So make this process faster. Same as above. Simply skip every xth cllor data within a row/column, keep a most-found colors. You can even set tolerance for colors within a range based on the sample of five or ten colors.
With a totally clear, background could use only one most-found color, #000000 or #FFFFFF, etc., b/c images sometimes do use pure colors for backgrounds. But more often, they are not.
Backgrounds are USUALLY repeating textures. But if you also have to analyse image data, this is getting sloooooooooooooow. lol
User could indicate whether to clip h or v by indicating _V/_R in filename itself.
A first analysis of the image, skipping 100 pixels, or calculate based on total L/W, and divide into set percentages, can yield a rough horizontal and vertical plotting of the image, and this process can then be applied to specific areas, saving calculations. Also will be auto H/V.