To improve my site performance I'm trying to lazy load images only when they are available to viewport and to make sure google bot index them.
I'm doing lazy loading as per below steps:
- Created a component using Intersection Observer to lazy load images based on view port.
- Added following Intersection observer polyfill to support lazy loading in all browsers
- I've used <noscript> to load images in non lazy way in order for google bot to index images in a non supported JavaScript browsers.
So now I want to test this lazy load images to make sure SEO is not impacted. As Google recommended, I tested this change with Google's Puppeteer script to verify whether the lazy loaded images are properly indexed by GoogleBot. But when I run the script I get this failed response:
Lazy images loaded correctly: Failed
Found 160557 pixels differences.
Dimension image A: 300x719
Dimension image B: 300x719
I'm doing this exercise for the first time but I'm unable to find a way to fix why the script is failing.
Here are the things I am stuck on in particular:
- Do I still need to worry about SEO indexing even after using Intersection observer API with polyfill?
- Do we really need to load images inside <noscript> tag for google bot to index images in non supported JavaScript browsers?
- I'm only lazy loading images but I don't have any place holder yet to display while the image is in loading state. Could this be a reason for Puppeteer script failure?
- Is there anything else that I'm missing here and that's why the script is failing?
Edit
I also saw that Puppeteer script is waiting for certain amount of time let’s say 2000 to capture screenshots one without page scrolling and one after page scrolling. So if this is the case, then how can I load images within that time?