Skip to content
blog.saurav.io
Go back

From Score 79 to 94: How I Optimized My Static Blog for Speed

From Score 79 to 94: How I Optimized My Static Blog for Speed

Building a blog with a static site generator like Hexo is naturally fast, but “fast” isn’t automatic.

Yesterday, I ran a Google Lighthouse report on this site and was humbled. Despite being a static site on CloudFront (a globally distributed CDN), my performance score was a mediocre 79.

The culprit? Huge, unoptimized images.

I was serving 5MB PNG files directly to the browser. Here is how I used a bit of Python and DevOps to fix it, dropping my page load size from 20MB to under 3MB and boosting my score to 94 (with 0ms Total Blocking Time).

1. The Image Problem

I had several high-definition screenshots and AI-generated cover images that were PNGs. Some were over 5MB.

Minifying HTML and CSS (hexo-all-minifier) helped slightly, but it was like trimming the fingernails of a giant. The weight was all in the visuals.

2. The Solution: Automated WebP Conversion

Instead of manually converting every image, I wrote a Python script to scan my directory, identify the “heavy hitters” (>400KB), and automatically:

  1. Resize them to a max width of 1200px (retina ready, but not 8K).
  2. Convert them to WebP, a modern format that offers superior compression.

Here is the script I used (generalized for you to use):

import os
from PIL import Image

# Configuration
# UPDATE THIS PATH TO YOUR OWN
IMAGE_DIR = "/path/to/your/blog/source/images" 
MAX_WIDTH = 1200
SIZE_THRESHOLD_KB = 400

def optimize_images():
    print(f"Scanning {IMAGE_DIR} for images larger than {SIZE_THRESHOLD_KB}KB...")

    for filename in os.listdir(IMAGE_DIR):
        if not filename.lower().endswith(('.png', '.jpg', '.jpeg')):
            continue

        filepath = os.path.join(IMAGE_DIR, filename)
        file_size_kb = os.path.getsize(filepath) / 1024

        if file_size_kb > SIZE_THRESHOLD_KB:
            print(f"\nOptimization Candidate: {filename} ({file_size_kb:.2f} KB)")
            
            try:
                with Image.open(filepath) as img:
                    # Resize if too wide
                    if img.width > MAX_WIDTH:
                        ratio = MAX_WIDTH / img.width
                        new_height = int(img.height * ratio)
                        img = img.resize((MAX_WIDTH, new_height), Image.Resampling.LANCZOS)

                    # Convert to WebP (Quality 75 is the sweet spot)
                    webp_filename = os.path.splitext(filename)[0] + '.webp'
                    webp_path = os.path.join(IMAGE_DIR, webp_filename)
                    
                    img.save(webp_path, 'WEBP', quality=75)
                    
                    print(f"  -> Converted to {webp_filename}")

            except Exception as e:
                print(f"  -> Error optimizing {filename}: {e}")

if __name__ == "__main__":
    optimize_images()

The Result:

3. Enabling Lazy Loading

Even with smaller images, loading 20 images at once slows down the “First Contentful Paint” (FCP). I installed hexo-lazyload-image so that images only load when you scroll them into view.

In _config.yml:

lazyload:
  enable: true
  onlypost: false

4. Cache Control Headers

Finally, Lighthouse complained that I wasn’t using browser caching efficiently.

Since I deploy to AWS S3 using GitHub Actions, the default aws s3 sync command doesn’t ensure long-term caching. I updated my workflow to explicitly tell CloudFront and browsers to cache images for 1 year:

- name: Sync files to S3
  run: |
      # Sync assets with long cache headers (1 year)
      aws s3 sync public/images/ s3://your-bucket/images/ --cache-control "max-age=31536000,public"
      aws s3 sync public/css/ s3://your-bucket/css/ --cache-control "max-age=31536000,public"
      
      # Sync everything else
      aws s3 sync public/ s3://your-bucket

5. The Final Mile: Lazy Loading JavaScript

After fixing images, Lighthouse still complained about “Unused JavaScript” from Twitter and Google Analytics blocking the main thread (Approx 500KB).

I didn’t want to lose my analytics, but I didn’t want to slow down my user’s initial load either.

The Fix: I explicitly disabled the default “eager” tracking in my _config.yml:

google_analytics: false
share: false

And replaced it with a custom Lazy Loader (scripts/lazy_load.js) that only runs when the user actually interacts with the page (scrolls or moves mouse):

/* global hexo */
'use strict';

hexo.extend.injector.register('body_end', `
<script>
document.addEventListener('DOMContentLoaded', () => {
    let fired = false;

    const loadScripts = () => {
        if (fired) return;
        fired = true;

        // Lazy Load Google Analytics
        const gaId = 'G-V8E3MHGHZQ';
        const gaScript = document.createElement('script');
        gaScript.src = 'https://www.googletagmanager.com/gtag/js?id=' + gaId;
        gaScript.async = true;
        document.head.appendChild(gaScript);
        
        // ... init GTAG ...
        
        console.log('Lazy loaded 3rd party scripts');
    };

    // Trigger on interaction or delay
    window.addEventListener('scroll', loadScripts, { once: true });
    window.addEventListener('mousemove', loadScripts, { once: true });
    setTimeout(loadScripts, 3500); // 3.5s fallback
});
</script>
`);

This dropped my Total Blocking Time (TBT) to 0ms.

Conclusion

Optimization isn’t always about complex code changes. Sometimes it’s just about managing your assets intelligently. By automating image compression and configuring proper cache headers, I cut the site’s weight by 75% without changing a line of feature code.

Bonus: Hitting 100% on SEO

After pushing these performance fixes, I realized I was missing one key element: Meta Descriptions. My SEO score was lagging because search engines didn’t have a clear summary of my content.

I added a custom description field to every post’s frontmatter and updated my global _config.yml with a site-wide summary. The result?

A perfect 100 score in SEO and Best Practices.

Now, back to building.


Share this post on:


Previous Post
AI Isn't Replacing Jobs—It's Making Everyone an AI Engineer
Next Post
Ask Better Questions or Get Left Behind