Making Loveable Sites SEO-Friendly with spa-ssr-renderer
Search engine optimization for single-page applications has been a persistent challenge since SPAs became mainstream. While frameworks like React, Vue, and Angular provide incredible user experiences, they often fall short when it comes to SEO because search engine crawlers struggle to execute JavaScript and index dynamic content.
I recently faced this exact problem with my Loveable-built site. After spending hours researching solutions from expensive managed services to complex SSR frameworks I decided to build something simpler: a lightweight caching server that intercepts bot requests and serves pre-rendered HTML while letting regular users enjoy the full SPA experience.
The Core Problem with SPAs and SEO
Single-page applications load a minimal HTML shell and then use JavaScript to render content dynamically. When a search engine bot crawls your site, it often sees an empty page with a loading spinner. While Google claims to execute JavaScript, the reality is inconsistent, and other search engines like Bing and social media crawlers frequently fail to render SPAs properly.
This means your carefully crafted content, meta descriptions, and Open Graph tags might never be seen by crawlers devastating for organic traffic and social sharing.
Why I Built spa-ssr-renderer
The existing solutions felt like overkill for my needs:
Full SSR frameworks require rewriting your entire application
Prerender.io and similar services cost $20-200+ monthly for what's essentially a caching layer
Static site generation doesn't work for dynamic content that changes frequently
I needed something that would work with any SPA especially Loveable sites without requiring code changes or ongoing costs. The solution: a simple Node.js proxy server that detects bots, renders pages using Puppeteer, caches the results, and serves them instantly on subsequent requests.
How spa-ssr-renderer Works
The architecture is deliberately minimal. When a request comes in, the server checks the User-Agent header against a list of known bot patterns (Googlebot, Bingbot, social media crawlers, etc.). If it's a bot, the server:
Checks if a cached version exists for that URL
If cached, serves it immediately (sub-millisecond response)
If not cached, launches a headless browser to render the page
Waits for the page to fully load (configurable timeout)
Extracts the rendered HTML and caches it
Serves the HTML to the bot
Regular users bypass this entirely and get the original SPA served directly from your hosting. This approach means zero impact on user experience while ensuring bots see your fully rendered content.
Implementation Details
The server is built with Express.js and Puppeteer, making it easy to deploy anywhere that supports Node.js. I'm hosting mine on a basic $5/month VPS, though you could easily deploy it to services like Railway, Render, or even AWS Lambda with some modifications.
Here's what makes it efficient:
Smart caching: Rendered pages are stored in memory with configurable TTL
Lazy browser initialization: Puppeteer only launches when needed
Configurable timeouts: Set how long to wait for page load based on your needs
Cache warming: Optional sitemap crawling to pre-populate the cache
The entire codebase is under 200 lines, making it easy to understand, modify, and maintain. Similar to how I approached scaling Smler efficiently, this solution prioritizes simplicity and resource efficiency over complexity.
Setting Up for Your Loveable Site
The setup process takes about 10 minutes:
# Clone the repository git clone https://github.com/singhey/spa-ssr-renderer cd spa-ssr-renderer # Install dependencies npm install # Configure your SPA URL cp .env.example .env # Edit .env and set TARGET_URL to your Loveable site # Start the server npm startPoint your domain to this server instead of directly to Loveable, and you're done. The server proxies all requests, so your site continues to function exactly as before but now it's fully crawlable.
Performance Considerations
The first bot request to any page takes 2-5 seconds because Puppeteer needs to render it. This might seem slow, but consider:
Search engine crawlers are patient they don't abandon pages like users do
Subsequent requests are served from cache in under 1ms
You can pre-warm the cache by crawling your sitemap on deployment
Regular users never experience this delay
In my testing with real Googlebot crawls, this approach results in perfect indexing. Pages that were previously showing "URL is on Google but has issues" started appearing properly in search results within days.
Real World Results
Since implementing spa-ssr-renderer on my Loveable site, I've seen:
100% of pages properly indexed by Google (verified in Search Console)
Rich previews working correctly on Twitter, LinkedIn, and Facebook
Organic search traffic increased by 340% over 3 weeks
Zero impact on user-facing performance metrics
The server handles about 1,500 bot requests per day while consuming less than 512MB of RAM and minimal CPU. Cost: $5/month for hosting versus $20+ for managed prerendering services.
Why I Open Sourced It
I built this tool because I needed it, but I'm sharing it because I believe the Loveable community and SPA developers everywhere shouldn't have to choose between great UX and SEO. Too many developers are paying for expensive services or spending weeks implementing complex SSR solutions when a simple proxy can solve the problem.
The code is MIT licensed and available at github.com/singhey/spa-ssr-renderer. I welcome contributions, bug reports, and feature requests. If you're building with Loveable or any SPA framework and struggling with SEO, give it a try.
Future Enhancements
While the current version solves the core problem, there are some interesting additions I'm considering:
Redis integration: Move from in-memory to distributed caching for multi-server deployments
Automatic cache invalidation: Webhook-based cache clearing when content updates
Analytics dashboard: Track which pages bots are crawling and cache hit rates
Edge deployment: Adapt for Cloudflare Workers or Fastly Compute@Edge
If any of these interest you, feel free to open an issue or submit a PR on GitHub.
Conclusion
SEO doesn't have to be complicated or expensive. With a simple caching proxy, you can make any SPA including Loveable sites fully crawlable and indexable without sacrificing the user experience that makes SPAs great.
This project reflects the same philosophy I apply to all my builds: solve real problems with simple, efficient solutions. Whether it's building leafpad as a developer-focused CMS or creating this SSR renderer, the goal is always to ship tools that work without unnecessary complexity.
If you're struggling with SPA SEO, clone the repo and give it a shot. And if you find it useful, consider starring it on GitHub or sharing it with others who might benefit.
Published with LeafPad[ END_OF_POST ]