Streaming HTML: Client-Side Rendering Made Easy with Any Framework

You’ve built a beautiful frontend. It works. It’s snappy on localhost. But when you ship it? Users wait. They click. Nothing happens. And by the time your app wakes up, they’re already gone. Today, we’re breaking down what actually causes slow frontend load times—and how you can fix it with one powerful technique: streaming HTML from the server. We’ll walk through a real example using Node.js and plain Javascript, explain the problem with Client-Side Rendering, and show you how to get serious performance gains with just a few lines of code. Why Frontend Load Times Still Suck Here’s the deal. Most modern apps use JavaScript-heavy frameworks like React, Vue, or Angular. These are awesome for building complex UIs—but there’s a catch: by default, they use Client-Side Rendering (CSR). That means: When someone visits your site, they don’t get a full HTML page. They get an empty and a big fat JavaScript bundle. Only after the JS is downloaded and run, your app makes API calls, builds the UI, and then the user sees something. Now imagine your user’s on a slow connection. Or using a low-end device. Or your API server is just having a bad day. That whole process? Takes forever. Feels clunky. They bounce. And here’s the kicker: it’s not their fault. It’s not their phone. It’s not even your React code. It’s your architecture. Let me show you exactly what’s going wrong. The Problem: Death by Waterfall Let’s take a common example—a simple admin dashboard. When someone visits the page, this is what happens behind the scenes: The browser requests the HTML from the server. The server sends a minimal HTML file—just a shell, really. The browser sees script tags and starts downloading the JS bundle. After the JS is downloaded, it starts running. Then, the JS makes an API call to get data—maybe feature flags or user info. Only after that data is fetched does JS kick in and render the page. Every single one of these steps depends on the one before it. It’s a waterfall of delays. One slow step = a slow experience. Now let’s say your JavaScript bundle is big. Maybe it’s 2MB. That takes a few seconds on a good connection. Now add an API call that takes another second or two. That’s three to five seconds before your user sees anything useful. And in web performance terms? That’s death. Enter: Server-Side Rendering (SSR) One obvious solution is Server-Side Rendering. SSR flips the script. Instead of sending an empty HTML shell, your server actually renders the full HTML content on the server and sends it down. That means the user sees something immediately, even before the JS bundle arrives. That’s a huge win for perceived performance. Note: Turning a big CSR app into an SSR setup isn’t a quick flip of a switch. It’s a serious refactor. You’ll have to rethink how you handle state, manage server-side logic, and deal with tricky hydration bugs. If you’re starting fresh, go for it. But if you’re working with an existing app? Be ready—it’s a heavy lift. The Solution: Streaming HTML (CSR + STREAM) With HTML streaming, we don’t wait for everything to be ready before sending a response. Instead, we start streaming chunks of HTML to the browser as soon as we have them. Let’s break this down with an example: The user visits your site. Your server immediately sends the basic HTML skeleton, with headers, styles, maybe even a loading spinner. While the browser’s busy parsing that first chunk of HTML, the server quietly makes a non-critical API call in the background, then slips that data right into the rest of the HTML as it streams it down. As soon as that data comes in, the server sends more chunks of HTML with the actual content. Meanwhile, the browser is already downloading the JavaScript bundle in parallel. By the time your JS is ready, it can read the data already in the HTML and render the page. You’re parallelising what used to be sequential. It’s like cooking dinner while doing laundry instead of one after the other. Time saved. Experience better. Let’s look at some actual code. A Real-World Example Using Express Here’s a real Node.js Express server that streams HTML. const express = require('express'); const fs = require('fs'); const app = express(); const PORT = process.env.PORT || 3000; // Serve static files from the public directory app.use(express.static('public')); const getEmployee = async () => { const response = await fetch('https://jsonplaceholder.typicode.com/users'); const data = await response.json(); return data; } const [START_HTML, END_HTML] = fs.readFileSync('./public/index.html', 'utf8').split(''); // Route for the home page app.get('/server', async (req, res) => { let api = getEmployee(); res.write(START_HTML); try { const employees = await api res.write( `serverEmployees = ${JSON.stringify(employees)};console.log('SERVER =>', serverEmployees); ${END_HTML}` ); } catch (error) { console

Apr 19, 2025 - 07:12
 0
Streaming HTML: Client-Side Rendering Made Easy with Any Framework

You’ve built a beautiful frontend. It works. It’s snappy on localhost. But when you ship it? Users wait. They click. Nothing happens. And by the time your app wakes up, they’re already gone.

Today, we’re breaking down what actually causes slow frontend load times—and how you can fix it with one powerful technique: streaming HTML from the server. We’ll walk through a real example using Node.js and plain Javascript, explain the problem with Client-Side Rendering, and show you how to get serious performance gains with just a few lines of code.

Why Frontend Load Times Still Suck

Here’s the deal. Most modern apps use JavaScript-heavy frameworks like React, Vue, or Angular. These are awesome for building complex UIs—but there’s a catch: by default, they use Client-Side Rendering (CSR).

That means:

  • When someone visits your site, they don’t get a full HTML page.
  • They get an empty
    and a big fat JavaScript bundle.
  • Only after the JS is downloaded and run, your app makes API calls, builds the UI, and then the user sees something.

Now imagine your user’s on a slow connection. Or using a low-end device. Or your API server is just having a bad day.

That whole process? Takes forever. Feels clunky. They bounce.

And here’s the kicker: it’s not their fault. It’s not their phone. It’s not even your React code. It’s your architecture.

Let me show you exactly what’s going wrong.

The Problem: Death by Waterfall

Let’s take a common example—a simple admin dashboard.

When someone visits the page, this is what happens behind the scenes:

  • The browser requests the HTML from the server.
  • The server sends a minimal HTML file—just a shell, really.
  • The browser sees script tags and starts downloading the JS bundle.
  • After the JS is downloaded, it starts running.
  • Then, the JS makes an API call to get data—maybe feature flags or user info.
  • Only after that data is fetched does JS kick in and render the page.

Streaming HTML: The Performance Hack Nobody Talks About: DEVSMITRA(RAHUL SHARMA)

Every single one of these steps depends on the one before it. It’s a waterfall of delays.

One slow step = a slow experience.

Now let’s say your JavaScript bundle is big. Maybe it’s 2MB. That takes a few seconds on a good connection. Now add an API call that takes another second or two. That’s three to five seconds before your user sees anything useful.

And in web performance terms? That’s death.

Streaming HTML: The Performance Hack Nobody Talks About: DEVSMITRA(RAHUL SHARMA)

Enter: Server-Side Rendering (SSR)

One obvious solution is Server-Side Rendering.

SSR flips the script. Instead of sending an empty HTML shell, your server actually renders the full HTML content on the server and sends it down. That means the user sees something immediately, even before the JS bundle arrives.
That’s a huge win for perceived performance.

Note: Turning a big CSR app into an SSR setup isn’t a quick flip of a switch. It’s a serious refactor. You’ll have to rethink how you handle state, manage server-side logic, and deal with tricky hydration bugs. If you’re starting fresh, go for it. But if you’re working with an existing app? Be ready—it’s a heavy lift.

The Solution: Streaming HTML (CSR + STREAM)

With HTML streaming, we don’t wait for everything to be ready before sending a response. Instead, we start streaming chunks of HTML to the browser as soon as we have them.

Let’s break this down with an example:

  • The user visits your site.
  • Your server immediately sends the basic HTML skeleton, with headers, styles, maybe even a loading spinner.
  • While the browser’s busy parsing that first chunk of HTML, the server quietly makes a non-critical API call in the background, then slips that data right into the rest of the HTML as it streams it down.
  • As soon as that data comes in, the server sends more chunks of HTML with the actual content.
  • Meanwhile, the browser is already downloading the JavaScript bundle in parallel.
  • By the time your JS is ready, it can read the data already in the HTML and render the page.

Streaming HTML: The Performance Hack Nobody Talks About: DEVSMITRA(RAHUL SHARMA)

You’re parallelising what used to be sequential. It’s like cooking dinner while doing laundry instead of one after the other. Time saved. Experience better.

Streaming HTML: The Performance Hack Nobody Talks About: DEVSMITRA(RAHUL SHARMA)

Let’s look at some actual code.

A Real-World Example Using Express

Here’s a real Node.js Express server that streams HTML.

const express = require('express');
const fs = require('fs');
const app = express();
const PORT = process.env.PORT || 3000;

// Serve static files from the public directory
app.use(express.static('public'));

const getEmployee = async () => {
  const response = await fetch('https://jsonplaceholder.typicode.com/users');
  const data = await response.json();
  return data;
}
const [START_HTML, END_HTML] = fs.readFileSync('./public/index.html', 'utf8').split('');

// Route for the home page
app.get('/server', async (req, res) => {
  let api = getEmployee();
  res.write(START_HTML);
  try {
    const employees = await api
    res.write(
      ` ${END_HTML}` 
    );
  } catch (error) {
    console.error('Error fetching employees:', error);
    res.write(END_HTML);
  }
  res.end();
});

// Start the server
app.listen(PORT, () => {
  console.log(`Server running on http://localhost:${PORT}`);
}); 

Let’s break this down.

What’s Happening

  • The HTML skeleton is sent immediately to the browser.
  • JavaScript files start downloading right away.
  • Meanwhile, the server calls an API for employee data.
  • It streams each employee name into the HTML response as it gets it.
  • Once everything’s rendered, it closes the response.

What This Means

  • The browser doesn’t wait for the entire backend process.
  • It starts rendering right away.
  • JS and API fetching happen in parallel.
  • No unnecessary client-side API call to get the same data again.

The Result?

Faster rendering. Happier users. Lower bounce rates.

All without switching frameworks or adding bloat.

It's Not a Silver Bullet

  • Not all the API calls can be done on the server side before rendering.
  • Some API might need input from the Client side.

When Should You Use This?

Streaming HTML is perfect when:

  • You have static or semi-static data on initial load (like feature flags or config).
  • You want better perceived performance without a full SSR setup.
  • You’re not ready (or allowed) to migrate to a meta-framework like Next.js or Nuxt.

And if you are using a meta-framework, great—they already use streaming under the hood. But if you're not, this is your lightweight way to get similar gains.

Bonus Tips

  • Always set proper cache headers for static resources.
  • Compress responses with Gzip or Brotli.
  • Minify your JS and CSS.
  • Use preload tags for critical assets.
  • Track and test your Time To First Byte (TTFB) and Largest Contentful Paint (LCP) in tools like Lighthouse or WebPageTest.

Performance isn’t just a dev metric—it’s a business metric.

Wrap-Up

So here’s what we learned:

  • Client-side rendering can hurt your initial load times.
  • Server-Side Rendering helps, but can be heavy.
  • Streaming HTML gives you the best of both worlds.
  • With one simple pattern and some basic Express code, you can start sending content faster and make your users happier.

Demo Link: here

Original Post: here

CSR Route: “/”
HTML STREAM ROUTE: “/server”
Must Read If you haven't
More content at Dev.to.