Next.js Middleware – Full Node.js Access

3 months ago
|
0 views

TL;DR: Next.js's middleware only allows access to a limited subset of Node.js APIs. If you need to access the full Node.js API, you must use a workaround or implement a custom server for your Next.js app.

Problem

Let's say you want to implement rate limiting for your Next.js app, store data in Redis, and also use the same connection pool for your API route. It seems like a perfect case for using Next.js middleware, right?

// middleware.ts
import Redis from "ioredis";
 
export const redis = new Redis('redis://localhost:6379');
 
const rateLimitCheck = (ip: string) => {
  // some logic to check the rate limit
  if (valid) {
    return true;
  }
  return false;
}
 
export async function middleware(req) {
  const res = NextResponse.next();
  const ip = req.ip;
  if (rateLimitCheck(ip)) {
    return res;
  }
  return new Response('Too many requests', { status: 429 });
}

This code seems totally fine—but it won't work. When you run it, you will see an error explaining that middleware cannot handle the Redis import because some API is missing in the middleware runtime. This is because ioredis uses some Node.js APIs that Vercel's Edge runtime doesn't support, so there is no way to use ioredis in Next.js middleware. That was very frustrating to realize.

Moreover, the community is also complaining about this. One of the most upvoted and reacted-to issues is here. Next.js's team knows about this but decided to put it deep in their backlog—offering only promises like "we're working on it" and "it's on our roadmap." But trust me, it's been years and nothing has changed.

So, we must save ourselves and find a workaround.

Workaround

Implement your logic in an API route, and then call that API route from middleware. It may seem like a crazy, anti-pattern solution, but it works.

// api/rate-limit.ts
import Redis from "ioredis";
 
export const redis = new Redis('redis://localhost:6379');
 
const rateLimitCheck = (ip: string) => {
  // some logic to check the rate limit
  if (valid) {
    return true;
  }
  return false;
}
 
export async function GET(req: NextApiRequest, res: NextApiResponse) {
  // Get the IP from request query params
  const ip = req.query.ip;
  if (rateLimitCheck(ip)) {
    return res.status(200).json({ success: true });
  }
  return res.status(429).json({ success: false });
}
 
// middleware.ts
export async function middleware(req) {
  const res = NextResponse.next();
  const ip = req.ip;
  const response = await fetch(`${process.env.YOUR_DOMAIN}/api/rate-limit?ip=${ip}`);
  const data = await response.json();
  if (data.success) {
    return res;
  }
  return new Response('Too many requests', { status: 429 });
}

Long term solution

The best solution is to use a full Node.js server for your Next.js app so you can use the entire Node.js API. You can use a custom server with Next.js, but you'll need to do some additional work to make it work.

⚠️ Custom servers have some undocumented issues. As you know, Vercel does not prioritize supporting custom servers as much as the community would like, so be prepared for some headaches in production when converting from next start to node server.ts.

Below is an example of a custom server for Next.js with Express:

// server.ts
import express from 'express';
import next from 'next';
import Redis from 'ioredis';
 
const app = next({ dev: process.env.NODE_ENV !== 'production' });
const handle = app.getRequestHandler();
// Work normally without the weird edge runtime limitations
const redis = new Redis(process.env.REDIS_URL);
 
app.prepare().then(() => {
  const server = express();
  server.all('*', (req, res) => {
    if (req.path.startsWith('/api/')) {
      // add your rate limit logic here
      // redis...
    } 
    return handle(req, res);
  });
});

Happy coding! 🚀