HTTP Tunnels Core
Expose your local HTTP server to the internet with a single command. NFLTR creates a secure, encrypted tunnel from your localhost to a public HTTPS URL — no port forwarding, no DNS configuration, no firewall changes required.
How It Works
When you run nfltr http 3000, the agent establishes an outbound gRPC connection to the NFLTR server. Incoming HTTPS requests to your public URL are relayed through this encrypted tunnel to your local service. The connection is always initiated from your machine — no inbound ports needed.
Quick Start
1. Start your local server
# Any HTTP server on any port
node server.js # Express on :3000
python -m http.server # Python on :8000
go run main.go # Go on :8080
2. Create the tunnel
nfltr http 3000
That's it. You'll see output like:
Agent ID: alice.cranky-fox
Forwarding: https://nfltr.xyz/browse/alice.cranky-fox/ → http://localhost:3000
Share URL: https://swift-bay.nfltr.xyz/
3. Access your service
Anyone on the internet can now reach your local server via the share URL or the stable browse URL. All traffic is encrypted with TLS end-to-end.
Multi-Route Support
Route different URL paths to different local ports. Perfect for microservice architectures where your frontend and API run on separate ports:
# Frontend on :5173, API on :8080, WebSocket on :4000
nfltr http 5173 \
--route /api=8080 \
--route /ws=4000
Named Endpoints
Give your tunnel a stable, human-readable name that persists across restarts:
nfltr http 3000 --name my-api
Your service is always available at https://nfltr.xyz/browse/alice.my-api/ — the URL never changes, even if you restart the agent. If the agent is temporarily offline, the server returns 503 Service Unavailable with a Retry-After header instead of 404, so clients know the endpoint exists and will come back.
Streaming & SSE Support
NFLTR transparently proxies all HTTP response types:
- Server-Sent Events (SSE) —
text/event-streamresponses stream continuously - Chunked Transfer — Large file downloads, streaming APIs
- WebSocket Upgrade — Via
--route /ws=4000 - Long-Polling — No timeout interference, configurable keep-alive
AI chat interfaces, live log tails, real-time dashboards — they all work transparently through the tunnel.
Security
🔒 TLS Everywhere
All public URLs use HTTPS. TLS termination happens at the server — your local service can run plain HTTP.
🔑 Authentication
Tunnels require an API key. Incoming requests can be further protected with share URL auth, IP allowlists, or bearer tokens.
📋 Request Inspection
Every proxied request is logged with method, path, status code, and timing. View in the dashboard or via the API.
CLI Options
| Flag | Description | Example |
|---|---|---|
--name | Stable name for the agent (persistent URL) | --name my-api |
--route | Map a URL path to a different local port | --route /api=8080 |
--no-share | Disable automatic share URL generation | --no-share |
--share-auth | Password-protect the share URL | --share-auth secret123 |
--share-ip-allowlist | Restrict share URL by IP | --share-ip-allowlist 1.2.3.4 |
--share-bearer | Require bearer token for share URL | --share-bearer mytoken |
--labels | Metadata labels for fleet management | --labels env=prod,region=us |
Use Cases
- Local development — Share your dev server with teammates for review
- Webhook testing — Receive GitHub, Stripe, Razorpay callbacks locally
- OAuth flows — Get a public HTTPS callback URL for OAuth testing
- Demo & presentations — Show a client your work without deploying
- CI/CD integration — Run integration tests against live local code with Newman/Postman
- Homelab — Expose Jellyfin, Home Assistant, or Nextcloud behind CGNAT
The agent connects outbound over gRPC (port 443). No inbound ports, no static IP, no router configuration needed. Works behind CGNAT, hotel Wi-Fi, and corporate firewalls.
Ready to try HTTP tunnels?
Download the agent and expose your first service in under 30 seconds.
Download Agent CLI Reference