Ready to estimate
Paste JSON and click Estimate Size// measure json payload size before sending
Calculate the exact byte size of JSON payloads in UTF-8, UTF-16, and compressed (gzip) formats before sending them over the wire. Free browser-based tool.
Ready to estimate
Paste JSON and click Estimate SizeDrop any JSON payload — objects, arrays, nested structures all work.
Get instant byte sizes in UTF-8, UTF-16, minified, and gzip formats.
See how long your payload takes to deliver over 3G, 4G, WiFi, and Fiber.
The JSON Size Estimator calculates the exact byte size of your JSON payload in multiple encodings and compression formats — instantly, in your browser. No data is ever uploaded to a server.
UTF-8 is the standard wire encoding for HTTP APIs. UTF-16 is used in some Windows and Java environments. Gzip compression dramatically reduces payload size for large responses.
HTTP APIs almost universally use UTF-8 encoding. The UTF-8 byte count is what actually travels over the network, so that's the most relevant number for API optimization.
The gzip estimate uses maximum compression (level 9) and is computed entirely in your browser using the PHP gzencode function server-side. Real-world gzip results may vary slightly by server configuration, but the estimate is typically within 1–5%.
Minification (removing whitespace and newlines) helps most when gzip is NOT enabled on your server. If gzip is active, minification provides minimal additional benefit since gzip already eliminates repetitive whitespace patterns very efficiently.
UTF-16 uses a minimum of 2 bytes per character, so ASCII characters that only need 1 byte in UTF-8 take 2 bytes in UTF-16. For ASCII-heavy JSON, UTF-16 is roughly 2× larger.
UTF-8 and character counting happens entirely in JavaScript in your browser — no data leaves your machine. The gzip and minified estimates use a lightweight server-side call that processes your JSON and immediately discards it.
For mobile-first REST APIs, aim for under 50 KB per response without compression. With gzip, responses under 200 KB are generally acceptable. For real-time websocket payloads, under 10 KB is a good target.
A JSON Size Estimator is a developer tool that calculates exactly how many bytes your JSON payload occupies in memory and over the network. When you build APIs, WebSocket connections, or any system that exchanges JSON data, knowing the payload size upfront helps you prevent performance bottlenecks before they hit production.
💡 Looking for premium web development assets? MonsterONE offers unlimited downloads of templates, UI kits, and assets — worth checking out.
Every byte you send over HTTP costs real money and real time. For a high-traffic API receiving a million requests per day, shaving 1 KB off each response saves 1 GB of bandwidth daily. On mobile networks where 3G latency is common, a bloated 500 KB JSON response can add several seconds to a user's perceived load time.
The JSON Size Estimator shows you the size in four formats simultaneously:
UTF-8 is the de facto standard for JSON over HTTP. It encodes ASCII characters (a–z, 0–9, most punctuation) in a single byte, making it extremely efficient for typical JSON keys and English-language values. The RFC 8259 JSON specification requires UTF-8 encoding for JSON transmitted over networks.
UTF-16, commonly seen in Windows file systems and Java's internal string representation, uses a minimum of 2 bytes per character. For JSON payloads that are mostly ASCII, UTF-16 encoding is roughly twice the size of UTF-8. The only scenario where UTF-16 approaches UTF-8 efficiency is with content-heavy CJK (Chinese, Japanese, Korean) characters, which require 3 bytes in UTF-8 but only 2 in UTF-16.
For JSON payloads larger than ~1 KB, enabling gzip on your web server almost always pays off. Gzip compression typically reduces JSON size by 60–80% because JSON has high structural repetition — key names repeat across array items, brackets and braces appear frequently, and whitespace compresses to nearly nothing.
To enable gzip in Nginx, add gzip on; gzip_types application/json; to your server block. For Apache, enable mod_deflate with a filter for application/json. Most modern CDNs like Cloudflare and Fastly compress JSON automatically.
The tradeoff: gzip adds a few milliseconds of CPU time to compress (server) and decompress (client). For very small payloads under 150 bytes, the overhead can exceed the savings. This tool shows you the break-even point clearly.
If your server already sends gzip-compressed responses, minifying your JSON first provides only marginal additional benefit — typically 0–5% smaller after gzip. Gzip is so effective at eliminating whitespace patterns that pre-minification doesn't add much.
However, minification matters in two specific scenarios: when gzip is unavailable (embedded systems, certain IoT protocols) and when JSON is stored in databases or caches where storage cost matters more than compression CPU overhead.
As a general rule: use gzip for HTTP transport, use minification for storage. Never skip both.
The transfer time section helps you reason about user experience at different connection speeds. A few benchmarks to keep in mind:
Note that these are theoretical maximums. Real transfer times are dominated by latency (round-trip time), not throughput, for small payloads. A 1 KB JSON response on a 3G connection might take 300ms — but most of that is the 200ms RTT, not the 5ms transfer time.
Beyond measuring, here are practical strategies to reduce your JSON payload sizes:
"id" vs "user_identifier" adds up across millions of objects.?fields= query params) so clients only request what they need.The UTF-8 byte count uses JavaScript's TextEncoder API, which accurately encodes your string to UTF-8 bytes in the browser without any server round-trip. The UTF-16 calculation uses the fact that JavaScript internally stores strings as UTF-16, so string.length * 2 gives the UTF-16 byte count (this is an approximation that's exact for BMP characters).
The gzip estimate sends your JSON to a lightweight server endpoint that runs PHP's gzencode() at maximum compression level (9) and returns the byte count. This gives you a realistic production-quality compression estimate. The minified size uses json_encode(json_decode()) to strip all non-essential whitespace.