How to Read JSON File in JavaScript: A Complete Step by Step Guide
Imad Uddin
Full Stack Developer

I still remember my first week as a junior developer when I was tasked with building a dynamic dashboard. The data was sitting in a simple data.json file, and I spent hours trying to figure out why I could not just "open" it like a normal variable. I tried everything from basic script tags to weird hacks before I finally understood the fundamental difference between the browser environment and the server. That was the day I realized that learning how to read json file in JavaScript is not just a syntax exercise; it is about understanding how data flows through the modern web.
Today, JSON (JavaScript Object Notation) is the undisputed language of data exchange. Whether you are fetching weather data from a public API, reading configuration settings in a Node.js backend, or processing user uploaded files, you will find yourself reaching for JSON operations constantly. In this definitive guide, I am going to walk you through every reliable method to read and parse JSON files, covering both the browser and the Node.js ecosystems. We will also dive into the "pro" territory, handling things like memory safe streams for massive datasets and modern ES6 import assertions.
Why JSON Has Become the Industry Standard
Before we get into the technical "how to," it is worth taking a moment to understand why we use JSON so heavily. If you have ever worked with XML, you know how verbose and difficult it can be to parse manually. JSON, by contrast, is lightweight and maps perfectly to JavaScript objects.
Common scenarios where you will need to read JSON include:
- API Integration: Fetching dynamic content from RESTful services (like product lists or user profiles).
- Configuration Management: Storing application settings (like API keys or theme preferences) in a structured format.
- Data Migration: Reading exported data from databases to transform or display it.
- Local Development: Using mock data files to build frontends before the backend is ready.
By the end of this article, you will be able to handle any JSON file—no matter where it is located or how large it is—with absolute confidence.
Understanding the Environment: Browser vs. Node.js
The very first thing you must identify is where your code is running. This is the part that trips up most beginners.
- The Browser (Client Side): For security reasons, the browser cannot just reach into your "C: Drive" or "Documents" folder and read a file. It can only fetch files via a web server (using a URL) or read files that a user explicitly uploads via an input field.
- Node.js (Server Side): Since Node.js runs directly on your operating system, it has full permissions to read, write, and delete files on your hard drive.
I have seen many developers try to use Node.js modules like fs in a React or Vue project, only to be met with a "Module not found" error. Always remember: if it is for the user's browser, use fetch. If it is for a script running on your machine, use fs.
Method 1: Reading JSON in the Browser Using the Fetch API
The Fetch API is the modern, standard way to request resources in the browser. It is built into every modern browser and has completely replaced the older, clunky XMLHttpRequest tool.
The Basic Implementation
If you have a JSON file sitting on your server (e.g., data.json), here is the most robust way to read it using async/await:
Deep Insight: Why response.ok Matters
One mistake I often see is developers skipping the response.ok check. If the browser gets a 404 Not Found or a 500 Server Error, the fetch() promise still technically "resolves." This means your code will try to parse an error message (which is often HTML) as JSON, leading to the dreaded "Unexpected token < in JSON at position 0" error. Always verify the status before parsing.
Method 2: Modern ES6 JSON Modules (Importing JSON)
As of 2024 and 2025, modern JavaScript supports JSON Modules. This allows you to treat a JSON file almost like a regular JavaScript file by using the import statement. This is perfect for static configuration files or small data sets.
How to Use Import Assertions
To use this, you need to provide a with clause (previously called "assertions") to tell the browser that the file you are importing is specifically a JSON file.
Pros:
- Loads before your script runs, making the data immediately available.
- Clean syntax without needing fetch or await inside the main logic.
Cons:
- The data is cached. If the JSON file changes on the server, the browser will not re-fetch it unless the page is reloaded.
- Not suitable for fetching data from a dynamic API endpoint.
Method 3: Reading Local User Files (The File Input Method)
Sometimes you want to let a user upload their own JSON file to your website (for example, if you are building a tool like our JSON Flattener). In this case, you use the FileReader API.
I recommend this approach if you are building utility tools. It is fast, secure, and works entirely on the user's machine without ever sending data to your server.
Method 4: Reading JSON in Node.js (Primary Backend Method)
When you are working on the server-side, you have a few different ways to handle JSON. If you are building a CLI tool or a backend server, the fs (File System) module is your best friend.
Option A: Using fs/promises (The Modern Standard)
This is the non blocking, promise based way to read files. It is the method I recommend for almost all modern Node.js applications.
Option B: Using require() (Legitimate Shortcut)
If you are still using CommonJS (the traditional Node.js modules) and you have a small static configuration file, you can "cheat" by using require.
Warning: I generally advise against using require() for large data files. Why? Because require() caches the file in memory. If you update the JSON file on the disk, require() will still return the old data until you restart the server.
Comparison: Which Method is Best for You?
| Scenerio | Technology | Best Method | Why? |
|---|---|---|---|
| Web Page (External Data) | Browser | fetch() | Native, non blocking, and powerful. |
| Web Page (Bundled Config) | Browser | import | Clean and easy for build time data. |
| User Uploaded Data | Browser | FileReader | Securely reads local user files. |
| Node.js Core Backend | Node.js | fs/promises | Professional, async, and memory safe. |
| Quick Scripts/Tools | Node.js | require() | Fastest to write, but careful with caching. |
Deep Insight: The "Unexpected Token" Nightmare
If you have spent any time writing JavaScript, you have likely encountered an error like SyntaxError: Unexpected token ' in JSON at position 0.
During my years of managing data at scale, I have found that the problem is rarely your code. It is almost always one of these "invisible" issues:
- The BOM (Byte Order Mark): Some text editors add an invisible marker at the start of a UTF8 file. JavaScript's JSON.parse will choke on this. To fix it, ensure your file is saved as "UTF8 without BOM."
- Single Quotes: This is the #1 mistake. JSON requires double quotes for keys and string values ("key": "value"). Single quotes ('key': 'value') are valid JavaScript, but they are invalid JSON.
- Trailing Commas: JSON does not allow a comma after the last item in an array or object. My advice is to use an online JSON Formatter to clean your data before trying to read it.
- The UTF8 Flag: In Node.js, if you forget to pass utf8 as the second argument to readFile, it will return a Buffer (raw binary data) instead of a string. JSON.parse cannot understand a Buffer.
Handling Large JSON Files (The Professional Way)
What happens if you need to read a JSON file that is 5GB? If you try to use JSON.parse() on a 5GB file, your computer will likely run out of memory and the program will crash. This is because JSON.parse() tries to load the entire object into memory at once.
For large files, you must use Streaming. Instead of reading the whole file, you read it piece by piece.
In Node.js, you would use a package like stream json:
If you are dealing with a large file right now and just need a quick way to handle it, you can also use our JSON File Splitter to break it into smaller pieces that are easier to read.
Best Practices for Reading JSON in Production
After years of building data-intensive applications, I have developed a set of "Golden Rules" for reading JSON:
1. Always Use Try/Catch
Never assume a JSON file is perfect. Even if it was valid yesterday, a bug in another service could corrupt it today. Always wrap your parsing logic in a try...catch block to prevent your entire application from crashing.
2. Validate Your Data (Zod or Joi)
Just because a file is valid JSON does not mean it has the data you expect. If your code expects user.email but the file only contains user.username, your app will crash. Use a validation library like Zod to define a schema and verify the data as soon as you parse it.
3. Be Mindful of CORS
When using fetch() in the browser to read a JSON file from a different domain, you will run into CORS (Cross Origin Resource Sharing) issues. Ensure the server hosting the JSON file has the correct headers (****Access Control Allow Origin: *****) enabled.
4. Encoding is Everything
Always use UTF8 for your JSON files. It is the universal standard. If you are seeing weird symbols or "gibberish" characters where your text should be, it's almost certainly an encoding mismatch. You can use our guide on how to format JSON in Notepad++ to verify and fix encoding issues.
Troubleshooting Common JSON Reading Errors
"Request has been blocked by CORS policy" This is a browser security feature. You cannot fetch a JSON file from domainA.com if you are currently on domainB.com unless the server explicitly allows it. If you're developing locally, use a local server (like the "Live Server" extension in VS Code) rather than opening the HTML file directly as a "file:///" URL.
"Unexpected token < in JSON at position 0" This almost always means you tried to fetch a file that does not exist. The server responded with a 404 Not Found page (which starts with
"Cannot find module" (in Node.js) If you are using require('./data'), make sure you included the .json extension or that the path is correct relative to the current file. Remember that require paths are relative to the file itself, while fs paths are usually relative to the folder where you started the terminal.
Related Tools and Guides
If you found this guide helpful, check out some of our other developer focused tools and articles:
- JSON to Excel Converter : Instantly turn complex JSON into a clean spreadsheet.
- How to Merge JSON Files : A guide on consolidating data from multiple sources.
- JSON Flattener : Simplify deeply nested JSON objects into a single level.
- YAML to JSON Converter : Convert configuration files between formats seamlessly.
- How to Format JSON in Notepad++ : Make your raw data readable and easy to debug.
- JSON File Splitter : Break down large files that are crashing your browser.
Final Thoughts
Mastering how to read json file in JavaScript is a rite of passage for every modern developer. Whether you are using the simplicity of fetch(), the power of Node’s fs module, or the modern elegance of ES6 imports, you now have the tools to handle data safely and efficiently.
Next time you are faced with a massive JSON file or a complex API response, remember to check your environment, handle your errors, and keep your memory usage in check. The insight you can gain from well processed data is limitless.
Happy coding!
Read More
All Articles
How to Add an Image in JSON: A Comprehensive Guide
Learn how to add an image to a JSON object using URLs, file paths, or base64 encoding. This guide provides examples and best practices for each method.

How JSON Powers Everything: The Hidden Force Behind Modern Web Applications
Learn how JSON powers modern web applications, APIs, and online tools. Discover why JSON became the universal data format and how it enables everything from file converters to real-time applications.

How Does JSON Work? A Complete Technical Guide to Structure, Parsing, Serialization, and Data Exchange
Learn how JSON works internally - from serialization and parsing to network communication. Complete technical guide covering structure, syntax rules, performance, and cross-language compatibility.