Top Node.js Interview Questions(2025)
Question: What is async/await in Node.js?
Answer:
async/await
in Node.js is syntactic sugar built on top of Promises that allows you to write asynchronous code in a more readable, synchronous-like manner. It simplifies the process of working with Promises and makes asynchronous code easier to manage and understand.
async
: Declares a function as asynchronous, which means it will always return a Promise.await
: Pauses the execution of the asynchronous function until a Promise is resolved or rejected, and returns the resolved value of the Promise.
Together, async
and await
provide a way to handle asynchronous operations sequentially, without the need for chaining .then()
methods or dealing with deeply nested callbacks.
Key Concepts:
-
Async Functions:
- An async function always returns a Promise. Even if you return a non-Promise value from an
async
function, it is wrapped in a resolved Promise. - The
async
keyword is placed before the function definition.
async function myAsyncFunction() { return "Hello, World!"; } myAsyncFunction().then(result => console.log(result)); // Outputs: Hello, World!
- The above code returns a resolved Promise automatically, even if we return a simple string from the function.
- An async function always returns a Promise. Even if you return a non-Promise value from an
-
Await Expression:
- The
await
keyword is used inside anasync
function to wait for a Promise to resolve (or reject) before proceeding with the next line of code. - It can only be used inside
async
functions. Theawait
expression pauses the execution of theasync
function until the Promise it’s waiting on is settled.
async function example() { const result = await someAsyncFunction(); console.log(result); // Executes after someAsyncFunction resolves }
- The
-
Error Handling with
try/catch
:- When using
await
, it’s important to handle potential errors because Promises might be rejected. This can be done using thetry/catch
block to catch errors.
async function fetchData() { try { const data = await fetch('https://api.example.com/data'); const json = await data.json(); console.log(json); } catch (error) { console.error("Error fetching data:", error); } }
- When using
Example of async/await:
Let’s look at a more detailed example to illustrate how async
and await
can be used for handling asynchronous tasks in Node.js:
const fs = require('fs').promises;
// Using async/await to read a file
async function readFile() {
try {
const data = await fs.readFile('example.txt', 'utf8');
console.log('File content:', data);
} catch (error) {
console.log('Error reading file:', error);
}
}
readFile();
Explanation:
- The
readFile
function is asynchronous and usesawait
to pause execution until the file reading operation is complete. - If the file reading operation is successful, the content of the file is logged. If there’s an error (e.g., file not found), it’s caught and logged in the
catch
block.
Benefits of async/await
in Node.js:
-
More Readable and Synchronous-Like:
async/await
makes asynchronous code look and behave more like synchronous code, improving readability and reducing the need for.then()
chains or callback nesting.
-
Simplified Error Handling:
- Unlike Promises with
.then()
and.catch()
,async/await
allows you to handle errors using standardtry/catch
blocks, making error handling simpler and more intuitive.
- Unlike Promises with
-
Improved Debugging:
- Code written with
async/await
is easier to debug because it behaves like synchronous code. The execution flow is more predictable, and stack traces are easier to follow.
- Code written with
-
Sequential Execution of Asynchronous Tasks:
- With
await
, you can easily execute asynchronous tasks in sequence, ensuring that each task completes before the next one begins, without using nested callbacks or chained.then()
methods.
- With
Example of Sequential Asynchronous Operations:
async function sequentialTasks() {
const task1 = await task1Function();
console.log('Task 1 done:', task1);
const task2 = await task2Function();
console.log('Task 2 done:', task2);
const task3 = await task3Function();
console.log('Task 3 done:', task3);
}
sequentialTasks();
- Here,
task2
will only start oncetask1
has completed, andtask3
will only start oncetask2
is done.
Example of Parallel Asynchronous Operations:
In cases where asynchronous operations don’t depend on each other, you can use Promise.all
to run them in parallel and wait for all of them to complete.
async function parallelTasks() {
try {
const [task1Result, task2Result, task3Result] = await Promise.all([
task1Function(),
task2Function(),
task3Function()
]);
console.log('All tasks done:', task1Result, task2Result, task3Result);
} catch (error) {
console.error('Error in one of the tasks:', error);
}
}
parallelTasks();
Promise.all
allows multiple asynchronous operations to run concurrently, improving performance when tasks are independent of each other.
Key Points:
- Asynchronous Code:
async/await
simplifies the writing of asynchronous code by making it look like synchronous code. - Syntactic Sugar:
async/await
is built on top of Promises, so understanding Promises is essential for working withasync/await
. - Error Handling: With
async/await
, you can usetry/catch
blocks for error handling, which is more natural than using.catch()
with Promises. - Avoid Callback Hell: It reduces the complexity of nested callbacks and
.then()
chains.
Conclusion:
async/await
is a powerful tool in Node.js for handling asynchronous operations in a more readable and manageable way. By allowing you to write asynchronous code that looks and behaves like synchronous code, it simplifies error handling and improves the overall clarity of your application’s logic. While it’s built on top of Promises, async/await
provides an elegant and efficient solution for working with asynchronous code.
Question: What are Streams in Node.js?
Answer:
In Node.js, Streams are a powerful and efficient way to handle reading and writing data in a continuous and event-driven manner. Instead of loading all data into memory at once (which can be inefficient for large files or large amounts of data), Streams allow you to process data piece by piece, as it becomes available. This is especially useful for handling large datasets, such as files, network responses, or real-time data.
Streams are a core concept in Node.js and are widely used in I/O operations, allowing for efficient memory usage and handling of asynchronous data.
Types of Streams in Node.js:
Node.js provides four types of streams:
-
Readable Streams:
- A Readable Stream allows you to read data from a source, such as a file, HTTP response, or database query.
- Data from a readable stream is available in chunks, and you can consume the data piece by piece.
- Examples include:
fs.createReadStream()
(for reading files)http.IncomingMessage
(for reading HTTP requests)
-
Writable Streams:
- A Writable Stream allows you to write data to a destination, such as a file, HTTP request, or network socket.
- Data is written in chunks to the destination.
- Examples include:
fs.createWriteStream()
(for writing to files)http.ServerResponse
(for sending HTTP responses)
-
Duplex Streams:
- A Duplex Stream is both readable and writable, meaning you can read from and write to it.
- It represents a combination of readable and writable operations, allowing bi-directional communication.
- Examples include:
- TCP sockets
net.Socket
(in thenet
module)
-
Transform Streams:
- A Transform Stream is a special type of duplex stream that modifies or transforms the data as it is being read and written.
- It can process the data in real time while it’s being transferred.
- Examples include:
zlib.createGzip()
(for compressing data)crypto.createCipher()
(for encrypting data)
Key Features of Streams:
-
Event-Driven: Streams use events to handle data as it becomes available. For example, you can listen for events like
data
,end
,error
, andfinish
to manage the flow of data. -
Non-Blocking: Streams operate asynchronously and non-blocking, meaning they allow your program to handle other tasks while reading or writing data. This helps improve the efficiency of applications, especially when dealing with large amounts of data.
-
Memory Efficient: Streams process data in chunks, which means you don’t need to load the entire dataset into memory at once. This is particularly important for applications that work with large files or real-time data.
-
Flow Control: Streams support backpressure, meaning that if the writable stream is not ready to accept more data, the readable stream will slow down its data delivery to avoid overwhelming the system.
Working with Streams:
1. Readable Streams:
Readable streams allow you to read data from a source. You can use the data
event to consume the data chunks as they arrive.
const fs = require('fs');
const stream = fs.createReadStream('example.txt', { encoding: 'utf8' });
stream.on('data', (chunk) => {
console.log('Received chunk:', chunk);
});
stream.on('end', () => {
console.log('Stream ended');
});
stream.on('error', (err) => {
console.error('Error occurred:', err);
});
In this example:
- The
data
event is emitted whenever a chunk of data is available. - The
end
event is emitted when the entire file has been read. - The
error
event is used to handle any errors during the stream operation.
2. Writable Streams:
Writable streams allow you to write data to a destination. You can use the write()
method to send chunks of data and the finish
event to know when all data has been written.
const fs = require('fs');
const writeStream = fs.createWriteStream('output.txt');
writeStream.write('Hello, ');
writeStream.write('World!');
writeStream.end();
writeStream.on('finish', () => {
console.log('Data has been written to output.txt');
});
writeStream.on('error', (err) => {
console.error('Error occurred:', err);
});
In this example:
- The
write()
method is used to send chunks of data. - The
end()
method indicates that no more data will be written. - The
finish
event is emitted when all the data has been successfully written to the stream.
3. Duplex Streams:
Duplex streams allow you to both read from and write to a stream. They are useful in scenarios like network communication, where data needs to be both sent and received.
const net = require('net');
const server = net.createServer((socket) => {
socket.write('Hello, Client!');
socket.on('data', (data) => {
console.log('Received data:', data.toString());
});
});
server.listen(8080, () => {
console.log('Server listening on port 8080');
});
In this example, the server reads from and writes to a TCP socket using the net.Socket
stream.
4. Transform Streams:
Transform streams are used to modify the data as it is being read and written. These are useful when you want to perform operations like compression or encryption on data while it is being transferred.
const fs = require('fs');
const zlib = require('zlib');
const readStream = fs.createReadStream('example.txt');
const writeStream = fs.createWriteStream('example.txt.gz');
const gzip = zlib.createGzip();
readStream.pipe(gzip).pipe(writeStream);
In this example, the createGzip()
method creates a transform stream that compresses data as it is read from the file and written to a new compressed file.
Piping Streams:
One of the most common patterns in Node.js is piping streams together. This allows you to pass data from a readable stream to a writable stream in a seamless manner.
const fs = require('fs');
const zlib = require('zlib');
fs.createReadStream('example.txt')
.pipe(zlib.createGzip())
.pipe(fs.createWriteStream('example.txt.gz'));
- In this example, data flows from the readable stream (
fs.createReadStream
) through the transform stream (zlib.createGzip()
) and finally to the writable stream (fs.createWriteStream
). - The
pipe()
method handles the flow of data automatically, ensuring that data is read, transformed, and written efficiently.
Conclusion:
Streams in Node.js are a fundamental concept that allows you to work with data efficiently by processing it in chunks. They are used in many core modules for tasks like reading and writing files, making network requests, and working with real-time data. The key benefits of streams are their non-blocking nature, efficient memory usage, and support for handling large datasets or real-time streams of data. By understanding streams, you can build scalable and performant applications in Node.js.
Read More
If you can’t get enough from this article, Aihirely has plenty more related information, such as Node.js interview questions, Node.js interview experiences, and details about various Node.js job positions. Click here to check it out.
Tags
- Node.js
- JavaScript
- Backend Development
- Asynchronous Programming
- Event Driven Architecture
- Event Loop
- Callbacks
- Promises
- Async/Await
- Streams
- Require
- Modules
- Middleware
- Express.js
- Error Handling
- Cluster Module
- Process.nextTick
- SetImmediate
- Concurrency
- Non Blocking I/O
- HTTP Module
- File System (fs) Module
- Node.js Interview Questions
- Node.js Advantages
- Node.js Performance
- Node.js Errors
- Callback Hell
- Server Side JavaScript
- Scalable Web Servers
- Node.js Architecture
- Node.js Event Emitters