Throttling Promises by Batching
Throttling promises by batching is a useful technique to limit the number of concurrent promise executions, which can help manage resources more effectively and prevent overwhelming systems with too many simultaneous operations.
What is Throttling by Batching?
Throttling by batching involves limiting the number of promises executed concurrently by dividing them into smaller batches. Once a batch of promises is completed, the next batch starts. This technique is particularly useful in scenarios where APIs have rate limits or when system resources need to be managed carefully.
Real Interview Insights
Interviewers might ask you to:
- Implement a function to throttle promises by batching.
- Explain the benefits and potential pitfalls of batching promises.
- Provide use cases where batching is beneficial.
Implementing Throttling by Batching
Here's how you can implement a function to throttle promises by batching:
function throttlePromises(promises, batchSize) {
return new Promise((resolve, reject) => {
const results = [];
let currentBatch = 0;
function runBatch() {
const batchPromises = promises
.slice(currentBatch * batchSize, (currentBatch + 1) * batchSize)
.map((promise, index) =>
promise
.then((result) => {
results[currentBatch * batchSize + index] = result;
})
.catch((error) => {
results[currentBatch * batchSize + index] = { error };
})
);
Promise.all(batchPromises)
.then(() => {
currentBatch++;
if (currentBatch * batchSize < promises.length) {
runBatch();
} else {
resolve(results);
}
})
.catch(reject);
}
runBatch();
});
}
Explanation:
- Parameters:
promises
: An array of promises to be throttled.batchSize
: The number of promises to run concurrently in each batch.
- Results Array: An array to store the results or errors of each promise.
- Current Batch: A counter to keep track of the current batch.
- runBatch Function: Executes the promises in the current batch and stores the results. Once a batch completes, it checks if there are more promises to run and proceeds to the next batch.
Practical Example
Consider an example where we fetch data from multiple URLs but limit the number of concurrent fetch operations:
const urls = [
'https://jsonplaceholder.typicode.com/todos/1',
'https://jsonplaceholder.typicode.com/todos/2',
'https://jsonplaceholder.typicode.com/todos/3',
// more URLs...
];
const fetchPromises = urls.map(url => fetch(url).then(response => response.json()));
throttlePromises(fetchPromises, 2)
.then(results => {
results.forEach((result, index) => {
if (result.error) {
console.error(`Error fetching URL ${urls[index]}:`, result.error);
} else {
console.log(`Fetched data from ${urls[index]}:`, result);
}
});
})
.catch(error => console.error('Batch execution failed:', error));
In this example:
- URLs Array: A list of URLs to fetch data from.
- Fetch Promises: An array of promises created by fetching data from each URL.
- Throttle Promises: We use the
throttlePromises
function to limit the number of concurrent fetch operations to 2.
Use Cases for Throttling by Batching
- Rate-Limited APIs: When interacting with APIs that have strict rate limits, batching ensures you do not exceed these limits.
- Resource Management: Managing system resources efficiently by preventing too many concurrent operations that could overwhelm the system.
- User Experience: Enhancing user experience by controlling the load on the frontend, avoiding performance issues due to too many simultaneous network requests.
Coding Challenge: Adding Progress Tracking
Challenge: Enhance the throttlePromises
function to include progress tracking, allowing you to monitor the completion status of the batches.
function throttlePromises(promises, batchSize, onProgress) {
return new Promise((resolve, reject) => {
const results = [];
let currentBatch = 0;
let completedBatches = 0;
function runBatch() {
const batchPromises = promises
.slice(currentBatch * batchSize, (currentBatch + 1) * batchSize)
.map((promise, index) =>
promise
.then((result) => {
results[currentBatch * batchSize + index] = result;
})
.catch((error) => {
results[currentBatch * batchSize + index] = { error };
})
);
Promise.all(batchPromises)
.then(() => {
completedBatches++;
if (onProgress) {
onProgress(completedBatches, Math.ceil(promises.length / batchSize));
}
currentBatch++;
if (currentBatch * batchSize < promises.length) {
runBatch();
} else {
resolve(results);
}
})
.catch(reject);
}
runBatch();
});
}
// Example usage with progress tracking
throttlePromises(fetchPromises, 2, (completed, total) => {
console.log(`Completed ${completed} out of ${total} batches.`);
})
.then(results => {
results.forEach((result, index) => {
if (result.error) {
console.error(`Error fetching URL ${urls[index]}:`, result.error);
} else {
console.log(`Fetched data from ${urls[index]}:`, result);
}
});
})
.catch(error => console.error('Batch execution failed:', error));
In this challenge:
- onProgress Callback: A callback function that receives the number of completed batches and the total number of batches, allowing for progress tracking.