27. Async Patterns & Advanced Fetch Usage
Why Advanced Async Patterns Matter
Modern apps rely on multiple asynchronous operations simultaneously. Understanding patterns beyond basic fetches allows you to orchestrate complex flows, improve performance, and maintain readable, maintainable code.
Parallel vs Sequential Fetching
Fetching multiple resources can be done sequentially (one after the other) or in parallel (all at once). Parallel fetching is usually faster, but sequential can be necessary when one call depends on another.
// Sequential fetching
async function fetchSequential() {
const res1 = await fetch('https://jsonplaceholder.typicode.com/users');
const users = await res1.json();
const res2 = await fetch(`https://jsonplaceholder.typicode.com/posts?userId=${users[0].id}`);
const posts = await res2.json();
console.log(users[0], posts);
}// Parallel fetching
async function fetchParallel() {
const [usersRes, postsRes] = await Promise.all([
fetch('https://jsonplaceholder.typicode.com/users'),
fetch('https://jsonplaceholder.typicode.com/posts')
]);
const users = await usersRes.json();
const posts = await postsRes.json();
console.log(users[0], posts.slice(0, 3));
}Timeouts and AbortController
Sometimes requests take too long. `AbortController` lets you cancel fetch requests if they exceed a timeout or if a user navigates away.
const controller = new AbortController();
const timeout = setTimeout(() => controller.abort(), 5000);
try {
const res = await fetch('https://jsonplaceholder.typicode.com/users', { signal: controller.signal });
const data = await res.json();
console.log(data);
} catch (err) {
if (err.name === 'AbortError') {
console.error('Fetch aborted due to timeout');
} else {
console.error('Fetch error:', err);
}
} finally {
clearTimeout(timeout);
}Rate-Limiting & Throttling API Calls
Some APIs have rate limits. Implement throttling or batching to avoid hitting the limit.
async function fetchWithThrottle(urls, limit = 3) {
const results = [];
for (let i = 0; i < urls.length; i += limit) {
const batch = urls.slice(i, i + limit).map(url => fetch(url).then(res => res.json()));
results.push(...await Promise.all(batch));
}
return results;
}Retrying & Backoff Strategies
Retries with exponential backoff reduce load on APIs while improving success rates.
async function fetchWithBackoff(url, retries = 3, delay = 1000) {
for (let i = 0; i < retries; i++) {
try {
const res = await fetch(url);
if (!res.ok) throw new Error('Network error');
return await res.json();
} catch (err) {
console.warn(`Attempt ${i + 1} failed. Retrying in ${delay}ms...`);
await new Promise(r => setTimeout(r, delay));
delay *= 2; // Exponential backoff
}
}
throw new Error('All retries failed');
}Streaming & Progressive Data Handling
Sometimes APIs provide streaming data or large payloads. Using streams prevents memory overload and allows processing data as it arrives.
async function streamData(url) {
const res = await fetch(url);
const reader = res.body.getReader();
let result;
while(!(result = await reader.read()).done) {
const chunk = new TextDecoder().decode(result.value);
console.log('Chunk received:', chunk);
}
}
streamData('https://example.com/large-data');Mini Challenge
1. Fetch data from three different endpoints in parallel, but handle each independently. 2. Implement a timeout for each request and cancel if it takes too long. 3. Add retry logic with exponential backoff for failed requests. 4. Display each piece of data as soon as it arrives using streaming or progressive updates.