Learn generators from the ground up

Learn generators from the ground up

Foundation

Before Generators

Before generators, if you wanted to make something that could be looped through (like an array), you had to write all the logic yourself.

You needed two main things:

First, you needed a next() function that would give you the next value each time you called it. This function had to remember its place in the sequence and know what value to return next.

Second, you needed some way to know when you were done. When there were no more values to give back.

Here's what this looked like in code:

function createCounter(max) {
  let count = 0;

  return {
    next() {
      if (count < max) {
        return { value: count++, done: false };
      }
      return { value: undefined, done: true };
    },
  };
}

const counter = createCounter(3);
console.log(counter.next()); // { value: 0, done: false }
console.log(counter.next()); // { value: 1, done: false }
console.log(counter.next()); // { value: 2, done: false }
console.log(counter.next()); // { value: undefined, done: true }

PS. count works like a state because of closures.

This code shows the core concept: next() returns an object with two properties:

  • value: the current value

  • done: whether we've reached the end

This was the basic pattern, but it had problems:

  • You had to manually track state (count in this example)

  • You had to write a lot of boilerplate code

  • It was easy to make mistakes

  • The code was hard to read

This is why Symbol.iterator and generators were later introduced.

Introduction of Symbol.iterator specification

JavaScript needed a standard way for objects to say "here's how you loop through me." Symbol.iterator was introduced to solve this. It's a special property that tells JavaScript how something should be looped through.

Here's how it works in code:

const myObject = {
  data: [1, 2, 3],
  [Symbol.iterator]() {
    let index = 0;

    return {
      next: () => {
        if (index < this.data.length) {
          return { value: this.data[index++], done: false };
        }
        return { value: undefined, done: true };
      },
    };
  },
};

// Now we can loop through it
for (const value of myObject) {
  console.log(value); // 1, 2, 3
}

The key points:

  • Symbol.iterator is a method that returns an iterator object

  • The iterator object must have a next() method

  • next() returns the same {value, done} pattern we saw before

This standardization allowed arrays, strings, maps, and sets to work in the same way. It also enabled new features like for...of loops to work with any object that used this interface.

Before Symbol.iterator, different objects had their own ways to loop through them. Arrays used one way, strings used another, and custom objects had their own methods too. This standardization made everything clearer and more consistent.

The iterator protocol's role in ES6

ES6 formalized how iteration should work in JavaScript. The protocol is actually simple. It's just a contract that says "if you want something to be iterable, here's what you need to do":

// The complete protocol in action
const collection = {
  data: ["a", "b", "c"],

  // This makes it iterable
  [Symbol.iterator]() {
    let index = 0;

    // This is the iterator itself
    return {
      // The next method is the core of the protocol
      next() {
        if (index < this.data.length) {
          return {
            value: this.data[index++],
            done: false,
          };
        }

        return {
          value: undefined,
          done: true,
        };
      },
    };
  },
};

The protocol has these key requirements:

  • An iterable must have a Symbol.iterator method

  • This method must return an iterator object

  • The iterator must have a next() method

  • next() must return objects with value and done properties

This protocol is what allows features like for...of, spread operator (...), and destructuring to work. They all look for Symbol.iterator and use the protocol under the hood.

How for...of uses this protocol

When you write a for...of loop, JavaScript does the following under the hood:

const items = ["a", "b", "c"];

// When you write this:
for (const item of items) {
  console.log(item);
}

// JavaScript actually does this:
const iterator = items[Symbol.iterator]();
let result = iterator.next();

while (!result.done) {
  const item = result.value;
  console.log(item);
  result = iterator.next();
}

This is why for...of is different from for...in. The for...of loop uses the iterator protocol, while for...in just loops over property names.

A practical example showing the difference:

const arr = ["hello"];
arr.customProp = "world";

for (const x of arr) {
  console.log(x); // Only prints: hello
}

for (const x in arr) {
  console.log(x); // Prints: 0, customProp
}

That's actually all there is to how for...of works. It's just a loop that keeps calling next() until done is true.

Basic generator functions

Why they were introduced

Remember the iterator I wrote earlier?

Here it is again:

const collection = {
  data: ["a", "b", "c"],
  [Symbol.iterator]() {
    let index = 0;
    return {
      next() {
        if (index < this.data.length) {
          return {
            value: this.data[index++],
            done: false,
          };
        }
        return {
          value: undefined,
          done: true,
        };
      },
    };
  },
};

This code has several problems:

  • We have to manually track state (index)

  • We have to create the value/done objects ourselves

  • The logic for getting next values is mixed with the iteration mechanics

  • If we want to do something more complex than array access, the code gets messy fast

Generators were created to solve these exact problems.

Here's the same code as a generator:

function* createIterator() {
  yield "a";
  yield "b";
  yield "c";
}

That's it. The generator handles all the complexity we had to write manually before. The code is shorter, clearer, and less prone to bugs.

Generator function syntax and the function* declaration

The syntax for creating a generator -> just add a * after the function keyword:

// Generator function declaration
function* myGenerator() {
  // generator body
}

// Generator function expression
const myGenerator = function* () {
  // generator body
};

// In an object
const obj = {
  *myGenerator() {
    // generator body
  },
};

// In a class
class MyClass {
  *myGenerator() {
    // generator body
  }
}

When you call a generator function, it doesn't run the code inside.

Instead, it creates a generator object:

function* gen() {
  console.log("This doesn't run yet");
  yield 1;
}

const generator = gen(); // Nothing is logged
generator.next(); // Now 'This doesn't run yet' is logged

The function* syntax marks that this is a special kind of function that can pause its execution. The execution only starts when you call next() on the generator object.

Understanding the yield keyword

yield is the core of how generators work. It's a way to give a value back and pause the function right there. When you call next() again, the function continues from exactly where it left off:

function* numbers() {
  console.log("Starting");
  yield 1;
  console.log("After first yield");
  yield 2;
  console.log("After second yield");
  yield 3;
  console.log("Done");
}

const gen = numbers();

console.log(gen.next()); // Logs: Starting, then {value: 1, done: false}
console.log(gen.next()); // Logs: After first yield, then {value: 2, done: false}
console.log(gen.next()); // Logs: After second yield, then {value: 3, done: false}
console.log(gen.next()); // Logs: Done, then {value: undefined, done: true}

Some key points about yield:

  • It can only be used inside generator functions

  • Each yield creates a pause point

  • The function remembers all its variables between yields

  • When there are no more yields, done becomes true

This ability to pause and continue is what makes generators powerful. The function retains all its state between calls to next().

The generator object and its methods (next, return, throw)

A generator object has three main methods:

function* example() {
  try {
    yield 1;
    yield 2;
    yield 3;
  } catch (err) {
    console.log("Error caught:", err);
  }
}

const gen = example();

// next() -> moves to the next yield and returns {value, done}
console.log(gen.next()); // {value: 1, done: false}

// return() -> ends the generator early with a value
console.log(gen.return(10)); // {value: 10, done: true}
// After return(), the generator is done

// throw() -> throws an error into the generator
const gen2 = example();
gen2.next(); // Start the generator
gen2.throw(new Error("Something went wrong"));
// Logs: Error caught: Error: Something went wrong

The next() method is what you'll use most often. return() and throw() are special:

  • return() is like forcing the generator to end early

  • throw() lets you send errors into the generator that can be caught with try/catch

Each of these methods follows the same pattern. They all return an object with value and done properties (iterator object).

The internal state machine concept

A generator function is like a mini-program inside a program. It goes through different states and remembers its position between each next() call.

function* login() {
  const username = yield "Enter username";
  const password = yield "Enter password";

  if (username === "admin" && password === "pass") {
    yield "Logged in";
  } else {
    yield "Failed";
  }
}

const loginProcess = login();

console.log(loginProcess.next()); // {value: 'Enter username', done: false}
console.log(loginProcess.next("admin")); // {value: 'Enter password', done: false}
console.log(loginProcess.next("pass")); // {value: 'Logged in', done: false}
console.log(loginProcess.next()); // {value: undefined, done: true}

Each yield marks a state:

  • The generator is either running, suspended at a yield, or done

  • Variables keep their values between states

  • You can't go backwards. Only forwards through the states

  • Each next() call moves us to the next state

This state machine behavior makes generators great for handling flows that need to happen in a specific order.

It's important to note that when you call next("admin"), the value "admin" is passed to the generator function as the result of the previous yield. This allows for two way communication between the generator and the outside world.


The key thing is that yield does two things:

  1. It gives a value out (what's after the yield keyword)

  2. It then WAITS/PAUSES to receive a value back (this becomes the result of the yield expression)

Let's break down that login example step by step:

function* login() {
  // 1. First next() starts execution and runs until this yield
  // yield gives out 'Enter username'
  // Then it PAUSES (important change in mental model), waiting for the next next() call
  const username = yield "Enter username";
  // When next('admin') is called, 'admin' becomes the result
  // of the entire `yield 'Enter username'` expression

  // 2. Execution continues until this yield
  // yield gives out 'Enter password'
  // Then it PAUSES again
  const password = yield "Enter password";
  // When next('pass') is called, 'pass' becomes the result
  // of the entire `yield 'Enter password'` expression

  if (username === "admin" && password === "pass") {
    yield "Logged in";
  } else {
    yield "Failed";
  }
}

This is why the first next() can't pass a value. There's no yield waiting to receive it yet. The values you pass always go to the previous yield (the point we paused at).

Suspended execution state

When a generator runs, it can be in three states: running, suspended, or done. The key part is the suspended state.

This is unique to generators:

function* example() {
  console.log("Starting"); // State: running
  yield 1; // State: suspended
  console.log("Middle"); // State: running
  yield 2; // State: suspended
  console.log("End"); // State: running
} // State: done

const gen = example();
// Generator is created but hasn't started running

gen.next();
// Logs 'Starting'
// Runs until yield 1, then suspends

gen.next();
// Logs 'Middle'
// Runs until yield 2, then suspends

gen.next();
// Logs 'End'
// Runs to completion, now done

When suspended, the generator:

  • Stops executing

  • Remembers exactly where it stopped

  • Holds onto all its variables and their values

  • Waits for the next next() call to resume

This suspension is what makes generators different from regular functions. They can pause in the middle of execution.

Stack frame preservation

When a regular function runs, it pushes its data (local variables, position in code) onto a stack and pops it off when done. But generators need to keep this data between yields.

Here's how it works:

function* counter() {
  let count = 1;
  let sum = 0;

  while (count <= 3) {
    sum += count;
    yield sum; // Local variables (count, sum) are preserved here
    count++; // When we return, count still has its value
  }
}

const gen = counter();
console.log(gen.next()); // {value: 1, done: false}  (sum = 1, count = 1)
console.log(gen.next()); // {value: 3, done: false}  (sum = 3, count = 2)
console.log(gen.next()); // {value: 6, done: false}  (sum = 6, count = 3)
console.log(gen.next()); // {value: undefined, done: true}

When the generator yields:

  • All local variables stay intact

  • The position in the while loop is remembered

  • Even the whole call stack is preserved if you have nested function calls

This is different from regular functions where variables are gone once the function returns.

How yield pauses execution

Let's see exactly what happens when a yield statement is hit:

function* process() {
  console.log("Step 1");
  yield "first";

  console.log("Step 2");
  const value = yield "second";

  console.log("Got value:", value);
  yield "third";
}

const gen = process();

// Nothing runs yet -> generator is created but suspended before first line
console.log("Before first next");

// First next -> runs until first yield
gen.next(); // Logs: Step 1
// Returns: {value: 'first', done: false}
// Pauses before Step 2

// Second next -> runs until second yield
gen.next(); // Logs: Step 2
// Returns: {value: 'second', done: false}
// Pauses before logging value

// Third next -> runs until third yield
gen.next("passed in"); // Logs: Got value: passed in
// Returns: {value: 'third', done: false}

When yield happens:

  • The current line finishes executing

  • The value after yield is returned in the value property

  • The code completely stops. No other lines run

  • The generator waits for another next() call

The execution pause is complete. No background work happens, no timers continue, nothing. It's a true pause.

The bidirectional communication channel

The yield keyword creates a two-way communication channel between the code running the generator and the generator itself.

Values can flow both ways:

function* twoWayTalk() {
  // Out: 'hello', In: 'first response'
  const response1 = yield "hello";
  console.log("Got:", response1);

  // Out: 'goodbye', In: 'second response'
  const response2 = yield "goodbye";
  console.log("Got:", response2);
}

const gen = twoWayTalk();

// Generator -> Outside: send 'hello'
console.log(gen.next()); // {value: 'hello', done: false}

// Outside -> Generator: send 'first response'
console.log(gen.next("first response")); // Logs: Got: first response
// Returns: {value: 'goodbye', done: false}

// Outside -> Generator: send 'second response'
console.log(gen.next("second response")); // Logs: Got: second response
// Returns: {value: undefined, done: true}

This two-way flow makes generators powerful for:

  • Processing data where each step needs input from outside

  • Creating dialogues between different parts of code

  • Building state machines that need external input

Understanding the done flag

The done flag tells us if a generator has finished running.

It becomes true in two situations:

function* example() {
  yield 1;
  yield 2;
  // After this point, done becomes true
  // because we run out of yields
}

function* example2() {
  yield 1;
  return 2;
  // done becomes true immediately after return
  yield 3; // This will never be reached
}

const gen1 = example();
console.log(gen1.next()); // {value: 1, done: false}
console.log(gen1.next()); // {value: 2, done: false}
console.log(gen1.next()); // {value: undefined, done: true}

const gen2 = example2();
console.log(gen2.next()); // {value: 1, done: false}
console.log(gen2.next()); // {value: 2, done: true}
console.log(gen2.next()); // {value: undefined, done: true}

The done flag works like this:

  • It's false as long as there are more yields to run

  • It becomes true when the generator finishes naturally

  • It becomes true immediately when return is called

  • Once true, it stays true for all future next() calls

Advanced generator concepts

yield* delegation operator

The yield* operator allows one generator to pass control to another generator.

It's a way to combine generators:

function* numbers() {
  yield 1;
  yield 2;
}

function* letters() {
  yield "a";
  yield "b";
}

function* combined() {
  yield* numbers();
  yield* letters();
}

const gen = combined();
console.log(gen.next()); // {value: 1, done: false}
console.log(gen.next()); // {value: 2, done: false}
console.log(gen.next()); // {value: 'a', done: false}
console.log(gen.next()); // {value: 'b', done: false}
console.log(gen.next()); // {value: undefined, done: true}

yield* works by:

  • Taking control of next() calls until the sub-generator is done

  • Passing through all values from the sub-generator

  • Continuing with the main generator when the sub-generator is done

This is useful for breaking down complex generators into smaller pieces that can be combined.

Passing values back to generators

I mentioned this before, but let's explore how you can use these passed values in more practical ways:

function* calculate() {
  const firstNumber = yield "Enter first number";
  const operation = yield "Enter + or -";
  const secondNumber = yield "Enter second number";

  if (operation === "+") {
    yield firstNumber + secondNumber;
  } else if (operation === "-") {
    yield firstNumber - secondNumber;
  } else {
    yield "Invalid operation";
  }
}

const calc = calculate();
console.log(calc.next()); // {value: 'Enter first number', done: false}
console.log(calc.next(10)); // {value: 'Enter + or -', done: false}
console.log(calc.next("+")); // {value: 'Enter second number', done: false}
console.log(calc.next(5)); // {value: 15, done: false}

The key points about passing values:

  • The value you pass in next() becomes the result of the previous yield

  • The first next() call can't receive a value because no yield is waiting

  • If you pass a value but the generator isn't yielding, the value is ignored

This pattern is great for building interactive flows where each step depends on previous input.

Error handling in generators

Generators have a special way of handling errors using try/catch and the throw() method:

function* divideNumbers() {
  try {
    const first = yield "First number";
    const second = yield "Second number";

    if (second === 0) {
      throw new Error("Cannot divide by zero");
    }

    yield first / second;
  } catch (error) {
    // catch and yield the error message
    yield "Error: " + error.message;
  }
}

const div = divideNumbers();
console.log(div.next()); // {value: 'First number', done: false}
console.log(div.next(10)); // {value: 'Second number', done: false}
console.log(div.next(0)); // {value: 'Error: Cannot divide by zero', done: false}

// You can also throw errors from outside:
const div2 = divideNumbers();
div2.next(); // {value: 'First number', done: false}
// Will be caught by the catch block inside the generator
div2.throw(new Error("Oops")); // {value: 'Error: Oops', done: false}

Error handling in generators works like this:

  • You can use try/catch inside the generator

  • throw() lets you send errors into the generator from outside

  • Uncaught errors stop the generator and set done to true

  • The error handling flows through yield* delegation

Clarification with yield* and errors

function* subGenerator() {
  try {
    yield "sub 1";
    yield "sub 2";
  } catch (err) {
    yield "caught in sub: " + err.message;
  }
}

function* mainGenerator() {
  yield "main 1";
  yield* subGenerator();
  yield "main 2";
}

const gen = mainGenerator();
console.log(gen.next()); // {value: 'main 1', done: false}
console.log(gen.next()); // {value: 'sub 1', done: false}
console.log(gen.throw(new Error("boom"))); // {value: 'caught in sub: boom', done: false}

When you use yield*, errors are sent to wherever the execution currently is. In this case, when we call throw(), even though we called it on mainGenerator, the error is caught in subGenerator's try/catch because that's where the execution was at that moment.

This means error handling "flows" through to wherever yield* has delegated to. You don't need special code to make this work → it just follows where the execution is.

Generator composition patterns

Generator composition is about combining generators in useful ways.

Here are key patterns:

// Pattern 1: Sequential composition
function* firstPart() {
  yield 1;
  yield 2;
}

function* secondPart() {
  yield 3;
  yield 4;
}

function* sequential() {
  yield* firstPart();
  yield* secondPart();
}

// Pattern 2: Nested composition
function* outer() {
  yield "start";
  yield* inner();
  yield "end";
}

function* inner() {
  yield* [1, 2, 3]; // yield* works with any iterable
}

// Pattern 3: Transform composition
function* numbers() {
  yield* [1, 2, 3];
}

function* doubled() {
  for (const num of numbers()) {
    yield num * 2;
  }
}

const gen = doubled();
console.log(gen.next()); // {value: 2, done: false}
console.log(gen.next()); // {value: 4, done: false}
console.log(gen.next()); // {value: 6, done: false}

These patterns let you:

  • Break complex generators into simpler pieces

  • Reuse generator logic

  • Transform values as they flow through generators

yield* on value directly

When you use yield* on a value directly, it only works if that value is iterable.

It essentially means "take this iterable thing and yield each of its values":

function* example() {
  // Arrays are iterable
  yield* [1, 2, 3];
  // Same as:
  // yield 1
  // yield 2
  // yield 3

  // Strings are iterable
  yield* "hi";
  // Same as:
  // yield "h"
  // yield "i"

  // Sets are iterable
  yield* new Set([4, 5]);
  // Same as:
  // yield 4
  // yield 5
}

const gen = example();
console.log(gen.next()); // {value: 1, done: false}
console.log(gen.next()); // {value: 2, done: false}
console.log(gen.next()); // {value: 3, done: false}
console.log(gen.next()); // {value: "h", done: false}
console.log(gen.next()); // {value: "i", done: false}
console.log(gen.next()); // {value: 4, done: false}
console.log(gen.next()); // {value: 5, done: false}

If you try to use yield* on something that isn't iterable, you'll get an error:

function* wrong() {
  yield* 42; // TypeError: 42 is not iterable
}

Infinite sequences and lazy evaluation

This is where generators really shine.

They can represent infinite sequences without actually creating all the values at once:

function* infiniteNumbers() {
  let n = 0;
  while (true) {
    yield n++;
  }
}

function* evenNumbers() {
  let n = 0;
  while (true) {
    yield n;
    n += 2;
  }
}

const numbers = infiniteNumbers();
console.log(numbers.next()); // {value: 0, done: false}
console.log(numbers.next()); // {value: 1, done: false}
console.log(numbers.next()); // {value: 2, done: false}
// Could go on forever, but only calculates values when we ask for them

// We can also compose with infinite generators
function* take(n, generator) {
  for (let i = 0; i < n; i++) {
    yield generator.next().value;
  }
}

// Get first 3 even numbers
const first3Even = take(3, evenNumbers());
console.log([...first3Even]); // [0, 2, 4]

The key points about lazy evaluation:

  • Values are only calculated when requested

  • Memory usage stays constant

  • You can work with theoretically infinite sequences

  • You can transform infinite sequences without running forever

first3Even and calling .next() on it

const first3Even = take(3, evenNumbers());

console.log(first3Even.next()); // {value: 0, done: false}
console.log(first3Even.next()); // {value: 2, done: false}
console.log(first3Even.next()); // {value: 4, done: false}
console.log(first3Even.next()); // {value: undefined, done: true}

The [...first3Even] syntax used before is just a convenient way to collect all values at once using the spread operator. It works because generators are iterable.

Both approaches do the same thing. The spread operator is just calling next() under the hood until done is true and collecting all the values in an array.

Memory Management Deep Dive

Let's look at how generators handle memory differently from regular functions:

// Regular function -> stack frame is created and destroyed on each call
function regular() {
  const x = 1;
  const y = 2;
  return x + y;
  // All variables are gone after return
}

// Generator -> stack frame is suspended and preserved
function* generator() {
  const x = 1;
  yield x; // Stack frame is suspended here
  const y = 2; // y doesn't exist until we resume
  yield x + y; // Stack frame is suspended again
}

const gen = generator();
console.log(gen.next()); // {value: 1, done: false}
// At this point:
// - x exists and equals 1
// - y hasn't been created yet
// - Stack frame is preserved with this state

console.log(gen.next()); // {value: 3, done: false}
// Now:
// - x still exists and equals 1
// - y exists and equals 2
// - Stack frame preserves both variables

The key differences are:

  • Regular functions create and destroy their whole stack frame each time

  • Generators suspend their stack frame between yields

  • Variables in generators only get created when the code that declares them runs

  • The JavaScript engine needs to store more information for generators to keep this state


Memory Cleanup When Abandoned:

function* longProcess() {
  const bigData = new Array(1000000);
  yield "step 1";

  yield "step 2";
  // If we abandon the generator here, bigData gets cleaned up
  // because it's not longer needed
}

const gen = longProcess();
gen.next(); // Creates bigData
// If we never call next() again, the generator is abandoned
// gen = null would make it OK for garbage collection

Generator Object Lifecycle:

function* generator() {
  try {
    yield 1;
    yield 2;
  } finally {
    // This runs when the generator is garbage collected
    // or when return() is called
    // Good when you need to clean up resources
    console.log("Cleanup");
  }
}

const gen = generator();
gen.next(); // {value: 1, done: false}
gen.return(42); // Logs: Cleanup, returns: {value: 42, done: true}

Garbage Collection Rules:

function* example() {
  const state = { data: "important" };
  while (true) {
    yield state;
  }
}

let gen = example();
gen.next(); // Generator is active
gen = null; // Generator becomes OK for GC

The key points for all three:

  • Generators clean up their resources when they're abandoned or completed

  • The finally block can handle cleanup tasks

  • A generator becomes OK for garbage collection when:

    • It completes naturally (done: true)

    • It's terminated with return()

    • All references to it are gone e.g. gen = null

  • Variables inside the generator stay in memory as long as the generator is alive

Async generators

Symbol.asyncIterator protocol

Just like Symbol.iterator is for normal iteration, Symbol.asyncIterator is for async iteration.

The key difference is that it works with Promises:

const asyncIterable = {
  [Symbol.asyncIterator]() {
    let count = 0;
    return {
      // This is async because it returns a Promise
      async next() {
        if (count < 3) {
          // Simulate async work
          await new Promise((resolve) => setTimeout(resolve, 1000));
          return { value: count++, done: false };
        }
        return { value: undefined, done: true };
      },
    };
  },
};

// Using it:
async function run() {
  for await (const num of asyncIterable) {
    console.log(num); // Logs 0, 1, 2 with 1 second delay between each
  }
}

The protocol requires:

  • A [Symbol.asyncIterator]() method

  • The next() method must return a Promise

  • That Promise resolves to an object with value and done

async function* syntax

Async generators combine both async/await and generator syntax.

They let you yield Promises naturally:

async function* asyncNumbers() {
  yield 1;
  await new Promise((resolve) => setTimeout(resolve, 1000));
  yield 2;
  await new Promise((resolve) => setTimeout(resolve, 1000));
  yield 3;
}

// Using it:
const gen = asyncNumbers();
console.log(await gen.next()); // {value: 1, done: false}
console.log(await gen.next()); // After 1s: {value: 2, done: false}
console.log(await gen.next()); // After 1s: {value: 3, done: false}

// next() returns promises now:
const promise = gen.next(); // Promise<{value: any, done: boolean}>

Key points:

  • next() returns Promises that resolve to {value, done}

  • You can use await inside the generator

  • You can yield both normal values and Promises

  • The values get automatically unwrapped when using for await...of

for await...of loops

for await...of is specifically designed to work with async iterables and async generators.

It handles all the Promise unwrapping for us:

async function* fetchUrls() {
  const urls = ["data1", "data2", "data3"];

  for (const url of urls) {
    // Simulate fetch
    const data = await new Promise((resolve) =>
      setTimeout(() => resolve(`Result from ${url}`), 1000)
    );
    yield data;
  }
}

// Using for await...of
async function processUrls() {
  for await (const result of fetchUrls()) {
    console.log(result);
    // Logs each result with 1s delay:
    // "Result from data1"
    // "Result from data2"
    // "Result from data3"
  }
}

// This won't work as top level code. Must be in async function:
// for await (const result of fetchUrls()) {}  // SyntaxError

Key points:

  • Must be used inside an async function

  • Waits for each Promise to resolve before moving to next iteration

  • Works with any async iterable, not just async generators

More code samples

// This will fail -> top level code
for await (const result of fetchUrls()) {
} // SyntaxError

// This works -> inside async function
async function doWork() {
  for await (const result of fetchUrls()) {
  }
}

// This works too -> inside async arrow function
const doWork = async () => {
  for await (const result of fetchUrls()) {
  }
};

// This works -> inside async IIFE (Immediately Invoked Function Expression)
(async () => {
  for await (const result of fetchUrls()) {
  }
})();

Combining with Promises

Async generators work well with other Promise-based code.

They can mix regular yields with Promise operations:

async function* fetchInBatches() {
  const urls = ["users", "posts", "comments"];

  for (const url of urls) {
    // Multiple async operations per yield
    const data = await fetch(url);
    const processed = await processData(data);
    yield processed;

    // You can also yield Promises directly
    yield Promise.resolve("batch done");
  }
}

// Can be combined with Promise.all
async function* parallelFetch() {
  const urls = [
    ["user1", "user2"],
    ["post1", "post2"],
  ];

  for (const batch of urls) {
    // Fetch batch in parallel
    const results = await Promise.all(batch.map((url) => fetch(url)));
    yield results;
  }
}

// Can handle Promise rejections
async function* retryFetch() {
  while (true) {
    try {
      const data = await fetch("url");
      yield data;
      break;
    } catch {
      yield "retrying...";
      await new Promise((r) => setTimeout(r, 1000));
    }
  }
}

Error handling in async contexts

Error handling in async generators combines both async/await error handling and generator error handling:

async function* riskyGenerator() {
  try {
    const data = await fetch("some-url");
    yield data;
  } catch (error) {
    yield "fetch failed";
  }

  try {
    // Generator's throw() method works too
    const value = yield "continue?";
    if (!value) {
      throw new Error("stopped");
    }
  } catch (error) {
    yield "generator stopped";
  }
}

// Using it with error handling
async function run() {
  const gen = riskyGenerator();

  try {
    // Handle Promise rejections
    const result = await gen.next();

    // Handle generator throws
    await gen.throw(new Error("external error"));

    // Both await and yield can throw
    for await (const value of gen) {
      console.log(value);
    }
  } catch (error) {
    console.log("Caught:", error);
  }
}

Key points about error handling:

  • Both await and yield can throw errors

  • try/catch works with both types of errors

  • The throw() method still works like regular generators

  • Unhandled errors in for await...of stop the loop

More notes

When you call gen.throw(), the error appears at the current yield point inside the generator.

async function* example() {
  try {
    console.log("Starting");
    const a = yield 1;
    console.log("After first yield");
    const b = yield 2; // Error appears HERE if we throw after first yield
    console.log("Never gets here");
  } catch (error) {
    console.log("Caught inside:", error.message);
    yield "recovered";
  }
}

async function run() {
  const gen = example();

  console.log(await gen.next()); // Logs: Starting
  // Returns: {value: 1, done: false}

  console.log(await gen.throw(new Error("boom")));
  // Logs: After first yield

  // this log comes from the catch block in the generator:
  // Logs: Caught inside: boom
  // Returns: {value: 'recovered', done: false}
}