Section outline

    • ·         17.1 Why Is Generating Language Hard? (Goal: Understand why getting computers to generate human-like text is more challenging than simply reading or classifying text.)

      • 17.2 Recurrent Neural Networks (RNNs) – Memory in Sequences (Goal: Learn how RNNs enable deep networks to generate sequences by keeping a “memory” of previous inputs.)
      • 17.3 Translating Languages with Deep Networks (Goal: Discover how deep learning approaches (sequence-to-sequence models) revolutionized machine translation, achieving near human-level translation quality.)
      • 17.4 AI Storytelling – Generating Text Creatively (Goal: See how AI can generate creative text by predicting one piece at a time, illustrated by a game of progressive story expansion.)
      • 17.5 The Rise of Powerful Language Models (Goal: Introduce modern advanced language models (like Transformer-based models) that can generate remarkably human-like text, and discuss their impact.)