Embodied AI Lab - EECS 106B

Class Type: Robotic Manipulation and Interaction

Will AI robots take over the world? Who knows! ¯\(ツ)/¯ Let's drop the dramatic fiction and see what's actually driving these machines 🤖.

Authors:

Presentations

Project Source

Table of Contents

  1. Introduction
  2. Course Context
  3. Learning Goals
  4. Student Assignment
  5. Instructor Guides
  6. Requirements

Introduction

The idea of AI robots developing their own brains and taking over the world is a dramatic storyline, not a reality. Our project cuts through this fear by giving students direct, hands-on experience with embodied AI, showing them exactly what is happening “under the hood.” We demonstrate that the core technology is an evolution of classic engineering, linking traditional controls and robot research architectures to the new AI landscape. Students start with fundamental techniques like teleoperation (remote control) and simple intelligence built on nearest neighbor computation - where the robot finds the closest match to the current situation in its memory to decide its next move. This forms the essential base for understanding how modern robots achieve intelligent behavior without needing independent consciousness.

Moving beyond the basics, we integrate powerful new tools like Vision-Language Models (VLMs), which let the robot understand the world by connecting what it sees with language (e.g., locating an object based on a spoken command). The VLM’s output then drives the action through Vision-Language-Action (VLA) models. A key focus is on ensuring safety for the output of VLAs. Since these systems can generate complex, novel robot actions, we emphasize building robust safety protocols and constraints.

Course Context

EECS 106B Robotic Manipulation and Interaction introduces students to advanced topics and research in robotics and intelligent machines, including kinematics & control, obstacle avoidance & computer vision, manipulation, active vision, and reinforcement learning. It is expected that students have previously taken EECS 106A Introduction to Robotics, have a strong programming background and knowledge of Python and MATLAB, and have some prior experience with coursework in feedback controls.

Learning Goals

This lab aims to add an introduction to embodied AI to the existing research topics covered in EECS 106B. Through the lab, students will:

  1. Develop an imitation learning abstracted mental model for a pick-and-place task.
  2. Understand the input and outputs for VLMs. Compare different VLM implementations.
  3. Apply VLM intuition to robot use case, introducing VLAs.
  4. Relate GenAI concepts to traditional robotics. Demonstrate understanding of GenAI limitations and safety.

Student Assignment

This lab is separated into two parts. To begin:

  1. Navigate to Part 1, read through the README, and walk through the provided Imitation Learning Jupyter notebook.
  2. Continue to Part 2, read the second README, and walk through the provided Jupyter notebooks, starting with the Introduction to ChatGPT.
  3. When you’ve explored all notebooks, complete the Final Lab Checkoff with a lab TA and your lab partner.

Instructor Guides

⚠️Access required - The grading guidelines for the lab final checkoff are available here for course staff. Please make a copy of the document before implementing any customizations to the guidelines.

Requirements

  • Software requirements: Python/Jupyter Notebook
  • [Suggested] software: Google Colab
  • Classroom logistics: Synchronous lab time with TAs, lab partners (pairs of students), at least 1 TA for 2-3 pairs of students, open collaborative lab space