Project 1: Interactive Pet Robot

Overview

Create an interactive pet robot that responds to touch, displays emotions on the LCD, plays sounds, and performs dance moves. This project combines all Jupyter Lab skills into a cohesive, engaging robot personality.

Learning Objectives

By completing this project, you will:

  • Integrate camera, LCD, sound, and touch sensor systems
  • Create a state machine for robot behavior
  • Design expressive robot emotions and reactions
  • Build a complete interactive experience using Python

Prerequisites

Lab Topic Skills Used
Jupyter Lab 1 Camera Image capture, display
Jupyter Lab 2 LCD Display Drawing faces, animations
Jupyter Lab 3 Sound Audio playback, sound effects
Jupyter Lab 4 Touch Sensor Touch detection, events
Jupyter Lab 5 Dance Movement APIs, choreography

Hardware Requirements

  • Mini Pupper v2 fully assembled
  • LCD display connected
  • Speaker connected
  • Touch sensors functional

Part 1: System Architecture

┌─────────────────────────────────────────────────────────────────────┐
│                    Interactive Pet System                           │
├─────────────────────────────────────────────────────────────────────┤
│                                                                     │
│  ┌──────────────┐    ┌──────────────┐    ┌──────────────────────┐  │
│  │   Touch      │───►│   State      │───►│   Behavior           │  │
│  │   Sensors    │    │   Machine    │    │   Controller         │  │
│  └──────────────┘    └──────────────┘    └──────────┬───────────┘  │
│                                                      │              │
│                              ┌───────────────────────┼────────────┐ │
│                              │                       │            │ │
│                              ▼                       ▼            ▼ │
│  ┌──────────────┐    ┌──────────────┐    ┌──────────────────────┐  │
│  │   LCD        │    │   Sound      │    │   Movement           │  │
│  │   Emotions   │    │   Effects    │    │   Dance              │  │
│  └──────────────┘    └──────────────┘    └──────────────────────┘  │
│                                                                     │
└─────────────────────────────────────────────────────────────────────┘

Part 2: Create Project Structure

mkdir -p ~/interactive_pet/{src,assets/sounds,assets/images}
cd ~/interactive_pet

Part 3: LCD Emotion Display

Create src/emotions.py:

#!/usr/bin/env python3
"""
LCD Emotion Display for Interactive Pet
Based on Jupyter Lab 2
"""

from PIL import Image, ImageDraw
from MangDang.mini_pupper.display import Display


class EmotionDisplay:
    """Display emotions on Mini Pupper LCD"""
    
    def __init__(self):
        self.display = Display()
        self.width = 320
        self.height = 240
        
    def create_face(self, emotion: str) -> Image:
        """Create a face image for the given emotion"""
        img = Image.new('RGB', (self.width, self.height), color='black')
        draw = ImageDraw.Draw(img)
        
        if emotion == 'happy':
            self._draw_happy(draw)
        elif emotion == 'sad':
            self._draw_sad(draw)
        elif emotion == 'excited':
            self._draw_excited(draw)
        elif emotion == 'sleepy':
            self._draw_sleepy(draw)
        elif emotion == 'surprised':
            self._draw_surprised(draw)
        elif emotion == 'love':
            self._draw_love(draw)
        else:
            self._draw_neutral(draw)
            
        return img
        
    def show_emotion(self, emotion: str):
        """Display an emotion on the LCD"""
        face = self.create_face(emotion)
        self.display.show_image(face)
        
    def _draw_happy(self, draw):
        """Draw happy face - big smile, curved eyes"""
        # Eyes (curved happy)
        draw.arc([60, 80, 120, 140], 0, 180, fill='cyan', width=5)
        draw.arc([200, 80, 260, 140], 0, 180, fill='cyan', width=5)
        # Mouth (big smile)
        draw.arc([80, 140, 240, 220], 0, 180, fill='cyan', width=5)
        
    def _draw_sad(self, draw):
        """Draw sad face - droopy eyes, frown"""
        # Eyes (droopy)
        draw.arc([60, 100, 120, 140], 180, 360, fill='blue', width=5)
        draw.arc([200, 100, 260, 140], 180, 360, fill='blue', width=5)
        # Tears
        draw.ellipse([90, 145, 100, 170], fill='blue')
        draw.ellipse([220, 145, 230, 170], fill='blue')
        # Mouth (frown)
        draw.arc([100, 180, 220, 230], 180, 360, fill='blue', width=4)
        
    def _draw_excited(self, draw):
        """Draw excited face - wide eyes, open mouth"""
        # Eyes (wide open with sparkles)
        draw.ellipse([60, 70, 130, 140], outline='yellow', width=4)
        draw.ellipse([80, 90, 110, 120], fill='yellow')
        draw.ellipse([190, 70, 260, 140], outline='yellow', width=4)
        draw.ellipse([210, 90, 240, 120], fill='yellow')
        # Sparkles
        draw.polygon([(50, 60), (55, 75), (60, 60), (55, 55)], fill='white')
        draw.polygon([(260, 60), (265, 75), (270, 60), (265, 55)], fill='white')
        # Mouth (open wide)
        draw.ellipse([120, 160, 200, 220], outline='yellow', fill='orange', width=3)
        
    def _draw_sleepy(self, draw):
        """Draw sleepy face - half-closed eyes"""
        # Eyes (half closed)
        draw.line([60, 110, 120, 110], fill='gray', width=5)
        draw.line([200, 110, 260, 110], fill='gray', width=5)
        # Zzz
        draw.text((250, 40), "Z", fill='white')
        draw.text((270, 30), "z", fill='white')
        draw.text((285, 25), "z", fill='gray')
        # Mouth (small o)
        draw.ellipse([145, 170, 175, 200], outline='gray', width=3)
        
    def _draw_surprised(self, draw):
        """Draw surprised face - wide eyes, O mouth"""
        # Eyes (wide circles)
        draw.ellipse([60, 70, 130, 140], outline='white', width=4)
        draw.ellipse([85, 95, 105, 115], fill='white')
        draw.ellipse([190, 70, 260, 140], outline='white', width=4)
        draw.ellipse([215, 95, 235, 115], fill='white')
        # Mouth (big O)
        draw.ellipse([130, 160, 190, 220], outline='white', width=4)
        
    def _draw_love(self, draw):
        """Draw love face - heart eyes"""
        # Heart eyes
        self._draw_heart(draw, 90, 100, 40, 'red')
        self._draw_heart(draw, 230, 100, 40, 'red')
        # Smile
        draw.arc([100, 160, 220, 210], 0, 180, fill='pink', width=4)
        
    def _draw_neutral(self, draw):
        """Draw neutral face"""
        # Eyes (simple dots)
        draw.ellipse([80, 90, 110, 120], fill='white')
        draw.ellipse([210, 90, 240, 120], fill='white')
        # Mouth (straight line)
        draw.line([120, 180, 200, 180], fill='white', width=3)
        
    def _draw_heart(self, draw, x, y, size, color):
        """Draw a heart shape"""
        half = size // 2
        # Two circles for top of heart
        draw.ellipse([x - half, y - half, x, y + half//2], fill=color)
        draw.ellipse([x, y - half, x + half, y + half//2], fill=color)
        # Triangle for bottom
        draw.polygon([
            (x - half, y),
            (x + half, y),
            (x, y + size)
        ], fill=color)
        
    def animate_transition(self, from_emotion: str, to_emotion: str, steps: int = 5):
        """Animate between two emotions"""
        import time
        # Simple blink transition
        for _ in range(2):
            self.show_emotion('sleepy')
            time.sleep(0.1)
            self.show_emotion(from_emotion)
            time.sleep(0.1)
        self.show_emotion(to_emotion)

Part 4: Sound Effects System

Create src/sounds.py:

#!/usr/bin/env python3
"""
Sound Effects System for Interactive Pet
Based on Jupyter Lab 3
"""

import os
import subprocess
from pathlib import Path


class SoundEffects:
    """Play sound effects for the pet"""
    
    def __init__(self, sounds_dir: str = "assets/sounds"):
        self.sounds_dir = Path(sounds_dir)
        self.sounds_dir.mkdir(parents=True, exist_ok=True)
        
        # Sound file mappings
        self.sounds = {
            'happy': 'happy.wav',
            'sad': 'sad.wav',
            'excited': 'excited.wav',
            'bark': 'bark.wav',
            'whine': 'whine.wav',
            'purr': 'purr.wav',
            'yawn': 'yawn.wav',
            'greeting': 'greeting.wav',
        }
        
    def play(self, sound_name: str):
        """Play a sound effect"""
        if sound_name not in self.sounds:
            print(f"Unknown sound: {sound_name}")
            return
            
        sound_file = self.sounds_dir / self.sounds[sound_name]
        
        if sound_file.exists():
            self._play_file(str(sound_file))
        else:
            # Generate a beep pattern as fallback
            self._generate_beep(sound_name)
            
    def _play_file(self, filepath: str):
        """Play an audio file"""
        try:
            subprocess.run(
                ['aplay', '-q', filepath],
                capture_output=True,
                timeout=5
            )
        except Exception as e:
            print(f"Error playing sound: {e}")
            
    def _generate_beep(self, sound_name: str):
        """Generate beep patterns for different sounds"""
        patterns = {
            'happy': [(800, 100), (1000, 100), (1200, 150)],
            'sad': [(400, 200), (300, 300)],
            'excited': [(1000, 50), (1200, 50), (1000, 50), (1200, 50), (1400, 100)],
            'bark': [(600, 100), (800, 150)],
            'whine': [(500, 300), (400, 400)],
            'purr': [(200, 500)],
            'yawn': [(600, 100), (400, 200), (300, 300)],
            'greeting': [(600, 100), (800, 100), (600, 100)],
        }
        
        pattern = patterns.get(sound_name, [(440, 200)])
        
        for freq, duration in pattern:
            self._beep(freq, duration)
            
    def _beep(self, frequency: int, duration_ms: int):
        """Generate a beep using speaker-test"""
        try:
            subprocess.run(
                ['speaker-test', '-t', 'sine', '-f', str(frequency), 
                 '-l', '1', '-p', str(duration_ms // 1000 + 1)],
                capture_output=True,
                timeout=duration_ms / 1000 + 1
            )
        except:
            pass  # Silently fail if speaker-test not available
            
    def speak(self, text: str):
        """Text to speech using espeak"""
        try:
            subprocess.run(
                ['espeak', '-s', '150', text],
                capture_output=True,
                timeout=10
            )
        except Exception as e:
            print(f"Error speaking: {e}")

Part 5: Touch Response System

Create src/touch_handler.py:

#!/usr/bin/env python3
"""
Touch Response Handler for Interactive Pet
Based on Jupyter Lab 4
"""

import time
from enum import Enum
from dataclasses import dataclass
from typing import Callable, Dict, List


class TouchLocation(Enum):
    """Touch sensor locations"""
    HEAD = "head"
    BACK = "back"
    LEFT = "left"
    RIGHT = "right"


@dataclass
class TouchEvent:
    """Touch event data"""
    location: TouchLocation
    duration: float  # seconds
    is_double_tap: bool


class TouchHandler:
    """Handle touch sensor input"""
    
    def __init__(self):
        self.callbacks: Dict[TouchLocation, List[Callable]] = {
            loc: [] for loc in TouchLocation
        }
        self.last_touch_time: Dict[TouchLocation, float] = {
            loc: 0 for loc in TouchLocation
        }
        self.double_tap_threshold = 0.5  # seconds
        self.running = False
        
    def register_callback(self, location: TouchLocation, callback: Callable):
        """Register a callback for touch events"""
        self.callbacks[location].append(callback)
        
    def simulate_touch(self, location: TouchLocation, duration: float = 0.5):
        """Simulate a touch event (for testing)"""
        current_time = time.time()
        
        # Check for double tap
        is_double = (current_time - self.last_touch_time[location]) < self.double_tap_threshold
        self.last_touch_time[location] = current_time
        
        event = TouchEvent(
            location=location,
            duration=duration,
            is_double_tap=is_double
        )
        
        # Call registered callbacks
        for callback in self.callbacks[location]:
            callback(event)
            
    def read_touch_sensors(self) -> Dict[TouchLocation, bool]:
        """Read actual touch sensor values"""
        # This would interface with actual hardware
        # For now, return simulated values
        try:
            from MangDang.mini_pupper.sensors import TouchSensor
            sensor = TouchSensor()
            values = sensor.read()
            return {
                TouchLocation.HEAD: values.get('head', False),
                TouchLocation.BACK: values.get('back', False),
                TouchLocation.LEFT: values.get('left', False),
                TouchLocation.RIGHT: values.get('right', False),
            }
        except ImportError:
            return {loc: False for loc in TouchLocation}
            
    def start_monitoring(self, callback: Callable):
        """Start monitoring touch sensors"""
        self.running = True
        touch_start: Dict[TouchLocation, float] = {}
        
        while self.running:
            states = self.read_touch_sensors()
            current_time = time.time()
            
            for location, is_touched in states.items():
                if is_touched:
                    if location not in touch_start:
                        touch_start[location] = current_time
                else:
                    if location in touch_start:
                        duration = current_time - touch_start[location]
                        is_double = (current_time - self.last_touch_time[location]) < self.double_tap_threshold
                        self.last_touch_time[location] = current_time
                        
                        event = TouchEvent(
                            location=location,
                            duration=duration,
                            is_double_tap=is_double
                        )
                        callback(event)
                        del touch_start[location]
                        
            time.sleep(0.05)  # 50ms polling
            
    def stop_monitoring(self):
        """Stop monitoring touch sensors"""
        self.running = False

Part 6: Movement Controller

Create src/movement.py:

#!/usr/bin/env python3
"""
Movement Controller for Interactive Pet
Based on Jupyter Lab 5
"""

import socket
import json
import time
from typing import List, Tuple


class MovementController:
    """Control Mini Pupper movements"""
    
    def __init__(self, host: str = '127.0.0.1', port: int = 8001):
        self.host = host
        self.port = port
        self.sock = None
        
    def connect(self):
        """Connect to the movement server"""
        try:
            self.sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
            self.sock.connect((self.host, self.port))
            return True
        except Exception as e:
            print(f"Connection failed: {e}")
            return False
            
    def disconnect(self):
        """Disconnect from the movement server"""
        if self.sock:
            self.sock.close()
            self.sock = None
            
    def send_command(self, command: dict):
        """Send a command to the movement server"""
        if not self.sock:
            if not self.connect():
                return False
                
        try:
            data = json.dumps(command) + '\n'
            self.sock.sendall(data.encode())
            return True
        except Exception as e:
            print(f"Send failed: {e}")
            return False
            
    # Basic movements
    def stand(self):
        """Stand up"""
        return self.send_command({'action': 'stand'})
        
    def sit(self):
        """Sit down"""
        return self.send_command({'action': 'sit'})
        
    def lie_down(self):
        """Lie down"""
        return self.send_command({'action': 'lie_down'})
        
    # Expressive movements
    def wag_tail(self, speed: float = 1.0):
        """Wag tail (body sway)"""
        for _ in range(3):
            self.send_command({'action': 'tilt', 'roll': 10})
            time.sleep(0.15 / speed)
            self.send_command({'action': 'tilt', 'roll': -10})
            time.sleep(0.15 / speed)
        self.send_command({'action': 'tilt', 'roll': 0})
        
    def nod_yes(self):
        """Nod head up and down"""
        for _ in range(2):
            self.send_command({'action': 'tilt', 'pitch': 15})
            time.sleep(0.2)
            self.send_command({'action': 'tilt', 'pitch': -10})
            time.sleep(0.2)
        self.send_command({'action': 'tilt', 'pitch': 0})
        
    def shake_no(self):
        """Shake head side to side"""
        for _ in range(2):
            self.send_command({'action': 'tilt', 'yaw': 20})
            time.sleep(0.2)
            self.send_command({'action': 'tilt', 'yaw': -20})
            time.sleep(0.2)
        self.send_command({'action': 'tilt', 'yaw': 0})
        
    def look_up(self):
        """Look up"""
        return self.send_command({'action': 'tilt', 'pitch': 20})
        
    def look_down(self):
        """Look down"""
        return self.send_command({'action': 'tilt', 'pitch': -15})
        
    def tilt_head(self, angle: float = 15):
        """Tilt head curiously"""
        return self.send_command({'action': 'tilt', 'roll': angle})
        
    # Dance moves
    def happy_dance(self):
        """Perform a happy dance"""
        moves = [
            ({'action': 'tilt', 'roll': 15}, 0.2),
            ({'action': 'tilt', 'roll': -15}, 0.2),
            ({'action': 'tilt', 'pitch': 10}, 0.2),
            ({'action': 'tilt', 'pitch': -10}, 0.2),
            ({'action': 'tilt', 'roll': 15}, 0.2),
            ({'action': 'tilt', 'roll': -15}, 0.2),
            ({'action': 'stand'}, 0.3),
        ]
        
        for cmd, delay in moves:
            self.send_command(cmd)
            time.sleep(delay)
            
    def excited_jump(self):
        """Jump up and down excitedly"""
        for _ in range(3):
            self.send_command({'action': 'height', 'z': 0.12})
            time.sleep(0.15)
            self.send_command({'action': 'height', 'z': 0.08})
            time.sleep(0.15)
        self.send_command({'action': 'stand'})
        
    def sleepy_stretch(self):
        """Stretch like waking up"""
        self.send_command({'action': 'tilt', 'pitch': -20})
        time.sleep(0.5)
        self.send_command({'action': 'height', 'z': 0.06})
        time.sleep(0.3)
        self.send_command({'action': 'tilt', 'pitch': 20})
        time.sleep(0.5)
        self.send_command({'action': 'stand'})

Part 7: Pet State Machine

Create src/pet_brain.py:

#!/usr/bin/env python3
"""
Pet State Machine - The brain of the Interactive Pet
"""

import time
import random
from enum import Enum
from threading import Thread, Event
from typing import Optional

from emotions import EmotionDisplay
from sounds import SoundEffects
from touch_handler import TouchHandler, TouchLocation, TouchEvent
from movement import MovementController


class PetState(Enum):
    """Pet emotional states"""
    IDLE = "idle"
    HAPPY = "happy"
    SAD = "sad"
    EXCITED = "excited"
    SLEEPY = "sleepy"
    CURIOUS = "curious"
    LOVING = "loving"


class InteractivePet:
    """Main Interactive Pet controller"""
    
    def __init__(self):
        # Initialize components
        self.display = EmotionDisplay()
        self.sounds = SoundEffects()
        self.touch = TouchHandler()
        self.movement = MovementController()
        
        # State
        self.current_state = PetState.IDLE
        self.energy = 100  # 0-100
        self.happiness = 50  # 0-100
        self.last_interaction = time.time()
        
        # Control
        self.running = False
        self.stop_event = Event()
        
    def start(self):
        """Start the interactive pet"""
        self.running = True
        self.movement.connect()
        
        # Show initial state
        self.transition_to(PetState.HAPPY)
        self.sounds.speak("Hello! I'm your Mini Pupper!")
        
        # Start background threads
        Thread(target=self._idle_behavior_loop, daemon=True).start()
        Thread(target=self._energy_decay_loop, daemon=True).start()
        
        # Start touch monitoring
        print("Interactive Pet started! Touch me to interact.")
        print("Press Ctrl+C to exit.")
        
        try:
            self.touch.start_monitoring(self._handle_touch)
        except KeyboardInterrupt:
            self.stop()
            
    def stop(self):
        """Stop the interactive pet"""
        self.running = False
        self.stop_event.set()
        self.sounds.speak("Goodbye!")
        self.display.show_emotion('sleepy')
        self.movement.sit()
        self.movement.disconnect()
        
    def transition_to(self, new_state: PetState):
        """Transition to a new emotional state"""
        if new_state == self.current_state:
            return
            
        print(f"State: {self.current_state.value} -> {new_state.value}")
        
        # Animate transition
        self.display.animate_transition(
            self.current_state.value, 
            new_state.value
        )
        
        self.current_state = new_state
        self._express_state()
        
    def _express_state(self):
        """Express current state through display, sound, and movement"""
        state = self.current_state
        
        if state == PetState.HAPPY:
            self.display.show_emotion('happy')
            self.sounds.play('happy')
            self.movement.wag_tail()
            
        elif state == PetState.SAD:
            self.display.show_emotion('sad')
            self.sounds.play('whine')
            self.movement.look_down()
            
        elif state == PetState.EXCITED:
            self.display.show_emotion('excited')
            self.sounds.play('excited')
            self.movement.excited_jump()
            
        elif state == PetState.SLEEPY:
            self.display.show_emotion('sleepy')
            self.sounds.play('yawn')
            self.movement.lie_down()
            
        elif state == PetState.CURIOUS:
            self.display.show_emotion('surprised')
            self.sounds.play('bark')
            self.movement.tilt_head(20)
            
        elif state == PetState.LOVING:
            self.display.show_emotion('love')
            self.sounds.play('purr')
            self.movement.nod_yes()
            
        else:  # IDLE
            self.display.show_emotion('neutral')
            
    def _handle_touch(self, event: TouchEvent):
        """Handle touch events"""
        self.last_interaction = time.time()
        self.energy = min(100, self.energy + 5)
        
        location = event.location
        
        if location == TouchLocation.HEAD:
            if event.is_double_tap:
                # Double tap on head = excited
                self.happiness = min(100, self.happiness + 20)
                self.transition_to(PetState.EXCITED)
            else:
                # Single pat on head = happy
                self.happiness = min(100, self.happiness + 10)
                self.transition_to(PetState.HAPPY)
                
        elif location == TouchLocation.BACK:
            if event.duration > 1.0:
                # Long pet on back = loving
                self.happiness = min(100, self.happiness + 15)
                self.transition_to(PetState.LOVING)
            else:
                # Short pet = happy
                self.happiness = min(100, self.happiness + 5)
                self.transition_to(PetState.HAPPY)
                
        elif location == TouchLocation.LEFT:
            # Left side = curious look left
            self.transition_to(PetState.CURIOUS)
            self.movement.send_command({'action': 'tilt', 'yaw': -20})
            
        elif location == TouchLocation.RIGHT:
            # Right side = curious look right
            self.transition_to(PetState.CURIOUS)
            self.movement.send_command({'action': 'tilt', 'yaw': 20})
            
    def _idle_behavior_loop(self):
        """Background loop for idle behaviors"""
        while self.running and not self.stop_event.is_set():
            time.sleep(5)  # Check every 5 seconds
            
            # Time since last interaction
            idle_time = time.time() - self.last_interaction
            
            if idle_time > 60:  # 1 minute idle
                if self.energy < 30:
                    self.transition_to(PetState.SLEEPY)
                elif self.happiness < 30:
                    self.transition_to(PetState.SAD)
                elif random.random() < 0.3:  # 30% chance
                    # Random idle action
                    action = random.choice(['look_around', 'stretch', 'yawn'])
                    self._do_idle_action(action)
                    
    def _do_idle_action(self, action: str):
        """Perform an idle action"""
        if action == 'look_around':
            self.movement.tilt_head(15)
            time.sleep(1)
            self.movement.tilt_head(-15)
            time.sleep(1)
            self.movement.tilt_head(0)
            
        elif action == 'stretch':
            self.movement.sleepy_stretch()
            
        elif action == 'yawn':
            self.display.show_emotion('sleepy')
            self.sounds.play('yawn')
            time.sleep(1)
            self.display.show_emotion('neutral')
            
    def _energy_decay_loop(self):
        """Background loop for energy decay"""
        while self.running and not self.stop_event.is_set():
            time.sleep(30)  # Every 30 seconds
            self.energy = max(0, self.energy - 2)
            self.happiness = max(0, self.happiness - 1)


def main():
    pet = InteractivePet()
    pet.start()


if __name__ == '__main__':
    main()

Part 8: Main Application

Create src/main.py:

#!/usr/bin/env python3
"""
Interactive Pet Robot - Main Entry Point
Project 1: Combines all Jupyter Lab skills
"""

import sys
import argparse
from pet_brain import InteractivePet, PetState


def demo_mode(pet: InteractivePet):
    """Run a demonstration of all features"""
    import time
    
    print("\n=== Interactive Pet Demo ===\n")
    
    # Show all emotions
    emotions = ['happy', 'sad', 'excited', 'sleepy', 'curious', 'loving']
    
    for emotion in emotions:
        print(f"Showing: {emotion}")
        state = PetState[emotion.upper()]
        pet.transition_to(state)
        time.sleep(3)
        
    # End with happy
    pet.transition_to(PetState.HAPPY)
    print("\nDemo complete!")


def interactive_mode(pet: InteractivePet):
    """Run in interactive mode with touch sensors"""
    pet.start()


def main():
    parser = argparse.ArgumentParser(description='Interactive Pet Robot')
    parser.add_argument('--demo', action='store_true', 
                        help='Run demonstration mode')
    parser.add_argument('--test-touch', type=str,
                        help='Simulate touch: head, back, left, right')
    args = parser.parse_args()
    
    pet = InteractivePet()
    
    if args.demo:
        pet.movement.connect()
        demo_mode(pet)
        pet.movement.disconnect()
    elif args.test_touch:
        from touch_handler import TouchLocation
        pet.movement.connect()
        pet.display.show_emotion('neutral')
        
        location_map = {
            'head': TouchLocation.HEAD,
            'back': TouchLocation.BACK,
            'left': TouchLocation.LEFT,
            'right': TouchLocation.RIGHT,
        }
        
        if args.test_touch in location_map:
            pet.touch.simulate_touch(location_map[args.test_touch])
        else:
            print(f"Unknown location: {args.test_touch}")
            
        pet.movement.disconnect()
    else:
        interactive_mode(pet)


if __name__ == '__main__':
    main()

Deliverables

Deliverable Description
src/emotions.py LCD emotion display system
src/sounds.py Sound effects manager
src/touch_handler.py Touch sensor handler
src/movement.py Movement controller
src/pet_brain.py State machine and main logic
src/main.py Entry point application
assets/sounds/ Sound effect files
README.md Setup and usage instructions
Video Demo Demonstration of interactions

Extension Ideas

  1. More Emotions: Add angry, confused, proud emotions
  2. Voice Recognition: Respond to voice commands
  3. Scheduled Behaviors: Time-based activities (morning stretch, bedtime)
  4. Memory: Remember favorite interactions
  5. Camera Reactions: React to faces or objects seen
  6. LED Effects: Sync LED lights with emotions

Grading Rubric

Component Points Criteria
LCD Emotions 20 At least 5 distinct emotions displayed
Sound Effects 15 Appropriate sounds for each state
Touch Response 20 All touch locations trigger responses
Movement 20 Expressive movements match emotions
State Machine 15 Smooth transitions, logical behavior
Code Quality 10 Clean, documented, modular code
Total 100