1080p.mov
Impact
- Objective: Build a rover that autonomously navigates an obstacle field and mines regolith within a 10-minute run.
- Role: Autonomy team member (Year 1) → Autonomy team lead (Year 2); owned perception/localization integration and end-to-end autonomy reliability.
- Key systems: Localization pipeline evolution (vision fiducials → UWB), LiDAR perception for navigation, real-time telemetry dashboard.
What I Built
Year 1 — Autonomy Engineer
- Designed the ROS message architecture to route sensor topics into the autonomy stack (positioning + planning inputs).
- Prototyped localization using AprilTag fiducials with USB cameras for position + orientation estimates.
- Built a Python + PyQt telemetry GUI to monitor voltage, current, rover speed, and autonomy health during tests.
Year 2 — Autonomy Team Lead
- Led navigation perception using Hokuyo LiDAR point clouds to support obstacle detection and path planning.
- Integrated Pozyx Ultra-wideband localization (beacon + robot anchors) to produce robust field positioning and orientation.
- Owned integration/testing across autonomy nodes and drive control to improve run reliability under competition constraints.
System Architecture
- Autonomy runs on an Intel NUC, consuming camera/LiDAR/IMU and publishing planned motion.
- Drive node runs on a BeagleBone and sends motor commands over UART.
- Low-level obstacle detection runs on an Arduino, while an Android app provides teleop + autonomy status.