Projects Showcase

Mars Rover Human-in-the-Loop Pick-and-Place

Mars Rover Human-in-the-Loop Pick-and-Place

DAM Robotics Club rover arm that lets an operator point to pick and place targets via an ArUco pointer.

About Me

From mental math to building systems

Growing up, I loved numbers and making things. I’d rattle off mental arithmetic to impress my parents’ friends and, as a teen, built a hydraulic arm that became my first real “engineering constraints” lesson. I posted game tutorials for fun, which nudged me into programming and, eventually, engineering.

I care about robotics, automation, AI, and AR, and I’m motivated by building systems that are useful, accurate, and delightful to use.

Outside class and work I balance badminton, ballroom dance, and hosting themed events with heavy workloads. Teaching is a through-line: designing explanations, tools, and interfaces that help people learn faster.

Students coached
0+
Tutoring hours
0+
Robotics & Mechatronics EnthusiastHardware & Software EngineeringBuilder • Teacher • TeammateEsports World Champion (2022)Funny Friend

Speed Math — 60s

60s
Solve as many as you can in 60 seconds.
Score
0
High
0
Percentile
0%

Outside School/Work

BadmintonBallroom danceHosting themed partiesFriends & hangoutsPhoto/Video editing (Adobe)TravelingTrying new food spotsLearning new things

OSU Capstone Design Proposal

Four Wheel Steering Integration and Sensor Platform for Woodpecker LSEV is my Oregon State University mechanical engineering capstone design proposal (ENGR 415/416, Team D605). The project focuses on integrating a four-wheel-steering kit and a modular sensor-ready mechanical platform for autonomous LSEV research.

Performance target: 30 ft turning diameter (± 1.5 ft) with low-speed maneuverability suited to campus/factory environments.
Safety goals: emergency-stop integration, independent braking approach, and controlled 0–25 mph validation criteria.
Tracking quality: commanded steering angle accuracy target within ± 5°.
Systems integration: robust sensor mounting interfaces for LiDAR/cameras/proximity sensors while minimizing occlusion.
Scope discipline: mechanical steering and interface readiness now; full autonomy software intentionally out of scope.
Verification plan: turning-circle tests, angle-tracking checks, interference/load checks, and repeatability-focused documentation.

Projects

Mars Rover Human-in-the-Loop Pick-and-Place
  • Scanned the workspace with a gripper-mounted Intel RealSense D405 to build a point cloud for scene understanding.
  • Segmented the ground plane and clustered objects with PCL, then detected ArUco pointer pose in OpenCV and computed ray-to-plane intersection for operator intent.
  • Selected the nearest object cluster to the intersection point and planned/executed approach, grasp, and place trajectories with MoveIt2 on the 6DOF arm.

ROS 2 · MoveIt2 · PCL · OpenCV · Intel RealSense D405

Open on YouTube ↗
Color Chaos: Block Attack Mode (Hiwonder ArmPi)
  • Built camera-based color detection for red, green, and blue blocks and randomly selected one visible block as the attacker.
  • When two or more blocks were visible, the arm picked the attacker, performed an attacking animation, and dropped it onto another randomly selected target block.
  • When only one block was visible, the arm still performed pickup + attack animation, then yeeted the block to a random off-board location.

Hiwonder ArmPi · Computer Vision · Color Detection · Robot Manipulation

Open on YouTube ↗
HOLOMAT: Touchless ROS 2 UI (Hand + Voice)
  • Built hand_tracking_node (ROS 2 + MediaPipe) → publishes fingertip TF frames + MarkerArray for AR/telemetry.
  • Wrote calibration_node (Python/NumPy) → 5×5 homography, persists M.npy for repeatable cam↔projector alignment.
  • Added voice_command_node + ui_display_node (OpenAI API + Pygame) → hands-free launches with logging/monitoring.

ROS 2 · Python · MediaPipe · OpenAI API · Pygame

Open on YouTube ↗
Cocktail Maker (HW/SW Mechatronics)
  • Modeled + 3D-printed assemblies; firmware & keypad UI for recipe selection + pump sequencing.
  • Ran DFX/FMEA to shrink footprint, speed assembly, and improve reliability.
  • Prototype results: <5% dosing error, ≤6 min dispense, ≤5 min clean; up to 3 ingredients & 4-drink batches.

SolidWorks · Arduino (C/C++) · DFX · FMEA

Open on YouTube ↗
PeopleDetector ACF (MATLAB CV Toolbox)
  • Demoed detector workflow: create detector, read frame, get bboxes + scores, overlay results.
  • Explained pros: turnkey, accurate on upright full-body, integrates with MATLAB tooling.
  • Caveats: partial occlusion/poses degrade accuracy; slower than modern DL detectors—good starter baseline.

MATLAB · Computer Vision Toolbox

Open on YouTube ↗

Contact Me

By submitting, your message will be sent to santosmatthewjohn@gmail.com. If the form doesn’t open your mail app, email me directly.