Steven Swanbeck
stevenswanbeck at gmail dot com

I am pursuing my PhD in Robotics at The University of Texas at Austin advised by Dr. Mitch Pryor in the Nuclear and Applied Robotics Group (NRG). I am passionate about developing robotic systems that assist humans to solve our greatest challenges. I am fascinated by many facets of robotics and have broad experience in perception, manipulation, and human-robot interaction. As of now, I am most interested in developing generalized robotics skillsets and creating behavior tree synthesis and validation mechanisms and software deployment infrastructure to allow these skillsets to be used to complete any task on any system.

Previously, I completed my BS in Mechanical Engineering from the University of Nevada, Reno, advised by Dr. Jun Zhang in the Smart Robotics Lab.

Aside from robotics, I also enjoy basketball, rock climbing, origami, puzzles, and sci-fi novels.

LinkedIn / Github

Major Updates

Projects

A Heterogeneous Robot Team Concept for Fabric Maintenance Applications

Steven Swanbeck, Alex Diaz, Fabian Parra, Mitch Pryor
Research Project     
Aug 2024 - Present  •   Code (coming soon)

  • A heterogeneous team of systems composed of a wheeled mobile robot with a LiDAR and camera sensor payload, a custom high-reach static manipulator system, a human supervisor with an augmented reality headset, and a quadruped with a manipulator is able to more completely perform end-to-end fabric maintenance tasks in real-world environments than any single commercially-available system.
  • Robot actions are coordinated using behavior trees and identical containerized code deployed on each system.
  • Darwin: An Ecosystem of Modular and Flexible Skillsets for Robotics Applications

    Steven Swanbeck
    Research Project     
    Oct 2024 - Present  •   Code (coming soon)

  • Darwin is a paradigm for creating maximally modular robotics skillsets. Each skillset is a ROS-enabled, BT.CPP-driven, and Docker-containerized application.
  • Using ROS allows for communication between each containerized skillset and use of behavior trees allows skills to be easily chained together to produce complex, task-specific, and reactive system behavior. Containerization allows for easy deployment across many different systems of different types.
  • Current Darwin capabilities include visual sensor fusion, 3D SLAM, data aggregation, ML model integration, information gain viewpose sampling, generative behavior tree construction and validation, raw data annotation, Meta's Llama 3 and SAM 2, and corrosion detection, plus many supported robot and sensor drivers. New capabilities will be continuously added for the foreseeable future.
  • Hyla-SLAM: Extremely Scalable and Memory Efficient LiDAR-Based SLAM for Robotics Applications

    Steven Swanbeck
    Research Project     
    Aug 2024 - Present  •   Paper (coming soon)

  • Hyla-SLAM integrates the dynamic data loading properties of the Hylacomylus mapping plugin with a set of possible localization solutions (the default of which is KISS-ICP) to provide a scalable SLAM solution that can build infinite* maps and can be deployed on any system that has a 3D LiDAR sensor.
  • *Can generate maps with volume up to 7.4 million cubic light years. Dynamic data loading means that the memory profile remains roughly constant during operation, preventing progressive performance degradation as map size grows.
  • The BT.CPP Multi-Agent Behavior Tree Toolbox

    Steven Swanbeck
    Research Project     
    Aug 2024 - Oct 2024  •   Code (coming soon)

  • Behavior trees are incredibly popular for use in robotics applications because of their flexibilty in construction and modularity. However, they do not generally scale to decentralized multi-agent applications.
  • The Multi-Agent Behavior Tree Toolbox aims to solve this by providing a set of general-purpose BT.CPP behaviors for decentralized coordination and communication between multiple systems (using ROS), transfer of data on disk between systems (using a lightweight data transfer library written in Rust), and running of subprocesses.
  • A Scalable SLAM and Object Extraction Pipeline for Informing Contact and Non-Contact Manipulation Tasks

    Steven Swanbeck
    Research Project     
    Jan 2024 - Present  •   Code (coming soon)

  • Supports custom plugins for image-based predictions and supports fusion of image data with depth images and point clouds generated by robot sensors.
  • A grounded Segment Anything plugin is implemented to extract objects using text inputs.
  • Using behavior trees, the processes associated with data collection, annotation, fusion, and storage can be easily adjusted to fit a desired task, and the stack supports any number of visual data sensors out of the box (subject to computational constraints).

    Autonomous Obstacle Avoidance, Localization, Navigation, and SLAM on a Mobile Robot

    Steven Swanbeck, Daniel Meza, Jared Rosenbaum
    Course Project     
    Jan 2024 - Apr 2024  •   Code

  • Simple obstacle avoidance, particle filter-based localization, navigation using RRT* for global planning and a line-of-sight carrot planner for local planning, and SLAM using correlative scan matching and GTSAM for pose graph optimization implemented to run onboard an autonomous mobile robot in real time.
  • Hylacomylus: Memory Efficient Mapping for Operation in Large-Scale Environments

    Steven Swanbeck
    Research Project     
    Jan 2024 - Apr 2024  •   Code (coming soon)

  • Hylacomylus is a mapping plugin that enables dynamic data loading and unloading during robot deployment. This allows computationally- or memory-constrained robots to produce and maintain maps that are both extremely dense and extremely large.
  • Produced maps can be effortlessly reloaded into working memory, transferred to other systems for use, or extended on subsequient deployments.
  • GaTORS: A Game-Theoretic Tool for Optimal Robot Platform Selection and Design in Surface Coverage Applications

    Steven Swanbeck, Daniel Meza, Jared Rosenbaum, David Fridovich-Keil, Mitch Pryor
    Research Project     
    Jan 2024 - Mar 2024  •   Paper (coming soon)

  • GaTORS presents a framework that helps answer the questions of which and how many robots should be deployed in real world environments to complete surface coverage tasks.
  • The design of systems can also be optimized by performing systematic parameter sweeps over the space of possible configurations to set targets to achieve economic viability.
  • AR-STAR: An Augmented Reality Tool for Online Modification of Robot Point Cloud Data

    Frank Regal*, Steven Swanbeck*, Fabian Parra, Mitch Pryor
    ACM/IEEE International Conference on Human-Robot Interaction (HRI '24)     
    Oct 2023 - Dec 2023  •   Paper

  • Uncertainty in sensor data collected onboard a deployed robot can hinder its ability to complete a mission, and it can be difficult for a robot to autonomously detect and overcome these uncertainties.
  • We develop an augmented reality tool that allows a a human supervisor to visualize a robot's predictions and modify them online using one of three interaction modalities and study user preferences in using them.
  • Reinforcement Learning for Traversal of Uncertain Vertical Terrain using a Magnetic Wall-Climbing Robot

    Steven Swanbeck, Jee-Eun Lee
    Course Project     
    Sep 2023 - Dec 2023  •   Code

  • Using only its own kinematics, magnetic forces experienced at its feet, and a goal position, a magnetic wall climbing robot is trained to navigate a surface within unknown and irregular magnetic properties.
  • Developed a custom ROS-based bridge to interface the C++ simulation toolkit DART with Python RL libraries and trained Deep Q-Learning and PPO policies using it.
  • Game-Theoretic Modeling for Robot Platform Selection in Industrial Repair Applications

    Steven Swanbeck, Daniel Meza
    Course Project     
    Oct 2023 - Dec 2023  •   Code

  • Using game-theoretic modeling, the ability of different robotic hardware platforms to perform maintenance and inspection tasks in a shared environment is evaluated.
  • With this competitive modeling approach, the ability to select a minimally-sized heterogeneous robot team to accomplish comprehensive corrosion repairs within predefined budget and time constraints in complex industrial environments was demonstrated.
  • Using Augmented Reality to Assess and Modify Mobile Manipulator Surface Repair Plans

    Frank Regal, Steven Swanbeck, Fabian Parra, Jared Rosenbaum, Mitch Pryor
    XR-ROB Second International Workshop on "Horizons of Extended Robotics Reality" @ IEEE IROS (2023)   •   Second Prize
    Jul 2023 - Aug 2023  •   Paper

  • Using an AR head-mounted display, a user is able to view predictions made by a surveying robot for surface repair.
  • The user can accept, reject, or modify the plan generated by the robot to prevent incidental covering of sensitive material or repair of unproblematic surfaces.
  • Non-Contact Surface Coverage of Corroded Material in Industrial Environments

    Steven Swanbeck
    Research Project
    May 2023 - Present  •   Code (coming soon)

  • Surface identification used to extract the locations and geometries of possible corroded surfaces within industrial environments.
  • Coverage planning and execution with constraint-relaxed redundant replanning enables coverage over the identified surfaces using a mobile manipulator system with a protective spray coating, preventing further corrosion development.
  • Virtual Fixture Generation and Execution for Surface Coverage of Complex Geometries

    Steven Swanbeck
    Research Project
    May 2023 - Jul 2023  •   Code (coming soon)

  • Using computer vision models, LiDAR detection models, or supervised scene labeling, surfaces on which a robot should perform surface inspection are denoted.
  • A discrete pose mesh is created offset from the surface that can be traversed using a manipulator to perform the surface coverage task.
  • Bat-Inspired Passive Drone Gripper for Angle-Invariant Perching

    Steven Swanbeck
    Course Project
    March 2023 - April 2023  •   Code

  • A custom mechanism inspired by the passive inverted hanging ability of bats to allow a drone to remain perched in one location for long periods of time without expending battery power.
  • In addition to perching in upright or inverted orientations, the gripper can also be used as landing gear for the drone or for holding objects during flight.
  • A MATLAB Toolbox using Screw Theory for Forward and Inverse Kinematics of Manipulators

    Steven Swanbeck
    Course Project
    Feb 2023 - April 2023  •   Code

  • Robots of arbitrary structure can be defined and visualized using screw theory.
  • Space frame and body frame forward kinematics can be calculated and visualized and manipulability measures are calculated and monitored.
  • Various inverse kinematics algorithms, including Jacobian pseudo-inverse, Jacobian transpose, redundancy resolution, and damped least-squares are available.
  • LiDAR & Image Data Fusion for Object Detection with Rapid Labeling and Training Pipeline

    Steven Swanbeck
    Research Project
    Jan 2023 - Jun 2023  •   Code (coming soon)

  • Separate trained LiDAR-based and image-based prediction models are used to make semantic (point-wise/pixel-wise) predictions about the presence of objects of interest within the robot environment.
  • Predictions are fused spatially using projection and probabilistically in the Bayesian sense to predict the locations of these objects in the environment.
  • Visual Inspection and Mapping Stack for Industrial Survey Applications

    Steven Swanbeck
    Research Project
    Nov 2022 - Mar 2023  •   Code (coming soon)

  • Custom mapping stack developed to create dense 3D representations of robot surroundings using fused 2D image and 3D LiDAR data.
  • Processing functionalities can be easily implemented to localize and ground regions of interest within the environment; map is also used for robot localization as it is developed.
  • Robotic Street Scam Artist

    Steven Swanbeck
    Course Project
    Oct 2022 - Nov 2022  •   Code

  • Combining manipulator control and computer vision, an eye-in-hand system is able to localize and track a series of shells as they are shuffled, inspect each to look for a planted marker, and pick it to reveal the money in a modified version of the classic street scam shell game.
  • Hand-tracking using the manipulator and writing calculation results also developed as intermediate steps.
  • Kinematic Modeling of a Twisted-String Actuated Soft Robotic Finger as Part of an Anthropomorphic Gripper

    Steven Swanbeck, Revanth Konda, Jun Zhang
    Modeling, Estimation, and Control Conference (MECC 2023)
    Apr 2022 - Aug 2022  •   Paper

  • Modeling strategies for a twisted-string actuator-driven soft robotic gripper developed to enable control and autonomous capabilities.
  • STAR–2: A Soft Twisted-string-actuated Anthropomorphic Robotic Gripper: Design, Fabrication, and Preliminary Testing

    Aaron Baker, Claire Foy, Steven Swanbeck, Revanth Konda, Jun Zhang
    IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM 2023)
    Mar 2022 - Aug 2022  •   Paper

  • 2.0 version of previous soft gripper with increased range-of-motion and degrees-of-freedom.
  • Dexterous Soft Gripper Manipulator Integration for Human-Robot Interaction

    Steven Swanbeck
    Research Project
    Feb 2022 - Aug 2022  •   Code

  • Enhanced version of anthropomorphic gripper with 5 additional degrees of freedom integrated with a UR3e manipulator.
  • ROS integration allowed for coordinated control of gripper and manipulator, enabling pick-and-place and human-robot interaction demonstrations.
  • Anthropomorphic Twisted String-Actuated Soft Robotic Gripper With Tendon-Based Stiffening

    Revanth Konda*, David Bombara*, Steven Swanbeck*, Jun Zhang
    IEEE Transactions on Robotics (April 2023)
    Sep 2021 - Feb 2022  •   Paper

  • Soft gripper capable of mimicking many of the grasping capabilities of the human hand, including achieving 31/33 grasps of the Feix GRASP Taxonomy and resisting a maximum force of 72N, over 13 times its own weight.
  • Sublunar Lava Tube Exploration Quadruped

    Steven Swanbeck
    NASA University Student Design Challenge
    May 2021 - Aug 2021  •   Code

  • Project to design a robotic system capable of surveying lava tubes under the surface of the moon to assess their ability to sustain long-term human habitation.
  • Custom quarupedal system capable of teleoperated walking, self-stabilization, and LiDAR mapping of surroundings.
  • * indicates equal contribution