Runtime Obstacle Avoidance & Shared Control
Jackal robot controlled by a human, which does runtime obstacle avoidance, and takes over human control if an obstacle is too close to it and autonomously moves away.
I am a PhD. candidate in the Mechanical Engineering department at Clemson University. I work at the Interdisciplinary Intelligent Research Laboratory at Clemson University under Dr. Yue "Sophie" Wang. I have also worked with a larger consortium of researchers at VIPR-GS group with the Automotive Department at CU-ICAR. Before this, I completed my M.S. from Mechanical Engineering at Purdue University and my B.E. in Mechanical Engineering at University of Pune.
Worked with a consortium of researchers at the VIPR-GS group. Focused on integrating Semantic 3D Mapping tools for off-road ground robot applications.
Contributed to the development of an Autonomous Train Robot for track health monitoring.
My research interests lie in neurosymbolic deep learning, formal verification and controls for robotics and autonomous vehicle applications. I have worked on developing path planning and navigation tools for ground robots using formal tools like temporal languages [1] [2]. I have also worked on integrating 3D semantic mapping tools for off-road ground robot applications using the octomap library. Currently, I am developing neurosymbolic tools to formally verify convolutional neural networks and neural network controllers for complex dynamical systems.
ICCPS, 2025
Aditya Parameshwaran, Yue Wang
SEVIN (Scalable and Explainable Verification of Image-Based Neural Network Controllers) uses Variational Autoencoders to encode high-dimensional images into an explainable latent space, creating annotated convex polytopes that enable efficient formal verification of neural network controllers for autonomous vehicles. This approach reduces computational complexity, enhances scalability, improves robustness against real-world perturbations, and provides explainable insights into controller behavior for safety-critical systems. This work will be presented at ICCPS 2025, part of the CPS-IoT week at Irvine, California.
IFAC, 2024
Aditya Parameshwaran, Yue Wang
2D controller synthesis combining Linear and Signal Temporal Logic specifications to gurantee safe and robust navigation for ground robots. This method is an update on the SAE Paper from 2023 and is faster while maintaining similar levels of safety as before. It is published as part of the IFAC papers in the MECC 2024 conference.
Aditya Parameshwaran, Yue Wang
2D navigation model for an autonomous vehicle based on task specifications given in signal temporal logic (STL) guaranteeing safety. This work is presented in the SAE WCX 2023 conference at Detroit.
Edwina Lewis, Aditya Parameshwaran
Bayesian Calibration Routine based off-road terrain roughness estimator combined with Simplex controller for mobile robots. This work is applied on NVIDIA's Isaac Sim environment along with Jackal robot to collect IMU data and predict the roughness of the terrain. The roughness estimates allow a Simplex controller to switch between performance and safety modes of operation. This work is part of the SAE WCX 2025 Conference at Detroit.
I have been involved in various robotics projects since completing my MS at Purdue, some in collaboration with companies like Wabtec Corporation, and others as side projects for the US Army VIPR Centre. These projects have spanned areas such as Controls, Deep Learning, Autonomous Navigation, and Computer Vision.
Jackal robot controlled by a human, which does runtime obstacle avoidance, and takes over human control if an obstacle is too close to it and autonomously moves away.
A pick and place task using a UR5 manipulator developed based on the MoveIt framework in ROS2 and C++. The task involves identifying objects and planning safe trajectories in Isaac Sim.
Developed an autonomous railway bot for track health monitoring using LiDAR, stereo cameras, and IMU. Deployed CNN models for real-time traffic sign recognition.
Generated semantically segmented 3D voxel maps by fusing RGB data with LiDAR point clouds using the Octomap library in outdoor environments.