
Location: Chennai / Hybrid
Duration: 6 Months
Type: Internship
Reference ID: INT00126
We are looking for a Software Engineer Intern to help build our fleet management and teleoperation platform—the interface between users and autonomous machines
This system enables:
You will work across frontend and backend systems, contributing to a product that directly interacts with real-world robots.
Task scheduling and mission control
Live system monitoring dashboards
Real-time video streaming interfaces
Teleoperation controls
Real-time video streaming (WebRTC, RTSP, etc.)
WebSockets or event-driven systems
Docker or cloud deployment
Exposure to:
Human-machine interfaces (HMI)
Control systems or teleoperation
Please submit:
Apply at: careers@zensomy.com
Duration: 6 Months
Type: Thesis / Research Internship
Location: Chennai / Remote
Reference ID: INT00226
Autonomous navigation in off-road environments (e.g., forests, agricultural land, construction zones, and rugged terrain) introduces challenges far beyond structured urban driving. These environments lack well-defined geometry, semantic consistency, and reliable priors.
Key difficulties include:
To address these challenges, perception systems must move beyond 2D understanding and adopt 3D spatial representations. In particular, 3D occupancy grids combined with semantic segmentation provide a powerful framework for modeling free space, obstacles, and terrain traversability in off-road settings.
This thesis will explore critical challenges in off-road perception:
The primary objectives of this thesis are:
The thesis will involve both research and system-level development:
Exposure to research or prior relevant projects
Access to datasets, tools, and compute resources
Please submit:
📩 Apply at: careers@zensomy.com
Location: Chennai / Hybrid
Type: Full-Time
Reference ID: FTE00126
We are looking for a Deep Learning Perception Engineer to design and deploy robust perception systems for off-road autonomy. You will work on 3D scene understanding, semantic segmentation, object detection & tracking, and occupancy grid-based representations, contributing directly to production-grade autonomy stacks.
This role combines research and engineering, with a strong emphasis on real-world deployment and system integration.
Contribute to system architecture and technical decision-making
Bridging the gap between research prototypes and deployable systems
Access to real-world deployment and testing environments
Please submit:
📩 Apply at: careers@zensomy.com
Degree in computer science, information technology, engineering or related (Bachelor, Master, or PhD).
Strong development experience in at least one of the following fields: hands-on embedded code development, safety-critical software development for real-time systems, CI/CD pipelines, ROS/ROS2 middleware, simulation environments (IsaacSim, CARLA, AWSIM).
Strong technical foundation and expertise in at least one of the following domains: deep learning, computer vision, multi-sensor fusion, occupancy grids, reinforcement learning and model predictive control.
Experience in Robotics, driver assistance systems and / or autonomous driving.
Experience with modern C++(14/17/20), Python and embedded software engineering for real-time applications.
Experience with modern software engineering tools such as CI/CD pipelines, Docker and Git.
Ability to work in a dynamic environment with complex technical challenges and requirements.
Readiness to tackle complex challenges and contribute to shaping our product into a cutting-edge autonomy platform for next-generation of mobile machines.
A strong team player with the ability to convey complex technologies clearly and effectively.
Strong team player with excellent collaboration skills.
Ability to communicate complex technical concepts clearly to diverse audiences.
Problem-solving mindset with a willingness to tackle challenging, ambiguous tasks.
Adaptability to work in a fast-paced, evolving environment.
Commitment to continuous learning and personal growth.
Open culture - transparent communication, flat hierarchies, and collaborative decision-making.
Opportunity to shape cutting-edge autonomous technologies with real-world impact.
Ownership of your work – real responsibility and the freedom to shape solutions.
Access to state-of-the-art tools, labs, and test environments.
Continuous learning and professional growth through mentorship, training, and conferences.