Physical AI for Robotics and Automation Training Course
Physical AI combines artificial intelligence and robotics to create machines capable of autonomous decision-making and interaction with their physical environment.
This instructor-led, live training (online or onsite) is aimed at intermediate-level participants who wish to enhance their skills in designing, programming, and deploying intelligent robotic systems for automation and beyond.
By the end of this training, participants will be able to:
- Understand the principles of Physical AI and its applications in robotics and automation.
- Design and program intelligent robotic systems for dynamic environments.
- Implement AI models for autonomous decision-making in robots.
- Leverage simulation tools for robotic testing and optimization.
- Address challenges such as sensor fusion, real-time processing, and energy efficiency.
Format of the Course
- Interactive lecture and discussion.
- Lots of exercises and practice.
- Hands-on implementation in a live-lab environment.
Course Customization Options
- To request a customized training for this course, please contact us to arrange.
Course Outline
Introduction to Physical AI and Robotics
- Overview of Physical AI and its evolution
- Applications in industrial automation and beyond
- Key components of intelligent robotic systems
Robotics System Design
- Mechanical design principles for robots
- Integration of sensors and actuators
- Power systems and energy efficiency
AI Models for Robotics
- Using machine learning for perception and decision-making
- Reinforcement learning in robotics
- Building AI pipelines for robotic systems
Real-Time Sensor Integration
- Sensor fusion techniques
- Processing data from LiDAR, cameras, and other sensors
- Real-time navigation and obstacle avoidance
Simulation and Testing
- Using simulation tools like Gazebo and MATLAB Robotics Toolbox
- Modeling dynamic environments
- Performance evaluation and optimization
Automation and Deployment
- Programming robots for industrial automation
- Developing workflows for repetitive tasks
- Ensuring safety and reliability in deployments
Advanced Topics and Future Trends
- Collaborative robots (cobots) and human-robot interaction
- Ethical and regulatory considerations in robotics
- The future of Physical AI in automation
Summary and Next Steps
Requirements
- Basic knowledge of robotics and automation systems
- Proficiency in programming, preferably Python
- Familiarity with AI fundamentals
Audience
- Robotics engineers
- Automation specialists
- AI developers
Delivery Options
Private Group Training
Our identity is rooted in delivering exactly what our clients need.
- Pre-course call with your trainer
- Customisation of the learning experience to achieve your goals -
- Bespoke outlines
- Practical hands-on exercises containing data / scenarios recognisable to the learners
- Training scheduled on a date of your choice
- Delivered online, onsite/classroom or hybrid by experts sharing real world experience
Private Group Prices RRP from €6840 online delivery, based on a group of 2 delegates, €2160 per additional delegate (excludes any certification / exam costs). We recommend a maximum group size of 12 for most learning events.
Contact us for an exact quote and to hear our latest promotions
Public Training
Please see our public courses
Need help picking the right course?
Physical AI for Robotics and Automation Training Course - Enquiry
Physical AI for Robotics and Automation - Consultancy Enquiry
Testimonials (1)
its knowledge and utilization of AI for Robotics in the Future.
Ryle - PHILIPPINE MILITARY ACADEMY
Course - Artificial Intelligence (AI) for Robotics
Provisional Upcoming Courses (Contact Us For More Information)
Related Courses
Artificial Intelligence (AI) for Robotics
21 HoursArtificial Intelligence (AI) for Robotics combines machine learning, control systems, and sensor fusion to create intelligent machines capable of perceiving, reasoning, and acting autonomously. Through modern tools like ROS 2, TensorFlow, and OpenCV, engineers can now design robots that navigate, plan, and interact with real-world environments intelligently.
This instructor-led, live training (online or onsite) is aimed at intermediate-level engineers who wish to develop, train, and deploy AI-driven robotic systems using current open-source technologies and frameworks.
By the end of this training, participants will be able to:
- Use Python and ROS 2 to build and simulate robotic behaviors.
- Implement Kalman and Particle Filters for localization and tracking.
- Apply computer vision techniques using OpenCV for perception and object detection.
- Use TensorFlow for motion prediction and learning-based control.
- Integrate SLAM (Simultaneous Localization and Mapping) for autonomous navigation.
- Develop reinforcement learning models to improve robotic decision-making.
Format of the Course
- Interactive lecture and discussion.
- Hands-on implementation using ROS 2 and Python.
- Practical exercises with simulated and real robotic environments.
Course Customization Options
To request a customized training for this course, please contact us to arrange.
AI and Robotics for Nuclear - Extended
120 HoursIn this instructor-led, live training in the Netherlands (online or onsite), participants will learn the different technologies, frameworks and techniques for programming different types of robots to be used in the field of nuclear technology and environmental systems.
The 6-week course is held 5 days a week. Each day is 4-hours long and consists of lectures, discussions, and hands-on robot development in a live lab environment. Participants will complete various real-world projects applicable to their work in order to practice their acquired knowledge.
The target hardware for this course will be simulated in 3D through simulation software. The ROS (Robot Operating System) open-source framework, C++ and Python will be used for programming the robots.
By the end of this training, participants will be able to:
- Understand the key concepts used in robotic technologies.
- Understand and manage the interaction between software and hardware in a robotic system.
- Understand and implement the software components that underpin robotics.
- Build and operate a simulated mechanical robot that can see, sense, process, navigate, and interact with humans through voice.
- Understand the necessary elements of artificial intelligence (machine learning, deep learning, etc.) applicable to building a smart robot.
- Implement filters (Kalman and Particle) to enable the robot to locate moving objects in its environment.
- Implement search algorithms and motion planning.
- Implement PID controls to regulate a robot's movement within an environment.
- Implement SLAM algorithms to enable a robot to map out an unknown environment.
- Extend a robot's ability to perform complex tasks through Deep Learning.
- Test and troubleshoot a robot in realistic scenarios.
AI and Robotics for Nuclear
80 HoursIn this instructor-led, live training in the Netherlands (online or onsite), participants will learn the different technologies, frameworks and techniques for programming different types of robots to be used in the field of nuclear technology and environmental systems.
The 4-week course is held 5 days a week. Each day is 4-hours long and consists of lectures, discussions, and hands-on robot development in a live lab environment. Participants will complete various real-world projects applicable to their work in order to practice their acquired knowledge.
The target hardware for this course will be simulated in 3D through simulation software. The code will then be loaded onto physical hardware (Arduino or other) for final deployment testing. The ROS (Robot Operating System) open-source framework, C++ and Python will be used for programming the robots.
By the end of this training, participants will be able to:
- Understand the key concepts used in robotic technologies.
- Understand and manage the interaction between software and hardware in a robotic system.
- Understand and implement the software components that underpin robotics.
- Build and operate a simulated mechanical robot that can see, sense, process, navigate, and interact with humans through voice.
- Understand the necessary elements of artificial intelligence (machine learning, deep learning, etc.) applicable to building a smart robot.
- Implement filters (Kalman and Particle) to enable the robot to locate moving objects in its environment.
- Implement search algorithms and motion planning.
- Implement PID controls to regulate a robot's movement within an environment.
- Implement SLAM algorithms to enable a robot to map out an unknown environment.
- Test and troubleshoot a robot in realistic scenarios.
Autonomous Navigation & SLAM with ROS 2
21 HoursROS 2 (Robot Operating System 2) is an open-source framework designed to support the development of complex and scalable robotic applications.
This instructor-led, live training (online or onsite) is aimed at intermediate-level robotics engineers and developers who wish to implement autonomous navigation and SLAM (Simultaneous Localization and Mapping) using ROS 2.
By the end of this training, participants will be able to:
- Set up and configure ROS 2 for autonomous navigation applications.
- Implement SLAM algorithms for mapping and localization.
- Integrate sensors such as LiDAR and cameras with ROS 2.
- Simulate and test autonomous navigation in Gazebo.
- Deploy navigation stacks on physical robots.
Format of the Course
- Interactive lecture and discussion.
- Hands-on practice using ROS 2 tools and simulation environments.
- Live-lab implementation and testing on virtual or physical robots.
Course Customization Options
- To request a customized training for this course, please contact us to arrange.
Developing Intelligent Bots with Azure
14 HoursThe Azure Bot Service combines the power of the Microsoft Bot Framework and Azure functions to enable rapid development of intelligent bots.
In this instructor-led, live training, participants will learn how to easily create an intelligent bot using Microsoft Azure
By the end of this training, participants will be able to:
- Learn the fundamentals of intelligent bots
- Learn how to create intelligent bots using cloud applications
- Understand how to use the Microsoft Bot Framework, the Bot Builder SDK, and the Azure Bot Service
- Understand how to design bots using bot patterns
- Develop their first intelligent bot using Microsoft Azure
Audience
- Developers
- Hobbyists
- Engineers
- IT Professionals
Format of the course
- Part lecture, part discussion, exercises and heavy hands-on practice
Computer Vision for Robotics: Perception with OpenCV & Deep Learning
21 HoursOpenCV is an open-source computer vision library that enables real-time image processing, while deep learning frameworks such as TensorFlow provide the tools for intelligent perception and decision-making in robotic systems.
This instructor-led, live training (online or onsite) is aimed at intermediate-level robotics engineers, computer vision practitioners, and machine learning engineers who wish to apply computer vision and deep learning techniques for robotic perception and autonomy.
By the end of this training, participants will be able to:
- Implement computer vision pipelines using OpenCV.
- Integrate deep learning models for object detection and recognition.
- Use vision-based data for robotic control and navigation.
- Combine classical vision algorithms with deep neural networks.
- Deploy computer vision systems on embedded and robotic platforms.
Format of the Course
- Interactive lecture and discussion.
- Hands-on practice using OpenCV and TensorFlow.
- Live-lab implementation on simulated or physical robotic systems.
Course Customization Options
- To request a customized training for this course, please contact us to arrange.
Developing a Bot
14 HoursA bot or chatbot is like a computer assistant that is used to automate user interactions on various messaging platforms and get things done faster without the need for users to speak to another human.
In this instructor-led, live training, participants will learn how to get started in developing a bot as they step through the creation of sample chatbots using bot development tools and frameworks.
By the end of this training, participants will be able to:
- Understand the different uses and applications of bots
- Understand the complete process in developing bots
- Explore the different tools and platforms used in building bots
- Build a sample chatbot for Facebook Messenger
- Build a sample chatbot using Microsoft Bot Framework
Audience
- Developers interested in creating their own bot
Format of the course
- Part lecture, part discussion, exercises and heavy hands-on practice
Edge AI for Robots: TinyML, On-Device Inference & Optimization
21 HoursEdge AI enables artificial intelligence models to run directly on embedded or resource-constrained devices, reducing latency and power consumption while increasing autonomy and privacy in robotic systems.
This instructor-led, live training (online or onsite) is aimed at intermediate-level embedded developers and robotics engineers who wish to implement machine learning inference and optimization techniques directly on robotic hardware using TinyML and edge AI frameworks.
By the end of this training, participants will be able to:
- Understand the fundamentals of TinyML and edge AI for robotics.
- Convert and deploy AI models for on-device inference.
- Optimize models for speed, size, and energy efficiency.
- Integrate edge AI systems into robotic control architectures.
- Evaluate performance and accuracy in real-world scenarios.
Format of the Course
- Interactive lecture and discussion.
- Hands-on practice using TinyML and edge AI toolchains.
- Practical exercises on embedded and robotic hardware platforms.
Course Customization Options
- To request a customized training for this course, please contact us to arrange.
Human-Centric Physical AI: Collaborative Robots and Beyond
14 HoursThis instructor-led, live training in the Netherlands (online or onsite) is aimed at intermediate-level participants who wish to explore the role of collaborative robots (cobots) and other human-centric AI systems in modern workplaces.
By the end of this training, participants will be able to:
- Understand the principles of Human-Centric Physical AI and its applications.
- Explore the role of collaborative robots in enhancing workplace productivity.
- Identify and address challenges in human-machine interactions.
- Design workflows that optimize collaboration between humans and AI-driven systems.
- Promote a culture of innovation and adaptability in AI-integrated workplaces.
Human-Robot Interaction (HRI): Voice, Gesture & Collaborative Control
21 HoursHuman-Robot Interaction (HRI): Voice, Gesture & Collaborative Control is a hands-on course designed to introduce participants to the design and implementation of intuitive interfaces for human–robot communication. The training combines theory, design principles, and programming practice to build natural and responsive interaction systems using speech, gesture, and shared control techniques. Participants will learn how to integrate perception modules, develop multimodal input systems, and design robots that safely collaborate with humans.
This instructor-led, live training (online or onsite) is aimed at beginner-level to intermediate-level participants who wish to design and implement human–robot interaction systems that enhance usability, safety, and user experience.
By the end of this training, participants will be able to:
- Understand the foundations and design principles of human–robot interaction.
- Develop voice-based control and response mechanisms for robots.
- Implement gesture recognition using computer vision techniques.
- Design collaborative control systems for safe and shared autonomy.
- Evaluate HRI systems based on usability, safety, and human factors.
Format of the Course
- Interactive lectures and demonstrations.
- Hands-on coding and design exercises.
- Practical experiments in simulation or real robotic environments.
Course Customization Options
- To request a customized training for this course, please contact us to arrange.
Industrial Robotics Automation: ROS-PLC Integration & Digital Twins
28 HoursIndustrial Robotics Automation: ROS-PLC Integration & Digital Twins is a hands-on course focused on bridging industrial automation with modern robotics frameworks. Participants will learn to integrate ROS-based robotic systems with PLCs for synchronized operations and explore digital twin environments to simulate, monitor, and optimize production processes. The course emphasizes interoperability, real-time control, and predictive analysis using digital replicas of physical systems.
This instructor-led, live training (online or onsite) is aimed at intermediate-level professionals who wish to build practical skills in connecting ROS-controlled robots with PLC environments and implementing digital twins for automation and manufacturing optimization.
By the end of this training, participants will be able to:
- Understand communication protocols between ROS and PLC systems.
- Implement real-time data exchange between robots and industrial controllers.
- Develop digital twins for monitoring, testing, and process simulation.
- Integrate sensors, actuators, and robotic manipulators within industrial workflows.
- Design and validate industrial automation systems using hybrid simulation environments.
Format of the Course
- Interactive lecture and architecture walkthroughs.
- Hands-on exercises integrating ROS and PLC systems.
- Simulation and digital twin project implementation.
Course Customization Options
- To request a customized training for this course, please contact us to arrange.
Artificial Intelligence (AI) for Mechatronics
21 HoursThis instructor-led, live training in the Netherlands (online or onsite) is aimed at engineers who wish to learn about the applicability of artificial intelligence to mechatronic systems.
By the end of this training, participants will be able to:
- Gain an overview of artificial intelligence, machine learning, and computational intelligence.
- Understand the concepts of neural networks and different learning methods.
- Choose artificial intelligence approaches effectively for real-life problems.
- Implement AI applications in mechatronic engineering.
Multi-Robot Systems and Swarm Intelligence
28 HoursMulti-Robot Systems and Swarm Intelligence is an advanced training course that explores the design, coordination, and control of robotic teams inspired by biological swarm behaviors. Participants will learn how to model interactions, implement distributed decision-making, and optimize collaboration across multiple agents. The course combines theory with hands-on simulation to prepare learners for applications in logistics, defense, search and rescue, and autonomous exploration.
This instructor-led, live training (online or onsite) is aimed at advanced-level professionals who wish to design, simulate, and implement multi-robot and swarm-based systems using open-source frameworks and algorithms.
By the end of this training, participants will be able to:
- Understand the principles and dynamics of swarm intelligence and cooperative robotics.
- Design communication and coordination strategies for multi-robot systems.
- Implement distributed decision-making and consensus algorithms.
- Simulate collective behaviors such as formation control, flocking, and coverage.
- Apply swarm-based techniques to real-world scenarios and optimization problems.
Format of the Course
- Advanced lectures with algorithmic deep dives.
- Hands-on coding and simulation in ROS 2 and Gazebo.
- Collaborative project applying swarm intelligence principles.
Course Customization Options
- To request a customized training for this course, please contact us to arrange.
Smart Robots for Developers
84 HoursA Smart Robot is an Artificial Intelligence (AI) system that can learn from its environment and its experience and build on its capabilities based on that knowledge. Smart Robots can collaborate with humans, working along-side them and learning from their behavior. Furthermore, they have the capacity for not only manual labor, but cognitive tasks as well. In addition to physical robots, Smart Robots can also be purely software based, residing in a computer as a software application with no moving parts or physical interaction with the world.
In this instructor-led, live training, participants will learn the different technologies, frameworks and techniques for programming different types of mechanical Smart Robots, then apply this knowledge to complete their own Smart Robot projects.
The course is divided into 4 sections, each consisting of three days of lectures, discussions, and hands-on robot development in a live lab environment. Each section will conclude with a practical hands-on project to allow participants to practice and demonstrate their acquired knowledge.
The target hardware for this course will be simulated in 3D through simulation software. The ROS (Robot Operating System) open-source framework, C++ and Python will be used for programming the robots.
By the end of this training, participants will be able to:
- Understand the key concepts used in robotic technologies
- Understand and manage the interaction between software and hardware in a robotic system
- Understand and implement the software components that underpin Smart Robots
- Build and operate a simulated mechanical Smart Robot that can see, sense, process, grasp, navigate, and interact with humans through voice
- Extend a Smart Robot's ability to perform complex tasks through Deep Learning
- Test and troubleshoot a Smart Robot in realistic scenarios
Audience
- Developers
- Engineers
Format of the course
- Part lecture, part discussion, exercises and heavy hands-on practice
Note
- To customize any part of this course (programming language, robot model, etc.) please contact us to arrange.
Smart Robotics in Manufacturing: AI for Perception, Planning, and Control
21 HoursSmart Robotics is the integration of artificial intelligence into robotic systems for improved perception, decision-making, and autonomous control.
This instructor-led, live training (online or onsite) is aimed at advanced-level robotics engineers, systems integrators, and automation leads who wish to implement AI-driven perception, planning, and control in smart manufacturing environments.
By the end of this training, participants will be able to:
- Understand and apply AI techniques for robotic perception and sensor fusion.
- Develop motion planning algorithms for collaborative and industrial robots.
- Deploy learning-based control strategies for real-time decision making.
- Integrate intelligent robotic systems into smart factory workflows.
Format of the Course
- Interactive lecture and discussion.
- Lots of exercises and practice.
- Hands-on implementation in a live-lab environment.
Course Customization Options
- To request a customized training for this course, please contact us to arrange.