In recent years, robots have been surfing on a trendy wave as standard devices for teaching programming. The tangibility of robotics platforms allows for collaborative and interactive learning. Moreover, with these robot platforms, we also observe the occurrence of a shift of visual attention from the screen (on which the programming is done) to the physical environments (i.e. the robot). In this paper, we describe an experiment aiming at studying the effect of using augmented reality (AR) representations of sensor data in a robotic learning activity. We designed an AR system able to display in real-time the data of the Infra-Red sensors of the Thymio robot. In order to evaluate the impact of AR on the learner’s understanding on how these sensors worked, we designed a pedagogical lesson that can run with or without the AR rendering. Two different age groups of students participated in this between-subject experiment, counting a total of 74 children. The tests were the same for the experimental (AR) and control group (no AR). The exercises differed only through the use of AR. Our results show that AR was worth being used for younger groups dealing with difficult concepts. We discuss our findings and propose future works to establish guidelines for designing AR robotic learning sessions.