Subscribe: The URSS Blog
Added By: Feedage Forager Feedage Grade C rated
Language: English
api  day  good  navigation  object  objects  pic microprocessor  results  robot  sensitivity  sensor  sensors  sonic  ultra sonic  ultra 
Rate this Feed
Rate this feedRate this feedRate this feedRate this feedRate this feed
Rate this feed 1 starRate this feed 2 starRate this feed 3 starRate this feed 4 starRate this feed 5 star

Comments (0)

Feed Details and Statistics Feed Statistics
Preview: The URSS Blog

The URSS Blog

To support students engaged in the Undergraduate Research Scholarship Scheme

Copyright: (C) 2018 Jennifer MacDonald

Beginnings by

Tue, 03 Jul 2007 22:58:46 GMT

As an introduction to our work: We (Daniel James and Dominic Orchard) are undertaking a project in the area of robotics and navigation and are working with the Department of Computer Science’s robot to implement a navigation system for the robot, exploring navigation, perception, memory and other related areas in robotics. Day 2 is now over and things have been going well so far. The first day was taken up mostly with standard “first day” type activities, such as chatting about what needs to be done, and trying to get our uni cards updated, but it was a good first day and Dan had a good look at the technical documentation for the Maxon motors and the API (Application Programming Interface) available to us, meanwhile Dom started work on a small graphical utility to aid in callibration. Today we actually managed to get things done. We tried out some example code from Maxon for the motor controllers which worked really well and is looking like a very promising basis for building the new API on top of. We had a good poke around with the robot once Dom had finished writing the callibration utility and some experiments were carried out to aid in callibrating the sensors and making sense of the results they return. The robot has 4 ultra-sonic range sensors on each side of the robot which will be used as the primary method of perception for the robot. The results after the first experiement today are shown here in graph form. We constructed a moveable `wall’ made out of empty paper boxes at well defined distances from the sensors and measured the output from each sensor. The graph is shown here and it can be seen that there is some differences in the sensors (Sensor 3 seems particularly erratic). The ultra-sonic sensors themselves have an analog output, which is processed and linearized by some hardware built by Stuart Valentine, one of the department’s technichians. While the hardware (small circuit with PIC microprocessor) and firmware (software on PIC microprocessor) are the same for each of the four sensors, the sensitivity is controlled by a potentiometer which may account for some of the differences in the results… as well as the reality of operating is a real environment. The plan is to carry out another similar experiment tomorrow to try and average out the results, then use this to write the API code for the sensors and perform any corrections. We also need to perform experiments to rigorously define the `beam width’ of the sensors and sensitive of the sensors to objects at the peripheral of the beam. Time permitting, it would also be informative to test sensitivity to objects of varying size, shape, texture and material. The resolution of the sensor should affect the minimum size of an object that will give a detectable echo. Shapes that reflect all the sound waves away at an angle should be invisible to the sensor, although this is an unlikely property for any object. Objects with a texture that sufficiently disperses the sounds waves may also render the object invisible. Finally material (with all other properties equal) should have no affect on the sensitivity of an ultra-sonic sensor, however for completeness sake it would be good to experimentally verify this. Additionally here is a video of Dan getting some practice for his wedding (~ 2020), walking the robot down the aisle ;) mov00009.3gp[...]