Ian Johnson

Research project title: 
Efficient Sensor Fusion for Vision and Distance Data
Principal goal for project: 
Integrating semantics into a robots ability to map out its environment will allow human collaborators to interact with robots in the same way they would with other people. Allowing natural language instructions to be passed to robots will expand participation beyond the expert practitioners it is currently limited to.
Research project: 

Project summary: Collaboration between robots and people has traditionally relied on human operators with a detailed understanding of under the hood robotics. Seemingly simple tasks like arm movements and grasping have required knowledge of physical fundamentals like kinematics and dynamics. This naturally limited participation to expert practitioners, and constrained application platforms to those with a high tolerance for failure.

In order to encourage greater collaboration the gap between a robots understanding of the world needs to be brought closer to a human understanding. One barrier to this is the difference in
environment representations. Robots tend to have a geometric view of their environment, made up of distance measurements and shape primitives. A person will tend to have a semantic
understanding, related to the meaning or use of space and objects.

Student type: 
Current student