Given a new event image (f), the associated 3D motion can be recalled from memory. From the event data (b) recorded on the DVS during drone flight (a), “event images” (c) and 3D motion vectors (d) are computed, and both are encoded as binary vectors and combined in memory via special vector operations (e). In this system, action possibilities, sensory input and other information occupy the same space, are in the same language, and are fused, creating a kind of memory for the robot. They can account for all these types of information in a meaningfully constructed way, binding each modality together in long vectors of 1s and 0s with equal dimension. HBVs can represent disparate discrete things-for example, a single image, a concept, a sound or an instruction sequences made up of discrete things and groupings of discrete things and sequences. In the authors' new computing theory, a robot's operating system would be based on hyperdimensional binary vectors (HBVs), which exist in a sparse and extremely high-dimensional space. This fusion, known as "active perception," would provide a more efficient and faster way for the robot to complete tasks. The next step in robotics will be to integrate a robot's perceptions with its motor capabilities. The cumbersome three-part AI system-each part speaking its own language-is a slow way to get robots to accomplish sensorimotor tasks. A robot's sensors and the actuators that move it are separate systems, linked together by a central learning mechanism that infers a needed action given sensor data, or vice versa. Integration is the most important challenge facing the robotics field. Mitrokhin and Sutor are advised by Aloimonos. Cornelia Fermüller, an associate research scientist with the University of Maryland Institute for Advanced Computer Studies and Computer Science Professor Yiannis Aloimonos. students Anton Mitrokhin and Peter Sutor, Jr. "Learning Sensorimotor Control with Neuromorphic Sensors: Toward Hyperdimensional Active Perception" was written by computer science Ph.D. A paper by University of Maryland researchers just published in the journal Science Robotics introduces a new way of combining perception and motor commands using the so-called hyperdimensional computing theory, which could fundamentally alter and improve the basic artificial intelligence (AI) task of sensorimotor representation-how agents like robots translate what they sense into what they do. Strike three!īut there may be hope for the robot. The robot, on the other hand, needs to use a linkage system to slowly coordinate data from its sensors with its motor capabilities. What he sees, hears, and feels seamlessly combines with his brain and muscle memory to time the swing that produces the hit. Could a robot get a hit in the same situation? Not likely.Īltuve has honed natural reflexes, years of experience, knowledge of the pitcher's tendencies, and an understanding of the trajectories of various pitches.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |