Skip to main content
MIT Corporate Relations
MIT Corporate Relations
Search
×
Read
Watch
Attend
About
Connect
MIT Startup Exchange
Search
Sign-In
Register
Search
×
MIT ILP Home
Read
Faculty Features
Research
News
Watch
Attend
Conferences
Webinars
Learning Opportunities
About
Membership
Staff
For Faculty
Connect
Faculty/Researchers
Program Directors
MIT Startup Exchange
User Menu and Search
Search
Sign-In
Register
MIT ILP Home
Toggle menu
Search
Sign-in
Register
Read
Faculty Features
Research
News
Watch
Attend
Conferences
Webinars
Learning Opportunities
About
Membership
Staff
For Faculty
Connect
Faculty/Researchers
Program Directors
MIT Startup Exchange
2.23.21-AI-Yildiz
Conference Video
|
Duration: 21:32
February 23, 2021
View this past event
Preview
2.23.21-AI-Yildiz
Please
login
to view this video.
Video details
Deep learning is a hugely successful and powerful algorithm for machine learning applications such as computer vision and natural language processing. However, the training of these neural networks is limited by the traditional von Neumann architecture of our current CPUs and GPUs. Shuttling data back and forth between the separate memory and computation units in such architecture results in significant energy consumption; many orders of magnitude greater than the energy consumption in human brain. Our research focuses on designing materials and hardware that can instead perform data storage and computation in a single architecture using ions, inspired by the human brain. In the project that I will present as an example, we have designed a protonicelectrochemical synapse that changes conductivity deterministically by current-controlled shuffling of dopant protons across the active device layer; resulting in energy consumption on par with biological synapses in the brain. Through these strategies, we exhibit a path towards neuromorphic hardware that has high yield and consistency, performs data storage and computation in a single device, and uses significantly lesser energy as compared to current systems.
Locked Interactive transcript
Please
login
to view this video.
Video details
Deep learning is a hugely successful and powerful algorithm for machine learning applications such as computer vision and natural language processing. However, the training of these neural networks is limited by the traditional von Neumann architecture of our current CPUs and GPUs. Shuttling data back and forth between the separate memory and computation units in such architecture results in significant energy consumption; many orders of magnitude greater than the energy consumption in human brain. Our research focuses on designing materials and hardware that can instead perform data storage and computation in a single architecture using ions, inspired by the human brain. In the project that I will present as an example, we have designed a protonicelectrochemical synapse that changes conductivity deterministically by current-controlled shuffling of dopant protons across the active device layer; resulting in energy consumption on par with biological synapses in the brain. Through these strategies, we exhibit a path towards neuromorphic hardware that has high yield and consistency, performs data storage and computation in a single device, and uses significantly lesser energy as compared to current systems.
Locked Interactive transcript
More Videos From This Event
See all videos
February 2021
|
Conference Video
2.23.21-AI-Solar-Lezama
Towards AI that Learns to Write Code
February 2021
|
Conference Video
2.23.21-AI-Isola
Generative Models as Data++
February 2021
|
Conference Video
2.23.21-AI-Autonomy-Startups
MIT Startup Exchange Lightning Talks
February 2021
|
Conference Video
2.23.21-AI-Fan
Building Dependable and Verifiable Autonomous Systems
February 2021
|
Conference Video
2.23.21-AI-Roy
Autonomous Flight in Urban environments: Challenges for Perception and Planning
February 2021
|
Conference Video
2.23.21-AI-Benjamin
The MIT MOOS-IvP Open Source Marine Autonomy Project