- Back to Home »
- Big Blue is basing next-generation computers on the human brain and modeling the brain inside computers. Brace yourself for a supercomputer that's cooled and powered by electronic blood and small enough to fit in a backpack. IBM Research is working on "interlayer cooling," in which water is pumped through tiny tubes penetrating chips are piggypacked using high-speed communication technology called through-silicon vias. IBM's approach is designed to deal with overheating problems that otherwise severely limit chip stacking. The protruding pipe fittings are for connecting water-cooling tubes. (Credit: Stephen Shankland/CNET) ZURICH, Switzerland -- Despite a strong philosophical connection, computers and brains inhabit separate realms in research. IBM, though, believes the time is ripe to bring them together. Through research projects expected to take a decade, Big Blue is using biological and manufactured forms of computing to learn about the other. On the computing side, IBM is using the brain as a template for breakthrough designs such as the idea of using fluids both to cool the machine and to distribute electrical power. That could enable processing power that's densely packed into 3D volumes rather than spread out across flat 2D circuit boards with slow communication links. And on the brain side, IBM is supplying computing equipment to a $1.3-billion European effort called the Human Brain Project. It uses computers to simulate the actual workings of an entire brain -- a mouse's first, then a human's -- all the way down to the biochemical level of the neuron. Researchers will be able to tweak parameters as the simulation is running to try to figure out core mechanisms for conditions like Alzheimer's disease, schizophrenia, or autism. It's all part of what IBM calls the cognitive systems era, in which computers aren't just programmed, but also perceive what's going on, make judgments, communicate with natural language, and learn from experience. It's a close cousin to that decades-old dream of artificial intelligence. "If we want to make an impact in the cognitive systems era, we need to understand how the brain works," said Matthias Kaiserswerth, a computer scientist who's director of IBM Research in Zurich, speaking during a media tour of the labs on Wednesday. One key challenge driving IBM's work is matching the brain's power consumption. Over millions of years, nature has evolved a remarkably efficient information-processing design, said Alessandro Curioni, manager of IBM Research's computational sciences department. The ability to process the subtleties of human language helped IBM's Watson supercomputer win at Jeopardy That was a high-profile step on the road to cognitive computing, but from a practical perspective, it also showed how much farther computing has to go. "Watson used 85 kilowatts," Kaiserwerth said. "That's a lot of power. The human brain uses 20 watts." Bruno Michel describes Aquasar, an IBM Research prototype high-performance computing machine that uses unusually high-temperature liquid cooling. (Credit: Stephen Shankland/CNET) Dense 3D computing The shift in IBM's computing research shows in the units it uses to measure progress. For decades, the yardstick of choice for gauging computer performance has been operations per second -- the rate at which it can perform mathematical calculations, for example. When energy constraints became a problem, meaning that computers required prohibitive amounts of electrical power and threw off problematic amounts of waste heat, a new measurement arrived: operations per joule of energy. It gauges a computer's energy efficiency. Now IBM has a new yardstick: operations per liter. The company is judging success by how much data-processing ability it can squeeze into a given volume. Today's computers must be laid out on flat circuit boards that ensure plenty of contact with air that cools the chips. "In a computer, processors occupy one millionth of the volume. In a brain, it's 40 percent. Our brain is a volumetric, dense, object," said Bruno Michel, an IBM Research advanced thermal packaging researcher who got his Ph.D in biophysics. What's the problem with sprawl? In short, communication links between processing elements can't keep up with data-transfer demands, and they consume too much power as well, Michel said. The fix is stacking chips into dense 3D configurations, with chips linked using a technology called through-silicon vias (TSVs). That's impossible today because even stacking two chips means crippling overheating problems. But IBM believes it's got an answer to the cooling problem: a branching network of liquid cooling channels that funnel fluid into ever-smaller tubes. The liquid passes not nearly next to the chip, but through it, drawing away heat in the thousandth of a second it takes to make the trip, Michel said. The company has demonstrated the approach in an efficient prototype system called Aquasar. (Get ready for a new yardstick: greenhouse gas emissions. Aquasar can perform 7.9 trillion operations per second per gram of carbon dioxide released into the atmosphere.) IBM can deliver up to 1 watt of power per square centimeter with this technology called a flow battery, which transports electrical power stored chemically. Here, vanadium electrolytes power a microfluidics chip in a lab demonstration. Ultimately IBM hopes to use liquids both to cool and power computers. (Credit: Stephen Shankland/CNET) Liquid-based flow battery But that's not all the liquid will do. IBM also is developing a system called a redox flow battery that also uses it to distribute power instead of using wires. Two liquids called electrolytes, each with oppositely charged electrical ions, circulate through the system to distribute power. Think of it as a liquid battery interlaced through the interstices of the machine. "We are going to provide cooling and power with a fluid," Michel said. "That's how our brain does it." The electrolytes, vanadium-based at present, travel through ever-smaller tubes, said Patrick Ruch, another IBM researcher working on the effort. At the smallest, they're about 100 microns wide, about the width of a human hair, at which point they hand off their power to conventional electrical wires. Flow batteries can produce between 0.5 and 3 volts, and that in turn means IBM can use the technology today to supply 1 watt of power for every square centimeter of a computer's circuit board. Liquid cooling has been around for decades in the computing industry, but most data centers avoid it given its expense and complexity. It's possible the redox battery could provide a new incentive to embrace it, though. Michel estimates the liquid power technology will take 10 to 15 years to develop, but when it works, it'll mean supercomputers that fit into something the size of a backpack, not a basketball court. "A 1-petaflop computer in 10 liters -- that's our goal," Michel said. Performing at 1 petaflops means a computer can complete a quadrillion floating-point mathematical operations per second; today's top supercomputer clocked in at 33.86 petaflops, but it uses 32,000 Xeon processors and 48,000 Xeon Phi accelerator processors. Matthias Kaiserswerth, director of IBM Research in Zurich, is working toward the era of "cognitive computing," in which machines get attributes of human thinking such as perception, learning, and judgment. (Credit: Stephen Shankland/CNET) Human brains, too More conventional supercomputers have been used so far for IBM's collaborations in brain research. The highlight of that work so far has been the Blue Brain project, which is on its third IBM Blue Gene supercomputer at the Ecole Polytechnique Federale de Lausanne, or EPFL, in Lausanne, Switzerland. The Blue Brain and Human Brain Project will take a new step with a Blue Gene/Q augmented by 128 terabytes of flash memory at the Swiss National Supercomputing Center in Lugano, Switzerland. It'll be used to simulate the formation and inner workings of an entire mouse brain, which has about 70 million neurons. The eventual human brain simulation will take place at the Juelich Supercomputing Centre in northern Germany, Curioni said. It's planned to be an "exascale" machine -- one that performs 1 exaflops, or quintillion floating-point operations per second. The project doesn't lack for ambition. One of its driving forces is co-director Henry Markram of EPFL, who has worked on the Blue Brain project for years and sees computing as the way to understand the true workings of the human brain. "It's impossible to experimentally map the brain," simply because it's too complicated, Markram said. There are too many neurons, 55 different varieties of neuron, and 3,000 ways they can interconnect. That complexity is multiplied by differences that appear with 600 different diseases, genetic variation from one person to the next, and changes that go along with the age and sex of humans. "If you can't experimentally map the brain, you have to predict it -- the numbers of neurons, the types, where the proteins are located, how they'll interact," Markram said. "We have to develop an entirely new scence where we predict most of the stuff that cannot be measured." Liquid cooling has traditionally meant water traveling near chips, the hottest part of computers, but IBM Research has begun making chips with cooling conduits built directly in. (Credit: Stephen Shankland/CNET) With the Human Brain Project, researchers will use supercomputers to reproduce how brains form -- basically, growing them in an virtual vat -- then seeing how they respond to input signals from simulated senses and nervous system. The idea isn't to reproduce every last thing about the brain, but rather a model based on the understanding so far. If it works, actual brain behavior should emerge from the fundamental framework inside the computer, and where it doesn't work, scientists will know where their knowledge falls short. "We take these rules and algorithmically reconstruct a model of the brain," Markram said. "We'll say this is biological prediction, then we can go back to the experiments and we can verify if the model is right. We celebrate when the model is wrong, because that's when it points to where we need more data or we don't understand the rules." The result, if the work is successful, will be not just a better understanding of the brain, but better cooperation among brain researchers and medical experts. That could reverse recent declines in the development of new drugs to treat neural problems, he said. And understanding the brain could usher in the era of "neuromorphic computing." "Any new rules, circuits, or understanding of how the brain works will allow us to design neuromorphic machines that are much more powerful in terms of cognitive power, energy efficiency, and packaging," Curioni said. And that, in turn, could lead to profoundly more capable computers. For starters, IBM has four markets in mind: machines that could find the best places to invest money, bring new depth and accuracy to medical diagnoses, research the appropriate legal precedents in court cases, or give people help when they dial a call center. But it's not hard to imagine that's only the beginning. When computers can learn for themselves and program themselves, it's clear the divide separating biological and artificial computing will be a lot narrower. IBM Research investigates supercomputing, nanotechnology, medicine, and more at its Zurich labs. (Credit: Stephen Shankland/CNET)
Big Blue is basing next-generation computers on the human brain and modeling the brain inside computers. Brace yourself for a supercomputer that's cooled and powered by electronic blood and small enough to fit in a backpack. IBM Research is working on "interlayer cooling," in which water is pumped through tiny tubes penetrating chips are piggypacked using high-speed communication technology called through-silicon vias. IBM's approach is designed to deal with overheating problems that otherwise severely limit chip stacking. The protruding pipe fittings are for connecting water-cooling tubes. (Credit: Stephen Shankland/CNET) ZURICH, Switzerland -- Despite a strong philosophical connection, computers and brains inhabit separate realms in research. IBM, though, believes the time is ripe to bring them together. Through research projects expected to take a decade, Big Blue is using biological and manufactured forms of computing to learn about the other. On the computing side, IBM is using the brain as a template for breakthrough designs such as the idea of using fluids both to cool the machine and to distribute electrical power. That could enable processing power that's densely packed into 3D volumes rather than spread out across flat 2D circuit boards with slow communication links. And on the brain side, IBM is supplying computing equipment to a $1.3-billion European effort called the Human Brain Project. It uses computers to simulate the actual workings of an entire brain -- a mouse's first, then a human's -- all the way down to the biochemical level of the neuron. Researchers will be able to tweak parameters as the simulation is running to try to figure out core mechanisms for conditions like Alzheimer's disease, schizophrenia, or autism. It's all part of what IBM calls the cognitive systems era, in which computers aren't just programmed, but also perceive what's going on, make judgments, communicate with natural language, and learn from experience. It's a close cousin to that decades-old dream of artificial intelligence. "If we want to make an impact in the cognitive systems era, we need to understand how the brain works," said Matthias Kaiserswerth, a computer scientist who's director of IBM Research in Zurich, speaking during a media tour of the labs on Wednesday. One key challenge driving IBM's work is matching the brain's power consumption. Over millions of years, nature has evolved a remarkably efficient information-processing design, said Alessandro Curioni, manager of IBM Research's computational sciences department. The ability to process the subtleties of human language helped IBM's Watson supercomputer win at Jeopardy That was a high-profile step on the road to cognitive computing, but from a practical perspective, it also showed how much farther computing has to go. "Watson used 85 kilowatts," Kaiserwerth said. "That's a lot of power. The human brain uses 20 watts." Bruno Michel describes Aquasar, an IBM Research prototype high-performance computing machine that uses unusually high-temperature liquid cooling. (Credit: Stephen Shankland/CNET) Dense 3D computing The shift in IBM's computing research shows in the units it uses to measure progress. For decades, the yardstick of choice for gauging computer performance has been operations per second -- the rate at which it can perform mathematical calculations, for example. When energy constraints became a problem, meaning that computers required prohibitive amounts of electrical power and threw off problematic amounts of waste heat, a new measurement arrived: operations per joule of energy. It gauges a computer's energy efficiency. Now IBM has a new yardstick: operations per liter. The company is judging success by how much data-processing ability it can squeeze into a given volume. Today's computers must be laid out on flat circuit boards that ensure plenty of contact with air that cools the chips. "In a computer, processors occupy one millionth of the volume. In a brain, it's 40 percent. Our brain is a volumetric, dense, object," said Bruno Michel, an IBM Research advanced thermal packaging researcher who got his Ph.D in biophysics. What's the problem with sprawl? In short, communication links between processing elements can't keep up with data-transfer demands, and they consume too much power as well, Michel said. The fix is stacking chips into dense 3D configurations, with chips linked using a technology called through-silicon vias (TSVs). That's impossible today because even stacking two chips means crippling overheating problems. But IBM believes it's got an answer to the cooling problem: a branching network of liquid cooling channels that funnel fluid into ever-smaller tubes. The liquid passes not nearly next to the chip, but through it, drawing away heat in the thousandth of a second it takes to make the trip, Michel said. The company has demonstrated the approach in an efficient prototype system called Aquasar. (Get ready for a new yardstick: greenhouse gas emissions. Aquasar can perform 7.9 trillion operations per second per gram of carbon dioxide released into the atmosphere.) IBM can deliver up to 1 watt of power per square centimeter with this technology called a flow battery, which transports electrical power stored chemically. Here, vanadium electrolytes power a microfluidics chip in a lab demonstration. Ultimately IBM hopes to use liquids both to cool and power computers. (Credit: Stephen Shankland/CNET) Liquid-based flow battery But that's not all the liquid will do. IBM also is developing a system called a redox flow battery that also uses it to distribute power instead of using wires. Two liquids called electrolytes, each with oppositely charged electrical ions, circulate through the system to distribute power. Think of it as a liquid battery interlaced through the interstices of the machine. "We are going to provide cooling and power with a fluid," Michel said. "That's how our brain does it." The electrolytes, vanadium-based at present, travel through ever-smaller tubes, said Patrick Ruch, another IBM researcher working on the effort. At the smallest, they're about 100 microns wide, about the width of a human hair, at which point they hand off their power to conventional electrical wires. Flow batteries can produce between 0.5 and 3 volts, and that in turn means IBM can use the technology today to supply 1 watt of power for every square centimeter of a computer's circuit board. Liquid cooling has been around for decades in the computing industry, but most data centers avoid it given its expense and complexity. It's possible the redox battery could provide a new incentive to embrace it, though. Michel estimates the liquid power technology will take 10 to 15 years to develop, but when it works, it'll mean supercomputers that fit into something the size of a backpack, not a basketball court. "A 1-petaflop computer in 10 liters -- that's our goal," Michel said. Performing at 1 petaflops means a computer can complete a quadrillion floating-point mathematical operations per second; today's top supercomputer clocked in at 33.86 petaflops, but it uses 32,000 Xeon processors and 48,000 Xeon Phi accelerator processors. Matthias Kaiserswerth, director of IBM Research in Zurich, is working toward the era of "cognitive computing," in which machines get attributes of human thinking such as perception, learning, and judgment. (Credit: Stephen Shankland/CNET) Human brains, too More conventional supercomputers have been used so far for IBM's collaborations in brain research. The highlight of that work so far has been the Blue Brain project, which is on its third IBM Blue Gene supercomputer at the Ecole Polytechnique Federale de Lausanne, or EPFL, in Lausanne, Switzerland. The Blue Brain and Human Brain Project will take a new step with a Blue Gene/Q augmented by 128 terabytes of flash memory at the Swiss National Supercomputing Center in Lugano, Switzerland. It'll be used to simulate the formation and inner workings of an entire mouse brain, which has about 70 million neurons. The eventual human brain simulation will take place at the Juelich Supercomputing Centre in northern Germany, Curioni said. It's planned to be an "exascale" machine -- one that performs 1 exaflops, or quintillion floating-point operations per second. The project doesn't lack for ambition. One of its driving forces is co-director Henry Markram of EPFL, who has worked on the Blue Brain project for years and sees computing as the way to understand the true workings of the human brain. "It's impossible to experimentally map the brain," simply because it's too complicated, Markram said. There are too many neurons, 55 different varieties of neuron, and 3,000 ways they can interconnect. That complexity is multiplied by differences that appear with 600 different diseases, genetic variation from one person to the next, and changes that go along with the age and sex of humans. "If you can't experimentally map the brain, you have to predict it -- the numbers of neurons, the types, where the proteins are located, how they'll interact," Markram said. "We have to develop an entirely new scence where we predict most of the stuff that cannot be measured." Liquid cooling has traditionally meant water traveling near chips, the hottest part of computers, but IBM Research has begun making chips with cooling conduits built directly in. (Credit: Stephen Shankland/CNET) With the Human Brain Project, researchers will use supercomputers to reproduce how brains form -- basically, growing them in an virtual vat -- then seeing how they respond to input signals from simulated senses and nervous system. The idea isn't to reproduce every last thing about the brain, but rather a model based on the understanding so far. If it works, actual brain behavior should emerge from the fundamental framework inside the computer, and where it doesn't work, scientists will know where their knowledge falls short. "We take these rules and algorithmically reconstruct a model of the brain," Markram said. "We'll say this is biological prediction, then we can go back to the experiments and we can verify if the model is right. We celebrate when the model is wrong, because that's when it points to where we need more data or we don't understand the rules." The result, if the work is successful, will be not just a better understanding of the brain, but better cooperation among brain researchers and medical experts. That could reverse recent declines in the development of new drugs to treat neural problems, he said. And understanding the brain could usher in the era of "neuromorphic computing." "Any new rules, circuits, or understanding of how the brain works will allow us to design neuromorphic machines that are much more powerful in terms of cognitive power, energy efficiency, and packaging," Curioni said. And that, in turn, could lead to profoundly more capable computers. For starters, IBM has four markets in mind: machines that could find the best places to invest money, bring new depth and accuracy to medical diagnoses, research the appropriate legal precedents in court cases, or give people help when they dial a call center. But it's not hard to imagine that's only the beginning. When computers can learn for themselves and program themselves, it's clear the divide separating biological and artificial computing will be a lot narrower. IBM Research investigates supercomputing, nanotechnology, medicine, and more at its Zurich labs. (Credit: Stephen Shankland/CNET)
Big Blue is basing next-generation computers on the human brain and modeling the brain inside computers. Brace yourself for a supercomputer that's cooled and powered by electronic blood and small enough to fit in a backpack.
(Credit: Stephen Shankland/CNET)
ZURICH, Switzerland -- Despite a strong philosophical connection, computers and brains inhabit separate realms in research. IBM, though, believes the time is ripe to bring them together.
Through research projects expected to take a decade, Big Blue is using biological and manufactured forms of computing to learn about the other.
On the computing side, IBM is using the brain as a template for breakthrough designs such as the idea of using fluids both to cool the machine and to distribute electrical power. That could enable processing power that's densely packed into 3D volumes rather than spread out across flat 2D circuit boards with slow communication links.
And on the brain side, IBM is supplying computing equipment to a $1.3-billion European effort called the Human Brain Project. It uses computers to simulate the actual workings of an entire brain -- a mouse's first, then a human's -- all the way down to the biochemical level of the neuron. Researchers will be able to tweak parameters as the simulation is running to try to figure out core mechanisms for conditions like Alzheimer's disease, schizophrenia, or autism.
It's all part of what IBM calls the cognitive systems era, in which computers aren't just programmed, but also perceive what's going on, make judgments, communicate with natural language, and learn from experience. It's a close cousin to that decades-old dream of artificial intelligence.
"If we want to make an impact in the cognitive systems era, we need to understand how the brain works," said Matthias Kaiserswerth, a computer scientist who's director of IBM Research in Zurich, speaking during a media tour of the labs on Wednesday.
One key challenge driving IBM's work is matching the brain's power consumption. Over millions of years, nature has evolved a remarkably efficient information-processing design, said Alessandro Curioni, manager of IBM Research's computational sciences department. The ability to process the subtleties of human language helped IBM's Watson supercomputer win at Jeopardy That was a high-profile step on the road to cognitive computing, but from a practical perspective, it also showed how much farther computing has to go.
"Watson used 85 kilowatts," Kaiserwerth said. "That's a lot of power. The human brain uses 20 watts."
(Credit: Stephen Shankland/CNET)
Dense 3D computing
The shift in IBM's computing research shows in the units it uses to measure progress. For decades, the yardstick of choice for gauging computer performance has been operations per second -- the rate at which it can perform mathematical calculations, for example.
When energy constraints became a problem, meaning that computers required prohibitive amounts of electrical power and threw off problematic amounts of waste heat, a new measurement arrived: operations per joule of energy. It gauges a computer's energy efficiency.
Now IBM has a new yardstick: operations per liter. The company is judging success by how much data-processing ability it can squeeze into a given volume. Today's computers must be laid out on flat circuit boards that ensure plenty of contact with air that cools the chips.
"In a computer, processors occupy one millionth of the volume. In a brain, it's 40 percent. Our brain is a volumetric, dense, object," said Bruno Michel, an IBM Research advanced thermal packaging researcher who got his Ph.D in biophysics.
What's the problem with sprawl? In short, communication links between processing elements can't keep up with data-transfer demands, and they consume too much power as well, Michel said.
The fix is stacking chips into dense 3D configurations, with chips linked using a technology called through-silicon vias (TSVs). That's impossible today because even stacking two chips means crippling overheating problems. But IBM believes it's got an answer to the cooling problem: a branching network of liquid cooling channels that funnel fluid into ever-smaller tubes.
The liquid passes not nearly next to the chip, but through it, drawing away heat in the thousandth of a second it takes to make the trip, Michel said. The company has demonstrated the approach in an efficient prototype system called Aquasar. (Get ready for a new yardstick: greenhouse gas emissions. Aquasar can perform 7.9 trillion operations per second per gram of carbon dioxide released into the atmosphere.)
(Credit: Stephen Shankland/CNET)
Liquid-based flow battery
But that's not all the liquid will do. IBM also is developing a system called a redox flow battery that also uses it to distribute power instead of using wires. Two liquids called electrolytes, each with oppositely charged electrical ions, circulate through the system to distribute power. Think of it as a liquid battery interlaced through the interstices of the machine.
"We are going to provide cooling and power with a fluid," Michel said. "That's how our brain does it."
The electrolytes, vanadium-based at present, travel through ever-smaller tubes, said Patrick Ruch, another IBM researcher working on the effort. At the smallest, they're about 100 microns wide, about the width of a human hair, at which point they hand off their power to conventional electrical wires. Flow batteries can produce between 0.5 and 3 volts, and that in turn means IBM can use the technology today to supply 1 watt of power for every square centimeter of a computer's circuit board.
Liquid cooling has been around for decades in the computing industry, but most data centers avoid it given its expense and complexity. It's possible the redox battery could provide a new incentive to embrace it, though.
Michel estimates the liquid power technology will take 10 to 15 years to develop, but when it works, it'll mean supercomputers that fit into something the size of a backpack, not a basketball court.
"A 1-petaflop computer in 10 liters -- that's our goal," Michel said.
Performing at 1 petaflops means a computer can complete a quadrillion floating-point mathematical operations per second; today's top supercomputer clocked in at 33.86 petaflops, but it uses 32,000 Xeon processors and 48,000 Xeon Phi accelerator processors.
(Credit: Stephen Shankland/CNET)
Human brains, too
More conventional supercomputers have been used so far for IBM's collaborations in brain research. The highlight of that work so far has been the Blue Brain project, which is on its third IBM Blue Gene supercomputer at the Ecole Polytechnique Federale de Lausanne, or EPFL, in Lausanne, Switzerland. The Blue Brain and Human Brain Project will take a new step with a Blue Gene/Q augmented by 128 terabytes of flash memory at the Swiss National Supercomputing Center in Lugano, Switzerland. It'll be used to simulate the formation and inner workings of an entire mouse brain, which has about 70 million neurons.
The eventual human brain simulation will take place at the Juelich Supercomputing Centre in northern Germany, Curioni said. It's planned to be an "exascale" machine -- one that performs 1 exaflops, or quintillion floating-point operations per second.
The project doesn't lack for ambition. One of its driving forces is co-director Henry Markram of EPFL, who has worked on the Blue Brain project for years and sees computing as the way to understand the true workings of the human brain.
"It's impossible to experimentally map the brain," simply because it's too complicated, Markram said. There are too many neurons, 55 different varieties of neuron, and 3,000 ways they can interconnect. That complexity is multiplied by differences that appear with 600 different diseases, genetic variation from one person to the next, and changes that go along with the age and sex of humans.
"If you can't experimentally map the brain, you have to predict it -- the numbers of neurons, the types, where the proteins are located, how they'll interact," Markram said. "We have to develop an entirely new scence where we predict most of the stuff that cannot be measured."
(Credit: Stephen Shankland/CNET)
With the Human Brain Project, researchers will use supercomputers to reproduce how brains form -- basically, growing them in an virtual vat -- then seeing how they respond to input signals from simulated senses and nervous system.
The idea isn't to reproduce every last thing about the brain, but rather a model based on the understanding so far. If it works, actual brain behavior should emerge from the fundamental framework inside the computer, and where it doesn't work, scientists will know where their knowledge falls short.
"We take these rules and algorithmically reconstruct a model of the brain," Markram said. "We'll say this is biological prediction, then we can go back to the experiments and we can verify if the model is right. We celebrate when the model is wrong, because that's when it points to where we need more data or we don't understand the rules."
The result, if the work is successful, will be not just a better understanding of the brain, but better cooperation among brain researchers and medical experts. That could reverse recent declines in the development of new drugs to treat neural problems, he said.
And understanding the brain could usher in the era of "neuromorphic computing."
"Any new rules, circuits, or understanding of how the brain works will allow us to design neuromorphic machines that are much more powerful in terms of cognitive power, energy efficiency, and packaging," Curioni said.
And that, in turn, could lead to profoundly more capable computers. For starters, IBM has four markets in mind: machines that could find the best places to invest money, bring new depth and accuracy to medical diagnoses, research the appropriate legal precedents in court cases, or give people help when they dial a call center.
But it's not hard to imagine that's only the beginning. When computers can learn for themselves and program themselves, it's clear the divide separating biological and artificial computing will be a lot narrower.
(Credit: Stephen Shankland/CNET)