Neuroscience

Articles and news from the latest research reports.

Posts tagged science

61 notes

BrainHealth Team Studies Overeating as a Type of Addiction

A similar, insidious craving plagues all addicts, no matter the substance of choice. A new study published in NeuroImage from Center for BrainHealth scientists Dr. Francesca Filbey, assistant professor in the School of Behavioral and Brain Sciences, and doctoral student Samuel DeWitt has found that for binge-eaters, as with all addiction sufferers, the compulsion to overeat is rooted in the brain’s reward center.

Read more

Filed under obesity overeating binge-eating reward system addiction neuroscience psychology science

94 notes


Study pinpoints brain area’s role in learning
An area of the brain called the orbitofrontal cortex is responsible for decisions made on the spur of the moment, but not those made based on prior experience or habit, according to a new basic science study from substance abuse researchers at the University of Maryland School of Medicine and the National Institute on Drug Abuse (NIDA). Scientists had previously believed that the area of the brain was responsible for both types of behavior and decision-making. The distinction is critical to understanding the neurobiology of decision-making, particularly with regard to substance abuse. The study was published online in the journal Science.
Scientists have assumed that the orbitofrontal cortex plays a role in “value-based” decision-making, when a person compares options and weights consequences and rewards to choose best alternative. The Science study shows that this area of the brain is involved in decision-making only when the value must be inferred or computed rapidly or hastily. If the value has been “cached” or pre-computed, like a habit, then the orbitofrontal cortex is not necessary.
The same is true for learning — if a person infers an outcome but it does not happen, the resulting error can drive learning. The study shows that the orbitofrontal cortex is necessary for the inferred value that is used for this type of learning.
"Our research showed that damage to the orbitofrontal cortex may decrease a person’s ability to use prior experience to make good decisions on the fly," says lead author Joshua Jones, Ph.D., a postdoctoral researcher at the University of Maryland School of Medicine and a research scientist at NIDA, part of the National Institutes of Health. "The person isn’t able to consider the whole continuum of the decision — the mind’s map of how choices play out further down the road. Instead, the person is going to regress to habitual behavior, gravitating toward the choice that provides the most value in its immediate reward."
The study enhances scientists’ understanding of how the brain works in healthy and unhealthy individuals, according to the researchers.
"This discovery has general implications in understanding how the brain processes information to help us make good decisions and to learn from our mistakes," says senior author Geoffrey Schoenbaum, M.D., Ph.D., adjunct professor at the University of Maryland School of Medicine and senior investigator and chief of the Cellular Neurobiology Research Branch at NIDA. "Understanding more about the orbitofrontal cortex also is important for understanding disorders such as addiction that seem to involve maladaptive decision-making and learning. Cocaine in particular seems to have long-lasting effects on the orbitofrontal cortex. One aspect of this work, which we are pursuing, is that perhaps some of the problems that characterize addiction are the result of drug-induced changes in this area of the brain."

(Image: iStock)

Study pinpoints brain area’s role in learning

An area of the brain called the orbitofrontal cortex is responsible for decisions made on the spur of the moment, but not those made based on prior experience or habit, according to a new basic science study from substance abuse researchers at the University of Maryland School of Medicine and the National Institute on Drug Abuse (NIDA). Scientists had previously believed that the area of the brain was responsible for both types of behavior and decision-making. The distinction is critical to understanding the neurobiology of decision-making, particularly with regard to substance abuse. The study was published online in the journal Science.

Scientists have assumed that the orbitofrontal cortex plays a role in “value-based” decision-making, when a person compares options and weights consequences and rewards to choose best alternative. The Science study shows that this area of the brain is involved in decision-making only when the value must be inferred or computed rapidly or hastily. If the value has been “cached” or pre-computed, like a habit, then the orbitofrontal cortex is not necessary.

The same is true for learning — if a person infers an outcome but it does not happen, the resulting error can drive learning. The study shows that the orbitofrontal cortex is necessary for the inferred value that is used for this type of learning.

"Our research showed that damage to the orbitofrontal cortex may decrease a person’s ability to use prior experience to make good decisions on the fly," says lead author Joshua Jones, Ph.D., a postdoctoral researcher at the University of Maryland School of Medicine and a research scientist at NIDA, part of the National Institutes of Health. "The person isn’t able to consider the whole continuum of the decision — the mind’s map of how choices play out further down the road. Instead, the person is going to regress to habitual behavior, gravitating toward the choice that provides the most value in its immediate reward."

The study enhances scientists’ understanding of how the brain works in healthy and unhealthy individuals, according to the researchers.

"This discovery has general implications in understanding how the brain processes information to help us make good decisions and to learn from our mistakes," says senior author Geoffrey Schoenbaum, M.D., Ph.D., adjunct professor at the University of Maryland School of Medicine and senior investigator and chief of the Cellular Neurobiology Research Branch at NIDA. "Understanding more about the orbitofrontal cortex also is important for understanding disorders such as addiction that seem to involve maladaptive decision-making and learning. Cocaine in particular seems to have long-lasting effects on the orbitofrontal cortex. One aspect of this work, which we are pursuing, is that perhaps some of the problems that characterize addiction are the result of drug-induced changes in this area of the brain."

(Image: iStock)

Filed under brain orbitofrontal cortex substance abuse learning decision-making neuroscience psychology science

44 notes

Combination of two pharmaceuticals proves effective in the treatment of multiple sclerosis
A new substance class for the treatment of multiple sclerosis and other neurodegenerative diseases now promises increased efficacy paired with fewer side effects. To achieve this, a team of scientists under the leadership of Prof. Gunter Fischer (Max Planck Research Unit for Enzymology of Protein Folding, Halle/Saale, Germany) and Dr. Frank Striggow (German Center for Neurodegenerative Diseases (DZNE)) have combined two already approved pharmaceutical substances with each other using a chemical linker structure. The objectives of this combination are to ensure maximum brain cell protection on the one hand and the suppression of unwanted side effects on the other. The new class of substances has now been registered with the European Patent Office as the DZNE’s first patent in the form of a joint patent application with the Max Planck Research Unit. “The patent approval process can take several years. During this phase we are planning to conclude the pre-clinical development. It is our aim to start with clinical research and development at the earliest possible time. Overall, we have identified substantial therapeutic potential as far as chronic and age-related neurodegenerative diseases are concerned,” comments Dr. Frank Striggow.
Read more

Combination of two pharmaceuticals proves effective in the treatment of multiple sclerosis

A new substance class for the treatment of multiple sclerosis and other neurodegenerative diseases now promises increased efficacy paired with fewer side effects. To achieve this, a team of scientists under the leadership of Prof. Gunter Fischer (Max Planck Research Unit for Enzymology of Protein Folding, Halle/Saale, Germany) and Dr. Frank Striggow (German Center for Neurodegenerative Diseases (DZNE)) have combined two already approved pharmaceutical substances with each other using a chemical linker structure. The objectives of this combination are to ensure maximum brain cell protection on the one hand and the suppression of unwanted side effects on the other. The new class of substances has now been registered with the European Patent Office as the DZNE’s first patent in the form of a joint patent application with the Max Planck Research Unit. “The patent approval process can take several years. During this phase we are planning to conclude the pre-clinical development. It is our aim to start with clinical research and development at the earliest possible time. Overall, we have identified substantial therapeutic potential as far as chronic and age-related neurodegenerative diseases are concerned,” comments Dr. Frank Striggow.

Read more

Filed under MS pharmaceutical substances treatment immunosuppressants nerve cells neuroscience science

1,138 notes

To Get the Best Look at a Person’s Face, Look Just Below the Eyes

They say that the eyes are the windows to the soul. However, to get a real idea of what a person is up to, according to UC Santa Barbara researchers Miguel Eckstein and Matt Peterson, the best place to check is right below the eyes. Their findings are published in the Proceedings of the National Academy of Science.

"It’s pretty fast, it’s effortless –– we’re not really aware of what we’re doing," said Miguel Eckstein, professor of psychology in the Department of Psychological & Brain Sciences. Using an eye tracker and more than 100 photos of faces and participants, Eckstein and graduate research assistant Peterson followed the gaze of the experiment’s participants to determine where they look in the first crucial moment of identifying a person’s identity, gender, and emotional state.

"For the majority of people, the first place we look at is somewhere in the middle, just below the eyes," Eckstein said. One possible reason could be that we are trained from youth to look there, because it’s polite in some cultures. Or, because it allows us to figure out where the person’s attention is focused.

However, Peterson and Eckstein hypothesize that, despite the ever-so-brief –– 250 millisecond –– glance, the relatively featureless point of focus, and the fact that we’re usually unaware that we’re doing it, the brain is actually using sophisticated computations to plan an eye movement that ensures the highest accuracy in tasks that are evolutionarily important in determining flight, fight, or love at first sight.

Filed under eye movements face perception face processing neuroscience psychology science

68 notes


Scientists image brain structures that deteriorate in Parkinson’s
A new imaging technique developed at MIT offers the first glimpse of the degeneration of two brain structures affected by Parkinson’s disease.
The technique, which combines several types of magnetic resonance imaging (MRI), could allow doctors to better monitor patients’ progression and track the effectiveness of potential new treatments, says Suzanne Corkin, MIT professor emerita of neuroscience and leader of the research team. The first author of the paper is David Ziegler, who received his PhD in brain and cognitive sciences from MIT in 2011.
The study, appearing in the Nov. 26 online edition of the Archives of Neurology, is also the first to provide clinical evidence for the theory that Parkinson’s neurodegeneration begins deep in the brain and advances upward.
“This progression has never been shown in living people, and that’s what was special about this study. With our new imaging methods, we can see these structures more clearly than anyone had seen them before,” Corkin says.

Scientists image brain structures that deteriorate in Parkinson’s

A new imaging technique developed at MIT offers the first glimpse of the degeneration of two brain structures affected by Parkinson’s disease.

The technique, which combines several types of magnetic resonance imaging (MRI), could allow doctors to better monitor patients’ progression and track the effectiveness of potential new treatments, says Suzanne Corkin, MIT professor emerita of neuroscience and leader of the research team. The first author of the paper is David Ziegler, who received his PhD in brain and cognitive sciences from MIT in 2011.

The study, appearing in the Nov. 26 online edition of the Archives of Neurology, is also the first to provide clinical evidence for the theory that Parkinson’s neurodegeneration begins deep in the brain and advances upward.

“This progression has never been shown in living people, and that’s what was special about this study. With our new imaging methods, we can see these structures more clearly than anyone had seen them before,” Corkin says.

Filed under brain neuroimaging parkinson's disease neurodegeneration neuroscience psychology science

138 notes


Yawning may cool brain when needed
Yawning isn’t triggered because you’re bored, tired or need oxygen. Rather, yawning helps regulate the brain’s temperature, according to Gary Hack, of the University of Maryland School of Dentistry, and Andrew Gallup, of Princeton University.
"The brain is exquisitely sensitive to temperature changes and therefore must be protected from overheating," they said in a University of Maryland news release. "Brains, like computers, operate best when they are cool."
During yawning, the walls of the maxillary sinuses (located in the cheeks on each side of the nose) flex like bellows and help with brain cooling, according to the researchers.
They noted that the actual function of sinuses is still the subject of debate, and this theory may help clarify their purpose.
"Very little is understood about them, and little is agreed upon even by those who investigate them. Some scientists believe that they have no function at all," Hack said in the news release.
The researchers said their theory that yawning helps cool the brain has medical implications. For example, excessive yawning often precedes seizures in people with epilepsy and pain in people with migraine headaches.
Doctors may be able to use excessive yawning as a way to identify patients with conditions that affect temperature regulation.
"Excessive yawning appears to be symptomatic of conditions that increase brain and/or core temperature, such as central nervous system damage and sleep deprivation," Gallup said in the news release.

Yawning may cool brain when needed

Yawning isn’t triggered because you’re bored, tired or need oxygen. Rather, yawning helps regulate the brain’s temperature, according to Gary Hack, of the University of Maryland School of Dentistry, and Andrew Gallup, of Princeton University.

"The brain is exquisitely sensitive to temperature changes and therefore must be protected from overheating," they said in a University of Maryland news release. "Brains, like computers, operate best when they are cool."

During yawning, the walls of the maxillary sinuses (located in the cheeks on each side of the nose) flex like bellows and help with brain cooling, according to the researchers.

They noted that the actual function of sinuses is still the subject of debate, and this theory may help clarify their purpose.

"Very little is understood about them, and little is agreed upon even by those who investigate them. Some scientists believe that they have no function at all," Hack said in the news release.

The researchers said their theory that yawning helps cool the brain has medical implications. For example, excessive yawning often precedes seizures in people with epilepsy and pain in people with migraine headaches.

Doctors may be able to use excessive yawning as a way to identify patients with conditions that affect temperature regulation.

"Excessive yawning appears to be symptomatic of conditions that increase brain and/or core temperature, such as central nervous system damage and sleep deprivation," Gallup said in the news release.

Filed under brain brain cooling yawning temperature neuroscience psychology science

41 notes


How the animals lost their sensors
For free-living organisms, the ability to sense and respond to the outside environment is crucial for survival. Eukaryotes, such as animals and plants, often have highly complex network systems in place to monitor their surroundings and respond effectively, but bacteria have developed a remarkably simple system. It’s called the ‘Two Component System’ because it literally relies on just two components; a sensor and a responder. The sensor picks up the signal, communicates this to the responder, which then causes the effect.
The picture above shows this process happening. The ‘communication’ of the message from the sensor to the responder, as shown by the coloured arrows, is carried out by transferring phosphate molecules. The signal interacting with the sensor causes the sensor to autophosphorylate (phosphorylate itself) and then pass the phosphate molecule onto the responder to trigger the response. The letters “H” and “D” are the actual amino-acids being phosphorylated; Histadine and Aspartate.
Although Two-Component Systems (TCS) are found in all three superkingdoms of life (archaea, bacteria and eukaryotes) they are suspiciously absent from the animal kingdom. Plants have them, as do fungi and several protazoa, but they just aren’t present in animals. For this reason they’ve been looked into as potential antibiotic targets as knocking out the Two-Component Systems of most bacteria is fatal.
Why don’t animals use TCS? To answer this you have to start looking at the evolution of the system itself, because despite being nominally present in eukaryotes such as plants and fungi, TCS are used very differently. Bacteria use TCS for sensing a wide variety of signals; stress, metabolism, nutrient regulation, chemotaxis, pathogen-host interactions etc. In eukaryotes on the other hand they are used sparingly; for ethylene responses and photosensitivity in plants and osmoregulation in fungi and slime moulds.

Read more

How the animals lost their sensors

For free-living organisms, the ability to sense and respond to the outside environment is crucial for survival. Eukaryotes, such as animals and plants, often have highly complex network systems in place to monitor their surroundings and respond effectively, but bacteria have developed a remarkably simple system. It’s called the ‘Two Component System’ because it literally relies on just two components; a sensor and a responder. The sensor picks up the signal, communicates this to the responder, which then causes the effect.

The picture above shows this process happening. The ‘communication’ of the message from the sensor to the responder, as shown by the coloured arrows, is carried out by transferring phosphate molecules. The signal interacting with the sensor causes the sensor to autophosphorylate (phosphorylate itself) and then pass the phosphate molecule onto the responder to trigger the response. The letters “H” and “D” are the actual amino-acids being phosphorylated; Histadine and Aspartate.

Although Two-Component Systems (TCS) are found in all three superkingdoms of life (archaea, bacteria and eukaryotes) they are suspiciously absent from the animal kingdom. Plants have them, as do fungi and several protazoa, but they just aren’t present in animals. For this reason they’ve been looked into as potential antibiotic targets as knocking out the Two-Component Systems of most bacteria is fatal.

Why don’t animals use TCS? To answer this you have to start looking at the evolution of the system itself, because despite being nominally present in eukaryotes such as plants and fungi, TCS are used very differently. Bacteria use TCS for sensing a wide variety of signals; stress, metabolism, nutrient regulation, chemotaxis, pathogen-host interactions etc. In eukaryotes on the other hand they are used sparingly; for ethylene responses and photosensitivity in plants and osmoregulation in fungi and slime moulds.

Read more

Filed under animals bacteria evolution proteins Two Component System science

105 notes



Smoking ‘rots’ brain, says King’s College study
Smoking “rots” the brain by damaging memory, learning and reasoning, according to researchers at King’s College London. A study of 8,800 people over 50 showed high blood pressure and being overweight also seemed to affect the brain, but to a lesser extent.
Scientists involved said people needed to be aware that lifestyles could damage the mind as well as the body. Their study was published in the journal Age and Ageing.
Researchers at King’s were investigating links between the likelihood of a heart attack or stroke and the state of the brain. Data about the health and lifestyle of a group of over-50s was collected and brain tests, such as making participants learn new words or name as many animals as they could in a minute, were also performed.
They were all tested again after four and then eight years. The results showed that the overall risk of a heart attack or stroke was “significantly associated with cognitive decline” with those at the highest risk showing the greatest decline.
It also said there was a “consistent association” between smoking and lower scores in the tests. One of the researchers, Dr Alex Dregan, said: “Cognitive decline becomes more common with ageing and for an increasing number of people interferes with daily functioning and well-being.
"We have identified a number of risk factors which could be associated with accelerated cognitive decline, all of which, could be modifiable." He added: "We need to make people aware of the need to do some lifestyle changes because of the risk of cognitive decline."
The researchers do not know how such a decline could affect people going about their daily life. They are also unsure whether the early drop in brain function could lead to conditions such as dementia.


(Image: Alamy)

Smoking ‘rots’ brain, says King’s College study

Smoking “rots” the brain by damaging memory, learning and reasoning, according to researchers at King’s College London. A study of 8,800 people over 50 showed high blood pressure and being overweight also seemed to affect the brain, but to a lesser extent.

Scientists involved said people needed to be aware that lifestyles could damage the mind as well as the body. Their study was published in the journal Age and Ageing.

Researchers at King’s were investigating links between the likelihood of a heart attack or stroke and the state of the brain. Data about the health and lifestyle of a group of over-50s was collected and brain tests, such as making participants learn new words or name as many animals as they could in a minute, were also performed.

They were all tested again after four and then eight years. The results showed that the overall risk of a heart attack or stroke was “significantly associated with cognitive decline” with those at the highest risk showing the greatest decline.

It also said there was a “consistent association” between smoking and lower scores in the tests. One of the researchers, Dr Alex Dregan, said: “Cognitive decline becomes more common with ageing and for an increasing number of people interferes with daily functioning and well-being.

"We have identified a number of risk factors which could be associated with accelerated cognitive decline, all of which, could be modifiable." He added: "We need to make people aware of the need to do some lifestyle changes because of the risk of cognitive decline."

The researchers do not know how such a decline could affect people going about their daily life. They are also unsure whether the early drop in brain function could lead to conditions such as dementia.

(Image: Alamy)

Filed under brain smoking cognitive decline memory dementia neuroscience psychology science

131 notes


A Fresh Look at Psychiatric Drugs
For several years, Henry Lester, Bren Professor of Biology at Caltech, and his colleagues have worked to understand nicotine addiction by repeatedly exposing nerve cells to the drug and studying the effects. At first glance, it’s a simple story: nicotine binds to, and activates, specific nicotine receptors on the surface of nerve cells within a few seconds of being inhaled. But nicotine addiction develops over weeks or months; and so the Caltech team wanted to know what changes in the nerve cell during that time, hidden from view.
The story that developed is that nicotine infiltrates deep into the cell, entering a protein-making structure called the endoplasmic reticulum and increasing its output of the same nicotine receptors. These receptors then travel to the cell’s surface. In other words, nicotine acts “inside out,” directing actions that ultimately fuel and support the body’s addiction to nicotine.
"That nicotine works ‘inside out’ was a surprise a few years ago," says Lester. "We originally thought that nicotine acted only from the outside in, and that a cascade of effects trickled down to the endoplasmic reticulum and the cell’s nucleus, slowly changing their function."
In a new research review paper, published in Biological Psychiatry, Lester—along with senior research fellow Julie M. Miwa and postdoctoral scholar Rahul Srinivasan—proposes that psychiatric medications may work in the same “inside-out” fashion—and that this process explains how it takes weeks rather than hours or days for patients to feel the full effect of such drugs.
"We’ve known what happens within minutes and hours after a person takes Prozac, for example," explains Lester. "The drug binds to serotonin uptake proteins on the cell surface, and prevents the neurotransmitter serotonin from being reabsorbed by the cell. That’s why we call Prozac a selective serotonin reuptake inhibitor, or SSRI." While the new hypothesis preserves that idea, it also presents several arguments for the idea that the drugs also enter into the bodies of the nerve cells themselves.

A Fresh Look at Psychiatric Drugs

For several years, Henry Lester, Bren Professor of Biology at Caltech, and his colleagues have worked to understand nicotine addiction by repeatedly exposing nerve cells to the drug and studying the effects. At first glance, it’s a simple story: nicotine binds to, and activates, specific nicotine receptors on the surface of nerve cells within a few seconds of being inhaled. But nicotine addiction develops over weeks or months; and so the Caltech team wanted to know what changes in the nerve cell during that time, hidden from view.

The story that developed is that nicotine infiltrates deep into the cell, entering a protein-making structure called the endoplasmic reticulum and increasing its output of the same nicotine receptors. These receptors then travel to the cell’s surface. In other words, nicotine acts “inside out,” directing actions that ultimately fuel and support the body’s addiction to nicotine.

"That nicotine works ‘inside out’ was a surprise a few years ago," says Lester. "We originally thought that nicotine acted only from the outside in, and that a cascade of effects trickled down to the endoplasmic reticulum and the cell’s nucleus, slowly changing their function."

In a new research review paper, published in Biological Psychiatry, Lester—along with senior research fellow Julie M. Miwa and postdoctoral scholar Rahul Srinivasan—proposes that psychiatric medications may work in the same “inside-out” fashion—and that this process explains how it takes weeks rather than hours or days for patients to feel the full effect of such drugs.

"We’ve known what happens within minutes and hours after a person takes Prozac, for example," explains Lester. "The drug binds to serotonin uptake proteins on the cell surface, and prevents the neurotransmitter serotonin from being reabsorbed by the cell. That’s why we call Prozac a selective serotonin reuptake inhibitor, or SSRI." While the new hypothesis preserves that idea, it also presents several arguments for the idea that the drugs also enter into the bodies of the nerve cells themselves.

Filed under nicotine nicotine addiction psychiatric drugs nerve cells endoplasmic reticulum neuroscience science

160 notes


Will machines kill mankind?
Academics at Cambridge University are pondering the risk to humanity from super-intelligent technology which could “threaten our own existence.”
Huw Price, Bertrand Russell Professor of Philosophy at Cambridge, said: “In the case of artificial intelligence, it seems a reasonable prediction that some time in this or the next century intelligence will escape from the constraints of biology.”
Professor Price is planning to launch a research centre next year looking into the danger, teaming up with Cambridge professor of cosmology and astrophysics Martin Rees and Jann Tallinn, one of the founders of Skype.
He wants to bring more attention to a future in which mankind might be at the mercy of “machines that are not malicious, but machines whose interests don’t include us.”
The group won’t be the first people to ponder such a future, which has featured in science fiction since the dawn of the computer age, perhaps most famously with HAL-  the malevolent computer from Stanley Kubrick’s 2001: A Space Oddyssey- and most recently in I, Robot, starring Will Smith.
Acknowledging that many people believe his concerns are far-fetched, Professor Price said: “It tends to be regarded as a flaky concern, but given that we don’t know how serious the risks are, that we don’t know the time scale, dismissing the concerns is dangerous.”
He said that advanced technology could be a threat when computers start to direct resources towards their own goals, at the expense of human concerns like environmental sustainability.
He compared the risk to the way humans have threatened the survival of other animals by spreading across the planet and using up natural resources that other animals depend upon.

Will machines kill mankind?

Academics at Cambridge University are pondering the risk to humanity from super-intelligent technology which could “threaten our own existence.”

Huw Price, Bertrand Russell Professor of Philosophy at Cambridge, said: “In the case of artificial intelligence, it seems a reasonable prediction that some time in this or the next century intelligence will escape from the constraints of biology.”

Professor Price is planning to launch a research centre next year looking into the danger, teaming up with Cambridge professor of cosmology and astrophysics Martin Rees and Jann Tallinn, one of the founders of Skype.

He wants to bring more attention to a future in which mankind might be at the mercy of “machines that are not malicious, but machines whose interests don’t include us.”

The group won’t be the first people to ponder such a future, which has featured in science fiction since the dawn of the computer age, perhaps most famously with HAL-  the malevolent computer from Stanley Kubrick’s 2001: A Space Oddyssey- and most recently in I, Robot, starring Will Smith.

Acknowledging that many people believe his concerns are far-fetched, Professor Price said: “It tends to be regarded as a flaky concern, but given that we don’t know how serious the risks are, that we don’t know the time scale, dismissing the concerns is dangerous.”

He said that advanced technology could be a threat when computers start to direct resources towards their own goals, at the expense of human concerns like environmental sustainability.

He compared the risk to the way humans have threatened the survival of other animals by spreading across the planet and using up natural resources that other animals depend upon.

Filed under AI intelligence humanity robotics technology science

free counters