Well, the yeast in our bread might out survive us ... but hopefully we can live longer to eat more bread ...
Turning Back the Clock: Genetic Engineers Rewire Cells for an 82% Increase in Lifespan
A team of researchers has developed a biosynthetic genetic ‘clock’ that significantly extends cellular lifespan, as reported [this week] in the journal Science. The study involved genetically rewiring the gene regulatory circuit that controls cell aging, transforming it from a toggle switch to a clock-like device or gene oscillator. This oscillator periodically switches the cell between two detrimental aged states, thereby preventing prolonged commitment to either and slowing cell degeneration. The team used yeast cells in their study and achieved an 82% increase in lifespan compared to control cells. This ground-breaking research, underpinned by computational simulations and synthetic biology, could revolutionize scientific approaches to age delay, going beyond attempts to artificially revert cells to a state of ‘youth’. The team is now expanding its research to human cell types.
... These gene circuits can operate like our home electric circuits that control devices like appliances and automobiles,” said Professor Nan Hao of the School of Biological Sciences’ Department of Molecular Biology, the senior author of the study and co-director of UC San Diego’s Synthetic Biology Institute. However, the UC San Diego group uncovered that, under the control of a central gene regulatory circuit, cells don’t necessarily age the same way. Imagine a car that ages either as the engine deteriorates or as the transmission wears out, but not both at the same time. The UC San Diego team envisioned a “smart aging process” that extends cellular longevity by cycling deterioration from one aging mechanism to another. ... The rewired circuit operates as a clock-like device, called a gene oscillator, that drives the cell to periodically switch between two detrimental “aged” states, avoiding prolonged commitment to either, and thereby slowing the cell’s degeneration. These advances resulted in a dramatically extended cellular lifespan, setting a new record for life extension through genetic and chemical interventions.
... “Our work represents a proof-of-concept example, demonstrating the successful application of synthetic biology to reprogram the cellular aging process,” the authors wrote, “and may lay the foundation for designing synthetic gene circuits to effectively promote longevity in more complex organisms.” The team is currently expanding its research to the aging of diverse human cell types, including stem cells and neurons.
A team of researchers has developed a biosynthetic genetic ‘clock’ that significantly extends cellular lifespan, as reported [this week] in the journal Science. The study involved genetically rewiring the gene regulatory circuit that controls cell aging, transforming it from a toggle switch to a clock-like device or gene oscillator. This oscillator periodically switches the cell between two detrimental aged states, thereby preventing prolonged commitment to either and slowing cell degeneration. The team used yeast cells in their study and achieved an 82% increase in lifespan compared to control cells. This ground-breaking research, underpinned by computational simulations and synthetic biology, could revolutionize scientific approaches to age delay, going beyond attempts to artificially revert cells to a state of ‘youth’. The team is now expanding its research to human cell types.
... These gene circuits can operate like our home electric circuits that control devices like appliances and automobiles,” said Professor Nan Hao of the School of Biological Sciences’ Department of Molecular Biology, the senior author of the study and co-director of UC San Diego’s Synthetic Biology Institute. However, the UC San Diego group uncovered that, under the control of a central gene regulatory circuit, cells don’t necessarily age the same way. Imagine a car that ages either as the engine deteriorates or as the transmission wears out, but not both at the same time. The UC San Diego team envisioned a “smart aging process” that extends cellular longevity by cycling deterioration from one aging mechanism to another. ... The rewired circuit operates as a clock-like device, called a gene oscillator, that drives the cell to periodically switch between two detrimental “aged” states, avoiding prolonged commitment to either, and thereby slowing the cell’s degeneration. These advances resulted in a dramatically extended cellular lifespan, setting a new record for life extension through genetic and chemical interventions.
... “Our work represents a proof-of-concept example, demonstrating the successful application of synthetic biology to reprogram the cellular aging process,” the authors wrote, “and may lay the foundation for designing synthetic gene circuits to effectively promote longevity in more complex organisms.” The team is currently expanding its research to the aging of diverse human cell types, including stem cells and neurons.
A decoder that uses brain scans to know what you mean — mostly
Scientists have found a way to decode a stream of words in the brain using MRI scans and artificial intelligence.
The system reconstructs the gist of what a person hears or imagines, rather than trying to replicate each word, a team reports in the journal Nature Neuroscience.
"It's getting at the ideas behind the words, the semantics, the meaning," says Alexander Huth, an author of the study and an assistant professor of neuroscience and computer science at The University of Texas at Austin.
This technology can't read minds, though. It only works when a participant is actively cooperating with scientists. Still, systems that decode language could someday help people who are unable to speak because of a brain injury or disease. They also are helping scientists understand how the brain processes words and thoughts.
... Participants wore headphones that streamed audio from podcasts. "For the most part, they just lay there and listened to stories from The Moth Radio Hour, Huth says. Those streams of words produced activity all over the brain, not just in areas associated with speech and language. ... After participants listened to hours of stories in the scanner, the MRI data was sent to a computer. It learned to match specific patterns of brain activity with certain streams of words.
Next, the team had participants listen to new stories in the scanner. Then the computer attempted to reconstruct these stories from each participant's brain activity. ... The system got a lot of help constructing intelligible sentences from artificial intelligence: an early version of the famous natural language processing program ChatGPT.
What emerged from the system was a paraphrased version of what a participant heard.
So if a participant heard the phrase, "I didn't even have my driver's license yet," the decoded version might be, "she hadn't even learned to drive yet," Huth says. In many cases, he says, the decoded version contained errors.
In another experiment, the system was able to paraphrase words a person just imagined saying. ...
https://www.npr.org/sections/health-...ou-mean-mostly
Scientists have found a way to decode a stream of words in the brain using MRI scans and artificial intelligence.
The system reconstructs the gist of what a person hears or imagines, rather than trying to replicate each word, a team reports in the journal Nature Neuroscience.
"It's getting at the ideas behind the words, the semantics, the meaning," says Alexander Huth, an author of the study and an assistant professor of neuroscience and computer science at The University of Texas at Austin.
This technology can't read minds, though. It only works when a participant is actively cooperating with scientists. Still, systems that decode language could someday help people who are unable to speak because of a brain injury or disease. They also are helping scientists understand how the brain processes words and thoughts.
... Participants wore headphones that streamed audio from podcasts. "For the most part, they just lay there and listened to stories from The Moth Radio Hour, Huth says. Those streams of words produced activity all over the brain, not just in areas associated with speech and language. ... After participants listened to hours of stories in the scanner, the MRI data was sent to a computer. It learned to match specific patterns of brain activity with certain streams of words.
Next, the team had participants listen to new stories in the scanner. Then the computer attempted to reconstruct these stories from each participant's brain activity. ... The system got a lot of help constructing intelligible sentences from artificial intelligence: an early version of the famous natural language processing program ChatGPT.
What emerged from the system was a paraphrased version of what a participant heard.
So if a participant heard the phrase, "I didn't even have my driver's license yet," the decoded version might be, "she hadn't even learned to drive yet," Huth says. In many cases, he says, the decoded version contained errors.
In another experiment, the system was able to paraphrase words a person just imagined saying. ...
https://www.npr.org/sections/health-...ou-mean-mostly
AI pioneer quits Google to warn about the technology’s ‘dangers’
Geoffrey Hinton, who has been called the ‘Godfather of AI,’ confirmed Monday that he left his role at Google last week to speak out about the “dangers” of the technology he helped to develop.
Hinton’s pioneering work on neural networks shaped artificial intelligence systems powering many of today’s products. He worked part-time at Google for a decade on the tech giant’s AI development efforts, but he has since come to have concerns about the technology and his role in advancing it.
“I console myself with the normal excuse: If I hadn’t done it, somebody else would have,” Hinton told the New York Times, which was first to report his decision.
... Hinton’s decision to step back from the company and speak out on the technology comes as a growing number of lawmakers, advocacy groups and tech insiders have raised alarms about the potential for a new crop of AI-powered chatbots to spread misinformation and displace jobs. ... In the interview with the Times, Hinton echoed concerns about AI’s potential to eliminate jobs and create a world where many will “not be able to know what is true anymore.” He also pointed to the stunning pace of advancement, far beyond what he and others had anticipated.
“The idea that this stuff could actually get smarter than people — a few people believed that,” Hinton said in the interview. “But most people thought it was way off. And I thought it was way off. I thought it was 30 to 50 years or even longer away. Obviously, I no longer think that.” ... “I believe that the rapid progress of AI is going to transform society in ways we do not fully understand and not all of the effects are going to be good,” Hinton said in a 2021 commencement address at the Indian Institute of Technology Bombay in Mumbai. He noted how AI will boost healthcare while also creating opportunities for lethal autonomous weapons. “I find this prospect much more immediate and much more terrifying than the prospect of robots taking over, which I think is a very long way off.” ...
https://us.cnn.com/2023/05/01/tech/g...ars/index.html
Geoffrey Hinton, who has been called the ‘Godfather of AI,’ confirmed Monday that he left his role at Google last week to speak out about the “dangers” of the technology he helped to develop.
Hinton’s pioneering work on neural networks shaped artificial intelligence systems powering many of today’s products. He worked part-time at Google for a decade on the tech giant’s AI development efforts, but he has since come to have concerns about the technology and his role in advancing it.
“I console myself with the normal excuse: If I hadn’t done it, somebody else would have,” Hinton told the New York Times, which was first to report his decision.
... Hinton’s decision to step back from the company and speak out on the technology comes as a growing number of lawmakers, advocacy groups and tech insiders have raised alarms about the potential for a new crop of AI-powered chatbots to spread misinformation and displace jobs. ... In the interview with the Times, Hinton echoed concerns about AI’s potential to eliminate jobs and create a world where many will “not be able to know what is true anymore.” He also pointed to the stunning pace of advancement, far beyond what he and others had anticipated.
“The idea that this stuff could actually get smarter than people — a few people believed that,” Hinton said in the interview. “But most people thought it was way off. And I thought it was way off. I thought it was 30 to 50 years or even longer away. Obviously, I no longer think that.” ... “I believe that the rapid progress of AI is going to transform society in ways we do not fully understand and not all of the effects are going to be good,” Hinton said in a 2021 commencement address at the Indian Institute of Technology Bombay in Mumbai. He noted how AI will boost healthcare while also creating opportunities for lethal autonomous weapons. “I find this prospect much more immediate and much more terrifying than the prospect of robots taking over, which I think is a very long way off.” ...
https://us.cnn.com/2023/05/01/tech/g...ars/index.html

Yes, We're All Being Spied On
Remember the Chinese spy balloon? Since then, we’ve seen leaked Pentagon spy documents on Discord and the discovery of fake Chinese police stations used for surveillance in the U.S. The line between espionage and everyday surveillance/data collection is more blurred than ever, thanks to the integration of technology into our daily lives. All of us are walking pieces of data being gobbled up and analyzed by spy agencies around the world. All of this spy news is a reminder of how high the stakes are, and how little we really know about the global fight for information. Audie talks with CNN Anchor and Chief National Security Correspondent Jim Sciutto, and former FBI intelligence official Javed Ali about what can spy balloons, leaked documents, and AI tell us about the state of spying today.
https://edition.cnn.com/audio/podcas...8-aff00126e985
Remember the Chinese spy balloon? Since then, we’ve seen leaked Pentagon spy documents on Discord and the discovery of fake Chinese police stations used for surveillance in the U.S. The line between espionage and everyday surveillance/data collection is more blurred than ever, thanks to the integration of technology into our daily lives. All of us are walking pieces of data being gobbled up and analyzed by spy agencies around the world. All of this spy news is a reminder of how high the stakes are, and how little we really know about the global fight for information. Audie talks with CNN Anchor and Chief National Security Correspondent Jim Sciutto, and former FBI intelligence official Javed Ali about what can spy balloons, leaked documents, and AI tell us about the state of spying today.
https://edition.cnn.com/audio/podcas...8-aff00126e985
MIT’s Ingestible “Electroceutical” Capsule Controls Appetite by Hormone Modulation
... Hormones released by the stomach, such as ghrelin, play a key role in stimulating appetite. These hormones are produced by endocrine cells that are part of the enteric nervous system, which controls hunger, nausea, and feelings of fullness. ... MIT engineers have now shown that they can stimulate these endocrine cells to produce ghrelin, using an ingestible capsule that delivers an electrical current to the cells. This approach could prove useful for treating diseases that involve nausea or loss of appetite, such as cachexia (loss of body mass that can occur in patients with cancer or other chronic diseases). ... “This study helps establish electrical stimulation by ingestible electroceuticals as a mode of triggering hormone release via the GI tract,” says Giovanni Traverso, an associate professor of mechanical engineering at MIT, a gastroenterologist at Brigham and Women’s Hospital, and the senior author of the study. “ ... In the prototype used in this study, the current runs constantly, but future versions could be designed so that the current can be wirelessly turned on and off, according to the researchers. ...
... Hormones released by the stomach, such as ghrelin, play a key role in stimulating appetite. These hormones are produced by endocrine cells that are part of the enteric nervous system, which controls hunger, nausea, and feelings of fullness. ... MIT engineers have now shown that they can stimulate these endocrine cells to produce ghrelin, using an ingestible capsule that delivers an electrical current to the cells. This approach could prove useful for treating diseases that involve nausea or loss of appetite, such as cachexia (loss of body mass that can occur in patients with cancer or other chronic diseases). ... “This study helps establish electrical stimulation by ingestible electroceuticals as a mode of triggering hormone release via the GI tract,” says Giovanni Traverso, an associate professor of mechanical engineering at MIT, a gastroenterologist at Brigham and Women’s Hospital, and the senior author of the study. “ ... In the prototype used in this study, the current runs constantly, but future versions could be designed so that the current can be wirelessly turned on and off, according to the researchers. ...
[ATTACH=CONFIG]8375[/ATTACH][ATTACH=CONFIG]8376[/ATTACH]
Mammalian Tree of Life Redefined: Genomic Time Machine Traces Back 100 Million Years of Evolution
Researchers from Texas A&M University have used the largest mammalian genomic dataset to track the evolutionary history of mammals, concluding that mammal diversification began before and accelerated after the dinosaur extinction ...
... Their ultimate goal is to better identify the genetic basis for traits and diseases in people and other species.
... The research — which was conducted with collaborators at the University of California, Davis; University of California, Riverside; and the American Museum of Natural History — concludes that mammals began diversifying before the K-Pg extinction as the result of continental drifting, which caused the Earth’s land masses to drift apart and come back together over millions of years. Another pulse of diversification occurred immediately following the K-Pg extinction of the dinosaurs, when mammals had more room, resources and stability. This accelerated rate of diversification led to the rich diversity of mammal lineages — such as carnivores, primates and hoofed animals — that share the Earth today.
... “Being able to look at shared differences and similarities across the mammalian species at a genetic level can help us figure out the parts of the genome that are critical to regulate the expression of genes,” she continued. “Tweaking this genomic machinery in different species has led to the diversity of traits that we see across today’s living mammals.” ...
... Determining which parts of genes can be manipulated and which parts cannot be changed without causing harm to the gene’s function is important for human medicine.
... “For example, cats have physiological adaptations rooted in unique mutations that allow them to consume an exclusively high-fat, high-protein diet that is extremely unhealthy for humans,” Murphy explained. "One of the beautiful aspects of Zoonomia’s 241-species alignment is that we can pick any species (not just human) as the reference and determine which parts of that species’ genome are free to change and which ones cannot tolerate change. In the case of cats, for example, we may be able to help identify genetic adaptations in those species that could lead to therapeutic targets for cardiovascular disease in people.” ...

[ATTACH=CONFIG]8377[/ATTACH]
Researchers from Texas A&M University have used the largest mammalian genomic dataset to track the evolutionary history of mammals, concluding that mammal diversification began before and accelerated after the dinosaur extinction ...
... Their ultimate goal is to better identify the genetic basis for traits and diseases in people and other species.
... The research — which was conducted with collaborators at the University of California, Davis; University of California, Riverside; and the American Museum of Natural History — concludes that mammals began diversifying before the K-Pg extinction as the result of continental drifting, which caused the Earth’s land masses to drift apart and come back together over millions of years. Another pulse of diversification occurred immediately following the K-Pg extinction of the dinosaurs, when mammals had more room, resources and stability. This accelerated rate of diversification led to the rich diversity of mammal lineages — such as carnivores, primates and hoofed animals — that share the Earth today.
... “Being able to look at shared differences and similarities across the mammalian species at a genetic level can help us figure out the parts of the genome that are critical to regulate the expression of genes,” she continued. “Tweaking this genomic machinery in different species has led to the diversity of traits that we see across today’s living mammals.” ...
... Determining which parts of genes can be manipulated and which parts cannot be changed without causing harm to the gene’s function is important for human medicine.
... “For example, cats have physiological adaptations rooted in unique mutations that allow them to consume an exclusively high-fat, high-protein diet that is extremely unhealthy for humans,” Murphy explained. "One of the beautiful aspects of Zoonomia’s 241-species alignment is that we can pick any species (not just human) as the reference and determine which parts of that species’ genome are free to change and which ones cannot tolerate change. In the case of cats, for example, we may be able to help identify genetic adaptations in those species that could lead to therapeutic targets for cardiovascular disease in people.” ...
[ATTACH=CONFIG]8377[/ATTACH]
A short video worth watching, while recalling how fortunate we are to be alive as a result of it all ... all so silly podcast hosts can sit around making silly podcasts ...
More massive species-ending asteroids may have hit our planet
In a new study, scientists used high-resolution satellite data to measure the size of four known impact craters on Earth. They found that massive asteroids like the one that sent dinosaurs into extinction may strike Earth more often than previously thought.
In a new study, scientists used high-resolution satellite data to measure the size of four known impact craters on Earth. They found that massive asteroids like the one that sent dinosaurs into extinction may strike Earth more often than previously thought.
Feeding the Future: Artificial Photosynthesis Transforms CO2 Into Food
Researchers at the Technical University of Munich have developed a sustainable method to create the essential amino acid L-alanine from CO2. This process uses artificial photosynthesis, converting CO2 to methanol and then to L-alanine. This new method requires less space than traditional agriculture, highlighting the potential of combining bioeconomy and hydrogen economy for a more sustainable future.
https://scitechdaily.com/feeding-the...co2-into-food/
Researchers at the Technical University of Munich have developed a sustainable method to create the essential amino acid L-alanine from CO2. This process uses artificial photosynthesis, converting CO2 to methanol and then to L-alanine. This new method requires less space than traditional agriculture, highlighting the potential of combining bioeconomy and hydrogen economy for a more sustainable future.
https://scitechdaily.com/feeding-the...co2-into-food/
First-Ever Identification of Schizophrenia Risk Markers in Newborns via DNA Methylation
Researchers have identified markers that may indicate early-life susceptibility to schizophrenia, potentially allowing for early detection and intervention. The international research team used blood samples drawn shortly after birth and analyzed 24 million methylation marks, validating their findings with transcriptional data from 595 postmortem brain samples. They concluded that certain differences in methylation in newborns indicate an increased risk of developing schizophrenia later in life.
... Although schizophrenia involves an inherited genetic component there is strong evidence that environmental factors play a role in whether a person develops the disease. These environmental factors can trigger chemical changes to DNA that regulate what genes are turned on or off through a process called methylation. ... Studying possible genetic triggers for a disease like schizophrenia is complicated because methylation changes can be caused by the disease itself and related factors such as the stress and medications that usually accompany it.
Because of the effects of the disease on the methylome — the term for the set of nucleic acid methylation modifications in an organism’s genome or in a particular cell — ideally samples would be obtained before the disease occurs. But since schizophrenia is a disorder of the brain, this would be impossible. ... The researchers concluded that certain differences in methylation already present in newborns indicate an increased risk of developing schizophrenia.
“In other words, we could identify methylation differences between individuals that later on in life would develop schizophrenia and controls that are unique to specific cell-types in the neonatal blood,” said van den Oord, the first listed author of the paper in Molecular Psychiatry and director of the Center of Biomarker Research and Precision Medicine. “Research will continue around these methylation differences to develop potential future clinical biomarkers that will allow early detection and intervention.”
https://scitechdaily.com/first-ever-...a-methylation/
Researchers have identified markers that may indicate early-life susceptibility to schizophrenia, potentially allowing for early detection and intervention. The international research team used blood samples drawn shortly after birth and analyzed 24 million methylation marks, validating their findings with transcriptional data from 595 postmortem brain samples. They concluded that certain differences in methylation in newborns indicate an increased risk of developing schizophrenia later in life.
... Although schizophrenia involves an inherited genetic component there is strong evidence that environmental factors play a role in whether a person develops the disease. These environmental factors can trigger chemical changes to DNA that regulate what genes are turned on or off through a process called methylation. ... Studying possible genetic triggers for a disease like schizophrenia is complicated because methylation changes can be caused by the disease itself and related factors such as the stress and medications that usually accompany it.
Because of the effects of the disease on the methylome — the term for the set of nucleic acid methylation modifications in an organism’s genome or in a particular cell — ideally samples would be obtained before the disease occurs. But since schizophrenia is a disorder of the brain, this would be impossible. ... The researchers concluded that certain differences in methylation already present in newborns indicate an increased risk of developing schizophrenia.
“In other words, we could identify methylation differences between individuals that later on in life would develop schizophrenia and controls that are unique to specific cell-types in the neonatal blood,” said van den Oord, the first listed author of the paper in Molecular Psychiatry and director of the Center of Biomarker Research and Precision Medicine. “Research will continue around these methylation differences to develop potential future clinical biomarkers that will allow early detection and intervention.”
https://scitechdaily.com/first-ever-...a-methylation/
The Brain’s “Chill Pill” – Gene That Suppresses Anxiety Discovered by Scientists
An international team of scientists has identified a gene in the brain linked to anxiety symptoms, with modifications to this gene shown to reduce anxiety levels.
A gene in the brain driving anxiety symptoms has been identified by an international team of scientists. Critically, modification of the gene is shown to reduce anxiety levels, offering an exciting novel drug target for anxiety disorders. The discovery, led by researchers at the Universities of Bristol and Exeter, was published on April 25 in the journal Nature Communications. ... Anxiety disorders are common with 1 in 4 people diagnosed with a disorder at least once in their lifetime. Severe psychological trauma can trigger genetic, biochemical, and morphological changes in neurons in the brain’s amygdala — the brain region implicated in stress-induced anxiety, leading to the onset of anxiety disorders, including panic attacks and post-traumatic stress disorder.
However, the efficacy of currently available anti-anxiety drugs is low with more than half of patients not achieving remission following treatment. Limited success in developing potent anxiolytic (anti-anxiety) drugs is a result of our poor understanding of the neural circuits underlying anxiety and molecular events resulting in stress-related neuropsychiatric states.
In this study, scientists sought to identify the molecular events in the brain that underpin anxiety. They focused on a group of molecules, known as miRNAs in animal models. This important group of molecules, also found in the human brain, regulates multiple target proteins controlling the cellular processes in the amygdala.
... “miRNAs are strategically poised to control complex neuropsychiatric conditions such as anxiety. But the molecular and cellular mechanisms they use to regulate stress resilience and susceptibility were until now, largely unknown. The miR483-5p/Pgap2 pathway we identified in this study, activation of which exerts anxiety-reducing effects, offers a huge potential for the development of anti-anxiety therapies for complex psychiatric conditions in humans.”
https://scitechdaily.com/the-brains-...by-scientists/
An international team of scientists has identified a gene in the brain linked to anxiety symptoms, with modifications to this gene shown to reduce anxiety levels.
A gene in the brain driving anxiety symptoms has been identified by an international team of scientists. Critically, modification of the gene is shown to reduce anxiety levels, offering an exciting novel drug target for anxiety disorders. The discovery, led by researchers at the Universities of Bristol and Exeter, was published on April 25 in the journal Nature Communications. ... Anxiety disorders are common with 1 in 4 people diagnosed with a disorder at least once in their lifetime. Severe psychological trauma can trigger genetic, biochemical, and morphological changes in neurons in the brain’s amygdala — the brain region implicated in stress-induced anxiety, leading to the onset of anxiety disorders, including panic attacks and post-traumatic stress disorder.
However, the efficacy of currently available anti-anxiety drugs is low with more than half of patients not achieving remission following treatment. Limited success in developing potent anxiolytic (anti-anxiety) drugs is a result of our poor understanding of the neural circuits underlying anxiety and molecular events resulting in stress-related neuropsychiatric states.
In this study, scientists sought to identify the molecular events in the brain that underpin anxiety. They focused on a group of molecules, known as miRNAs in animal models. This important group of molecules, also found in the human brain, regulates multiple target proteins controlling the cellular processes in the amygdala.
... “miRNAs are strategically poised to control complex neuropsychiatric conditions such as anxiety. But the molecular and cellular mechanisms they use to regulate stress resilience and susceptibility were until now, largely unknown. The miR483-5p/Pgap2 pathway we identified in this study, activation of which exerts anxiety-reducing effects, offers a huge potential for the development of anti-anxiety therapies for complex psychiatric conditions in humans.”
https://scitechdaily.com/the-brains-...by-scientists/
stlah
Leave a comment: