(OBS) A distinção foi atribuída à investigadora pela revista científica “The Pathologist” que, ao longo de dois meses, inquiriu patologistas de todo o mundo sobre quem consideravam merecedor do título.
Foi “com surpresa” e “enquanto estava a trabalhar e a ver o email” que Fátima Carneiro, professora da Faculdade de Medicina da Universidade do Porto (FMUP) e diretora do serviço de Anatomia Patologia do Centro Hospitalar São João (CHSJ), recebeu a notícia de que tinha sido considerada a patologista mais influente do mundo.
A distinção foi atribuída pela revista científica “The Pathologist” que, ao longo de dois meses, inquiriu patologistas “dos quatro cantos do mundo” sobre quem consideravam ser o merecedor do título. Este ano, Fátima Carneiro — que integra também o Instituto de Patologia e Imunologia Molecular da U.Porto (Ipatimup), atualmente integrado no i3S — foi destacada pelas suas capacidades enquanto patologista e professora universitária, ficando, desta forma, no primeiro lugar na lista das 100 posições.
Além disso, entre os colegas de profissão, Fátima Carneiro é destacada não só enquanto uma perita na sua área de especialidade, mas, também, pelas suas capacidades de liderança”, refere a FMUP num comunicado enviado esta quinta-feira.
Apesar de ser a primeira vez que chegou ao primeiro lugar, esta não é uma estreia da investigadora na lista. Em 2015 já tinha feito parte dos 100 mais influentes, num ano em que foi o médico português Manuel Sobrinho Simões — também ele docente da FMUP, fundador do Ipatimup, e patologista no CHSJ — o profissional considerado o patologista mais influente a nível mundial (e que este ano também está presente na lista).
Para Fátima Carneiro, apesar de esta ser uma distinção “que é de orgulho para quem o recebe”, o mais importante é que se trata de “um reconhecimento de uma visão de uma realidade particular, criada num determinado ambiente de trabalho e de um conjunto de pessoas que se apaixonaram por esta forma de estar na patologia”.
Apesar de tudo, o que nós temos é uma competição entre patologistas dos quatro cantos do mundo e o facto de Portugal ficar à frente de candidatos excelentes como a dos Estados Unidos ou do Reino Unido, é um reconhecimento importante para o país. Trazer a patologia portuguesa a uma fórum internacional dá-me alegria, não posso esconder”, referiu Fátima Carneiro ao Observador.
Natural de Angola (1954), Fátima Carneiro licenciou-se em Medicina pela FMUP em 1978 e é atualmente Professora Catedrática da mesma instituição e diretora do Serviço de Anatomia Patológica do Centro Hospitalar São João. No que diz respeito à academia e à investigação, a docente da FMUP é autora de mais de 250 artigos científicos e contribuiu para o desenvolvimento de vários capítulos de livros de especialidade.
Numa área que considera estar “um pouco abandonada”, mas que “é essencial para o exercício da Medicina com as suas exigências atuais”, a investigadora reforça a importância de uma aposta crescente na patologia. “É preciso uma aposta muito maior e é preciso consolidar o que existe, para criar estruturas que possam dar apoio permanente, para continuar a formar patologistas com qualidade e de uma forma integrada, para que possam exercer as suas funções, que são de uma imensa responsabilidade”, explicou ao Observador.
Sobre o seu percurso, Fátima Carneiro destaca à revista britânica, além do envolvimento no ensino e na atividade de diagnóstico, “um especial orgulho em ter conseguido atingir a senioridade na sua área de investigação, o cancro gástrico, e de todas as parcerias de investigação e ensino que estabeleci ao longo da carreira em quatro os continentes”.
A investigadora dirigiu também vários projetos internacionais, foi presidente da Sociedade Europeia de Patologia (2011-2013) e, em Portugal, coordenou a Rede Nacional de Bancos de Tumores (2008). Atualmente Fátima Carneiro preside a Academia Nacional de Medicina Portuguesa.
(Economist) A book excerpt and interview with Walter Scheidel, author of “The Great Leveler”.
IN AN age of widening inequality, Walter Scheidel believes he has cracked the code on how to overcome it. In “The Great Leveler”, the Stanford professor posits that throughout history, economic inequality has only been rectified by one of the “Four Horsemen of Leveling”: warfare, revolution, state collapse and plague.
So are liberal democracies doomed to a repeat of the pattern that saw the gilded age give way to a breakdown of society? Or can they legislate a way out of the ominous cycle of brutal inequality and potential violence?
“For more substantial levelling to occur, the established order needs to be shaken up,” he says. “The greater the shock to the system, the easier it becomes to reduce privilege at the top.” Yet nothing is inevitable, and Mr Scheidel urges that society become “more creative” in devising policies that can be implemented. The Economist’s Open Future initiative asked Mr Scheidel to reply to five questions. An excerpt from the book appears thereafter.
* * *
The Economist: Is society incapable of tackling income inequality peacefully?
Walter Scheidel: No, but history shows that there are limits. There is a big difference between maintaining existing arrangements that successfully check inequality—Scandinavia is a good example—and significantly reducing it. The latter requires real change and that is always much harder to do: think of America or Britain, not to mention Brazil, China or India. The modern welfare state does a reasonably good job of compensating for inequality before taxes and transfers. However for more substantial levelling to occur, the established order needs to be shaken up: the greater the shock to the system, the easier it becomes to reduce privilege at the top.
The Economist: Haven’t liberal democracy and capitalism tackled inequality in lots of areas if you look at it with the right frame, from LBJ’s Great Society to Deng’s economic reforms?
Walter Scheidel: Democracies have certainly made some progress in addressing various types of inequality: consider gender, race and disability. In America, transfer programmes from the New Deal to the Great Society managed to mitigate income inequality. But all that has since been reversed. China’s economic reforms since the 1980s have actually greatly increased inequality as a by-product of rapid economic growth. Capitalism is a great means of making the poor less poor, but it also continues to make the rich richer still.
The Economist: Are we really living in an unfathomable period of wealth inequality—or was the relatively equal society that followed the second world war the real aberration?
Walter Scheidel: When we view history over the long run we can see that this experience was certainly a novelty. We now know that modernisation as such does not reliably reduce inequality. Many things had to come together to make this happen, such as very high income and estate taxes, strong labour unions, and intrusive regulations and controls. Since the 1980s, liberalisation and globalisation have allowed inequality to rise again. Even so, wealth concentration in Europe is nowhere near as high as it was a century ago. America, meanwhile, is getting there—which shows that it all depends on where you look.
The Economist: If equality can only come about by war, revolution, state-collapse or plague, then is there an argument that we should simply learn to adapt to a new gilded age?
Walter Scheidel: No, but we need to appreciate that measures that worked well in the past may have done so because they were taken in the unique context of massive violent shocks and threats: the world wars and communism. This requires us to be more creative in dealing with inequality. Above all we must think harder about feasibility. It is not enough for economists to come up with recipes to reduce inequality, we also need to figure out how to implement them in an environment that is politically polarised and economically globalised. Both factors limit our scope for intervention.
The Economist: How do artificial intelligence and automation fit in to your thinking? Will they be a calamity for employment and thus for equality? Or might they unleash extraordinary productivity and improvements in living standards that actually narrow inequality?
Walter Scheidel: Ideally we would like education to keep up with technological change to make sure workers have the skills they need to face this challenge. But in practice there will always be losers, and even basic-income schemes can take us only so far. At the end of the day, someone owns the robots. As long as the capitalist world system is in place, it is hard to see how even huge productivity gains from greater automation would benefit society evenly instead of funnelling even more income and wealth to those who are in the best position to pocket these gains.
* * *
Excerpt from “The Great Leveler” (Princeton University Press, 2017), by Walter Scheidel
There was always one Big Reason behind every known episode of substantial leveling. There was one Big Reason why John D. Rockefeller was an entire order of magnitude richer in real terms than his richest compatriots one and two generations later, why the Britain of Downton Abbey gave way to a society known for universal free healthcare and powerful labor unions, why in industrialized nations around the globe the gap between rich and poor was so much smaller in the third quarter of the twentieth century than it had been at its beginning – and, indeed, why a hundred generations earlier ancient Spartans and Athenians had embraced ideals of equality and sought to put them into practice. There was one Big Reason why by the 1950s the Chinese village of Zhangghuangcun had come to boast a perfectly egalitarian distribution of farmland; one Big Reason why the high and mighty of Lower Egypt 3,000 years ago had to bury their dead with hand-me-downs or in shoddily manufactured coffins, why the remnants of the Roman aristocracy lined up for handouts from the pope and the successors of Maya chiefs subsisted on the same diet as hoi polloi; and one Big Reason why humble farmhands in Byzantine and early Islamic Egypt and carpenters in late medieval England and hired workers in early modern Mexico earned more and ate better than their peers before or after. These Big Reasons were not all the same, but they shared one common root: massive and violent disruptions of the established order. Across recorded history, the periodic compressions of inequality brought about by mass mobilization warfare, transformative revolution, state failure, and pandemics have invariably dwarfed any known instances of equalization by entirely peaceful means.
History does not determine the future. Maybe modernity really is different. In the very long run, it may well turn out to be. It may put us on a trajectory toward singularity, a point at which all human beings merge into a globally interconnected hybrid body-machine super-organism and no longer have to worry about inequality. Or perhaps technological advances will instead take inequalities to new extremes by separating a biomechatronically and genetically enhanced elite from ordinary mortals, the latter perpetually kept at bay by the ever more superior capabilities of their overlords. Or, just as likely, none of the above – we may be moving toward outcomes we cannot even yet conceive. But science fiction takes us only so far. For the time being, we are stuck with the minds and bodies we have and with the institutions they have created. This suggests that the prospects of future leveling are poor. It will be a challenge for the social democracies of continental Europe to maintain and adjust elaborate systems of high taxation and extensive redistribution or for the richest democracies of Asia to preserve their unusually equitable allocation of pretax incomes to stem the rising tide of inequality, which can grow only stronger as ongoing globalization and unprecedented demographic transformations add to the pressure. It is doubtful whether they will manage to hold the line: inequality has been inching up everywhere, a trend that undeniably works against the status quo. And if the stabilization of existing distributions of income and wealth will be increasingly difficult to achieve, any attempt to render them more equitable necessarily faces even bigger obstacles.
For thousands of years, history has alternated between long stretches of rising or high and stable inequality interspersed with violent compressions. For six or seven decades from 1914 to the 1970s or 1980s, both the world’s rich economies and those countries that had fallen to communist regimes experienced some of most intense leveling in recorded history. Since then, much of the world has entered what could become the next long stretch – a return to persistent capital accumulation and income concentration. If history is anything to go by, peaceful policy reform may well prove unequal to the growing challenges ahead. But what of the alternatives? All of us who prize greater economic equality would do well to remember that with the rarest of exceptions, it was only ever brought forth in sorrow. Be careful what you wish for.
(SkyNews) Scientists do not yet know what causes the mysterious fast radio bursts, but a form of alien transportation has been suggested.
Scientists searching for extraterrestrial life say they have spotted 72 mysterious signals from an alien galaxy using artificial intelligence (AI).
The researchers at the SETI (Search for Extraterrestrial Intelligence) Institute discovered the unusual signals when examining 400 terabytes of radio data from a dwarf galaxy three billion light years away from Earth.
Almost all artificial intelligence technology involves automating data analysis, combing through huge data sets to identify patterns or unusual occurrences.
The signals they spotted – fast radio bursts (FRBs) – are bright and quick pulses which were first discovered in 2007 and are believed to come from distant galaxies, although it is not yet know what causes them.
“The nature of the object emitting them is unknown,” SETI said, adding: “There are many theories, including that they could be the signatures of technology developed by extraterrestrial intelligent life.”
Scientists solve mystery of ‘alien’ Wow! signal
Researchers now believe they have discovered where the famous alien “Wow!” signal originated from
A light sail would use the tiny amount of pressure exerted by light to produce a small but constant acceleration which allows a spacecraft to reach a great speed.
The FRBs were detected in data collected by the Green Bank Telescope, part of the US Radio Quiet Zone, where wireless communications signals are banned to prevent interference with the telescopes.
Gerry Zhang, a PhD student at Berkeley, developed the machine-learning algorithm used to examine the 400tb of data, in which another researcher had already identified 21 FRBs.
“Gerry’s work is exciting not just because it helps us understand the dynamic behavior of FRBs in more detail,” said SETI’s Dr Andrew Siemion, “but also because of the promise it shows for using machine learning to detect signals missed by classical algorithms.”
Dr Siemion added: “These new techniques are already improving our sensitivity to signals from extraterrestrial technologies.”
The results of their research have been accepted for publication in the Astrophysical Journal.
(GUA) Impact of high levels of toxic air ‘is equivalent to having lost a year of education’
Air pollution causes a “huge” reduction in intelligence, according to new research, indicating that the damage to society of toxic air is far deeper than the well-known impacts on physical health.
The research was conducted in China but is relevant across the world, with 95% of the global population breathing unsafe air. It found that high pollution levels led to significant drops in test scores in language and arithmetic, with the average impact equivalent to having lost a year of the person’s education.
“Polluted air can cause everyone to reduce their level of education by one year, which is huge,” said Xi Chen at Yale School of Public Health in the US, a member of the research team. “But we know the effect is worse for the elderly, especially those over 64, and for men, and for those with low education. If we calculate [the loss] for those, it may be a few years of education.”
Previous research has found that air pollution harms cognitive performance in students, but this is the first to examine people of all ages and the difference between men and women.
The damage in intelligence was worst for those over 64 years old, with serious consequences, said Chen: “We usually make the most critical financial decisions in old age.” Rebecca Daniels, from the UK public health charity Medact, said: “This report’s findings are extremely worrying.”
The new work, published in the journal Proceedings of the National Academy of Sciences, analysed language and arithmetic tests conducted as part of the China Family Panel Studies on 20,000 people across the nation between 2010 and 2014. The scientists compared the test results with records of nitrogen dioxide and sulphur dioxide pollution.
They found the longer people were exposed to dirty air, the bigger the damage to intelligence, with language ability more harmed than mathematical ability and men more harmed than women. The researchers said this may result from differences in how male and female brains work.
Derrick Ho, at the Hong Kong Polytechnic University, said the impact of air pollution on cognition was important and his group had similar preliminary findings in their work. “It is because high air pollution can potentially be associated with oxidative stress, neuroinflammation, and neurodegeneration of humans,” he said.
Chen said air pollution was most likely to be the cause of the loss of intelligence, rather than simply being a correlation. The study followed the same individuals as air pollution varied from one year to the next, meaning that many other possible causal factors such as genetic differences are automatically accounted for.
The scientists also accounted for the gradual decline in cognition seen as people age and ruled out people being more impatient or uncooperative during tests when pollution was high.
Air pollution was seen to have a short-term impact on intelligence as well and Chen said this could have important consequences, for example for students who have to take crucial entrance exams on polluted days.
“But there is no shortcut to solve this issue,” he said. “Governments really need to take concrete measures to reduce air pollution. That may benefit human capital, which is one of the most important driving forces of economic growth.” In China, air pollution is declining but remains three times above World Health Organisation (WHO) limits.
According to the WHO, 20 of the world’s most polluted cities are in developing countries. China, home to several of those cities, has been engaged in a “war against pollution” for the past five years.
The results would apply around the world, Chen added. The damage to intelligence was likely to be incremental, he said, with a 1mg rise in pollution over three years equivalent to losing more than a month of education. Small pollution particles are known to be especially damaging. “That is the same wherever you live. As human beings we have more in common than is different.”
Aarash Saleh, a registrar in respiratory medicine in the UK and part of the Doctors Against Diesel campaign, said: “This study adds to the concerning bank of evidence showing that exposure to air pollution can worsen our cognitive function. Road traffic is the biggest contributor to air pollution in residential areas and the government needs to act urgently to remove heavily-polluting vehicles from our roads.”
Daniels said: “The UK’s air is illegally polluted and is harming people’s health every day. Current policies are not up to the scale of the challenge: government must commit to bringing air pollution below legal limits as soon as possible.”
Barbra Streisand is not alone. At a South Korean laboratory, a once-disgraced doctor is replicating hundreds of deceased pets for the rich and famous. It’s made for more than a few questions of bioethics.
The surgeon is a showman. Scrubbed in and surrounded by his surgical team, a lavalier mike clipped to his mask, he gestures broadly as he describes the C-section he is about to perform to a handful of rapt students watching from behind a plexiglass wall. Still narrating, he steps over to a steel operating table where the expectant mother is stretched out, fully anesthetized. All but her lower stomach is discreetly covered by a crisp green cloth. The surgeon makes a quick incision in her belly. His assistants tug gingerly on clamps that pull back the flaps of tissue on either side of the cut. The surgeon slips two gloved fingers inside the widening hole, then his entire hand. An EKG monitor shows the mother’s heart beating in steady pulses.
Just like that the baby’s head pops out, followed by its tiny body. Nurses soak up fluids filling its mouth so the tyke can breathe. The surgeon cuts the umbilical cord. After some tender shaking, the little one moves his head and starts to cry. Looking triumphant, the surgeon holds up the newborn for the students to see—a baby boy that isn’t given a name but a number: he is a clone.
This is not some sci-fi, futuristic scenario—it’s happening right now, in Seoul, South Korea. The newborn, however, is not a human. It’s a puppy, a breed called Central Asian Ovcharka. He weighs only a few ounces, and his fur, slickened by fluid, is covered in black and white splotches, like a miniature Holstein. His eyes are not yet open. When he cries, it’s a barely perceptible squeak. The surgeon, Hwang Woo-suk, unclips his microphone and holds it close to little 1108’s mouth, amplifying its mewling over a loudspeaker so the students can hear its plaintive, what-the-hell-just-happened whine—eeee, eeee, eeee.
Hwang’s assistants, meanwhile, are busy suturing up the mother, a Labrador-sized mutt with shaggy yellow fur who was specially bred to give birth to and nurse cloned puppies. “She’s a mixed breed,” explains Jae Woong Wang, a canine-reproduction researcher who works for Hwang here at the Sooam Biotech Research Foundation, the world’s first company dedicated to cloning dogs. “We breed the surrogate moms to be docile and gentle.”
It has been more than two decades since the world collectively freaked out over the birth of Dolly the Sheep, the first-ever mammal cloned from an adult cell. The media jumped on the fear implicit in creating genetic replicas of living beings: Time featured a close-up of two sheep on its cover, accompanied by the headline “Will There Ever Be Another You?” Jurassic Park, meanwhile, was terrifying audiences with cloned T. rexes and velociraptors that broke free from their creators and ran amok, eating lawyers and terrorizing small children. But over the years, despite all the Jurassic sequels, the issue faded from the public imagination, eclipsed by the rapid pace of scientific and technological change. In an age of gene editing, synthetic biology, and artificial intelligence, our dread of cloning now seems almost quaint, an anxiety from a simpler, less foreboding time.
Then, last March, Barbra Streisand came out as a cloner. In an interview with Variety, the singer let slip that her two Coton de Tulear puppies, Miss Violet and Miss Scarlett, are actually clones of her beloved dog Samantha, who died last year. The puppies, she said, were cloned from cells taken from “Sammie’s” mouth and stomach by ViaGen Pets, a pet-cloning company based in Texas that charges $50,000 for the service. “I was so devastated by the loss of my dear Samantha, after 14 years together, that I just wanted to keep her with me in some way,” Streisand explained in a New York Times opinion piece, after the news provoked an outcry from animal-rights advocates. “It was easier to let Sammie go if I knew that I could keep some part of her alive, something that came from her DNA.”
Cloning pets is “like The Handmaid’s Tale,” says one ethicist. “It’s a canine version of reproductive machines.”
Ethicists from the White House to the Vatican have long debated the morality of cloning. Do we have the right to bioengineer a copy of a living creature, especially given the pain and suffering that the process requires? It can take a dozen or more embryos to produce a single healthy dog. Along the way, the surrogate mothers may be treated with hormones that, over time, can be dangerous, and many of the babies are miscarried, born dead, or deformed. When a dog was first cloned, in 2005—a scientific achievement that Time hailed as one of the breakthrough inventions of the year—it took more than 100 borrowed wombs, and more than 1,000 embryos. “Surrogate mothers are a little bit like The Handmaid’s Tale,” says Jessica Pierce, an ethicist and dog expert who teaches at the Center for Bioethics and Humanities at the University of Colorado. “It’s a canine version of reproductive machines.”
Yet here in the operating room at Sooam, everyone is all smiles—especially the veterinarian representing the customer who paid for Clone 1108. A slender man whose employer is Middle Eastern royalty, he stands in scrubs next to Dr. Hwang, posing for photos with the newborn pup. It’s a moment that has become almost as routine as it is lucrative for Sooam: over the past decade, the company has cloned more than 1,000 dogs, at up to $100,000 per birth. “Yes, cloning has become a business,” says Wang. If a dog owner provides DNA from a deceased pet quickly enough—usually within five days of its death—Sooam promises a speedy replacement. “If the cells from the dead dog are not compromised,” Wang explains, “we guarantee you will get a dog within five months.”
It’s fitting, perhaps, that the man at the center of the controversy over canine cloning is Hwang Woo-suk. The surgeon was, briefly, a hero of South Korea. In 2004, while serving on the faculty at Seoul National University, he co-authored a story in the prestigious journal Science asserting that he and his team had successfully cloned a human embryo. A year later, he created the world’s first cloned dog. Using a cell from the ear of an Afghan hound, Hwang impregnated 123 surrogate mothers, only one of which gave birth to a pup that survived. He named it Snuppy—an amalgam of “Seoul National University” and “puppy.” In 2006, however, Hwang was kicked off the faculty when it was revealed that his claim to have cloned a human embryo was a spectacular hoax. The university determined that Hwang had fabricated evidence, embezzled government funds, and illegally paid for donor eggs from female researchers in his lab. After tearfully apologizing, he was sentenced to two years in prison, but escaped serving time when a judge suspended the sentence, writing in the verdict that Hwang “has shown he has truly repented for his crime.”
Undeterred, Hwang founded Sooam to continue his research. At first, he concentrated on cloning pigs and cows, which still makes up a sizable part of the company’s business. Then, in 2007, he was contacted by a representative of John Sperling, the billionaire founder of Phoenix University. Sperling had a girlfriend whose dog, Missy, had died a few years earlier. “She wanted to see Missy again,” says Wang, the Sooam researcher. Hwang cloned Missy in 2009, launching the lab’s foray into the commercial duplication of dogs.
The process itself, fine-tuned over years of trial and error, is known as “somatic cell nuclear transfer.” It starts with an egg from a donor dog. Using a high-powered microscope, scientists poke a micro-hole in the egg and remove the nucleus, where the DNA is housed. They then replace the nucleus with a cell from the dog that is being cloned—usually from its skin or inside its cheek. Finally, the hybrid egg is blasted with a short burst of electricity to fuse the cells and begin cell division. The embryo is then imbedded in a surrogate’s womb. If the transfer takes, a puppy will be born some 60 days later.
The day after Hwang delivers Clone 1108, he agrees to meet me at Sooam’s headquarters, an imposing stone structure that hugs one of the many steep, wooded hills on the southern outskirts of Seoul. Built in 2011, the building looks like a modern-day version of Frankenstein’s castle, its imposing tower offset by a touch of Bauhaus. Hwang refuses most interviews, in part because he speaks limited English, and in part, one suspects, because he isn’t keen to relive his controversial past. Dressed in a light-gray suit, he greets me with a smile that lights up his whole face, which looks younger than his 64 years. He bows slightly and promises, with the reassuring look of an old friend, to answer any questions I submit via e-mail.
Why, I ask him, do so many people want to clone their dogs? “The main reason,” he replies, “is that their beloved companion dogs are like family members, and they would like to have as close to a continuation of that companionship as possible.” He makes clear, though, that customers do not get an exact replica of their dog. Clones often look like the original dog, and share some traits, but they don’t have the original dog’s memories, and their upbringing is inevitably different. “Cloned puppies are like identical twins born at a later date,” Hwang tells me. “A twin out of time.”
And why is the cloning process so expensive? “Unlike other species,” he explains, “there are currently no effective protocols for the in-vitro maturation of canine oocytes.” Translation: the eggs have to be harvested from donor dogs, which go into heat only twice a year, rather than grown in a lab, making them more difficult and expensive to obtain.
When I inquire about ethics, Hwang is brief. “Animal-cloning ethics and human-cloning ethics have completely different values,” he says. “Here in Sooam we are steadfastly against human cloning, but we believe that animal cloning can bring us benefits and help us contribute socially.”
Hwang is quick to tout the broader benefits of his work in cloning. His staff’s research into stem cells and embryo development has generated dozens of scientific papers that aim to better understand cell development in animals, and to more effectively treat human diseases like Alzheimer’s and diabetes. Sooam has a grant from the South Korean government to create a model to screen drugs for melanoma. In a nod to Jurassic Park, Hwang is also using intact tissue frozen for thousands of years in Siberia to attempt to resurrect the woolly mammoth, fusing ancient cells recovered from the frozen tundra with donor eggs from modern-day elephants—a process he hopes can be used to clone other extinct animals, like the Pyrenean ibex, and endangered species like the Ethiopian wolf. But despite Hwang’s years of quiet accomplishment, and supporters who claim he was the victim of a conspiracy to discredit him, the shame of his past deceit has not been forgiven: the South Korean government continues to bar Hwang from conducting research with human eggs and stem cells.
At Sooam’s headquarters, Hwang ends our meeting by handing me a peach-colored gift bag full of cosmetics. “For your wife or girlfriend,” he says with a bow. I had already visited the floor upstairs where Sooam uses enzymes and stem cells to make a variety of lotions, cleansing oils, and eye creams, marketed under names like Beauté de Cell, JunéCell, and Beauté de Cell Homme for men. I thank Hwang for the gift, though I’m not exactly wild about the thought of lathering stem cells on my face.
It was Barry Diller, the media mogul, who inspired Barbra Streisand to opt for cloning after the death of her Coton de Tulear. Streisand loved her pet so much that in 2016, she ended a Netflix special of one of her rare concerts with a tribute to Sammie. In the video, she sings a rendition of her hit “Closer” as snapshots fade in and out of the dog cavorting and cuddling with Streisand and her husband, James Brolin.
Diller told Streisand that after the death of his own dog, Shannon, he paid Sooam to clone the Jack Russell terrier. The result was three genetic replicas of Shannon. Two live in Diller’s Beverly Hills mansion: Tess, short for “test tube,” and DiNA, a play on DNA. The third, Evita, lives in the Connecticut home of Diller and his wife, Diane von Furstenberg. “These dogs, they’re the soul of Shannon,” Diller told The New York Times. “Diane was horrified that I was doing this, but she’s switched now to say, ‘Thank God you did.’” Streisand also wound up with three clones, one of which went to the 13-year-old daughter of her A&R man at Columbia Records.
ViaGen, the Texas-based company that cloned Miss Violet and Miss Scarlett, launched in 2002 to store and preserve the DNA of cows, pigs, and horses. Eventually, the company took over some of the stored tissue from the first-ever cat-cloning company, Genetic Savings and Clone, and acquired patents for technologies developed by the scientists who cloned Dolly the Sheep. At first ViaGen licensed the tech to Sooam, before starting a dog-cloning service of its own two years ago.
Streisand knows that Miss Violet and Miss Scarlett aren’t exact replacements for Sammie. “They have different personalities,” she told Variety. “I’m waiting for them to get older so I can see if they have Sammie’s brown eyes and her seriousness.” That’s because genes are only one factor among many that shape a clone’s looks, personality, behavior. “The dogs are genetic duplicates,” explains Wang, the researcher at Sooam, “but the environment they grow up in also plays a big role in how they will look and act.”
Not everyone who clones a dog is as well off as Streisand. When Tom Rubython, a magazine publisher in Northampton, England, lost his cherished cocker spaniel, Daisy, he knew it was “ridiculous” to have Sooam clone her. “It was not a sensible decision,” he says. “My wife wasn’t very happy about it. But Daisy was special. I had a real connection with her.” Rubython owned two other spaniels who came from the same litter as Daisy, but he had no interest in cloning them. Nor was he interested in simply getting another dog from the same breed. “I don’t believe I would have gotten another dog if I didn’t do this,” he says.
To raise the $100,000 needed to clone Daisy, Rubython had to give up something else he loved. “I have money, but I’m not wealthy,” he says. “I had to sell two cars to pay for it.” He sends me photographs of the cars: a brand-new silver-blue Mercedes SL, and a cream-colored classic SL. “Now I drive a Mini,” he sighs. He also sends me a photo of Daisy, a gray spaniel with flecks of white and black. She has that bedraggled, old-dog look. The two clones, named Mabel and Myrtle, have thick fur and a playful gleam in their eyes. “They are very similar,” says Rubython, “but not the same. One of them looks very similar to the original, another looks like her sister. It’s 85 percent, against 100 percent.” But in every respect, they are indistinguishable from natural-born dogs. “They are staring at me right now,” Rubython says. “They know I’m talking about them.”
Researchers at Sooam, who insist that their cloning process is ethical, are eager to make it more efficient. “The hardest thing about cloning dogs is finding fresh eggs,” says Yeonwoo Jeong, director of Sooam’s biotech research. He hopes to one day grow eggs in the lab, using stem-cell technologies, rather than going through the time and expense to surgically extract eggs from other animals.
According to Jeong, Sooam has dramatically improved the cloning process since Snuppy was born 13 years ago. The company insists that it does not inject surrogates with hormones to induce ovulation, and says that most of the embryos that don’t make it die early in the pregnancy. Today, Jeong says, achieving one viable pregnancy requires implants of multiple embryos in only three dog moms—down from the hundreds of embryos and surrogates it took to give birth to Snuppy. “Through research,” he says, “we have minimized the stress on the dogs.”
It was Barry Diller who inspired Streisand to opt for cloning after the death of her beloved Sammie.
Other researchers scoff at such claims. “I don’t believe they are getting one out of three,” says Rudolf Jaenisch, a leading expert on stem cells and cloning techniques at the Whitehead Institute in Boston. “Cloning is inefficient. You lose many clones. Some die in implantation. You also get abnormal epigenetics”—changes in the animal’s DNA as it ages. “When you take somatic cells from older animals and put them into an egg that will need to develop from an embryo into a viable animal, you get mistakes from the old DNA that would not occur in a naturally produced embryo.” Most of the dogs, he adds, don’t live a normal life span—although it’s hard to know for certain, since most of the dogs cloned to date are just a few years old.
Hank Greely, a bioethicist at Stanford, wonders what happens to the two out of three clones that don’t make it. “Are they delivered deformed or stillborn? Are they born in pain?” What makes cloning dogs unethical, he says, is when it causes more suffering than natural reproduction. During the process, critics say, surrogate mothers often receive injections of hormones to make them receptive to the embryos. “It’s the same hormones used in humans going through I.V.F.,” says CheMyong Jay Ko, who directs a research lab on reproduction and stem cells at the University of Illinois at Urbana-Champaign. “Injecting these hormones is not good for the dogs, particularly when it’s repeated over and over again.”
After Streisand revealed the origins of Miss Scarlett and Miss Violet, animal-rights activists launched a Twitter campaign called #adoptdontclone, urging people who lose their pets to choose a dog from among the millions of natural-borns that have no home. “People who pay $100,000 to create a new dog seem to forget that there are so many that have no one who cares about them,” says Vicki Katrinak, head of animal-research issues for the Humane Society. “We’re opposed to cloning of any animal for profit.”
The clone researchers at Sooam insist that they provide a necessary service for grieving dog lovers. “After death, it’s hard for people who were really close to their dogs,” says Wang. “For those people, a clone is the alternative to a funeral. Some people taxidermy their dogs, others cremate them. Cloning is another way of dealing with death—the closest thing to getting back the lost dog, or a part of it.”
It’s early morning, and I’m waiting with Wang in front of Sooam’s headquarters. The clone puppies are about to arrive for their morning playtime.The company cares for client’s copies until their owners are able to take them home, according to the quarantine laws in their home countries. I don’t know what to expect. With Sooam’s imposing “castle” looming over the big, grassy lawn, it feels like a scene out of some futuristic dystopia—clean and orderly and slightly unsettling.
So I’m startled when the puppies arrive and they are just . . . puppies. They come tumbling out of a dog crate and into a fenced-in play area. Immediately, they start dashing about. Feather-light Pomeranians become puffy blurs of white fur; what seems like dozens of Chihuahuas chase one another in circles, tiny pink tongues dangling. Wang tells me that Sooam has cloned a total of 49 Chihuahuas, all of them copies of “Miracle Milly,” a dog from Puerto Rico that holds the Guinness record as the world’s smallest Chihuahua. “We made 49 because we were curious about the smallness,” explains Jeong, the head researcher. “Would it transfer?” He shakes his head. “It didn’t—the clones turned out bigger.”
You can’t help but fall in love with these puppies. It’s weird to imagine most of them are copies of dead dogs, but they make you smile as they swarm you, wanting their tummies rubbed. When human minders in blue uniforms approach, the weeks-old canines swarm them too, thrilled to play with people. Around their little necks are collars with numbers written in Magic Marker—1078, 1092, 1094.
When playtime is over, Wang leads me back into the building and shows me the kennel where the puppies live. I see little 1108, born the day before. For now he’s being kept in an incubator, but he looks healthy and robust, curious about what’s going on around him. In one pen a yellow-haired surrogate mother is nursing a pup. One of the minders places 1108 next to a teat, and the newborn immediately starts to suckle, its eyes barely open. The mother doesn’t seem to mind. She lets the puppy feed, and then stands up and paces in her pen, wagging her tail. I scoop up a Saluki, No. 1102, who is four weeks old. He licks my hand and promptly falls asleep in my lap. I don’t want to move lest I disturb him.
When Louise Brown, the first “test tube” baby, was born in 1978 using in-vitro fertilization, people feared the worst. Many religious leaders denounced I.V.F. as unnatural; even James Watson, who co-discovered the double-helix shape of DNA, predicted that “all hell will break loose, politically and morally.” Then people saw that the babies were just babies, and the outrage evaporated. Today, more than seven million babies have been born worldwide using I.V.F. and other forms of assisted reproduction.
When I ask Jeong if the technology currently exists to clone humans, he repeats Sooam’s talking point: that the company has no interest in copying humans. He points out, however, that scientists in China successfully cloned primates earlier this year, creating two long-tailed macaques named Zhong Zhong and Hua Hua. “These monkeys are very close to us genetically,” says Jeong, “which means you should be able to clone a human.”
The macaque “success,” however, took 63 surrogate mothers to create two healthy monkeys—a process unlikely to be tolerated in human cloning. “Can you imagine making human clones and using that many human surrogate mothers?” asks Greely, the Stanford bioethicist. “And can you imagine a human clinical trial being approved? What if you ended up with a deformed or damaged human baby?”
It won’t be long, researchers say, before grief-sticken parents try to clone a child they lost.
These days, though, the real push among scientists isn’t just to clone a human being—it’s to rewrite our DNA to better treat diseases and create new, improved versions of ourselves. “There isn’t much point in just copying a person,” says George Church, a Harvard geneticist who is also working to clone the woolly mammoth. “You would want to create an improved version, with DNA for cancer, say, edited out.” Cloning, it seems, is a now antiquated fear. The lightning-quick advance of technology has given us new stuff to be scared about—the rampaging dinosaurs of Jurassic Park supplanted by the perhaps more-human-than-human replicants of Westworld.
Despite government bans, science is closer than ever to successfully cloning a human. “Grief-stricken parents lose a toddler, and they’re billionaires,” says Greely. “They want another child as close as possible to the one they lost. This is a human version of what is happening when people lose a pet they love.” If distraught parents think a clone would resemble 85 percent of their child’s appearance and personality—roughly what Tom Rubython got with one of his clones—it’s only a matter of time until pressure will inexorably mount to give it a shot. If there’s enough demand, the market will do its best to respond.
Hwang Woo-suk once dreamed of being the first scientist to clone a human embryo. He wanted it so much, in fact, that he tried to con the world into believing he had done it. Now, given the restrictions placed on his research, he’s unlikely to ever get a crack at creating the first human Dolly, even if he wanted to. So he bio-engineers pigs and cows to study disease, tinkers with resurrecting the woolly mammoth, and runs his lucrative cloning empire, delivering little 1109, and beyond. There will always, it seems, be another grief-stricken customer, desperate to replace a lost companion: another Barbra Streisand, visiting the grave of her beloved Sammie, with Miss Violet and Miss Scarlett perched next to her in their stroller—two identical puffs of white fur, gazing at the tombstone of the dog they are.
An award-winning science journalist and best-selling author, Duncan is C.E.O. and curator of Arc Fusion, which focuses on the fusion of health, biomedicine, and I.T. His latest book, Talking to Robots: Tales from Our Human-Robot Futures(Dutton), will be published in 2019.
(BBG) Making carbon storage work is critical to fighting climate change. The question is where to put it all.
A Cold War-era joke has an American economist asking a Soviet peer how the communist economy is progressing. “In a word: good” the Russian responds. “In two words: not good.”
So it goes this century with the rapidly changing energy industry. Advances are taking place in clean energy, transport and efficiency that may have rightfully been considered miraculous a decade ago.
But here’s the catch: As fast as everything is proceeding, it’s still not fast enough. The International Energy Agency (IEA) reported last year that a critical technology—capturing carbon dioxide emissions from generators and either burying or otherwise disposing of them—isn’t expanding fast enough. The IEA reported that current “carbon capture and storage” (CCS) facilities are capable of handling just 7.5 percent of the emissions that the world will need eliminated every year by 2025. That’s necessary if nations are to meet the goal of keeping any increase in global warming below 2 degrees Celsius (3.6 Fahrenheit).
In China, researchers have been looking for ways to accelerate CCS. They decided to look out to sea.
On land, CCS isn’t just promising in principle—it’s been shown to work. There will be more than 20 large-scale capture facilities available by the end of the year, according to the Global CCS Institute. But there’s still concern about making sure the CO2, once buried, stays buried. The same can be said for the idea China has about burying CO2 at sea. For companies and countries to exploit the vastness of the ocean floor, they also need some kind of confidence that it’ll stay there.
By studying the long-term interactions of major physical forces in “unconsolidated marine sediment” such as loose silt, clay and other permeable stuff below the sea floor, researchers Yihua Teng and Dongxiao Zhang report that extreme conditions at the bottom of the ocean essentially hold CO2 in place, “which makes this option a safe storage.”
Under great pressure and low temperature, CO2 and water trapped in the sediment below the sea floor crystallize into a stable ice called hydrate. (Through a similar process, energy-rich methane freezes with water beneath the ocean and terrestrial permafrost, a potential source of energy being scrutinized by China, Japan, the U.S. and others.) The new paper on CCS demonstrates through simulation that the hydrates become an impermeable “cap” that keeps the CO2 below it from migrating back up to the sea floor.
Peking University’s carbon capture and storage research receives support from the multinational metals, mining, and petroleum company BHP Billiton Ltd., according to the paper.
The research appears this week in the journal Science Advances. The study should provide some confidence, they write, that ocean CO2 storage remains a viable tool in the push to reduce emissions of the most dangerous heat-trapping gas, even as commercialization of the process remains way off. In the meantime, there are other questions to answer, including how CO2 may behave differently under different kinds of geological conditions.
The big assumption, as with most underground CO2 storage scenarios, is that there’s no telling what the Earth’s living geology will do over the centuries and millennia. Fractures in the subsea sediment, either preexisting or created by tectonics or CO2 injection itself, could open a pathway for CO2 to escape—though significant uncertainty remains.
“In our assumption,” they write, “the unconsolidated marine sediment is intact.”
(BBG) President Donald Trump called for a new “Space Force” to be added to the U.S. military as an armed service separate from the Pentagon’s five traditional uniformed branches.
“When it comes to defending America, it is not enough to merely have an American presence in space,” Trump said Monday at a White House event on space policy. “We must have American dominance in space.”
Trump has been considering creation of a Space Force for months over resistance from the Air Force, which currently oversees military space programs. He announced his support for the idea at a White House meeting of the National Space Council as the administration presented a directive for setting a goal for a new moon landing within 10 years.
Congress would have to approve a new military service, and lawmakers have been divided on the idea.
Much of the push to formalize an off-planet branch of the U.S. armed forces is motivated by space investment by Russia and China, the latter of which is eager to establish itself as a superpower with plans for an orbiting space station and a permanent outpost on the moon.
Russia under President Vladimir Putin has become increasingly aggressive, annexing Crimea, deploying more sophisticated nuclear weapons and waging conventional warfare in eastern Ukraine and Syria. He, too, has aspirations for a military role in space.
On peaceful space exploration, the administration announced a goal to send robotic explorers to the moon as early as next year and do another human lunar landing within 10 years.
The push could result in the first Americans stepping foot on the moon’s surface 55 years after doing so for the first time.
The directive also calls for better tracking and monitoring of space debris as commercial and civil space traffic increases.
The 1960s-era Apollo program to land U.S. astronauts on the moon was driven by President John F. Kennedy’s famous challenge and zealously funded by a Congress motivated by the Soviet Union’s perceived existential threat. That goal was achieved by the crew of Apollo 11 in 1969.
NASA’s current planning for Mars isn’t driven by any such urgency. The agency’s priorities tend to change depending on the administration: Under President George W. Bush, NASA was directed to return to the moon, while President Barack Obama set Mars as the longer-term priority. The Trump administration aims to do both, planning a lunar “gateway” orbiter and landings on the moon’s surface — with heavy assistance from commercial firms — and then using those outposts as a leaping-off point for Mars.
Bush proposed in 2004 sending robotic probes to the lunar surface by 2008, with a human mission as early as 2015, “with the goal of living and working there for increasingly extended periods of time.”
NASA estimated in 2005 that the Bush program to return to the moon, canceled by Obama, would cost $104 billion. The Trump administration didn’t immediately provide a cost estimate.
The Trump administration’s first crewed lunar gateway mission is planned for 2023 under NASA’s current plans, with humans heading to Mars in the 2030s.
Sí, ya se sabía que el portugués es quien más corcho tiene del mundo y quien más vino bebe y más bacalao consume; pero lo que nadie calculaba en la tierra del fado es que Portugal fuera el país de las mujeres científicas o, al menos, el país con más mujeres estudiando carreras de ciencias.
Según el estudio de la OCDE The Pursuit of Gender Equality, el 57% de las portuguesas estudian ciencias, tecnologías, ingenierías y/o matemáticas; es el porcentaje más alto del mundo rompiendo todo tipo de estereotipos. Son 17 puntos más que en el Estados Unidos de Silicon Valley, 22 puntos más que en España o Dinamarca y más del doble que en Japón.
La presencia de mujeres en las carreras científicas no se debe a que los presupuestos del Estado derrochen el dinero en esta parte de la educación y aún menos en investigación. Aunque el objetivo del Gobierno es llegar al 1,5% del presupuesto nacional, apenas pasa del 0,8%.
La falta de dinero o de perspectivas no parece desanimar la afición de la portuguesa por las ciencias, aunque son más de ingenierías que de Tecnologías de la Información, donde aún el porcentaje de mujeres es ínfimo. Aún así, en ese terreno destaca Elvira Fortunato, de la Universidad Nova de Lisboa. Sus investigaciones de circuitos integrados sin silicio, es decir, chips de papel, le han valido hace unas semanas una subvención de 3,5 millones de euros del Consejo Europeo de Investigación para dedicar a proyectos de tecnologías amigas del medioambiente.
(Bloomberg) — Stephen Hawking, the British physicist and
black-hole theorist who brought science to a mass audience with
the best-selling book “A Brief History of Time,” has died. He
Hawking died peacefully at his home in Cambridge in England
in the early hours of Wednesday morning, a spokesman for his
family said in an emailed statement.
“We are deeply saddened that our beloved father passed away
today,” his children Lucy, Robert and Tim said in the statement.
“He was a great scientist and an extraordinary man whose work
and legacy will live on for many years. His courage and
persistence with his brilliance and humor inspired people across
the world. He once said, ‘It would not be much of a universe if
it wasn’t home to the people you love.’ We will miss him
Hawking suffered from amyotrophic lateral sclerosis, also
known as Lou Gehrig’s disease, and was confined to an electric
wheelchair for much of his adult life. Diagnosed at age 21, he
was one of the world’s longest survivors of ALS.
A Cambridge University professor, Hawking redefined
cosmology by proposing that black holes emit radiation and later
evaporate. He also showed that the universe had a beginning by
describing how Albert Einstein’s theory of general relativity
eventually breaks down when time and space are traced back to
the Big Bang about 13.7 billion years ago.
“Stephen’s remarkable combination of boldness, vision,
insight and courage have enabled him to produce ideas that have
transformed our understanding of space and time, black holes and
the origin of the universe,” James Hartle, professor of physics
at the University of California, Santa Barbara, said in 2002.
“A Brief History of Time,” first published in 1988, earned
its author worldwide acclaim, selling at least 10 million copies
in 40 languages and staying on the best-seller list of the
U.K.’s Sunday Times newspaper for a record 237 weeks.
Often referred to as “one of the most unread books of all
time” for the hard-to-grasp concepts, it included only one
equation: E = mc2 or the equivalence of mass and energy, deduced
by Einstein from his theory of special relativity. The book
outlined the basics of cosmology for the general reader.
Hawking’s fame increased as his health worsened. After his
degenerative muscle disorder was diagnosed, he defied medical
opinion by living five decades longer than expected. He
communicated his ideas through an American-accented speech
synthesizer after a life-saving tracheotomy in 1985 took away
his ability to speak. To the layman, the robot-like voice only
seemed to give his words added authority.
“To my colleagues, I’m just another physicist, but to the
wider public, I became possibly the best-known scientist in the
world,” Hawking wrote in his 2013 memoir “My Brief History.”
“This is partly because scientists, apart from Einstein, are not
widely known rock stars, and partly because I fit the stereotype
of a disabled genius.”
Hawking applied quantum theory — governing the subatomic
world — to black holes, which he claimed discharge radiation
that causes them to disappear. This process helps explain the
notion that black holes have existed at a micro level since the
Big Bang, and the smaller they are, the faster they evaporate.
Black holes are formed when a massive star collapses under
the weight of its own gravity. Detected by the movement of
surrounding matter, they devour everything in their path and may
play a role in the birth of galaxies. Physicists say these
invisible cosmic vacuums might allow travel through time and
space via “wormholes,” a favorite of science-fiction writers.
With mathematician Roger Penrose, Hawking used Einstein’s
theory of relativity to trace the origins of time and space to a
single point of zero size and infinite density. Their work gave
mathematical expression to the Big Bang theory, proposed by
Belgian priest Georges Lemaitre in 1927 and supported two years
later by Edwin Hubble’s discovery that the universe is
With Hartle, Hawking later tried to marry relativity with
quantum theory by proposing the no-boundary principle, which
held that space-time is finite and the laws of physics
determined how the universe began in a self-contained system,
without the need for a creator or prior cause.
The Nobel Prize in Physics proved elusive for Hawking,
whose theories required observational data to win the praise of
the awarding committee in Stockholm. The Nobel Foundation
excludes posthumous nominees.
“By any reasonable standard, Stephen Hawking is a great
scientist. Even if time shows some of his more radical proposals
to be incorrect, Hawking will have had a profound impact on the
history of science,” Henry F. Schaefer III, a chemistry
professor at the University of Georgia, said in a 2001 lecture.
Stephen William Hawking was born in Oxford, England, on
Jan. 8, 1942, exactly 300 years after the death of Italian
physicist Galileo Galilei. Hawking’s father, Frank, was a doctor
of tropical medicine. His mother, Isobel, was a tax inspector
and a secretary. He had two younger sisters and a brother.
At age 8, Hawking moved with his family to St. Albans,
where he went to school. He then graduated with first-class
honors in natural science at Oxford’s University College. While
he was a doctoral candidate at Cambridge, Hawking was diagnosed
with ALS, also known as motor neuron disease. He was told he had
only a few years to live.
As the illness progressed slower than expected and he found
inspiration in his girlfriend, Jane Wilde, Hawking began to work
at his studies for the first time. He completed his doctorate on
the origins of the universe, became a research fellow at Caius
College and married Wilde in 1965.
In 1970, Hawking realized the mathematical approaches he
developed with Penrose could be applied to black holes, a term
coined by physicist John Wheeler. Hawking worked for the next
four years on black holes, discovering they weren’t totally
black, but leaked radiation, now known as “Hawking radiation.”
For 30 years, Hawking was Cambridge’s Lucasian professor of
mathematics, a chair once held by Isaac Newton. U.S. President
Barack Obama awarded the Presidential Medal of Freedom to
Hawking in 2009, the year of his retirement.
His other popular books included “The Universe in a
Nutshell” (2001), “On the Shoulders of Giants” (2002), “A
Briefer History of Time” (2005) and “The Grand Design” (2010).
In 2015, Eddie Redmayne won an Oscar for his portrayal of
Hawking in “The Theory of Everything,” a film about the
Hawking separated from his wife in 1991 and married his
nurse, Elaine Mason, four years later. They divorced in 2007.
By 2017 Hawking was spending more time pondering humanity’s
future and concluding that we should plan to colonize other
planets. “We are running out of space, and the only place we can
go to are other worlds,” he told a gathering of scientists. “‘It
is time to explore other solar systems. Spreading out may be the
only thing that saves us from ourselves. I am convinced that
humans need to leave Earth.”
(SkyNews) A model of the Sun’s magnetic activity suggests the River Thames may freeze over within two decades, experts say.
A mini ice age that would freeze major rivers could hit Britain in less than two decades, according to research from universities in the UK and Russia.
A mathematical model of the Sun’s magnetic activity suggests temperatures could start dropping here from 2021, with the potential for winter skating on the River Thames by 2030.
A team led by maths professor Valentina Zharkova at Northumbria University built on work from Moscow to predict the movements of two magnetic waves produced by the Sun.
It predicts rapidly decreasing magnetic waves for three solar cycles beginning in 2021 and lasting 33 years.
Very low magnetic activity on the Sun correspond with historically documented cold periods on Earth.
Professor Zharkova claims 97% accuracy for the model which dovetails with previous mini ice ages, including the Maunder Minimum period from 1645 to 1715 when frost fairs were held on the frozen Thames.
But she cautions that her mathematical research cannot be used as proof that there will be a mini ice age this time around, not least because of global warming.
“I hope global warning will be overridden by this effect, giving humankind and the Earth 30 years to sort out our pollution,” she said.
But Professor Zharkova warned that any downward impact on global warming will last only until the Sun’s two magnetic waves become active again in the 2050s.
“We have to be sorted by that time and prepare everything on Earth for the next big solar activity,” she said.
(MIT) Clusters of human brain cells can integrate into rat brains, and that’s raising concerns about giving animals some form of human consciousness.
Researchers can grow stem cells into tiny clumps of cells, called organoids, that display similar activity and structure to human brains. To find out more about how exactly that works, read our primer from when we made the technique one of our Ten Breakthrough Technologies of 2015.
Now, though, reports Stat, several labs have inserted those organoids into rat brains and connected them to blood vessels; some of the organoids have even grown physical links with the rat brains. From Stat’s report:
Some of the axons grew as much as 1.5 millimeters, connecting to the corpus callosum, a bundle of neurons connecting the left and right cerebral hemispheres. When the scientists shined light on a rat’s eye, or stimulated brain regions involved in vision, neurons in the implanted organoid fired. That suggested the human brain tissue had become functionally integrated with the rat’s.
The aim of this kind of research is noble: to work out how lab-grown clusters of brain cells could be used to understand or even treat brain diseases or injuries. But while a handful of cells in a rat brain may not be a problem now, and the idea of imbuing animals with human characteristics or consciousness seems distant, the integration reported by the labs in Stat’s report is giving some ethicists cause for concern.
That’s especially the case as the number of organoids placed inside a rat’s head increases. “People are talking about connecting three or four,” says Stanford bioethicist Hank Greely to Stat. “But what if you could connect 1,000? That would be getting close to the number of cells in a mouse brain … At some future point it could be that what you’ve built is entitled to some kind of respect.”
(PUB) O neurocientista tem um novo livro, A Estranha Ordem das Coisas – A Vida, os Sentimentos e as Culturas Humanas, que chega sexta-feira, 3 de Novembro, às livrarias portuguesas. Este é o excerto de um capítulo intitulado “A crise”. No domingo publicaremos uma entrevista ao cientista português.
Junto à margem do mar da Galileia, numa manhã de Inverno cheia de sol, a poucos passos da sinagoga de Cafarnaum onde Jesus de Nazaré falou aos seus seguidores, penso nos problemas longínquos do Império Romano mas sobretudo na crise actual da condição humana. É uma crise curiosa, pois embora as condições locais sejam distintas em cada ponto do mundo onde ocorre, as respostas que a definem são semelhantes, marcadas pela zanga, fúria e confronto violento, a par de apelos ao isolamento dos países e de uma preferência por governação autocrática.
Mas a crise é sobretudo decepcionante, pois não devia de todo estar a acontecer. Seria de esperar que pelo menos as sociedades mais avançadas tivessem ficado imunizadas pelos horrores da Segunda Guerra Mundial e pelas ameaças da Guerra Fria, e que tivessem encontrado maneiras de ultrapassar, de modo gradual e pacífico, quaisquer dos problemas que as culturas complexas necessariamente enfrentam. Pensando bem, deveríamos ter sido menos complacentes.
Parte das sociedades que celebram a ciência e a tecnologia modernas, e que mais lucram com elas, parece estar numa situação de bancarrota “espiritual”, tanto no sentido secular como religioso do termo. A julgar pela aceitação despreocupada das crises financeiras problemáticas – a bolha da Internet de 2000, os abusos hipotecários de 2007 e o colapso bancário de 2008 – parecem igualmente estar numa situação de bancarrota moral
Os tempos em que vivemos poderiam ser a melhor das épocas para se estar vivo, porque estamos rodeados por descobertas científicas espectaculares e por um brilho técnico que tornam a vida cada vez mais confortável e conveniente; porque a quantidade de conhecimentos disponível e a facilidade de acesso a esses conhecimentos nunca foram tão elevadas, acontecendo o mesmo em relação à interligação humana a uma escala planetária, como se prova pelas viagens, pela comunicação electrónica e pelos acordos internacionais sobre todos os tipos de cooperação científica, artística e comercial; porque a capacidade de diagnóstico, gestão e até cura de doenças continua a aumentar e a longevidade continua a prolongar-se de tal forma que se espera que os seres humanos nascidos após o ano 2000 possam viver, e bem, segundo se espera, até uma média de 100 anos. Em breve seremos conduzidos por veículos robotizados que nos poupam esforço e vidas, pois, a certa altura, deveremos ter menos acidentes fatais.
No entanto, para considerar os nossos dias como sendo os melhores de sempre seria preciso que estivéssemos muito distraídos, já para não dizer indiferentes ao drama dos restantes seres humanos que vivem na miséria. Embora a literacia científica e técnica nunca tenha estado tão desenvolvida, o público dedica muito pouco tempo à leitura de romances ou de poesia, que continuam a ser a forma mais garantida e recompensadora de penetrar na comédia e no drama da existência, e de ter oportunidade de reflectir sobre aquilo que somos ou que podemos vir a ser. Ao que parece, não há tempo a perder com a questão pouco lucrativa de, pura e simplesmente, “ser”. Parte das sociedades que celebram a ciência e a tecnologia modernas, e que mais lucram com elas, parece estar numa situação de bancarrota “espiritual”, tanto no sentido secular como religioso do termo. A julgar pela aceitação despreocupada das crises financeiras problemáticas – a bolha da Internet de 2000, os abusos hipotecários de 2007 e o colapso bancário de 2008 – parecem igualmente estar numa situação de bancarrota moral. Curiosamente, ou talvez não tanto, o nível de felicidade nas sociedades que mais beneficiaram com os espantosos progressos do nosso tempo mantém-se estável ou em declínio, caso possamos confiar nas respectivas avaliações.
Ao longo das últimas quatro ou cinco décadas, o grande público das sociedades mais avançadas aceitou, com pouca ou nenhuma resistência, o tratamento cada vez mais deformado das notícias e das questões públicas concebidas para se enquadrarem no modelo de entretenimento da televisão e da rádio comerciais. As sociedades menos avançadas não têm demorado a imitar essa atitude. A conversão de quase todos os “media” de interesse público ao modelo lucrativo de negócios veio reduzir ainda mais a qualidade da informação. Embora uma sociedade viável deva preocupar-se com a forma como o Governo promove o bem-estar dos cidadãos, a noção de que se deve proceder a uma pausa diária de alguns minutos e fazer um esforço para se ficar a par das dificuldades e dos êxitos dos governos e dos cidadãos não só se tornou antiquada, como quase desapareceu. Quanto à noção de que devemos aprender algo sobre essas questões com seriedade e respeito, ela é, hoje em dia, um conceito estranho. A rádio e a televisão transformam cada questão governativa numa “história”, com a “forma” e o valor de entretenimento dessa história a contarem mais do que o seu conteúdo factual. Quando, em 1985, Neil Postman escreveu o seu livro Amusing Ourselves to Death: Public Discourse in the Age of Show Business, ele fez um diagnóstico correcto, mas nem sonhava que sofreríamos tanto antes de morrer. O problema agravou-se com a redução de fundos para a educação pública e com o declínio previsível da preparação de cidadãos, e, no caso dos Estados Unidos, piorou com o repúdio, em 1987, da Fairness Doctrine, que desde 1949 requeria um tratamento equilibrado dos comentários políticos. O resultado, intensificado pelo declínio dos jornais impressos e pela ascensão e domínio quase absoluto por parte da comunicação digital e da televisão, é a carência profunda de conhecimentos pormenorizados e não-partidários dos assuntos públicos, a par do abandono gradual das práticas da reflexão ponderada e do discernimento sobre os factos. É preciso ter o cuidado de não exagerar a nostalgia por um tempo que nunca existiu por completo. Nem todo o público estaria seriamente informado, reflexivo e exigente. Nem todos os cidadãos tinham reverência pela verdade e pela nobreza de espírito, já para não falar de reverência pela vida. Não obstante, o presente colapso da consciência pública séria é problemático. As sociedades humanas encontram-se previsivelmente fragmentadas segundo uma variedade de medidas, como literacia, nível de habilitações, comportamento cívico, aspirações espirituais, liberdade de expressão, acesso à justiça, estatuto económico, saúde e segurança ambiental. Dadas as circunstâncias, torna-se mais difícil do que jamais foi encorajar o público a promover e a defender uma lista de valores, direitos e obrigações que não sejam negociáveis.
Dado o espantoso progresso dos novos media, o público tem a oportunidade de ficar a saber com mais pormenores do que nunca os factos por detrás das economias, o estado dos governos locais e globais, e o estado das sociedades em que vive, algo que, sem qualquer dúvida, se trata de uma vantagem que confere poder real; para além disso, a Internet fornece meios de deliberação fora das tradicionais instituições comerciais ou governamentais, outra vantagem potencial. Por outro lado, em geral, o público não dispõe nem de tempo nem de método para converter as quantidades imensas de informação em conclusões razoáveis e de uso prático. Além disso, as empresas que geram a distribuição e a agregação de informação ajudam o público de forma dúbia: o fluxo de informação é orientado por algoritmos da empresa que, por sua vez, influenciam a apresentação, de modo a adequar-se a uma variedade de interesses financeiros, políticos e sociais, a par do gosto dos utilizadores, para que estes possam continuar fechados no silo de opiniões que os entretêm.
O acesso ilimitado à informação privada está a ser usado para criar escândalos embaraçosos, mesmo que o tema da vigilância não seja de natureza criminosa. O resultado é o silêncio dos candidatos políticos, para que eles e as suas campanhas políticas não sejam destruídos por revelações pessoais. Isso tornou-se um factor importante na governação pública.
Reconheça-se, a bem da verdade, que as vozes sábias do passado – as vozes dos experientes e judiciosos editores de jornais, de programas de televisão e de rádio – não eram completamente imparciais, favorecendo visões específicas quanto ao funcionamento das sociedades. Todavia, na maior parte dos casos, essas visões concretas identificavam-se com perspectivas filosóficas ou sociopolíticas específicas, às quais cada um podia resistir ou apoiar. Hoje em dia, o grande público não tem essa oportunidade. Cada um de nós tem acesso directo ao mundo através do seu dispositivo portátil, e é encorajado a maximizar a sua autonomia. Não há grande incentivo para debater, e muito menos aceitar opiniões divergentes.
O novo mundo da comunicação é uma bênção para os cidadãos treinados a pensar de forma crítica e informada sobre a História. Mas qual a sorte dos cidadãos que foram seduzidos por um modelo de vida como diversão e comércio? Em grande medida, foram formados por um mundo em que a provocação emocional negativa é a regra e não a excepção, e onde as melhores soluções para um problema passam, em primeiro lugar, por interesses próprios e de curto prazo. Poderemos censurá-los?
A disponibilidade generalizada de comunicação abundante e quase instantânea de informação pública e pessoal, um óbvio benefício, reduz, paradoxalmente, o tempo necessário para a reflexão sobre essa mesma informação. A gestão do fluxo de conhecimento disponível obriga, frequentemente, a uma rápida classificação de factos como sendo bons ou maus, agradáveis ou não. Isto contribui, porventura, para um aumento de opiniões polarizadas quanto a acontecimentos sociais e políticos. A exaustão provocada pelo excesso de factos recomenda uma fuga para as crenças e as opiniões pré-definidas, em geral as do grupo a que o indivíduo pertence. Isto agrava-se pelo facto de tendermos naturalmente a resistir à mudança de opinião, pese embora a disponibilidade de provas em contrário, e por mais inteligentes e informados que sejamos.
Trabalhos realizados pelo nosso instituto [Instituto do Cérebro e da Criatividade na Universidade da Califórnia do Sul, EUA] mostram que isso é verdade em relação a crenças políticas, mas imagino que também se aplique a uma grande variedade de crenças, desde a religião e a justiça à estética. O nosso trabalho mostra que a resistência à mudança está associada à relação conflituosa entre sistemas cerebrais relacionados com a emotividade e a razão. A resistência à mudança está associada, por exemplo, à activação de sistemas responsáveis pela produção de zanga e fúria. Criamos uma espécie de refúgio natural para nos defendermos contra a informação contraditória. Por todo o mundo os eleitores descontentes recusam-se a comparecer nas urnas. Com tal clima, a disseminação de notícias falsas e de pós-verdades fica facilitada. O mundo distópico que George Orwell em tempos descreveu, tendo a União Soviética como modelo, corresponde agora a uma situação sociopolítica diferente. A velocidade das comunicações e a resultante aceleração do ritmo de vida são igualmente possíveis contribuidores para o declínio da civilidade, identificável na impaciência do discurso público e na crescente grosseria da vida urbana.
Uma questão separada, mas importante, que continua a ser menosprezada é a natureza viciante dos media electrónicos, desde as simples comunicações por email às redes sociais. O vício desvia tempo e atenção da experiência imediata do ambiente que nos rodeia para uma experiência mediada por uma grande variedade de dispositivos electrónicos. O vício aumenta o desenquadramento entre o volume de informação e o tempo necessário para a processar.
Sem educação, os homens “vão matar-se uns aos outros”, diz António Damásio
António Damásio: o neurocientista põe a mão na consciência
A quebra de privacidade que acompanha o uso universal da Web e das redes sociais garante a monitorização de cada gesto e ideia humana. Todos os tipos de vigilância, desde a necessária por motivos de segurança pública até àquela que é intrusiva e mesmo abusiva, são agora uma realidade, praticados, tanto pelo Governo como pelo sector privado, com total impunidade. A vigilância faz com que a espionagem, até mesmo a espionagem das superpotências, uma actividade estabelecida que nos acompanha desde há milénios, pareça honrada e infantil. Até encontramos vigilância à venda, por lucros elevados, pela mão de uma série de empresas tecnológicas. O acesso ilimitado à informação privada está a ser usado para criar escândalos embaraçosos, mesmo que o tema da vigilância não seja de natureza criminosa. O resultado é o silêncio dos candidatos políticos, para que eles e as suas campanhas políticas não sejam destruídos por revelações pessoais. Isso tornou-se um factor importante na governação pública. Em sectores vastos das regiões mais tecnologicamente avançadas do mundo há escândalos de todas as dimensões que influenciam resultados eleitorais e fortalecem a desconfiança do público em relação às instituições políticas e às elites profissionais. Sociedades que já se debatiam com grandes problemas de desigualdade de riqueza e de deslocações humanas devido ao desemprego e às guerras tornaram-se quase ingovernáveis. Os eleitorados desorientados recordam com nostalgia passados há muito desaparecidos e miticamente melhores, ou, como alternativa, revelam uma revolta profunda. A nostalgia, no entanto, é deslocada, e a fúria, em geral, é mal dirigida. Tais reacções reflectem uma compreensão limitada da miríade de factos apresentados pelos vários órgãos de comunicação social, factos concebidos sobretudo para entreter, promover determinados interesses sociais, políticos e comerciais, e obter grandes recompensas financeiras com isso.
Nota-se uma tensão crescente entre o poder de um público vasto que parece mais bem informado do que nunca, mas que não dispõe do tempo ou dos instrumentos para julgar e interpretar a informação, e o poder das empresas e dos governos que controlam a informação e sabem tudo o que há para saber acerca desse mesmo público. Como sanar o conflito resultante? Há também riscos notáveis a considerar. A possibilidade de conflitos catastróficos que envolvam armas nucleares e biológicas representam riscos reais e possivelmente mais elevados agora do que quando essas armas eram controladas pelas potências da Guerra Fria; os riscos do terrorismo e o novo risco da guerra cibernética também são reais, bem como o risco das infecções resistentes a antibióticos. Podemos culpar a modernidade, a globalização, a desigualdade da riqueza, o desemprego, a educação a menos, o entretenimento a mais, a diversidade, e a rapidez e ubiquidade radicalmente paralisantes das comunicações digitais, mas atribuir culpas não reduz os riscos, de imediato, nem resolve o problema das sociedades ingovernáveis, sejam quais forem as causas.
(GUA) After thousands of years of failure, some scientists believe a breakthrough might finally be in sight. By Nicola Davison
The common cold has the twin distinction of being both the world’s most widespread infectious disease and one of the most elusive. The name is a problem, for starters. In almost every Indo-European language, one of the words for the disease relates to low temperature, yet experiments have shown that low temperature neither increases the likelihood of catching a cold, nor the severity of symptoms. Then there is the “common” part, which seems to imply that there is a single, indiscriminate pathogen at large. In reality, more than 200 viruses provoke cold-like illness, each one deploying its own peculiar chemical and genetic strategy to evade the body’s defences.
It is hard to think of another disease that inspires the same level of collective resignation. The common cold slinks through homes and schools, towns and cities, making people miserable for a few days without warranting much afterthought. Adults suffer an average of between two and four colds each year, and children up to 10, and we have come to accept this as an inevitable part of life.
Public understanding remains a jumble of folklore and false assumption. In 1984, researchers at the University of Wisconsin-Madison decided to investigate one of the best-known ways of catching a cold. They infected volunteers with a cold virus and instructed them to kiss healthy test subjects on the mouth for at least one minute. (The instruction for participants was to use whichever technique was “most natural”.) Sixteen healthy volunteers were kissed by people with colds. The result: just one confirmed infection.
The most common beliefs about how to treat the disease have turned out to be false. Dubious efficacy has done little to deter humankind from formulating remedies. The Ebers Papyrus, a medical document from ancient Egypt dated to 1550BC, advises a cold sufferer to recite an incantation, “in association with the administration of milk of one who has borne a male child, and fragrant gum”. In 1924, US President Calvin Coolidge sat down in an airtight chlorine chamber and inhaled the pungent, noxious gas for almost an hour on the advice of his physicians, who were certain that his cold would be cured quickly. (It wasn’t.)
Today, “winter remedy” sales in the UK reach £300m each year, though most over-the-counter products have not actually been proven to work. Some contain paracetamol, an effective analgesic, but the dosage is often sub-optimal. Taking vitamin C in regular doses does little to ward off disease. Hot toddies, medicated tissues and immune system “boosts” of echinacea or ginger are ineffective. Antibiotics do nothing for colds. The only failsafe means of avoiding a cold is to live in complete isolation from the rest of humanity.
Although modern science has changed the way medicine is practised in almost every field, it has so far failed to produce any radically new treatments for colds. The difficulty is that while all colds feel much the same, from a biological perspective the only common feature of the various viruses that cause colds is that they have adapted to enter and damage the cells that line the respiratory tract. Otherwise, they belong to quite different categories of organisms, each with a distinct way of infecting our cells. This makes a catch-all treatment extremely tricky to formulate.
Scientists today identify seven virus families that cause the majority of colds: rhinovirus, coronavirus, influenza and parainfluenza virus, adenovirus, respiratory syncytial virus (RSV) and, finally, metapneumovirus, which was first isolated in 2001. Each has a branch of sub-viruses, known as serotypes, of which there are about 200. Rhinovirus, the smallest cold pathogen by size, is by far the most prevalent, causing up to three-quarters of colds in adults. To vanquish the cold we will need to tackle all of these different families of virus at some stage. But, for now, rhinovirus is the biggest player.
Scientists first attempted to make a rhinovirus vaccine in the 1950s. They used a reliable method, pioneered by French biologist Louis Pasteur in the 1880s, in which a small amount of virus is introduced to a host in order to provoke a defensive immunological reaction that then protects the body from subsequent infection. Even so, those who had been vaccinated caught colds just as easily as those who had not.
Over the next decade, as the techniques for isolating cold viruses were refined, it became clear that there were many more rhinoviruses than first predicted. Researchers realised it would not be possible to make a vaccine in the traditional way. Producing dozens of single-serotype vaccines, each one targeting a different strain, would be impractical. The consensus that a rhinovirus vaccine was not possible deepened. The last human clinical trial took place in 1975.
Then, in January last year, an editorial appeared in the Expert Review of Vaccines that once again raised the prospect of a vaccine. The article was co-authored by a group of the world’s leading respiratory disease specialists based at Imperial College London. It was worded cautiously, yet the claim it made was striking. “Perhaps the quest for an RV [rhinovirus] vaccine has been dismissed as too difficult or even impossible,” it said, “but new developments suggest that it may be feasible to generate a significant breadth of immune protection.” The scientists were claiming to be on the way to solving a riddle that has stumped virologists for decades. One virologist told me it was as if a door that had been closed for many, many years had been re-opened.
Part of the Imperial scientists’ motivation was the notion that since we now have vaccines for many of the most dangerous viruses (measles, polio, yellow fever, cholera, influenza, and so on), it is time to tackle the disease that afflicts us most often. “Rhinovirus is by far the most common cause of illness,” says Sebastian Johnston, a professor at Imperial and one of the authors of the editorial. “Look at what people spend on ineffective over-the-counter medications. If you had a safe and effective treatment, you’d take it.”
I asked Johnston if he was optimistic. He pointed out that because their studies so far have only been in mice, they are not sure that the vaccine will work in humans. “The data is limited,” he says. “But it’s encouraging.” It was not the resounding triumphalism that I was expecting, but then cold scientists learned long ago to be careful about making grand proclamations. Theirs is an undertaking that, more than anything, has been defined by consistent disappointment.
The first scientist to try and fail to make a rhinovirus vaccine was also the first scientist to distinguish it from the jumble of other cold viruses. In 1953, an epidemiologist called Winston Price was working at Johns Hopkins University in Baltimore when a group of nurses in his department came down with a mild fever, a cough, sore throat and runny nose – symptoms that suggested the flu. Price took nasal washings from the nurses and grew their virus in a cell culture. What he found was too small to be influenza virus. In a 1957 paper, “The isolation of a new virus associated with respiratory clinical disease in humans”, Price initially named his discovery “JH virus”, after his employer.
Price decided to try to develop a vaccine using a bit of dead rhinovirus. When the immune system encounters an invading virus – even a dead or weakened virus – it sets out to expel it. One defence is the production of antibodies, small proteins that hang around in the blood system long after the virus is gone. If the virus is encountered a second time, the antibodies will swiftly recognise it and raise the alarm, giving the immune system the upper hand.
At first, Price was encouraged. In a trial that involved several hundred people, those vaccinated with JH virus had eight times fewer colds than the unvaccinated. Newspapers across the US wanted to know: had the common cold been cured? “The telephone by my bed kept ringing until 3 o’clock in the morning,” Price told the New York Times in November 1957. The celebration would be short-lived. Though Price’s vaccine was effective against his particular “JH” rhinovirus strain, in subsequent experiments it did nothing. This indicated that more than one rhinovirus was out there.
By the late 1960s, dozens of rhinoviruses had been discovered. Even in the alien menagerie of respiratory disease, this level of variation in one species was unusual; there are just three or four influenza viruses circulating at any one time. Scientists at the University of Virginia decided to try a different tactic. Instead of inoculating patients with a single strain of rhinovirus, they combined 10 different serotypes in one injection. But after this, too, failed to shield participants from infection, they were out of ideas.
As hope for a vaccine receded, scientists began investigating other ways to combat colds. From 1946 until it closed in 1990, most research into respiratory viruses in the UK was undertaken at the Common Cold Unit (CCU), a facility backed by the Medical Research Council that occupied a former wartime military hospital in the countryside near Salisbury. In its four decades of operation, some 20,000 volunteers passed through the doors of the CCU, many to be willingly infected with cold virus in the name of scientific progress.
An early experiment at the CCU involved a group of volunteers being made to take a bath and then to stand dripping wet and shivering in a corridor for 30 minutes. After they were allowed to get dressed, they had to wear wet socks for several hours. Despite a drop in body temperature, the group did not get any more colds than a control group of volunteers who had been kept cosy.
The CCU began focusing on cold treatments in the 1960s and 70s, when research into a substance produced by the human body called interferon was gaining momentum. Interferons are proteins that are secreted by cells when they are attacked by a virus. They act as messengers, alerting nearby cells to the invader. These cells in turn produce an antiviral protein that inhibits, or interferes with, the virus’s ability to spread, hence the name.
In 1972, researchers at the CCU decided to investigate whether interferon could be used as a treatment for colds. They infected 32 volunteers with rhinovirus and then sprayed either interferon or placebo up their noses. Of the 16 given a placebo, 13 came down with colds. But of the 16 given interferon, only three got ill. The findings, published in The Lancet, made the front page of the New York Times (below a story on Watergate). A rush of interferon research got underway. But, once again, the excitement was premature. A review by the CCU in the 1980s uncovered a fatal flaw: interferon only worked when it was given to the patient at the same time as the virus. But in real life – that is, outside the lab – a rhinovirus enters the nose between eight and 48 hours before the onset of cold symptoms. By the time you feel a cold coming on, it is already too late.
As the 20th century drew to a close, attempts to find a cure grew more desperate. At the CCU, molecules that were found in traditional Chinese medicine, Japanese tea and oranges were all seriously interrogated. In 1990, the CCU closed. The centre had done much to advance our understanding of the virology of the cold, yet it had also exposed the enormity of the task of defeating it.
In the 1990s, as many virologists focused on HIV and Aids, research into the cold tailed off. “Common acute respiratory infections were seen as less important compared with this threat of a worldwide, lethal plague,” writes David Tyrrell, the former director of the CCU, in his 2002 book Cold Wars. A cure seemed more remote than ever.
Sebastian Johnston’s lab is on the third floor of the School of Medicine, part of Imperial College’s St Mary’s Hospital campus in Paddington, west London. Opened in 1851, the original hospital building is red-brick, with high ceilings, arched colonnades and turrets, but numerous extensions, each progressively more box-like, now hem it in. A round blue plaque on the facade states that Sir Alexander Fleming (1881-1955) discovered penicillin in a second-storey room. Entry to a recreation of Fleming’s lab is £4.
Johnston, a professor of respiratory medicine and an asthma specialist, is 58 and bespectacled, with a mop of grey curls that form a peak on his forehead. As a PhD student in 1989, he was dispatched to the CCU, not long before it closed down, to study virus detection methods. “I spent six months there,” Johnston said. “It was a strange place, basically a bunch of nissen huts connected by wooden runways, with lots of rabbits.”
For his PhD on asthma, Johnston developed a technique called polymerase chain reaction, which magnifies DNA so that viruses can be identified more precisely. To his amazement, Johnston discovered that viruses were behind 85% of asthma attacks in children; about half of those were rhinoviruses. Previously, most studies had detected viruses in fewer than 20% of asthma attacks. Johnston went on to find that rhinovirus also exacerbates symptoms in 95% of cases of smoker’s cough (formally known as chronic obstructive pulmonary disease, or COPD).
It wasn’t until the 1990s that scientists fighting rhinovirus properly understood what they were up against. By that time, electron microscopy had advanced and it was possible to see the organism up close. For a pathogen so spectacularly good at infecting our nasal passages – the “rhin” of the name is from the Greek for “nose” – rhinoviruses are astonishingly simple, being little more than strands of ribonucleic acid (RNA) surrounded by a shell: “a piece of bad news wrapped in a protein coat”, as the Nobel Prize-winning biologist Peter Medawar once observed. Under an electron microscope, they are spherical with a shaggy surface like the bobble on a knitted hat.
Though all the rhinoviruses are pretty much the same internally, a subtle alteration to the pattern of proteins on their outer shell means that, to the immune system, they all look different. It’s a cloak-and-dagger strategy, and the reason why early vaccines such as Winston Price’s failed. Antibodies produced for one rhinovirus serotype do not detect the rest. Until recently, it was believed that there were around 100 different strains, and these were grouped into the “A” and “B” families. Then, in 2007, a new cache of viruses was discovered, the “C” group, making the total more like 160.
In 2003, Johnston, who was then working at Imperial, contacted Jeffrey Almond, a former professor of virology at Reading University who had been recently appointed as head of vaccine development at the pharmaceutical giant Sanofi. The company was already manufacturing a jab for influenza and was interested in tackling the common cold. Having bumped into Johnston at academic conferences, Almond felt that their ambitions were aligned. “I said: ‘Let’s think about whether we can do something dramatic,’” Almond told me. “Let’s think about how we can make a vaccine against rhino.”
For doctors, vaccines are preferable to drugs because they shield the host from invasive organisms before they cause any damage. For pharmaceutical companies, vaccines are significantly less attractive. Not only do they take years and hundreds of millions of dollars to develop, even if that process is successful – which it often isn’t – it can still be hard to make much money. Vaccines are usually injections administered on a single occasion, while drugs are taken for prolonged periods. And people don’t want to pay much for vaccines. “Everybody wants vaccines for pennies rather than pounds because you get them when you’re healthy,” Almond said. “Nobody wants to pay anything when they’re healthy. It’s like car insurance, right? But when you’re sick you will empty your wallet, whatever it takes.”
Still, Almond thought there might be a commercial case for a rhinovirus vaccine. Totting up the days off school and work, plus the secondary infections such as sinusitis that require supplementary treatment and even hospitalisation, rhinovirus places a huge burden on health systems. Last year, in the UK, coughs and colds accounted for almost a quarter of the total number of days lost to sickness, about 34m. In the US, a survey carried out in 2002 calculated that each cold experienced by an adult causes an average loss of 8.7 working hours, while a further 1.2 hours are lost attending to cold-ridden children, making the total cost of lost productivity almost $25bn (£19bn) each year. Almond convinced his bosses that, if it were possible to make one, a rhinovirus vaccination would be financially viable. “Our back-of-the-envelope calculations on what we could charge, and what the numbers of sales could be, mean that it’s likely to be quite profitable and quite interesting for a company to develop,” Almond says.
Reviewing the approaches taken in the 1960s and 70s, Almond and Johnston dismissed the idea of a mega-vaccine of all the 160 rhinovirus serotypes, believing it would be too heavy, too complex and too expensive to make. They wondered instead if there was a tiny part of the structure of viruses that is identical, or “conserved”, across the entire species that could form the basis of what is called a subunit vaccine, an approach that has had success with hepatitis B and the human papilloma virus, or HPV.
After comparing the genetic sequences of the different rhinovirus serotypes, the researchers honed in on a particular protein on the virus shell that seemed to recur across many of the serotypes. They took a piece of the conserved shell from a single rhinovirus, number 16, and mixed it with an adjuvant – a stimulus that mimics the danger signals that trigger an immune response – and injected it into mice as a vaccine. The hope was that the immune system would be jolted into recognising the shell protein as an invasive pathogen, conferring immunity against the entire rhinovirus family.
In petri dishes, the scientists mixed the immunised mouse blood with three other rhinovirus serotypes, numbers 1, 14 and 29. An immunological response to rhinovirus 1 was likely because its genetic sequence is similar to 16, but serotypes 14 and 29 are unalike. The mice’s white blood cells responded vigorously against all three strains. “Seeing responses against those two [different serotypes] was very encouraging,” Johnston said. This gave hope that the vaccine might protect against the full gamut of rhinoviruses.
The scientists gathered a group of respiratory medicine specialists to review the findings. The reviewers agreed that the results looked promising. But just as the scientists were ready to take the vaccine forward, there was a setback at Sanofi. “There was a change of direction, a change of guys at the top,” Almond said. “I took early retirement for different reasons. My boss retired as well.”
In 2013, the new management decided that the company’s priorities were elsewhere, handing back to Imperial College the patent that protects the vaccine idea from being developed by other groups. Imperial did not have the resources to develop the vaccine without outside investment. For Johnston, it was frustrating – years of research and toil in the lab had seemed to be finally yielding results. But there was little he could do. The vaccine was shelved.
Across the Atlantic, as Imperial began to search for new backers, Martin Moore, a paediatrician at Emory University in Atlanta, was working on a rival approach to the same problem. A specialist in children’s respiratory disease, for the past three years Moore has been working on a solution so straightforward that when he presented the results of his paper, published in Nature Communications last year, his colleagues struggled to accept them. “But if I pushed them, I couldn’t get a good reason for that other than, just: it hadn’t been done before,” he says.
Moore first resolved to do something about the common cold in 2014, while on holiday with his family in Florida. Shortly after they had arrived, his son, then a toddler, came down with a cold. “He wanted me to hold him day and night,” Moore said. The pair hunkered down in the hotel room watching movies while the rest of the family went to the beach. “It was frustrating because, as a virologist, we can go into the lab and slice and dice these viruses. But what are we really doing about them?”
Moore reviewed the papers from the 1960s and 70s that described the early attempts at a vaccine. He saw that the scientists had demonstrated that if they took one rhinovirus, killed it and then injected it, it would protect people against that same strain. “People actually made decent vaccines against rhinovirus in the 1960s,” Moore told me. What scientists did not account for at the time was that there were so many different serotypes. But where the scientists of the past had seen defeat, Moore saw promise. Why not simply make a vaccine made up of all the rhinoviruses? There was nothing to suggest that it would not work. The problem was not with the science, but with logistics. “I thought, the only thing between us and doing this is manufacturing and economics.”
Moore secured funding from the National Institutes of Health (NIH) and applied for samples of the different serotypes from the Centers for Disease Control and the American Type Culture Collection, a biological material repository headquartered in Virginia. He stopped short of calling in all 160 serotypes, reasoning that 50 would be enough to support his hypothesis.
After developing the vaccine, composed of these 50 serotypes, Moore tested it on a number of rhesus macaque monkeys. When their blood was later mixed with viruses in petri dishes, there was a strong antibody response to 49 of the 50 serotypes. It was not possible to see whether the vaccinated monkeys themselves would be protected from colds, since human rhinoviruses do not infect monkeys. But the ability to induce antibodies in monkey blood does correlate with protection in people.
“Maybe I shouldn’t say this, but I never had a doubt that it would produce antibodies,” Moore told me. “Our paper was about showing it can be done.” There is still a long way to go before Moore’s dream becomes reality. For the vaccine to be tested in a clinical trial, it will need to be made under good manufacturing practice (GMP) conditions – regulations that companies must adhere to for licensing. Under these regulations, substances need to be kept separate to avoid cross-contamination – a substantial challenge for a vaccine that potentially encompasses 160 serotypes (currently, the largest number of serotypes in a single vaccine, for pneumonia, is 23).
For a manufacturing model, Moore is looking to the polio vaccine, since polio and rhinovirus are biologically related. The scale of production would be many times greater, but the basic processes would be alike. In May, Moore’s start-up, Meissa Vaccines, received a $225,000 (£170,000) grant from the NIH for work on rhinovirus. He is taking leave from academia to work on the vaccines.
At this point in time, perhaps the biggest barrier to us curing the common cold is commercial. Researchers at universities can only go so far; the most generous grants from bodies such as the UK Medical Research Council are around £2m. It falls to pharmaceutical companies to carry out development beyond the initial proof of concept. “You’re looking at 10-15 years’ work, minimum, with teams of people, and you’re going to spend $1bn (£760m) at least,” Almond told me.
Successes have been rare, and there have been spectacular flops. Last year, shares in US firm Novavax fell by 83% after its vaccine for RSV, one of the virus families responsible for colds, failed in a late-stage clinical trial. While it is less common than rhinovirus, RSV can cause great harm and even death in those with weakened immunity, including infants and the elderly. An effective vaccine presented an estimated $1bn opportunity for Novavax in the US alone. Before the results came through, chief executive Stanley Erck said it could be “the largest-selling vaccine in the history of vaccines”. But in the phase III trial of elderly patients, it did little to protect against infection. In the hours after the news broke, Novavax share prices fell from $8.34 to $1.40.
Episodes such as this have made pharmaceutical companies wary. Today, vaccines constitute less than 5% of the overall pharmaceutical market, and development is consolidated in a handful of companies: Sanofi Pasteur, GlaxoSmithKline, Pfizer, AstraZeneca, Merck and Johnson & Johnson, among a few other smaller players.
After the $1bn or so spent on development, there are also manufacturing and distribution costs to consider. There needs to be a return on the initial investment. “You sure as hell can’t do it if there’s not a market at the end, you’re wasting the company’s money, and if you do that too often, you’ll bankrupt the company,” Almond says. “There isn’t a conspiracy out there that says, ‘Let’s not do vaccines so people can get ill and we charge them a lot’, nothing like that. It genuinely isn’t easy.”
In August, I called Sebastian Johnston to see if there was any news on his vaccine. He told me that he had just received confirmation of further funding from Apollo Therapeutics, a startup backed by AstraZeneca, GSK and Johnson & Johnson. This would allow his lab to test the vaccine on more strains of rhinovirus. Johnston believes that if the vaccine proves to be protective against, say, 20 serotypes, there is a good chance it will protect against all the rhinoviruses. Beginning in October, the research should take about a year and a half. “At that point, I think we’ll be at a stage where we’ll be able to go to major vaccine companies.”
If the vaccine were to make it through the clinical trials, and was approved by regulators, it would first be rolled out to high-risk groups – those with asthma and COPD, and perhaps the elderly, as the flu jab is in the UK – and then to the rest of the population. In time, as the proportion of vaccinated individuals reach a critical mass, the viruses would cease to circulate because the chain of infection will be broken – a phenomenon called herd immunity.
From where we are today, this scenario is still distant: about 80% of drugs that make it into clinical trials because they worked in mice do not go on to work in humans. Still, for the first time in decades there are now major pharmaceutical companies with rhinovirus vaccine programmes, as well as smaller university research groups like Johnston’s which, through different approaches, are all pursuing the same goal of a cure. Once again, Johnston said, “people are starting to believe it may be possible.”
(Reuters) Scientists Rainer Weiss, Barry Barish and Kip Thorne won the 2017 Nobel Prize for Physics for decisive contributions in the observation of gravitational waves, the award-giving body said on Tuesday.
“This is something completely new and different, opening up unseen worlds,” the Royal Swedish Academy of Sciences said in a statement on awarding the 9 million Swedish crown ($1.1 million)prize.
“A wealth of discoveries awaits those who succeed in capturing the waves and interpreting their message.”
Physics is the second of this year’s crop of Nobel Prizes and comes after Americans Jeffrey Hall, Michael Rosbash and Michael Young were awarded the Nobel Prize for Physiology or Medicine on Monday.
(Economist) Science will win the technical battle against cancer. But that is only half the fight.
THE numbers are stark. Cancer claimed the lives of 8.8m people in 2015; only heart disease caused more deaths. Around 40% of Americans will be told they have cancer during their lifetimes. It is now a bigger killer of Africans than malaria. But the statistics do not begin to capture the fear inspired by cancer’s silent and implacable cellular mutiny. Only Alzheimer’s exerts a similar grip on the imagination.
Confronted with this sort of enemy, people understandably focus on the potential for scientific breakthroughs that will deliver a cure. Their hope is not misplaced. Cancer has become more and more survivable over recent decades owing to a host of advances, from genetic sequencing to targeted therapies. The five-year survival rate for leukemia in America has almost doubled, from 34% in the mid-1970s to 63% in 2006-12. America is home to about 15.5m cancer survivors, a number that will grow to 20m in the next ten years. Developing countries have made big gains, too: in parts of Central and South America, survival rates for prostate and breast cancer have jumped by as much as a fifth in only a decade.
From a purely technical perspective, it is reasonable to expect that science will one day turn most cancers into either chronic diseases or curable ones. But cancer is not fought only in the lab. It is also fought in doctors’ surgeries, in schools, in public-health systems and in government departments. The dispatches from these battlefields are much less encouraging.
First, the good news. Caught early, many cancers are now highly treatable. Three out of four British men who received a prostate-cancer diagnosis in the early 1970s did not live for another ten years; today four out of five do. Other cancers, such as those of the lung, pancreas and brain, are harder to find and treat. But as our Technology Quarterly in this issue shows, progress is being made. Techniques to enable early diagnosis include a device designed to detect cancer on the breath; blood tests can track fragments of DNA shed from tumours. Genome sequencing makes it ever easier to identify new drug targets.
The established trio of 20th-century cancer treatments—surgery, radiation and chemotherapy—are all still improving. Radiotherapists can create webs of gamma rays, whose intersections deliver doses high enough to kill tumours but which do less damage to healthy tissue as they enter and leave the body. Some new drugs throttle the growth of blood vessels bringing nutrients to tumours; others attack cancer cells’ own DNA-repair kits. Cancer may be relentless; so too is science.
The greatest excitement is reserved for immunotherapy, a new approach that has emerged in the past few years. The human immune system is equipped with a set of brakes that cancer cells are able to activate; the first immunotherapy treatment in effect disables the brakes, enabling white blood cells to attack the tumours. It is early days, but in a small subset of patients this mechanism has produced long-term remissions that are tantamount to cures. Well over 1,000 clinical trials of such treatments are under way, targeting a wide range of different cancers. It is even now possible to reprogram immune cells to fight cancer better by editing their genomes; the first such gene therapy was approved for use in America last month.
Yet cancer sufferers need not wait for the therapies of tomorrow to have a better chance of survival today. Across rich and poor countries, the survivability of cancer varies enormously. Men die at far higher rates than women in some countries; in other countries, at similar levels of development, they do comparably well. The five-year survival rate for a set of three common cancers in America and Canada is above 70%; Germany achieves 64%, whereas Britain manages a mere 52%. Disparities exist within countries, too. America does well in its treatment of cancer overall, but suffers extraordinary inequalities in outcomes. The death rate of black American men from all cancers is 24% higher than it is for white males; breast-cancer death rates among blacks are 42% higher than for whites. A diagnosis in rural America is deadlier than one in its cities.
Practical as well as pioneering
Variations between countries are partly a reflection of health-care spending: more than half of patients requiring radiotherapy in low- and middle-income countries do not have access to treatment. But big budgets do not guarantee good outcomes. Iceland and Portugal do not outspend England and Denmark on health care as a proportion of GDP, but past studies show wide variation in survivability in all cancers.
Instead, the problem is often how money is spent, not how much of it there is. To take one example, a vaccine exists against the human papillomavirus (HPV), which causes cancers of the cervix in women, as well as cancers of the head and neck. Rwanda started a programme of routine vaccination in 2011, and aims to eradicate cervical cancer by 2020. Other countries are far less systematic. Vaccinations could help prevent cervical cancer in 120,000 Indian women each year.
Policymakers are not powerless. More can be done to verify which treatments (and combinations thereof) work best. A £1.3bn ($2bn) cancer-drug fund in England, which made expensive new medicines easier to obtain, did not assess the efficacy of the drugs it provided—a huge missed opportunity. Measuring the incidence and survival of cancer, through cancer registries, spotlights where patients are being failed. Access to health care matters, too: the number of Americans whose cancers were diagnosed at the earliest possible opportunity went up after Obamacare was enacted. And prevention remains the best cure of all. Efforts to rein in tobacco use averted 22m deaths (many of them to cancer) between 2008 and 2014. Yet only a tenth of the world’s population lives in countries where taxes make up at least three-quarters of the price of cigarettes, as recommended by the World Health Organisation.
Taxes and budgeting are a lot less exciting than tumour-zapping proton beams and antibodies with superpowers. But the decisions of technocrats are as important as the work of technicians. Cancer kills millions of people not simply for want of scientific advance, but also because of bad policy.
(BBG) Over Humira’s lifetime, AbbVie has secured more than 100 patents to prevent anyone from attempting to copy the biologic, with $16 billion in annual sales.
Humira, a treatment for inflammatory diseases such as rheumatoid arthritis and psoriasis made by AbbVie Inc., is the planet’s best-selling drug. It’s also been around almost 15 years. Those two facts alone would normally have rival drugmakers eagerly circling, ready to roll out generic versions that could win a piece of the aging medicine’s $16 billion in annual sales. Yet last year, when the patent on Humira’s main ingredient expired, not a single competitor launched a copycat version. Figuring out how to manufacture it wasn’t the obstacle. The real challenge was the seemingly impregnable fortress of patents AbbVie has methodically constructed around its prized moneymaker.
(OBS) Maior distinção portuguesa na área da visão foi atribuída à Sightsavers e CBM. As duas entidades estão focadas em prevenir a cegueira, curá-la e apoiar quem vive com ela nos países em desenvolvimento.
Os vencedores do Prémio António Champalimaud de Visão de 2017, o maior prémio do mundo atribuído por uma instituição portuguesa na área da visão, já foram revelados: são a Sightsavers e a CBM, duas organizações que lutam há décadas contra a cegueira e os preconceitos relacionados. Ao Observador, ambas dizem-se “muito honradas” com a distinção que vale ainda um prémio de um milhão de euros. E explicaram-nos como têm atuado em alguns dos países mais pobres e mais conflituosos do mundo.
Há 11 anos que a Fundação Champalimaud reúne cientistas internacionais e figuras públicas envolvidas em projetos humanitários para galardoar quem mais luta contra os problemas na área da visão que se vivem nos países em vias de desenvolvimento. Este ano o prémio foi entregue a duas instituições que combatem há décadas as causas mais comuns dos problemas de cegueira em países como Nepal, Moçambique, Uganda, Etiópia ou Bangladesh. É que tanto a Sightsavers como a CBS combatem os estigmas relacionados com a cegueira nos países em vias de desenvolvimento e já permitiram que milhões de pessoas ficassem mais integradas na sociedade em que vivem.
O júri do Prémio é composto por Alfred Sommer (oftamologista e epidemologista), Paul Sieving(diretor do Instituto Nacional do Olho, nos Estados Unidos), Jacques Delors (antigo presidente da Comissão Europeia, um dos criadores da União Europeia e autor do Relatório para a UNESCO da Comissão Internacional sobre Educação para o Século XXI), Amartya Sen (escritor e economista, autor de muitas obras sobre a pobreza), Carla Shatz (neurocientista), Joshua Sane(investigador na área da biologia celular e molecular), Mark Bear(neurocientista), Gullapalli Rao(oftamologista), José Cunha-Vaz(professor catedrático, oftamologista e presidente da Associação para Investigação Biomédica e Inovação em Luz e Imagem), António Guterres(Secretário-Geral das Organização das Nações Unidas) e Susumu Tonegawa (Nobel da Medicina em 1987).
Aliar tratamento clínico e inclusão social
Com seis décadas de vida, a Sightsavers é uma instituição de solidariedade do Reino Unido que procura evitar a cegueira, restaurar a visão e defender a inclusão social e igualdade de direitos das pessoas com deficiências visuais em mais de 30 países em desenvolvimento. Em entrevista ao Observador, Izidine Hassane, diretor da instituição em Moçambique, explicou que essa missão é cumprida graças a parcerias com os ministérios de saúde e com marcas médicas presentes nos países onde estão: “Fazemos tudo dentro do modelo que os governos têm delineado para os cuidados de saúde dos países e adaptamos os nosso planos às especificidades políticas. Isso leva-nos ao nosso objetivo de fortalecer o sistema de saúde de forma permanente“, explica ao Observador. Apesar dos 67 anos de vida, a Sightsavers continua a enfrentar problemas, admite Izidine Hassane: “O nosso maior problema é a escassez de recursos, como dinheiro, infraestruturas, pessoal médico e medicamentos”, enumera. Para ultrapassar estes obstáculos, a Sightsavers procura promover os seus serviços e assim conquistar donativos de outras instituições.
Além dos cuidados médicos que tem providenciado às populações mais pobres do mundo, a Sightsavers também trabalha para a inclusão social das pessoas com deficiência. “Em alguns países, as famílias têm vergonha de ter um familiar deficiente e mantêm-no escondido. Noutros, eles são considerados inválidos e são marginalizados”, conta o responsável da entidade em Moçambique. Por isso, além de combater e eliminar doenças ou apoiar cirurgias nos casos mais urgentes, a Sightsavers organiza formações, utiliza a comunicação social, prepara campanhas de sensibilização e até peças de teatro para “mudar os comportamentos” nos países onde os deficientes são mais estigmatizados.
É graças a estas práticas que a Sightsavers se orgulha de ter realizado mais de 500 milhões de cirurgias — 6,6 milhões na área das cataratas — só desde 2016. Conseguiu ainda fazer o maior mapeamento conhecido do tracoma, uma doença tropical, em 29 países durante três anos. Mais: através da campanha “Juntando os Pontos”, conseguiu que 75% dos 400 mil deficientes formados por profissionais da Sightsavers fossem incluídos no mercado de trabalho.
Também a CBM, uma organização de desenvolvimento internacional cristão, está empenhada em melhorar a qualidade de pessoas deficientes nas comunidades mais pobres do mundo. Em 100 anos de experiência, a CBM já ajudou mais de 28 milhões de pessoas deficientes a terem uma participação mais ativa e inclusiva na vida social dos países onde vivem. Conseguiram-no trabalhando com parceiros locais que lhes ajudam a detetar problemas de visão nas populações de países mais pobres, tratando os mais leves e encaminhando os casos mais graves para hospitais.
Chegar a 39 milhões de pessoas cegas no mundo
Ao Observador, Babar Qureshi, diretor internacional para a saúde ocular inclusiva da instituição, a pobreza é “causa e consequência” da deficiência nos 59 países em vias de desenvolvimento em que a CBM trabalha, sempre em parceria com organizações da sociedade civil locais e nacionais. As pessoas mais pobres são quem mais dificuldade tem em aceder aos cuidados de saúde, que são demasiado caros. Também são elas quem menos sabem sobre como prevenir, detetar ou tratar os problemas de visão. “O que nós fazemos é tornar o sistema de saúde mais acessível para as pessoas mais pobres e marginalizadas. Depois, nos casos que já não têm cura, tentamos reabilitá-los na sociedade tornando-a mais inclusiva”, explica Babar Qureshi. Essa é uma longa caminhada, “principalmente nos países em conflito: de vez em quando temos de retirar-nos ou ser mais cuidadosos”, conta ao Observador. Ainda assim, em apenas um ano, a CBM ajudou mais de oito milhões de pessoas.
Com um milhão de euros nas mãos, o maior valor monetário do mundo a ser entregue como prémio a instituições desta natureza, os planos para este dinheiro já estão delineados. A Sightsavers pretende investir o dinheiro nas atividades que já tem em curso, nomeadamente nos cuidados de saúde que fornece e nos estudos científicos em que participa. A CBM quer usar o prémio para “chegar a mais pessoas e tornar o sistema de saúde mais forte“, conta ao Observador. É um passo em frente num mundo onde existem 39 milhões de pessoas cegas no mundo, 80% das quais podiam ser curadas ou cujos problemas podiam ter sido evitados.