Category Archives: Science

(BBG) The Key to Saving the Planet May Be Under the Sea

(BBG) Making carbon storage work is critical to fighting climate change. The question is where to put it all.

A Cold War-era joke has an American economist asking a Soviet peer how the communist economy is progressing. “In a word: good” the Russian responds. “In two words: not good.”

So it goes this century with the rapidly changing energy industry. Advances are taking place in clean energy, transport and efficiency that may have rightfully been considered miraculous a decade ago.

But here’s the catch: As fast as everything is proceeding, it’s still not fast enough. The International Energy Agency (IEA) reported last year that a critical technology—capturing carbon dioxide emissions from generators and either burying or otherwise disposing of them—isn’t expanding fast enough. The IEA reported that current “carbon capture and storage” (CCS) facilities are capable of handling just 7.5 percent of the emissions that the world will need eliminated every year by 2025. That’s necessary if nations are to meet the goal of keeping any increase in global warming below 2 degrees Celsius (3.6 Fahrenheit).

In China, researchers have been looking for ways to accelerate CCS. They decided to look out to sea.

On land, CCS isn’t just promising in principle—it’s been shown to work. There will be more than 20 large-scale capture facilities available by the end of the year, according to the Global CCS Institute. But there’s still concern about making sure the CO2, once buried, stays buried. The same can be said for the idea China has about burying CO2 at sea. For companies and countries to exploit the vastness of the ocean floor, they also need some kind of confidence that it’ll stay there.

By studying the long-term interactions of major physical forces in “unconsolidated marine sediment” such as loose silt, clay and other permeable stuff below the sea floor, researchers Yihua Teng and Dongxiao Zhang report that extreme conditions at the bottom of the ocean essentially hold CO2 in place, “which makes this option a safe storage.”

Under great pressure and low temperature, CO2 and water trapped in the sediment below the sea floor crystallize into a stable ice called hydrate. (Through a similar process, energy-rich methane freezes with water beneath the ocean and terrestrial permafrost, a potential source of energy being scrutinized by ChinaJapan, the U.S. and others.) The new paper on CCS demonstrates through simulation that the hydrates become an impermeable “cap” that keeps the CO2 below it from migrating back up to the sea floor.

Peking University’s carbon capture and storage research receives support from the multinational metals, mining, and petroleum company BHP Billiton Ltd., according to the paper.

The research appears this week in the journal Science Advances. The study should provide some confidence, they write, that ocean CO2 storage remains a viable tool in the push to reduce emissions of the most dangerous heat-trapping gas, even as commercialization of the process remains way off. In the meantime, there are other questions to answer, including how CO2 may behave differently under different kinds of geological conditions.

The big assumption, as with most underground CO2 storage scenarios, is that there’s no telling what the Earth’s living geology will do over the centuries and millennia. Fractures in the subsea sediment, either preexisting or created by tectonics or CO2 injection itself, could open a pathway for CO2 to escape—though significant uncertainty remains.

“In our assumption,” they write, “the unconsolidated marine sediment is intact.”

(BBG) Trump Wants ‘Space Force’ Added to Military as New U.S. Service

(BBG) President Donald Trump called for a new “Space Force” to be added to the U.S. military as an armed service separate from the Pentagon’s five traditional uniformed branches.

“When it comes to defending America, it is not enough to merely have an American presence in space,” Trump said Monday at a White House event on space policy. “We must have American dominance in space.”

Trump has been considering creation of a Space Force for months over resistance from the Air Force, which currently oversees military space programs. He announced his support for the idea at a White House meeting of the National Space Council as the administration presented a directive for setting a goal for a new moon landing within 10 years.

Congress would have to approve a new military service, and lawmakers have been divided on the idea.

Russia, China

Much of the push to formalize an off-planet branch of the U.S. armed forces is motivated by space investment by Russia and China, the latter of which is eager to establish itself as a superpower with plans for an orbiting space station and a permanent outpost on the moon.

Hold Your Rockets: Why Trump’s Space Force Could Take Years to Launch

Russia under President Vladimir Putin has become increasingly aggressive, annexing Crimea, deploying more sophisticated nuclear weapons and waging conventional warfare in eastern Ukraine and Syria. He, too, has aspirations for a military role in space.

Robotic Explorers

On peaceful space exploration, the administration announced a goal to send robotic explorers to the moon as early as next year and do another human lunar landing within 10 years.

The push could result in the first Americans stepping foot on the moon’s surface 55 years after doing so for the first time.

The directive also calls for better tracking and monitoring of space debris as commercial and civil space traffic increases.

The 1960s-era Apollo program to land U.S. astronauts on the moon was driven by President John F. Kennedy’s famous challenge and zealously funded by a Congress motivated by the Soviet Union’s perceived existential threat. That goal was achieved by the crew of Apollo 11 in 1969.

NASA’s current planning for Mars isn’t driven by any such urgency. The agency’s priorities tend to change depending on the administration: Under President George W. Bush, NASA was directed to return to the moon, while President Barack Obama set Mars as the longer-term priority. The Trump administration aims to do both, planning a lunar “gateway” orbiter and landings on the moon’s surface — with heavy assistance from commercial firms — and then using those outposts as a leaping-off point for Mars.

Bush proposed in 2004 sending robotic probes to the lunar surface by 2008, with a human mission as early as 2015, “with the goal of living and working there for increasingly extended periods of time.”

NASA estimated in 2005 that the Bush program to return to the moon, canceled by Obama, would cost $104 billion. The Trump administration didn’t immediately provide a cost estimate.

The Trump administration’s first crewed lunar gateway mission is planned for 2023 under NASA’s current plans, with humans heading to Mars in the 2030s.

(EP) El país con más mujeres científicas es… Portugal

(EPEl porcentaje de portuguesas en carreras de ciencias dobla al de la japonesas.

Una científica en un laboratorio.
Una científica en un laboratorio. SERGIO PÉREZ REUTERS

Sí, ya se sabía que el portugués es quien más corcho tiene del mundo y quien más vino bebe y más bacalao consume; pero lo que nadie calculaba en la tierra del fado es que Portugal fuera el país de las mujeres científicas o, al menos, el país con más mujeres estudiando carreras de ciencias.

Según el estudio de la OCDE The Pursuit of Gender Equality, el 57% de las portuguesas estudian ciencias, tecnologías, ingenierías y/o matemáticas; es el porcentaje más alto del mundo rompiendo todo tipo de estereotipos. Son 17 puntos más que en el Estados Unidos de Silicon Valley, 22 puntos más que en España o Dinamarca y más del doble que en Japón.

La presencia de mujeres en las carreras científicas no se debe a que los presupuestos del Estado derrochen el dinero en esta parte de la educación y aún menos en investigación. Aunque el objetivo del Gobierno es llegar al 1,5% del presupuesto nacional, apenas pasa del 0,8%.

La falta de dinero o de perspectivas no parece desanimar la afición de la portuguesa por las ciencias, aunque son más de ingenierías que de Tecnologías de la Información, donde aún el porcentaje de mujeres es ínfimo. Aún así, en ese terreno destaca Elvira Fortunato, de la Universidad Nova de Lisboa. Sus investigaciones de circuitos integrados sin silicio, es decir, chips de papel, le han valido hace unas semanas una subvención de 3,5 millones de euros del Consejo Europeo de Investigación para dedicar a proyectos de tecnologías amigas del medioambiente.

La encuesta de la Fundación Santos, Global Portuguese Scientist, que rastrea el número de científicos portugueses por el mundo, también certifica esa supremacía femenina. Desde hace años la diáspora científica portuguesa, presente en 50 países, tiene una mayoría femenina (50,3%).

Aunque una cosa son las licenciadas, otra las trabajadoras y muy otra los puestos en la dirección de empresas (en Estados Unidos son mujeres el 57% de los graduados pero solo el 6% de los directivos de empresas de S&P), en Portugal dos mujeres dirigen las dos mayores fundaciones científicas, la Champalimaud (Leonor Beleza) y el Instituto Gulbenkian de Ciencia (Mónica Bettencourt).

+++ V.I. (BBG) Stephen Hawking, Physicist Who Reshaped Cosmology, Dies at 76

(Bloomberg) — Stephen Hawking, the British physicist and
black-hole theorist who brought science to a mass audience with
the best-selling book “A Brief History of Time,” has died. He
was 76.
Hawking died peacefully at his home in Cambridge in England
in the early hours of Wednesday morning, a spokesman for his
family said in an emailed statement.
“We are deeply saddened that our beloved father passed away
today,” his children Lucy, Robert and Tim said in the statement.
“He was a great scientist and an extraordinary man whose work
and legacy will live on for many years. His courage and
persistence with his brilliance and humor inspired people across
the world. He once said, ‘It would not be much of a universe if
it wasn’t home to the people you love.’ We will miss him
forever.”
Hawking suffered from amyotrophic lateral sclerosis, also
known as Lou Gehrig’s disease, and was confined to an electric
wheelchair for much of his adult life. Diagnosed at age 21, he
was one of the world’s longest survivors of ALS.
A Cambridge University professor, Hawking redefined
cosmology by proposing that black holes emit radiation and later
evaporate. He also showed that the universe had a beginning by
describing how Albert Einstein’s theory of general relativity
eventually breaks down when time and space are traced back to
the Big Bang about 13.7 billion years ago.
“Stephen’s remarkable combination of boldness, vision,
insight and courage have enabled him to produce ideas that have
transformed our understanding of space and time, black holes and
the origin of the universe,” James Hartle, professor of physics
at the University of California, Santa Barbara, said in 2002.

Best-Seller

“A Brief History of Time,” first published in 1988, earned
its author worldwide acclaim, selling at least 10 million copies
in 40 languages and staying on the best-seller list of the
U.K.’s Sunday Times newspaper for a record 237 weeks.
Often referred to as “one of the most unread books of all
time” for the hard-to-grasp concepts, it included only one
equation: E = mc2 or the equivalence of mass and energy, deduced
by Einstein from his theory of special relativity. The book
outlined the basics of cosmology for the general reader.
Hawking’s fame increased as his health worsened. After his
degenerative muscle disorder was diagnosed, he defied medical
opinion by living five decades longer than expected. He
communicated his ideas through an American-accented speech
synthesizer after a life-saving tracheotomy in 1985 took away
his ability to speak. To the layman, the robot-like voice only
seemed to give his words added authority.
“To my colleagues, I’m just another physicist, but to the
wider public, I became possibly the best-known scientist in the
world,” Hawking wrote in his 2013 memoir “My Brief History.”
“This is partly because scientists, apart from Einstein, are not
widely known rock stars, and partly because I fit the stereotype
of a disabled genius.”

Black Holes

Hawking applied quantum theory — governing the subatomic
world — to black holes, which he claimed discharge radiation
that causes them to disappear. This process helps explain the
notion that black holes have existed at a micro level since the
Big Bang, and the smaller they are, the faster they evaporate.
Black holes are formed when a massive star collapses under
the weight of its own gravity. Detected by the movement of
surrounding matter, they devour everything in their path and may
play a role in the birth of galaxies. Physicists say these
invisible cosmic vacuums might allow travel through time and
space via “wormholes,” a favorite of science-fiction writers.
With mathematician Roger Penrose, Hawking used Einstein’s
theory of relativity to trace the origins of time and space to a
single point of zero size and infinite density. Their work gave
mathematical expression to the Big Bang theory, proposed by
Belgian priest Georges Lemaitre in 1927 and supported two years
later by Edwin Hubble’s discovery that the universe is
expanding.
With Hartle, Hawking later tried to marry relativity with
quantum theory by proposing the no-boundary principle, which
held that space-time is finite and the laws of physics
determined how the universe began in a self-contained system,
without the need for a creator or prior cause.

‘Profound Impact’

The Nobel Prize in Physics proved elusive for Hawking,
whose theories required observational data to win the praise of
the awarding committee in Stockholm. The Nobel Foundation
excludes posthumous nominees.
“By any reasonable standard, Stephen Hawking is a great
scientist. Even if time shows some of his more radical proposals
to be incorrect, Hawking will have had a profound impact on the
history of science,” Henry F. Schaefer III, a chemistry
professor at the University of Georgia, said in a 2001 lecture.
Stephen William Hawking was born in Oxford, England, on
Jan. 8, 1942, exactly 300 years after the death of Italian
physicist Galileo Galilei. Hawking’s father, Frank, was a doctor
of tropical medicine. His mother, Isobel, was a tax inspector
and a secretary. He had two younger sisters and a brother.
At age 8, Hawking moved with his family to St. Albans,
where he went to school. He then graduated with first-class
honors in natural science at Oxford’s University College. While
he was a doctoral candidate at Cambridge, Hawking was diagnosed
with ALS, also known as motor neuron disease. He was told he had
only a few years to live.

Universe’s Origin

As the illness progressed slower than expected and he found
inspiration in his girlfriend, Jane Wilde, Hawking began to work
at his studies for the first time. He completed his doctorate on
the origins of the universe, became a research fellow at Caius
College and married Wilde in 1965.
In 1970, Hawking realized the mathematical approaches he
developed with Penrose could be applied to black holes, a term
coined by physicist John Wheeler. Hawking worked for the next
four years on black holes, discovering they weren’t totally
black, but leaked radiation, now known as “Hawking radiation.”
For 30 years, Hawking was Cambridge’s Lucasian professor of
mathematics, a chair once held by Isaac Newton. U.S. President
Barack Obama awarded the Presidential Medal of Freedom to
Hawking in 2009, the year of his retirement.
His other popular books included “The Universe in a
Nutshell” (2001), “On the Shoulders of Giants” (2002), “A
Briefer History of Time” (2005) and “The Grand Design” (2010).
In 2015, Eddie Redmayne won an Oscar for his portrayal of
Hawking in “The Theory of Everything,” a film about the
scientist’s life.
Hawking separated from his wife in 1991 and married his
nurse, Elaine Mason, four years later. They divorced in 2007.
By 2017 Hawking was spending more time pondering humanity’s
future and concluding that we should plan to colonize other
planets. “We are running out of space, and the only place we can
go to are other worlds,” he told a gathering of scientists. “‘It
is time to explore other solar systems. Spreading out may be the
only thing that saves us from ourselves. I am convinced that
humans need to leave Earth.”

(SkyNews) Scientists predict ‘mini ice age’ could hit UK by 2030

(SkyNews) A model of the Sun’s magnetic activity suggests the River Thames may freeze over within two decades, experts say.

An illustration of the River Thames freezing over. Pic: Museum of London
Image:Frost fairs were once held on the River Thames. Pic: Museum of London 

 

A mini ice age that would freeze major rivers could hit Britain in less than two decades, according to research from universities in the UK and Russia.

A mathematical model of the Sun’s magnetic activity suggests temperatures could start dropping here from 2021, with the potential for winter skating on the River Thames by 2030.

A team led by maths professor Valentina Zharkova at Northumbria University built on work from Moscow to predict the movements of two magnetic waves produced by the Sun.

It predicts rapidly decreasing magnetic waves for three solar cycles beginning in 2021 and lasting 33 years.

A mathematical model of the Sun's magnetic activity suggests temperatures in Britain could start dropping from 2021, with the potential for winter skating on the Thames by 2030.
Image:Experts measured the Sun’s magnetic waves

Very low magnetic activity on the Sun correspond with historically documented cold periods on Earth.

Professor Zharkova claims 97% accuracy for the model which dovetails with previous mini ice ages, including the Maunder Minimum period from 1645 to 1715 when frost fairs were held on the frozen Thames.

But she cautions that her mathematical research cannot be used as proof that there will be a mini ice age this time around, not least because of global warming.

An illustration of the River Thames freezing over. Pic: Museum of London
Image:The River Thames froze over in the 17th century. Pic: Museum of London

“I hope global warning will be overridden by this effect, giving humankind and the Earth 30 years to sort out our pollution,” she said.

But Professor Zharkova warned that any downward impact on global warming will last only until the Sun’s two magnetic waves become active again in the 2050s.

“We have to be sorted by that time and prepare everything on Earth for the next big solar activity,” she said.

(MIT) Tiny Human Brains Inside Rats Are Sparking Ethical Concern

(MIT) Clusters of human brain cells can integrate into rat brains, and that’s raising concerns about giving animals some form of human consciousness.

(PUB) O alerta de António Damásio para a bancarrota espiritual e moral das sociedades

(PUB) O neurocientista tem um novo livro, A Estranha Ordem das Coisas – A Vida, os Sentimentos e as Culturas Humanas, que chega sexta-feira, 3 de Novembro, às livrarias portuguesas. Este é o excerto de um capítulo intitulado “A crise”. No domingo publicaremos uma entrevista ao cientista português.

Junto à margem do mar da Galileia, numa manhã de Inverno cheia de sol, a poucos passos da sinagoga de Cafarnaum onde Jesus de Nazaré falou aos seus seguidores, penso nos problemas longínquos do Império Romano mas sobretudo na crise actual da condição humana. É uma crise curiosa, pois embora as condições locais sejam distintas em cada ponto do mundo onde ocorre, as respostas que a definem são semelhantes, marcadas pela zanga, fúria e confronto violento, a par de apelos ao isolamento dos países e de uma preferência por governação autocrática.

Mas a crise é sobretudo decepcionante, pois não devia de todo estar a acontecer. Seria de esperar que pelo menos as sociedades mais avançadas tivessem ficado imunizadas pelos horrores da Segunda Guerra Mundial e pelas ameaças da Guerra Fria, e que tivessem encontrado maneiras de ultrapassar, de modo gradual e pacífico, quaisquer dos problemas que as culturas complexas necessariamente enfrentam. Pensando bem, deveríamos ter sido menos complacentes.

Parte das sociedades que celebram a ciência e a tecnologia modernas, e que mais lucram com elas, parece estar numa situação de bancarrota “espiritual”, tanto no sentido secular como religioso do termo. A julgar pela aceitação despreocupada das crises financeiras problemáticas – a bolha da Internet de 2000, os abusos hipotecários de 2007 e o colapso bancário de 2008 – parecem igualmente estar numa situação de bancarrota moral
Os tempos em que vivemos poderiam ser a melhor das épocas para se estar vivo, porque estamos rodeados por descobertas científicas espectaculares e por um brilho técnico que tornam a vida cada vez mais confortável e conveniente; porque a quantidade de conhecimentos disponível e a facilidade de acesso a esses conhecimentos nunca foram tão elevadas, acontecendo o mesmo em relação à interligação humana a uma escala planetária, como se prova pelas viagens, pela comunicação electrónica e pelos acordos internacionais sobre todos os tipos de cooperação científica, artística e comercial; porque a capacidade de diagnóstico, gestão e até cura de doenças continua a aumentar e a longevidade continua a prolongar-se de tal forma que se espera que os seres humanos nascidos após o ano 2000 possam viver, e bem, segundo se espera, até uma média de 100 anos. Em breve seremos conduzidos por veículos robotizados que nos poupam esforço e vidas, pois, a certa altura, deveremos ter menos acidentes fatais.

No entanto, para considerar os nossos dias como sendo os melhores de sempre seria preciso que estivéssemos muito distraídos, já para não dizer indiferentes ao drama dos restantes seres humanos que vivem na miséria. Embora a literacia científica e técnica nunca tenha estado tão desenvolvida, o público dedica muito pouco tempo à leitura de romances ou de poesia, que continuam a ser a forma mais garantida e recompensadora de penetrar na comédia e no drama da existência, e de ter oportunidade de reflectir sobre aquilo que somos ou que podemos vir a ser. Ao que parece, não há tempo a perder com a questão pouco lucrativa de, pura e simplesmente, “ser”. Parte das sociedades que celebram a ciência e a tecnologia modernas, e que mais lucram com elas, parece estar numa situação de bancarrota “espiritual”, tanto no sentido secular como religioso do termo. A julgar pela aceitação despreocupada das crises financeiras problemáticas – a bolha da Internet de 2000, os abusos hipotecários de 2007 e o colapso bancário de 2008 – parecem igualmente estar numa situação de bancarrota moral. Curiosamente, ou talvez não tanto, o nível de felicidade nas sociedades que mais beneficiaram com os espantosos progressos do nosso tempo mantém-se estável ou em declínio, caso possamos confiar nas respectivas avaliações.

Ao longo das últimas quatro ou cinco décadas, o grande público das sociedades mais avançadas aceitou, com pouca ou nenhuma resistência, o tratamento cada vez mais deformado das notícias e das questões públicas concebidas para se enquadrarem no modelo de entretenimento da televisão e da rádio comerciais. As sociedades menos avançadas não têm demorado a imitar essa atitude. A conversão de quase todos os “media” de interesse público ao modelo lucrativo de negócios veio reduzir ainda mais a qualidade da informação. Embora uma sociedade viável deva preocupar-se com a forma como o Governo promove o bem-estar dos cidadãos, a noção de que se deve proceder a uma pausa diária de alguns minutos e fazer um esforço para se ficar a par das dificuldades e dos êxitos dos governos e dos cidadãos não só se tornou antiquada, como quase desapareceu. Quanto à noção de que devemos aprender algo sobre essas questões com seriedade e respeito, ela é, hoje em dia, um conceito estranho. A rádio e a televisão transformam cada questão governativa numa “história”, com a “forma” e o valor de entretenimento dessa história a contarem mais do que o seu conteúdo factual. Quando, em 1985, Neil Postman escreveu o seu livro Amusing Ourselves to Death: Public Discourse in the Age of Show Business, ele fez um diagnóstico correcto, mas nem sonhava que sofreríamos tanto antes de morrer. O problema agravou-se com a redução de fundos para a educação pública e com o declínio previsível da preparação de cidadãos, e, no caso dos Estados Unidos, piorou com o repúdio, em 1987, da Fairness Doctrine, que desde 1949 requeria um tratamento equilibrado dos comentários políticos. O resultado, intensificado pelo declínio dos jornais impressos e pela ascensão e domínio quase absoluto por parte da comunicação digital e da televisão, é a carência profunda de conhecimentos pormenorizados e não-partidários dos assuntos públicos, a par do abandono gradual das práticas da reflexão ponderada e do discernimento sobre os factos. É preciso ter o cuidado de não exagerar a nostalgia por um tempo que nunca existiu por completo. Nem todo o público estaria seriamente informado, reflexivo e exigente. Nem todos os cidadãos tinham reverência pela verdade e pela nobreza de espírito, já para não falar de reverência pela vida. Não obstante, o presente colapso da consciência pública séria é problemático. As sociedades humanas encontram-se previsivelmente fragmentadas segundo uma variedade de medidas, como literacia, nível de habilitações, comportamento cívico, aspirações espirituais, liberdade de expressão, acesso à justiça, estatuto económico, saúde e segurança ambiental. Dadas as circunstâncias, torna-se mais difícil do que jamais foi encorajar o público a promover e a defender uma lista de valores, direitos e obrigações que não sejam negociáveis.

Dado o espantoso progresso dos novos media, o público tem a oportunidade de ficar a saber com mais pormenores do que nunca os factos por detrás das economias, o estado dos governos locais e globais, e o estado das sociedades em que vive, algo que, sem qualquer dúvida, se trata de uma vantagem que confere poder real; para além disso, a Internet fornece meios de deliberação fora das tradicionais instituições comerciais ou governamentais, outra vantagem potencial. Por outro lado, em geral, o público não dispõe nem de tempo nem de método para converter as quantidades imensas de informação em conclusões razoáveis e de uso prático. Além disso, as empresas que geram a distribuição e a agregação de informação ajudam o público de forma dúbia: o fluxo de informação é orientado por algoritmos da empresa que, por sua vez, influenciam a apresentação, de modo a adequar-se a uma variedade de interesses financeiros, políticos e sociais, a par do gosto dos utilizadores, para que estes possam continuar fechados no silo de opiniões que os entretêm.

O acesso ilimitado à informação privada está a ser usado para criar escândalos embaraçosos, mesmo que o tema da vigilância não seja de natureza criminosa. O resultado é o silêncio dos candidatos políticos, para que eles e as suas campanhas políticas não sejam destruídos por revelações pessoais. Isso tornou-se um factor importante na governação pública.
Reconheça-se, a bem da verdade, que as vozes sábias do passado – as vozes dos experientes e judiciosos editores de jornais, de programas de televisão e de rádio – não eram completamente imparciais, favorecendo visões específicas quanto ao funcionamento das sociedades. Todavia, na maior parte dos casos, essas visões concretas identificavam-se com perspectivas filosóficas ou sociopolíticas específicas, às quais cada um podia resistir ou apoiar. Hoje em dia, o grande público não tem essa oportunidade. Cada um de nós tem acesso directo ao mundo através do seu dispositivo portátil, e é encorajado a maximizar a sua autonomia. Não há grande incentivo para debater, e muito menos aceitar opiniões divergentes.

O novo mundo da comunicação é uma bênção para os cidadãos treinados a pensar de forma crítica e informada sobre a História. Mas qual a sorte dos cidadãos que foram seduzidos por um modelo de vida como diversão e comércio? Em grande medida, foram formados por um mundo em que a provocação emocional negativa é a regra e não a excepção, e onde as melhores soluções para um problema passam, em primeiro lugar, por interesses próprios e de curto prazo. Poderemos censurá-los?

A disponibilidade generalizada de comunicação abundante e quase instantânea de informação pública e pessoal, um óbvio benefício, reduz, paradoxalmente, o tempo necessário para a reflexão sobre essa mesma informação. A gestão do fluxo de conhecimento disponível obriga, frequentemente, a uma rápida classificação de factos como sendo bons ou maus, agradáveis ou não. Isto contribui, porventura, para um aumento de opiniões polarizadas quanto a acontecimentos sociais e políticos. A exaustão provocada pelo excesso de factos recomenda uma fuga para as crenças e as opiniões pré-definidas, em geral as do grupo a que o indivíduo pertence. Isto agrava-se pelo facto de tendermos naturalmente a resistir à mudança de opinião, pese embora a disponibilidade de provas em contrário, e por mais inteligentes e informados que sejamos.

Trabalhos realizados pelo nosso instituto [Instituto do Cérebro e da Criatividade na Universidade da Califórnia do Sul, EUA] mostram que isso é verdade em relação a crenças políticas, mas imagino que também se aplique a uma grande variedade de crenças, desde a religião e a justiça à estética. O nosso trabalho mostra que a resistência à mudança está associada à relação conflituosa entre sistemas cerebrais relacionados com a emotividade e a razão. A resistência à mudança está associada, por exemplo, à activação de sistemas responsáveis pela produção de zanga e fúria. Criamos uma espécie de refúgio natural para nos defendermos contra a informação contraditória. Por todo o mundo os eleitores descontentes recusam-se a comparecer nas urnas. Com tal clima, a disseminação de notícias falsas e de pós-verdades fica facilitada. O mundo distópico que George Orwell em tempos descreveu, tendo a União Soviética como modelo, corresponde agora a uma situação sociopolítica diferente. A velocidade das comunicações e a resultante aceleração do ritmo de vida são igualmente possíveis contribuidores para o declínio da civilidade, identificável na impaciência do discurso público e na crescente grosseria da vida urbana.

Uma questão separada, mas importante, que continua a ser menosprezada é a natureza viciante dos media electrónicos, desde as simples comunicações por email às redes sociais. O vício desvia tempo e atenção da experiência imediata do ambiente que nos rodeia para uma experiência mediada por uma grande variedade de dispositivos electrónicos. O vício aumenta o desenquadramento entre o volume de informação e o tempo necessário para a processar.

Sem educação, os homens “vão matar-se uns aos outros”, diz António Damásio
António Damásio: o neurocientista põe a mão na consciência
A quebra de privacidade que acompanha o uso universal da Web e das redes sociais garante a monitorização de cada gesto e ideia humana. Todos os tipos de vigilância, desde a necessária por motivos de segurança pública até àquela que é intrusiva e mesmo abusiva, são agora uma realidade, praticados, tanto pelo Governo como pelo sector privado, com total impunidade. A vigilância faz com que a espionagem, até mesmo a espionagem das superpotências, uma actividade estabelecida que nos acompanha desde há milénios, pareça honrada e infantil. Até encontramos vigilância à venda, por lucros elevados, pela mão de uma série de empresas tecnológicas. O acesso ilimitado à informação privada está a ser usado para criar escândalos embaraçosos, mesmo que o tema da vigilância não seja de natureza criminosa. O resultado é o silêncio dos candidatos políticos, para que eles e as suas campanhas políticas não sejam destruídos por revelações pessoais. Isso tornou-se um factor importante na governação pública. Em sectores vastos das regiões mais tecnologicamente avançadas do mundo há escândalos de todas as dimensões que influenciam resultados eleitorais e fortalecem a desconfiança do público em relação às instituições políticas e às elites profissionais. Sociedades que já se debatiam com grandes problemas de desigualdade de riqueza e de deslocações humanas devido ao desemprego e às guerras tornaram-se quase ingovernáveis. Os eleitorados desorientados recordam com nostalgia passados há muito desaparecidos e miticamente melhores, ou, como alternativa, revelam uma revolta profunda. A nostalgia, no entanto, é deslocada, e a fúria, em geral, é mal dirigida. Tais reacções reflectem uma compreensão limitada da miríade de factos apresentados pelos vários órgãos de comunicação social, factos concebidos sobretudo para entreter, promover determinados interesses sociais, políticos e comerciais, e obter grandes recompensas financeiras com isso.

Nota-se uma tensão crescente entre o poder de um público vasto que parece mais bem informado do que nunca, mas que não dispõe do tempo ou dos instrumentos para julgar e interpretar a informação, e o poder das empresas e dos governos que controlam a informação e sabem tudo o que há para saber acerca desse mesmo público. Como sanar o conflito resultante? Há também riscos notáveis a considerar. A possibilidade de conflitos catastróficos que envolvam armas nucleares e biológicas representam riscos reais e possivelmente mais elevados agora do que quando essas armas eram controladas pelas potências da Guerra Fria; os riscos do terrorismo e o novo risco da guerra cibernética também são reais, bem como o risco das infecções resistentes a antibióticos. Podemos culpar a modernidade, a globalização, a desigualdade da riqueza, o desemprego, a educação a menos, o entretenimento a mais, a diversidade, e a rapidez e ubiquidade radicalmente paralisantes das comunicações digitais, mas atribuir culpas não reduz os riscos, de imediato, nem resolve o problema das sociedades ingovernáveis, sejam quais forem as causas.

(GUA) Why can’t we cure the common cold?

(GUA) After thousands of years of failure, some scientists believe a breakthrough might finally be in sight. By

The common cold has the twin distinction of being both the world’s most widespread infectious disease and one of the most elusive. The name is a problem, for starters. In almost every Indo-European language, one of the words for the disease relates to low temperature, yet experiments have shown that low temperature neither increases the likelihood of catching a cold, nor the severity of symptoms. Then there is the “common” part, which seems to imply that there is a single, indiscriminate pathogen at large. In reality, more than 200 viruses provoke cold-like illness, each one deploying its own peculiar chemical and genetic strategy to evade the body’s defences.

It is hard to think of another disease that inspires the same level of collective resignation. The common cold slinks through homes and schools, towns and cities, making people miserable for a few days without warranting much afterthought. Adults suffer an average of between two and four colds each year, and children up to 10, and we have come to accept this as an inevitable part of life.

Public understanding remains a jumble of folklore and false assumption. In 1984, researchers at the University of Wisconsin-Madison decided to investigate one of the best-known ways of catching a cold. They infected volunteers with a cold virus and instructed them to kiss healthy test subjects on the mouth for at least one minute. (The instruction for participants was to use whichever technique was “most natural”.) Sixteen healthy volunteers were kissed by people with colds. The result: just one confirmed infection.

The most common beliefs about how to treat the disease have turned out to be false. Dubious efficacy has done little to deter humankind from formulating remedies. The Ebers Papyrus, a medical document from ancient Egypt dated to 1550BC, advises a cold sufferer to recite an incantation, “in association with the administration of milk of one who has borne a male child, and fragrant gum”. In 1924, US President Calvin Coolidge sat down in an airtight chlorine chamber and inhaled the pungent, noxious gas for almost an hour on the advice of his physicians, who were certain that his cold would be cured quickly. (It wasn’t.)

Today, “winter remedy” sales in the UK reach £300m each year, though most over-the-counter products have not actually been proven to work. Some contain paracetamol, an effective analgesic, but the dosage is often sub-optimal. Taking vitamin C in regular doses does little to ward off disease. Hot toddies, medicated tissues and immune system “boosts” of echinacea or ginger are ineffective. Antibiotics do nothing for colds. The only failsafe means of avoiding a cold is to live in complete isolation from the rest of humanity.

Although modern science has changed the way medicine is practised in almost every field, it has so far failed to produce any radically new treatments for colds. The difficulty is that while all colds feel much the same, from a biological perspective the only common feature of the various viruses that cause colds is that they have adapted to enter and damage the cells that line the respiratory tract. Otherwise, they belong to quite different categories of organisms, each with a distinct way of infecting our cells. This makes a catch-all treatment extremely tricky to formulate.

Scientists today identify seven virus families that cause the majority of colds: rhinovirus, coronavirus, influenza and parainfluenza virus, adenovirus, respiratory syncytial virus (RSV) and, finally, metapneumovirus, which was first isolated in 2001. Each has a branch of sub-viruses, known as serotypes, of which there are about 200. Rhinovirus, the smallest cold pathogen by size, is by far the most prevalent, causing up to three-quarters of colds in adults. To vanquish the cold we will need to tackle all of these different families of virus at some stage. But, for now, rhinovirus is the biggest player.

Scientists first attempted to make a rhinovirus vaccine in the 1950s. They used a reliable method, pioneered by French biologist Louis Pasteur in the 1880s, in which a small amount of virus is introduced to a host in order to provoke a defensive immunological reaction that then protects the body from subsequent infection. Even so, those who had been vaccinated caught colds just as easily as those who had not.

Over the next decade, as the techniques for isolating cold viruses were refined, it became clear that there were many more rhinoviruses than first predicted. Researchers realised it would not be possible to make a vaccine in the traditional way. Producing dozens of single-serotype vaccines, each one targeting a different strain, would be impractical. The consensus that a rhinovirus vaccine was not possible deepened. The last human clinical trial took place in 1975.

Then, in January last year, an editorial appeared in the Expert Review of Vaccines that once again raised the prospect of a vaccine. The article was co-authored by a group of the world’s leading respiratory disease specialists based at Imperial College London. It was worded cautiously, yet the claim it made was striking. “Perhaps the quest for an RV [rhinovirus] vaccine has been dismissed as too difficult or even impossible,” it said, “but new developments suggest that it may be feasible to generate a significant breadth of immune protection.” The scientists were claiming to be on the way to solving a riddle that has stumped virologists for decades. One virologist told me it was as if a door that had been closed for many, many years had been re-opened.

Part of the Imperial scientists’ motivation was the notion that since we now have vaccines for many of the most dangerous viruses (measles, polio, yellow fever, cholera, influenza, and so on), it is time to tackle the disease that afflicts us most often. “Rhinovirus is by far the most common cause of illness,” says Sebastian Johnston, a professor at Imperial and one of the authors of the editorial. “Look at what people spend on ineffective over-the-counter medications. If you had a safe and effective treatment, you’d take it.”

I asked Johnston if he was optimistic. He pointed out that because their studies so far have only been in mice, they are not sure that the vaccine will work in humans. “The data is limited,” he says. “But it’s encouraging.” It was not the resounding triumphalism that I was expecting, but then cold scientists learned long ago to be careful about making grand proclamations. Theirs is an undertaking that, more than anything, has been defined by consistent disappointment.


The first scientist to try and fail to make a rhinovirus vaccine was also the first scientist to distinguish it from the jumble of other cold viruses. In 1953, an epidemiologist called Winston Price was working at Johns Hopkins University in Baltimore when a group of nurses in his department came down with a mild fever, a cough, sore throat and runny nose – symptoms that suggested the flu. Price took nasal washings from the nurses and grew their virus in a cell culture. What he found was too small to be influenza virus. In a 1957 paper, “The isolation of a new virus associated with respiratory clinical disease in humans”, Price initially named his discovery “JH virus”, after his employer.

Price decided to try to develop a vaccine using a bit of dead rhinovirus. When the immune system encounters an invading virus – even a dead or weakened virus – it sets out to expel it. One defence is the production of antibodies, small proteins that hang around in the blood system long after the virus is gone. If the virus is encountered a second time, the antibodies will swiftly recognise it and raise the alarm, giving the immune system the upper hand.

At first, Price was encouraged. In a trial that involved several hundred people, those vaccinated with JH virus had eight times fewer colds than the unvaccinated. Newspapers across the US wanted to know: had the common cold been cured? “The telephone by my bed kept ringing until 3 o’clock in the morning,” Price told the New York Times in November 1957. The celebration would be short-lived. Though Price’s vaccine was effective against his particular “JH” rhinovirus strain, in subsequent experiments it did nothing. This indicated that more than one rhinovirus was out there.

By the late 1960s, dozens of rhinoviruses had been discovered. Even in the alien menagerie of respiratory disease, this level of variation in one species was unusual; there are just three or four influenza viruses circulating at any one time. Scientists at the University of Virginia decided to try a different tactic. Instead of inoculating patients with a single strain of rhinovirus, they combined 10 different serotypes in one injection. But after this, too, failed to shield participants from infection, they were out of ideas.

As hope for a vaccine receded, scientists began investigating other ways to combat colds. From 1946 until it closed in 1990, most research into respiratory viruses in the UK was undertaken at the Common Cold Unit (CCU), a facility backed by the Medical Research Council that occupied a former wartime military hospital in the countryside near Salisbury. In its four decades of operation, some 20,000 volunteers passed through the doors of the CCU, many to be willingly infected with cold virus in the name of scientific progress.

An early experiment at the CCU involved a group of volunteers being made to take a bath and then to stand dripping wet and shivering in a corridor for 30 minutes. After they were allowed to get dressed, they had to wear wet socks for several hours. Despite a drop in body temperature, the group did not get any more colds than a control group of volunteers who had been kept cosy.

Illustration by Nathalie Lees
Illustration: Nathalie Lees

The CCU began focusing on cold treatments in the 1960s and 70s, when research into a substance produced by the human body called interferon was gaining momentum. Interferons are proteins that are secreted by cells when they are attacked by a virus. They act as messengers, alerting nearby cells to the invader. These cells in turn produce an antiviral protein that inhibits, or interferes with, the virus’s ability to spread, hence the name.

In 1972, researchers at the CCU decided to investigate whether interferon could be used as a treatment for colds. They infected 32 volunteers with rhinovirus and then sprayed either interferon or placebo up their noses. Of the 16 given a placebo, 13 came down with colds. But of the 16 given interferon, only three got ill. The findings, published in The Lancet, made the front page of the New York Times (below a story on Watergate). A rush of interferon research got underway. But, once again, the excitement was premature. A review by the CCU in the 1980s uncovered a fatal flaw: interferon only worked when it was given to the patient at the same time as the virus. But in real life – that is, outside the lab – a rhinovirus enters the nose between eight and 48 hours before the onset of cold symptoms. By the time you feel a cold coming on, it is already too late.

As the 20th century drew to a close, attempts to find a cure grew more desperate. At the CCU, molecules that were found in traditional Chinese medicine, Japanese tea and oranges were all seriously interrogated. In 1990, the CCU closed. The centre had done much to advance our understanding of the virology of the cold, yet it had also exposed the enormity of the task of defeating it.

In the 1990s, as many virologists focused on HIV and Aids, research into the cold tailed off. “Common acute respiratory infections were seen as less important compared with this threat of a worldwide, lethal plague,” writes David Tyrrell, the former director of the CCU, in his 2002 book Cold Wars. A cure seemed more remote than ever.


Sebastian Johnston’s lab is on the third floor of the School of Medicine, part of Imperial College’s St Mary’s Hospital campus in Paddington, west London. Opened in 1851, the original hospital building is red-brick, with high ceilings, arched colonnades and turrets, but numerous extensions, each progressively more box-like, now hem it in. A round blue plaque on the facade states that Sir Alexander Fleming (1881-1955) discovered penicillin in a second-storey room. Entry to a recreation of Fleming’s lab is £4.

Johnston, a professor of respiratory medicine and an asthma specialist, is 58 and bespectacled, with a mop of grey curls that form a peak on his forehead. As a PhD student in 1989, he was dispatched to the CCU, not long before it closed down, to study virus detection methods. “I spent six months there,” Johnston said. “It was a strange place, basically a bunch of nissen huts connected by wooden runways, with lots of rabbits.”

For his PhD on asthma, Johnston developed a technique called polymerase chain reaction, which magnifies DNA so that viruses can be identified more precisely. To his amazement, Johnston discovered that viruses were behind 85% of asthma attacks in children; about half of those were rhinoviruses. Previously, most studies had detected viruses in fewer than 20% of asthma attacks. Johnston went on to find that rhinovirus also exacerbates symptoms in 95% of cases of smoker’s cough (formally known as chronic obstructive pulmonary disease, or COPD).

It wasn’t until the 1990s that scientists fighting rhinovirus properly understood what they were up against. By that time, electron microscopy had advanced and it was possible to see the organism up close. For a pathogen so spectacularly good at infecting our nasal passages – the “rhin” of the name is from the Greek for “nose” – rhinoviruses are astonishingly simple, being little more than strands of ribonucleic acid (RNA) surrounded by a shell: “a piece of bad news wrapped in a protein coat”, as the Nobel Prize-winning biologist Peter Medawar once observed. Under an electron microscope, they are spherical with a shaggy surface like the bobble on a knitted hat.

Though all the rhinoviruses are pretty much the same internally, a subtle alteration to the pattern of proteins on their outer shell means that, to the immune system, they all look different. It’s a cloak-and-dagger strategy, and the reason why early vaccines such as Winston Price’s failed. Antibodies produced for one rhinovirus serotype do not detect the rest. Until recently, it was believed that there were around 100 different strains, and these were grouped into the “A” and “B” families. Then, in 2007, a new cache of viruses was discovered, the “C” group, making the total more like 160.

A CGI image of the human rhinovirus
A computer-generated image of the human rhinovirus. Photograph: Alamy

In 2003, Johnston, who was then working at Imperial, contacted Jeffrey Almond, a former professor of virology at Reading University who had been recently appointed as head of vaccine development at the pharmaceutical giant Sanofi. The company was already manufacturing a jab for influenza and was interested in tackling the common cold. Having bumped into Johnston at academic conferences, Almond felt that their ambitions were aligned. “I said: ‘Let’s think about whether we can do something dramatic,’” Almond told me. “Let’s think about how we can make a vaccine against rhino.”

For doctors, vaccines are preferable to drugs because they shield the host from invasive organisms before they cause any damage. For pharmaceutical companies, vaccines are significantly less attractive. Not only do they take years and hundreds of millions of dollars to develop, even if that process is successful – which it often isn’t – it can still be hard to make much money. Vaccines are usually injections administered on a single occasion, while drugs are taken for prolonged periods. And people don’t want to pay much for vaccines. “Everybody wants vaccines for pennies rather than pounds because you get them when you’re healthy,” Almond said. “Nobody wants to pay anything when they’re healthy. It’s like car insurance, right? But when you’re sick you will empty your wallet, whatever it takes.”

Still, Almond thought there might be a commercial case for a rhinovirus vaccine. Totting up the days off school and work, plus the secondary infections such as sinusitis that require supplementary treatment and even hospitalisation, rhinovirus places a huge burden on health systems. Last year, in the UK, coughs and colds accounted for almost a quarter of the total number of days lost to sickness, about 34m. In the US, a survey carried out in 2002 calculated that each cold experienced by an adult causes an average loss of 8.7 working hours, while a further 1.2 hours are lost attending to cold-ridden children, making the total cost of lost productivity almost $25bn (£19bn) each year. Almond convinced his bosses that, if it were possible to make one, a rhinovirus vaccination would be financially viable. “Our back-of-the-envelope calculations on what we could charge, and what the numbers of sales could be, mean that it’s likely to be quite profitable and quite interesting for a company to develop,” Almond says.

Reviewing the approaches taken in the 1960s and 70s, Almond and Johnston dismissed the idea of a mega-vaccine of all the 160 rhinovirus serotypes, believing it would be too heavy, too complex and too expensive to make. They wondered instead if there was a tiny part of the structure of viruses that is identical, or “conserved”, across the entire species that could form the basis of what is called a subunit vaccine, an approach that has had success with hepatitis B and the human papilloma virus, or HPV.

After comparing the genetic sequences of the different rhinovirus serotypes, the researchers honed in on a particular protein on the virus shell that seemed to recur across many of the serotypes. They took a piece of the conserved shell from a single rhinovirus, number 16, and mixed it with an adjuvant – a stimulus that mimics the danger signals that trigger an immune response – and injected it into mice as a vaccine. The hope was that the immune system would be jolted into recognising the shell protein as an invasive pathogen, conferring immunity against the entire rhinovirus family.

In petri dishes, the scientists mixed the immunised mouse blood with three other rhinovirus serotypes, numbers 1, 14 and 29. An immunological response to rhinovirus 1 was likely because its genetic sequence is similar to 16, but serotypes 14 and 29 are unalike. The mice’s white blood cells responded vigorously against all three strains. “Seeing responses against those two [different serotypes] was very encouraging,” Johnston said. This gave hope that the vaccine might protect against the full gamut of rhinoviruses.

The scientists gathered a group of respiratory medicine specialists to review the findings. The reviewers agreed that the results looked promising. But just as the scientists were ready to take the vaccine forward, there was a setback at Sanofi. “There was a change of direction, a change of guys at the top,” Almond said. “I took early retirement for different reasons. My boss retired as well.”

In 2013, the new management decided that the company’s priorities were elsewhere, handing back to Imperial College the patent that protects the vaccine idea from being developed by other groups. Imperial did not have the resources to develop the vaccine without outside investment. For Johnston, it was frustrating – years of research and toil in the lab had seemed to be finally yielding results. But there was little he could do. The vaccine was shelved.


Across the Atlantic, as Imperial began to search for new backers, Martin Moore, a paediatrician at Emory University in Atlanta, was working on a rival approach to the same problem. A specialist in children’s respiratory disease, for the past three years Moore has been working on a solution so straightforward that when he presented the results of his paper, published in Nature Communications last year, his colleagues struggled to accept them. “But if I pushed them, I couldn’t get a good reason for that other than, just: it hadn’t been done before,” he says.

Moore first resolved to do something about the common cold in 2014, while on holiday with his family in Florida. Shortly after they had arrived, his son, then a toddler, came down with a cold. “He wanted me to hold him day and night,” Moore said. The pair hunkered down in the hotel room watching movies while the rest of the family went to the beach. “It was frustrating because, as a virologist, we can go into the lab and slice and dice these viruses. But what are we really doing about them?”

Moore reviewed the papers from the 1960s and 70s that described the early attempts at a vaccine. He saw that the scientists had demonstrated that if they took one rhinovirus, killed it and then injected it, it would protect people against that same strain. “People actually made decent vaccines against rhinovirus in the 1960s,” Moore told me. What scientists did not account for at the time was that there were so many different serotypes. But where the scientists of the past had seen defeat, Moore saw promise. Why not simply make a vaccine made up of all the rhinoviruses? There was nothing to suggest that it would not work. The problem was not with the science, but with logistics. “I thought, the only thing between us and doing this is manufacturing and economics.”

Moore secured funding from the National Institutes of Health (NIH) and applied for samples of the different serotypes from the Centers for Disease Control and the American Type Culture Collection, a biological material repository headquartered in Virginia. He stopped short of calling in all 160 serotypes, reasoning that 50 would be enough to support his hypothesis.

After developing the vaccine, composed of these 50 serotypes, Moore tested it on a number of rhesus macaque monkeys. When their blood was later mixed with viruses in petri dishes, there was a strong antibody response to 49 of the 50 serotypes. It was not possible to see whether the vaccinated monkeys themselves would be protected from colds, since human rhinoviruses do not infect monkeys. But the ability to induce antibodies in monkey blood does correlate with protection in people.

“Maybe I shouldn’t say this, but I never had a doubt that it would produce antibodies,” Moore told me. “Our paper was about showing it can be done.” There is still a long way to go before Moore’s dream becomes reality. For the vaccine to be tested in a clinical trial, it will need to be made under good manufacturing practice (GMP) conditions – regulations that companies must adhere to for licensing. Under these regulations, substances need to be kept separate to avoid cross-contamination – a substantial challenge for a vaccine that potentially encompasses 160 serotypes (currently, the largest number of serotypes in a single vaccine, for pneumonia, is 23).

For a manufacturing model, Moore is looking to the polio vaccine, since polio and rhinovirus are biologically related. The scale of production would be many times greater, but the basic processes would be alike. In May, Moore’s start-up, Meissa Vaccines, received a $225,000 (£170,000) grant from the NIH for work on rhinovirus. He is taking leave from academia to work on the vaccines.


At this point in time, perhaps the biggest barrier to us curing the common cold is commercial. Researchers at universities can only go so far; the most generous grants from bodies such as the UK Medical Research Council are around £2m. It falls to pharmaceutical companies to carry out development beyond the initial proof of concept. “You’re looking at 10-15 years’ work, minimum, with teams of people, and you’re going to spend $1bn (£760m) at least,” Almond told me.

Successes have been rare, and there have been spectacular flops. Last year, shares in US firm Novavax fell by 83% after its vaccine for RSV, one of the virus families responsible for colds, failed in a late-stage clinical trial. While it is less common than rhinovirus, RSV can cause great harm and even death in those with weakened immunity, including infants and the elderly. An effective vaccine presented an estimated $1bn opportunity for Novavax in the US alone. Before the results came through, chief executive Stanley Erck said it could be “the largest-selling vaccine in the history of vaccines”. But in the phase III trial of elderly patients, it did little to protect against infection. In the hours after the news broke, Novavax share prices fell from $8.34 to $1.40.

Episodes such as this have made pharmaceutical companies wary. Today, vaccines constitute less than 5% of the overall pharmaceutical market, and development is consolidated in a handful of companies: Sanofi Pasteur, GlaxoSmithKline, Pfizer, AstraZeneca, Merck and Johnson & Johnson, among a few other smaller players.

After the $1bn or so spent on development, there are also manufacturing and distribution costs to consider. There needs to be a return on the initial investment. “You sure as hell can’t do it if there’s not a market at the end, you’re wasting the company’s money, and if you do that too often, you’ll bankrupt the company,” Almond says. “There isn’t a conspiracy out there that says, ‘Let’s not do vaccines so people can get ill and we charge them a lot’, nothing like that. It genuinely isn’t easy.”

In August, I called Sebastian Johnston to see if there was any news on his vaccine. He told me that he had just received confirmation of further funding from Apollo Therapeutics, a startup backed by AstraZeneca, GSK and Johnson & Johnson. This would allow his lab to test the vaccine on more strains of rhinovirus. Johnston believes that if the vaccine proves to be protective against, say, 20 serotypes, there is a good chance it will protect against all the rhinoviruses. Beginning in October, the research should take about a year and a half. “At that point, I think we’ll be at a stage where we’ll be able to go to major vaccine companies.”

If the vaccine were to make it through the clinical trials, and was approved by regulators, it would first be rolled out to high-risk groups – those with asthma and COPD, and perhaps the elderly, as the flu jab is in the UK – and then to the rest of the population. In time, as the proportion of vaccinated individuals reach a critical mass, the viruses would cease to circulate because the chain of infection will be broken – a phenomenon called herd immunity.

From where we are today, this scenario is still distant: about 80% of drugs that make it into clinical trials because they worked in mice do not go on to work in humans. Still, for the first time in decades there are now major pharmaceutical companies with rhinovirus vaccine programmes, as well as smaller university research groups like Johnston’s which, through different approaches, are all pursuing the same goal of a cure. Once again, Johnston said, “people are starting to believe it may be possible.”

(Reuters) Gravitational wave scientists win 2017 Nobel Physics Prize

(Reuters) Scientists Rainer Weiss, Barry Barish and Kip Thorne won the 2017 Nobel Prize for Physics for decisive contributions in the observation of gravitational waves, the award-giving body said on Tuesday.

“This is something completely new and different, opening up unseen worlds,” the Royal Swedish Academy of Sciences said in a statement on awarding the 9 million Swedish crown ($1.1 million)prize.

“A wealth of discoveries awaits those who succeed in capturing the waves and interpreting their message.”

Physics is the second of this year’s crop of Nobel Prizes and comes after Americans Jeffrey Hall, Michael Rosbash and Michael Young were awarded the Nobel Prize for Physiology or Medicine on Monday.

(Economist) Closing in on cancer

(Economist) Science will win the technical battle against cancer. But that is only half the fight.

THE numbers are stark. Cancer claimed the lives of 8.8m people in 2015; only heart disease caused more deaths. Around 40% of Americans will be told they have cancer during their lifetimes. It is now a bigger killer of Africans than malaria. But the statistics do not begin to capture the fear inspired by cancer’s silent and implacable cellular mutiny. Only Alzheimer’s exerts a similar grip on the imagination.

Confronted with this sort of enemy, people understandably focus on the potential for scientific breakthroughs that will deliver a cure. Their hope is not misplaced. Cancer has become more and more survivable over recent decades owing to a host of advances, from genetic sequencing to targeted therapies. The five-year survival rate for leukemia in America has almost doubled, from 34% in the mid-1970s to 63% in 2006-12. America is home to about 15.5m cancer survivors, a number that will grow to 20m in the next ten years. Developing countries have made big gains, too: in parts of Central and South America, survival rates for prostate and breast cancer have jumped by as much as a fifth in only a decade.

From a purely technical perspective, it is reasonable to expect that science will one day turn most cancers into either chronic diseases or curable ones. But cancer is not fought only in the lab. It is also fought in doctors’ surgeries, in schools, in public-health systems and in government departments. The dispatches from these battlefields are much less encouraging.

Cell-side research

First, the good news. Caught early, many cancers are now highly treatable. Three out of four British men who received a prostate-cancer diagnosis in the early 1970s did not live for another ten years; today four out of five do. Other cancers, such as those of the lung, pancreas and brain, are harder to find and treat. But as our Technology Quarterly in this issue shows, progress is being made. Techniques to enable early diagnosis include a device designed to detect cancer on the breath; blood tests can track fragments of DNA shed from tumours. Genome sequencing makes it ever easier to identify new drug targets.

The established trio of 20th-century cancer treatments—surgery, radiation and chemotherapy—are all still improving. Radiotherapists can create webs of gamma rays, whose intersections deliver doses high enough to kill tumours but which do less damage to healthy tissue as they enter and leave the body. Some new drugs throttle the growth of blood vessels bringing nutrients to tumours; others attack cancer cells’ own DNA-repair kits. Cancer may be relentless; so too is science.

The greatest excitement is reserved for immunotherapy, a new approach that has emerged in the past few years. The human immune system is equipped with a set of brakes that cancer cells are able to activate; the first immunotherapy treatment in effect disables the brakes, enabling white blood cells to attack the tumours. It is early days, but in a small subset of patients this mechanism has produced long-term remissions that are tantamount to cures. Well over 1,000 clinical trials of such treatments are under way, targeting a wide range of different cancers. It is even now possible to reprogram immune cells to fight cancer better by editing their genomes; the first such gene therapy was approved for use in America last month.

Yet cancer sufferers need not wait for the therapies of tomorrow to have a better chance of survival today. Across rich and poor countries, the survivability of cancer varies enormously. Men die at far higher rates than women in some countries; in other countries, at similar levels of development, they do comparably well. The five-year survival rate for a set of three common cancers in America and Canada is above 70%; Germany achieves 64%, whereas Britain manages a mere 52%. Disparities exist within countries, too. America does well in its treatment of cancer overall, but suffers extraordinary inequalities in outcomes. The death rate of black American men from all cancers is 24% higher than it is for white males; breast-cancer death rates among blacks are 42% higher than for whites. A diagnosis in rural America is deadlier than one in its cities.

Practical as well as pioneering

Variations between countries are partly a reflection of health-care spending: more than half of patients requiring radiotherapy in low- and middle-income countries do not have access to treatment. But big budgets do not guarantee good outcomes. Iceland and Portugal do not outspend England and Denmark on health care as a proportion of GDP, but past studies show wide variation in survivability in all cancers.

Instead, the problem is often how money is spent, not how much of it there is. To take one example, a vaccine exists against the human papillomavirus (HPV), which causes cancers of the cervix in women, as well as cancers of the head and neck. Rwanda started a programme of routine vaccination in 2011, and aims to eradicate cervical cancer by 2020. Other countries are far less systematic. Vaccinations could help prevent cervical cancer in 120,000 Indian women each year.

Policymakers are not powerless. More can be done to verify which treatments (and combinations thereof) work best. A £1.3bn ($2bn) cancer-drug fund in England, which made expensive new medicines easier to obtain, did not assess the efficacy of the drugs it provided—a huge missed opportunity. Measuring the incidence and survival of cancer, through cancer registries, spotlights where patients are being failed. Access to health care matters, too: the number of Americans whose cancers were diagnosed at the earliest possible opportunity went up after Obamacare was enacted. And prevention remains the best cure of all. Efforts to rein in tobacco use averted 22m deaths (many of them to cancer) between 2008 and 2014. Yet only a tenth of the world’s population lives in countries where taxes make up at least three-quarters of the price of cigarettes, as recommended by the World Health Organisation.

Taxes and budgeting are a lot less exciting than tumour-zapping proton beams and antibodies with superpowers. But the decisions of technocrats are as important as the work of technicians. Cancer kills millions of people not simply for want of scientific advance, but also because of bad policy.

(BBG) This Shield of Patents Protects the World’s Best-Selling Drug

(BBGOver Humira’s lifetime, AbbVie has secured more than 100 patents to prevent anyone from attempting to copy the biologic, with $16 billion in annual sales.

 PHOTOGRAPHER: CAROLINE TOMPKINS FOR BLOOMBERG BUSINESSWEEK

Humira, a treatment for inflammatory diseases such as rheumatoid arthritis and psoriasis made by AbbVie Inc., is the planet’s best-selling drug. It’s also been around almost 15 years. Those two facts alone would normally have rival drugmakers eagerly circling, ready to roll out generic versions that could win a piece of the aging medicine’s $16 billion in annual sales. Yet last year, when the patent on Humira’s main ingredient expired, not a single competitor launched a copycat version. Figuring out how to manufacture it wasn’t the obstacle. The real challenge was the seemingly impregnable fortress of patents AbbVie has methodically constructed around its prized moneymaker.

(OBS) Sightsavers e CBM ganham Prémio Champalimaud de Visão

(OBS) Maior distinção portuguesa na área da visão foi atribuída à Sightsavers e CBM. As duas entidades estão focadas em prevenir a cegueira, curá-la e apoiar quem vive com ela nos países em desenvolvimento.

Sightsavers

Os vencedores do Prémio António Champalimaud de Visão de 2017, o maior prémio do mundo atribuído por uma instituição portuguesa na área da visão, já foram revelados: são a Sightsavers e a CBM, duas organizações que lutam há décadas contra a cegueira e os preconceitos relacionados. Ao Observador, ambas dizem-se “muito honradas” com a distinção que vale ainda um prémio de um milhão de euros. E explicaram-nos como têm atuado em alguns dos países mais pobres e mais conflituosos do mundo.

Há 11 anos que a Fundação Champalimaud reúne cientistas internacionais e figuras públicas envolvidas em projetos humanitários para galardoar quem mais luta contra os problemas na área da visão que se vivem nos países em vias de desenvolvimento. Este ano o prémio foi entregue a duas instituições que combatem há décadas as causas mais comuns dos problemas de cegueira em países como Nepal, Moçambique, Uganda, Etiópia ou Bangladesh. É que tanto a Sightsavers como a CBS combatem os estigmas relacionados com a cegueira nos países em vias de desenvolvimento e já permitiram que milhões de pessoas ficassem mais integradas na sociedade em que vivem.

O júri do Prémio é composto por Alfred Sommer (oftamologista e epidemologista), Paul Sieving(diretor do Instituto Nacional do Olho, nos Estados Unidos), Jacques Delors (antigo presidente da Comissão Europeia, um dos criadores da União Europeia e autor do Relatório para a UNESCO da Comissão Internacional sobre Educação para o Século XXI), Amartya Sen (escritor e economista, autor de muitas obras sobre a pobreza), Carla Shatz (neurocientista), Joshua Sane(investigador na área da biologia celular e molecular), Mark Bear(neurocientista), Gullapalli Rao(oftamologista), José Cunha-Vaz(professor catedrático, oftamologista e presidente da Associação para Investigação Biomédica e Inovação em Luz e Imagem), António Guterres(Secretário-Geral das Organização das Nações Unidas) e Susumu Tonegawa (Nobel da Medicina em 1987).

Aliar tratamento clínico e inclusão social

Com seis décadas de vida, a Sightsavers é uma instituição de solidariedade do Reino Unido que procura evitar a cegueira, restaurar a visão e defender a inclusão social e igualdade de direitos das pessoas com deficiências visuais em mais de 30 países em desenvolvimento. Em entrevista ao Observador, Izidine Hassane, diretor da instituição em Moçambique, explicou que essa missão é cumprida graças a parcerias com os ministérios de saúde e com marcas médicas presentes nos países onde estão: “Fazemos tudo dentro do modelo que os governos têm delineado para os cuidados de saúde dos países e adaptamos os nosso planos às especificidades políticas. Isso leva-nos ao nosso objetivo de fortalecer o sistema de saúde de forma permanente“, explica ao Observador. Apesar dos 67 anos de vida, a Sightsavers continua a enfrentar problemas, admite Izidine Hassane: “O nosso maior problema é a escassez de recursos, como dinheiro, infraestruturas, pessoal médico e medicamentos”, enumera. Para ultrapassar estes obstáculos, a Sightsavers procura promover os seus serviços e assim conquistar donativos de outras instituições.

Além dos cuidados médicos que tem providenciado às populações mais pobres do mundo, a Sightsavers também trabalha para a inclusão social das pessoas com deficiência. “Em alguns países, as famílias têm vergonha de ter um familiar deficiente e mantêm-no escondido. Noutros, eles são considerados inválidos e são marginalizados”, conta o responsável da entidade em Moçambique. Por isso, além de combater e eliminar doenças ou apoiar cirurgias nos casos mais urgentes, a Sightsavers organiza formações, utiliza a comunicação social, prepara campanhas de sensibilização e até peças de teatro para “mudar os comportamentos” nos países onde os deficientes são mais estigmatizados.

É graças a estas práticas que a Sightsavers se orgulha de ter realizado mais de 500 milhões de cirurgias — 6,6 milhões na área das cataratas — só desde 2016. Conseguiu ainda fazer o maior mapeamento conhecido do tracoma, uma doença tropical, em 29 países durante três anos. Mais: através da campanha “Juntando os Pontos”, conseguiu que 75% dos 400 mil deficientes formados por profissionais da Sightsavers fossem incluídos no mercado de trabalho.

Também a CBM, uma organização de desenvolvimento internacional cristão, está empenhada em melhorar a qualidade de pessoas deficientes nas comunidades mais pobres do mundo. Em 100 anos de experiência, a CBM já ajudou mais de 28 milhões de pessoas deficientes a terem uma participação mais ativa e inclusiva na vida social dos países onde vivem. Conseguiram-no trabalhando com parceiros locais que lhes ajudam a detetar problemas de visão nas populações de países mais pobres, tratando os mais leves e encaminhando os casos mais graves para hospitais.

Chegar a 39 milhões de pessoas cegas no mundo

Ao Observador, Babar Qureshi, diretor internacional para a saúde ocular inclusiva da instituição, a pobreza é “causa e consequência” da deficiência nos 59 países em vias de desenvolvimento em que a CBM trabalha, sempre em parceria com organizações da sociedade civil locais e nacionais. As pessoas mais pobres são quem mais dificuldade tem em aceder aos cuidados de saúde, que são demasiado caros. Também são elas quem menos sabem sobre como prevenir, detetar ou tratar os problemas de visão. “O que nós fazemos é tornar o sistema de saúde mais acessível para as pessoas mais pobres e marginalizadas. Depois, nos casos que já não têm cura, tentamos reabilitá-los na sociedade tornando-a mais inclusiva”, explica Babar Qureshi. Essa é uma longa caminhada, “principalmente nos países em conflito: de vez em quando temos de retirar-nos ou ser mais cuidadosos”, conta ao Observador. Ainda assim, em apenas um ano, a CBM ajudou mais de oito milhões de pessoas.

Com um milhão de euros nas mãos, o maior valor monetário do mundo a ser entregue como prémio a instituições desta natureza, os planos para este dinheiro já estão delineados. A Sightsavers pretende investir o dinheiro nas atividades que já tem em curso, nomeadamente nos cuidados de saúde que fornece e nos estudos científicos em que participa. A CBM quer usar o prémio para “chegar a mais pessoas e tornar o sistema de saúde mais forte“, conta ao Observador. É um passo em frente num mundo onde existem 39 milhões de pessoas cegas no mundo, 80% das quais podiam ser curadas ou cujos problemas podiam ter sido evitados.

(AP) Putin: Leader in artificial intelligence will rule world

(AP)

Russian President Vladimir Putin gestures as he speaks to journalists following a live nationwide broadcast call-in in Moscow, Russia June 15, 2017.

Sergei Karpukhin | Reuters

Russian President Vladimir Putin gestures as he speaks to journalists following a live nationwide broadcast call-in in Moscow, Russia June 15, 2017.

Russian President Vladimir Putin says that whoever reaches a breakthrough in developing artificial intelligence will come to dominate the world.

Putin, speaking Friday at a meeting with students, said the development of AI raises “colossal opportunities and threats that are difficult to predict now.”

He warned that “the one who becomes the leader in this sphere will be the ruler of the world.”

Putin warned that “it would be strongly undesirable if someone wins a monopolist position” and promised that Russia would be ready to share its know-how in artificial intelligence with other nations.

The Russian leader predicted that future wars will be fought by drones, and “when one party’s drones are destroyed by drones of another, it will have no other choice but to surrender.”

(Economist) The Economist explains: What are algorithms?

(Economist) Though capable of great feats, they are simply lists of instructions.

ALGORITHMS are everywhere. They play the stockmarket, decide whether you can have a mortgage and may one day drive your car for you. They search the internet when commanded, stick carefully chosen advertisements into the sites you visit and decide what prices to show you in online shops. As Uber and Waymo will tell you, they can be the subjects of legal arguments; they cause regulatory worries too (earlier this month a group of luminaries called for a ban on battlefield robots running algorithms designed to kill people). PageRank—the algorithm that powers Google’s search results—has made its inventors very rich indeed. Algorithmically curated “filter bubbles” may even affect the way a country votes. But what exactly are algorithms, and what makes them so powerful?

An algorithm is, essentially, a brainless way of doing clever things. It is a set of precise steps that need no great mental effort to follow but which, if obeyed exactly and mechanically, will lead to some desirable outcome. Long division and column addition are examples that everyone is familiar with—if you follow the procedure, you are guaranteed to get the right answer. So is the strategy, rediscovered thousands of times every year by schoolchildren bored with learning mathematical algorithms, for playing a perfect game of noughts and crosses. The brainlessness is key: each step should be as simple and as free from ambiguity as possible. Cooking recipes and driving directions are algorithms of a sort. But instructions like “stew the meat until tender” or “it’s a few miles down the road” are too vague to follow without at least some interpretation.

Algorithms are closely associated with computers and code. They do not have to be. Alan Turing, a British mathematician who did a great deal of pioneering work on how to treat algorithms with mathematical rigour, once wrote a fairly complicated chess-playing algorithm on paper. He tested it in a match against a friend, scanning down the list of instructions with every move and doing what his instructions told him. But, as Turing’s opponent conceded, humans generally find such repetitive, mindless work boring and frustrating (there was so much paperwork and arithmetic involved that it reportedly took about half an hour to play each move). Computers, though, excel at quickly churning through dull, repetitive tasks such as “add these two numbers”, “decide if this number is bigger than that one” and “store the answer over there.” It is, in fact, the only thing that they are capable of doing.

For that reason, computers have allowed humans to build—and execute—ever bigger and more baroque algorithmical constructs. And it turns out that, like Lego bricks, piling up enough simple instructions allows you to build far more intricate and interesting things than is apparent at first. Every computer program, from Chrome to Call of Duty to a climate model, is, at its root, nothing more than a big pile of algorithms being executed at high speed. In a nice bit of symmetry, some of the most advanced algorithms are not written by humans at all, but by other algorithms. Machine learning is a fashionable artificial-intelligence technique used to teach computers to do things that people can do, such as decode speech or recognise faces, but which humans cannot explain in a sufficiently mechanical algorithmic fashion. So a machine-learning algorithm does the translation work for them. It ingests lots of examples of the thing in question—spoken language, say, or pictures of faces—which have been labelled by humans. It then produces another algorithm that recognises them reliably. Brainlessness, in other words, is no impediment to intelligence.

(BBG) Bill Gates and Richard Branson Back Startup That Grows ‘Clean Meat’

(BBG) Cargill Inc., one of the largest global agricultural companies, has joined Bill Gates and other business giants to invest in a nascent technology to make meat from self-producing animal cells amid rising consumer demand for protein that’s less reliant on feed, land and water.

Memphis Meats, which produces beef, chicken and duck directly from animal cells without raising and slaughtering livestock or poultry, raised $17 million from investors including Cargill, Gates and billionaire Richard Branson, according to a statementTuesday on the San Francisco-based startup’s website. The fundraising round was led by venture-capital firm DFJ, which has previously backed several social-minded retailstartups.

“I’m thrilled to have invested in Memphis Meats,” Branson said in an email in response to questions from Bloomberg News. “I believe that in 30 years or so we will no longer need to kill any animals and that all meat will either be clean or plant-based, taste the same and also be much healthier for everyone.”

This is the latest move by an agricultural giant to respond to consumers, especially Millennials, who are rapidly leaving their mark on the U.S. food world. That’s happening through surging demand for organic products, increasing focus on food that’s considered sustainable and greater attention on animal treatment. Big poultry and livestock processors have started to take up alternatives to traditional meat.

“The world loves to eat meat, and it is core to many of our cultures and traditions,” Uma Valeti, co-founder and chief executive officer of Memphis Meats, said in the statement. “The way conventional meat is produced today creates challenges for the environment, animal welfare and human health. These are problems that everyone wants to solve.”

‘Clean Meat’

To date, Memphis Meats has raised $22 million, signaling a commitment to the “clean-meat movement,” the company said.

Cargill has “taken an equity position in Memphis Meats’ first series of funding,” Sonya Roberts, the president of growth ventures at Cargill Protein, said in an email, without disclosing the investment amount.

“Our equity position with Memphis Meats gives Cargill entry into the cultured protein market and allows us to work together to further innovate and commercialize,” Roberts said. “We believe that consumers will continue to crave meat, and we aim to bring it to the table, as sustainably and cost-effectively as we can. Cultured meats and conventionally produced meats will both play a role in meeting that demand.”

The investment is just the most recent by traditional meat companies. Tyson Foods Inc., the largest U.S. meat producer, has created a venture capital fund focused on investing in companies “to sustainably feed” the world’s growing population and in December announced a stake in plant-based protein producer Beyond Meat, which counts Gates among its early funders.

(Economist) Diamonds are rare on Earth. Elsewhere, they fall from the sky

(Economist) A hard rain descends on Uranus and Neptune.

IN THE marketplaces of planet Earth diamonds are both desirable and scarce, and that makes them expensive. Both the demand and the rarity are, however, largely artificial. Diamonds were made desirable in the 20th century mainly by a marketing campaign from De Beers, a big South African producer of the stones. The scarcity was, until recently, a result of the same company—which at one point controlled about 90% of the world’s production—ensuring that the number of stones which found their way into the world’s jewellery shops was well regulated.

In nature, though, diamonds are unremarkable. They are simply crystals of carbon, albeit crystals of a type that needs a fair amount of pressure to form. And carbon is the fourth-most abundant element in the universe. For that reason, diamonds are thought to be the commonest gemstones on Earth. Elsewhere in the cosmos, as demonstrated in a paper just published in Nature Astronomy, they are probably available in embarrassing abundance.

Dominik Kraus, a physicist at the Helmholtz Centre in Dresden, and his colleagues, are interested in ice-giant planets, such as Uranus (pictured) and Neptune. Unlike gas giants (Jupiter and Saturn being local examples), which are made mostly of hydrogen and helium, ice giants are rich in comparatively heavy elements such as oxygen, nitrogen and, crucially, carbon. That carbon is locked up in compounds, mostly hydrocarbons such as methane, ethane and the like.

Ice giants, as the name suggests, are also big. This means that, in the depths of their thick atmospheres, temperatures are high enough to split those hydrocarbons into hydrogen and carbon, and pressures are sufficient to compress the carbon into diamonds. The consequence, 10,000km or so beneath the top of the atmosphere, is a constant rain of diamonds. Those diamonds sink towards the planet’s core, encrusting it in a thick layer of gem stones.

That, at least, is the prediction. Testing it is tricky. Previous attempts, using anvils to compress hydrocarbons and lasers to heat them, have hinted that theory may, with a few tweaks, match reality. But Dr Kraus’s paper is definitive. He and his colleagues put tiny samples of polystyrene—which, like methane, is made of carbon and hydrogen—in front of a giant X-ray laser at the National Accelerator Laboratory, near Stanford University in California, in order to squeeze and heat it at the same time.

The results confirmed what researchers had long suspected. Diamonds do indeed form in such conditions, although the pressure required is a bit higher than previously thought. And Dr Kraus’s research will be of interest to more than just gem-cutters of the distant future looking for new sources of supply. Knowing the temperature and pressure at which parts of an ice giant’s atmosphere start to decompose into their elementary constituents can help astronomers fix the relationship between the radius and mass of such planets. That is useful, for these days scientists are interested in planets outside the solar system as well as those within it. For such bodies, mass and radius are often the only data available. Knowing how they relate will help astronomers catalogue just how many more diamond-encrusted planets are lurking out there in the cosmos.

(GUA) Gene editing to remove viruses brings transplant organs from pigs a step closer

(GUA) Study shows gene editing can remove porcine endogenous retroviruses from DNA, potentially making it safe to grow human transplant organs in pigs.

The use of organs from animals has been seen as a way to overcome the shortage of donor organs for human transplantation.
The use of organs from animals has been seen as a way to overcome the shortage of donor organs for human transplantation. Photograph: NHS

Growing human transplant organs in pigs has become a more realistic prospect after scientists used advanced gene editing to remove threatening viruses from the animals’ DNA.

Porcine endogenous retroviruses (Pervs) are permanently embedded in the pig genome but research has shown they can infect human cells, posing a potential hazard.

The existence of Pervs has been a major stumbling block preventing the development of genetically engineered pigs to provide kidneys and other organsfor transplant into human patients.

That hurdle may now have been cleared, according to new research reported in the journal Science.

Researchers in the US used the precision gene editing tool Crispr-Cas9 combined with gene repair technology to deactivate 100% of Pervs in a line of pig cells.

Piglets cloned from the fibroblast (connective tissue) cells turned out to be Perv-free.

Dr Luhan Yang, co-founder and chief scientific officer at the biotech company eGenesis, said: “This is the first publication to report on Perv-free pig production.

“We generated a protocol to enable multiplex genome editing, eradicated all Perv activity using Crispr technology in cloneable primary porcine fibroblasts and successfully produced Perv-free piglets.

“This research represents an important advance in addressing safety concerns about cross-species viral transmission. Our team will further engineer the Perv-free pig strain to deliver safe and effective xenotransplantation.”

The scientists first mapped the Pervs present in the pig genome, identifying 25 in total. Tests demonstrated that pig cells could infect human cells with Pervs in the laboratory. The viruses could then be transmitted to other cells not exposed to pig tissue.

Whether or not Pervs would actually cause diseases in humans is unknown, but they are considered an unacceptable risk.

Other endogenous retroviruses (ERVs) in humans have been suggested to play a role in cancers and autoimmune disorders, although evidence for this is lacking. Their involvement in multiple sclerosis and motor neurone disease has also been proposed.

British expert Professor Ian McConnell, from the University of Cambridge, said the research was a “promising first step”.

He added: “Successful transplantation of tissues and organs from animals to man, known as xenotransplantation, has been one of the goals of modern medicine for the last 20 years.

“The safe use of pig organs such as kidneys in xenotransplantation has been seen as an approach which could be used to overcome the shortage of donor organs in human transplantation.

“The problem is that all pig cells carry cancer viruses embedded in their DNA. These are known as endogenous retroviruses which, although normally silent, can be activated to become fully infectious for human cells when pig cells carrying these retroviruses are co-incubated with human cells.

“Since xenotransplantation involves long-term intimate cell-to-cell contact the potential for the species jump of retroviruses for the entire life-time of the transplants is a very real one.”

(BBC) Gold ‘could be used in cancer treatment’

(BBC)

Gold in panImage copyrightGETTY IMAGES

Tiny flecks of gold could be used in the fight against cancer, new research has suggested.

Scientists at Edinburgh University found the precious metal increased the effectiveness of drugs used to treat lung cancer cells.

Minute fragments, known as gold nanoparticles, were encased in a chemical by the research team.

The research involved zebrafish but the team are hopeful the technique could be used to develop human treatments.

Gold is a safe element which can accelerate – or catalyse – chemical reactions.

It is hoped such a method could one day be used to reduce side effects of current chemotherapy treatments by precisely targeting diseased cells without damaging healthy tissue.

Hard-to-treat cancers

Dr Asier Unciti-Broceta, from Cancer Research UK’s Edinburgh centre, said: “We have discovered new properties of gold that were previously unknown and our findings suggest that the metal could be used to release drugs inside tumours very safely.

“There is still work to do before we can use this on patients, but this study is a step forward. We hope that a similar device in humans could one day be implanted by surgeons to activate chemotherapy directly in tumours and reduce harmful effects to healthy organs.”

Dr Aine McCarthy, Cancer Research UK’s senior science information officer, said: “By developing new, better ways of delivering cancer drugs, studies like this have the potential to improve cancer treatment and reduce side effects.

“In particular, it could help improve treatment for brain tumours and other hard-to-treat cancers. The next steps will be to see if this method is safe to use in people, what its long and short-term side effects are, and if it’s a better way to treat some cancers.”

The study was carried out in collaboration with researchers at the University of Zaragoza’s Institute of Nanoscience of Aragon in Spain and published in the scientific journal Angewandte Chemie.