What should digital nerds and bio-geeks be worried about (opinion)



[ad_1]

Genes and genomes are based on code – just like the numerical language of computers. But instead of zeros and ones, four letters of DNA – A, C, T, G – encode the whole life. (Life is messy and there are all kinds of extreme cases, but ignore it for the moment.) If you have the sequence that encodes an organism, you can theoretically recreate it. If you can write a new job code, you can edit an existing organization or create a new one.

If that sounds a lot like software coding, you're right. As synthetic biology is more closely related to computer technology, the risks of the latter become the risks of the former. Code is the code, but because we are dealing with molecules – and sometimes with real life forms – the risks can be much greater.

Imagine a biology engineer trying to increase the expression of a gene that maintains normal function in blood cells. Although this is a relatively straightforward operation under current standards, it will certainly take several attempts to succeed. If this computer code was used, the only damage that these unsuccessful attempts would do would be to crash the computer on which they were operating. With a biological system, the code could instead increase the probability of several types of leukemia and destroy cells important for the immune system of the patient.

We have known the mechanisms of DNA for more than 60 years. The field of modern biotechnology began in 1972 when Paul Berg joined one virus gene to another and produced the first "recombinant" virus. Synthetic biology appeared in the early 2000s, when biologists adopted the mindset of engineers; instead of moving single genes, they designed complex genetic circuits.

In 2010, Craig Venter and his colleagues recreated the genome of a simple bacterium. More recently, researchers at the British Molecular Biology Laboratory of the Medical Research Council have created a new, simpler version of E. Coli. In both cases, researchers created what could be called new life forms.

This is the new bio-engineering, and it will only become more powerful. Today, you can write DNA code in the same way that a programmer writes computer code. You can then use a DNA synthesizer or order DNA from a commercial vendor, then use precision editing tools such as CRISPR to "operate" it in an existing organism, from virus to plant of wheat.

In the future, it may be possible to build an entire complex organism, such as a dog or a cat, or to recreate an extinct mammoth (in progress). Today, biotechnology companies are developing new gene therapies and international consortia are studying the feasibility and ethics of making changes to the human genome that could be passed on to future generations.
Within the bioscience community, urgent discussions are taking place on "cyber security", a disputed term that exists between biological systems and information systems, where the vulnerabilities of can affect each other. These may include the security of DNA databases, the fidelity of the transmission of these data and the dangerous information associated with specific DNA sequences that could code for new pathogens for which it does not occur. there is no treatment.
These risks have not only occupied scholarly bodies – the National Academies of Science, Engineering and Medicine have published at least half a dozen reports on biosafety risks and how to manage them in a timely manner. proactive – but they were also mainstreamed in the mainstream media: the genome edition was a major plot item in Netflix season 3 of "Designated Survivor".

Our concerns are more mundane. As the "programming" of synthetic biology reaches the complexity of traditional computer programming, the risks of computer systems will be transferred to biological systems. The difference is that biological systems can cause much greater and far more lasting damage than computer systems.

Programmers write software by trial and error. Because computer systems are so complex and there is no real theory of software, programmers regularly test the code they write until it works properly. This makes sense because the cost of an error and the ease of trying again are very low. There are even jokes about it: a programmer would diagnose a car accident by putting another car in the same situation and checking to see if it happened again.

Designer babies are on the way. We are not ready

Even the finished code still has problems. Again, due to the complexity of modern software systems, "working properly" does not mean that it is perfectly correct. Modern software is full of bugs – thousands of software flaws – that sometimes affect performance or security. That's why any software you use is regularly updated. developers still fix bugs even after the software is released.

Bioengineering will be essentially the same: biological code writing will have the same reliability properties. Unfortunately, the software solution of making a lot of mistakes and correcting them as you go does not work in biology.

In nature, a similar type of trial and error is treated by "survival of the fittest" and occurs slowly over several generations. But the code generated from scratch by the man does not have this kind of correction mechanism. Involuntary or intentional spread of these newly coded "programs" may result in the occurrence of pathogens belonging to the extended range of hosts (eg, swine flu) or harmful organisms. delicate ecological balances.

Unlike computer software, there is as yet no way to "patch" biological systems once they have been released into the wild, although researchers are trying to develop them. There is also no way to "repair" humans (or animals or crops) sensitive to such agents. Rigorous biological containment can help, but no containment system provides zero risk.

Opportunities for mischief and malfeasance often arise when expertise is compartmentalized, areas overlap only marginally and knowledge gathered in small groups of experts fail to reach the larger body of practitioners who have an important contribution to make.

Biologists, security agencies and governance experts have made a good start. But these efforts tend to be summarized, whether in the spheres of biological and numerical influence, to be classified and only within the military, or to be exchanged only by a very small group of investigators.

What we need is more opportunities for integration between the two disciplines. We need to share information and experiences, classified and unclassified. We have tools among our digital and biological communities to identify and mitigate biological risks, and those to write and deploy secure computer systems.

These opportunities will not happen without effort and financial support. Let's find these resources, public, private, philanthropic or a combination of both. And then, use these resources to create new opportunities for geeks and digital bioners – as well as for ethicists and policymakers – to share their experiences, their concerns, and to come up with creative and constructive solutions to these issues, which are not just patches. .

These are global problems; do not let compartmentalized thinking or funding hinder the removal of barriers between communities. And do not let any technology, whatever it is, hinder the public good.

[ad_2]

Source link