This essay is about the question of responsibility. As a reference point, I'll begin with an example: the responsibility when it comes to a firearm. Different countries have different laws and cultural ways of relating when it comes to firearms. A comparison of three countries makes these differences very obvious.
The countries are Germany, the United States of America, and Switzerland. In the US and Switzerland, gun ownership is common (in fact, in Switzerland it's partially compulsory), whereas in Germany, it's strictly regulated, and among civilians it's mainly limited to hunters. There are more guns per capita in Switzerland than in the US, but despite this, the risk of getting killed by someone carrying a firearm is a lot smaller there than in the US.
This indicates that it's not just a question of accessibility, but that it's rather a cultural issue. Maybe people in the US generally are more inclined to reach for their guns, or maybe the social conditions in the US drive people to despair more frequently. Who knows. Simply recognizing this fact adds an enormous amount of complexity to the topic of who's responsible for a firearm. For now, I'll back away from this complexity and stick to a couple of clear and simple points.
I'd like to present the following list of four roles in relation to the responsibility concerning the use of firearms. In doing this, I'd like to also point out that this list can be extended at will, to illustrate that we are all connected to one another and that we all carry some degree of responsibility for each other.
The policy maker is the person who initially either allows or forbids the manufacturing, ownership, and use of guns, and who then makes regulations accordingly. In a state, this would be the lawmaker, with help from the judiciary and executive organs.
The manufacturer is the person who builds the firearm. Depending on the product, a huge amount of people all over the globe can be part of the manufacturing of it. A weapon has to be designed, it requires raw materials such as steel and gunpowder, and it requires someone to turn those raw materials into the end product.
The dealer makes sure that the materials for the weapon make it from one point to the next during the manufacturing process. They bring the steel to the manufacturer and bring finished firearms from the manufacturer to the seller. The Dealer and the manufacturer can be the same person. The seller is the last link in the chain before the firearm reaches the user.
The last link is the Quadriga is the user, the owner of the firearm. He is the person pulling the trigger.
When a living being is killed with a firearm, all these four people are involved in the act of killing. The weapon user is only the most obvious person who has to take responsibility. It is obvious that laws and conditions of use affect and direct the behaviour of people. If this wasn't the case, there would be no need for laws. Clearly, if no one manufactures firearms, no one will be killed by them. Clearly, if no one can acquire a firearm, no one will be killed by one, and clearly, if no one owns a firearm, no one will kill anyone with one.
This is the Quadriga of responsibility.
We are all connected to one another.
One problem regarding responsibility can be easily observed in the exploration of new technologies in the 20th and 21st centuries. Alfred Nobel invented dynamite. A discovery that later caused him headaches, as he discovered the potential uses for his invention. An even more extreme case is the invention of the atomic bomb, where many of the physics who took part in inventing it later spoke out loudly against it. Too late. Friedrich Dürrenmatt parodies such historical irony in his didactic drama "The Physicists". In the 21st century we have reached a point where young people, using easily accessible resources, a calculator and the internet, can become inventors. Facebook wasn't invented by scientists in their fifties, but rather by Mark Zuckerberg, who was 20 years old when Facebook first saw the light of day. Years later we are discussing whether Facebook helped Donald Trump become elected president and whether Facebook is making people lonely. Young people struggle with taking responsibility for the future consequences of their actions.
One of the next big technological revolutions, which is already under way, is considered to be artificial intelligence. Scientist Nick Bostrom, who among other things works as an advisor for the United Nations when it comes to questions of AI, brings light to the possibility of artificial intelligence becoming so powerful and unpredictable that we humans need to proceed with precaution. No postcaution. This means that a certain amount of "in case something goes wrong"-mechanisms need to be built into the technology even before it's finished, in order to make sure artificial superintelligences don't start annihilating humanity.
A strange thing about humanity is that it seems to view the exploration of new technologies as something unavoidable, as something unquestionable, as a God-given fact. In a few years, the self-driving car will probably be mass produced. An important milestone in the exploration of artificial intelligence. We all know that the self-driving car could lead to many professional drivers, like taxi drivers and truck drivers, ending up unemployed. We all know that these new technologies could change the entire society.
Currently, old jobs are disappearing and new ones are appearing all the time. The call of the time is to constantly reinvent oneself. Seemingly, this is the only way to make sure one still has work in 10 years. A small contingency of people is changing the entire society in a permanent way. It's not a democratic process. A few people with financial and intellectual resources can in today's corporatocracy do whatever they want. I can't off the top of my head think of anyone in my surroundings who could passionately say that they're excited about the self-driving car and are looking forward to the day it's finally available. To be honest, I do frequently have moments where I wish I could let go of the steering wheel and do other things without worrying. On the other hand, I firmly believe that automation always brings with it a loss of the feeling of "being in the world". I like doing things myself. Admittedly not all things though.
The fact is that the self-driving car is being forced upon the world by corporations and that the rest of the world, and even politics, are powerless and voiceless in the face of the invention of this new technology. I do not mean to say that the self-driving car is bad, but rather that the self-driving car is coming, that it'll change society, that we don't yet fully know how, and that these things are viewed as inevitable and unstoppable. Even if the internet companies and the car manufacturers would suddenly decide to cancel the study of automated driving, some talented people have already gone out into their garages and redesigned their cars to be self-driving. This progress truly is unstoppable. There are too many intelligent people.
Progress is, of course, not a problem. Progress can actually be the greatest thing that can happen to us. The problem is progress that has not been thought through, where suddenly, accidentally, the atomic bomb is created. This kind of progress comes with technologies that are extremely dangerous. It is easily imagined that the atomic bomb isn't the most dangerous thing we humans can create. Because of this, scientists need to take responsibility. Because of this, they need to study in a pre-emptive way and ask themselves what the technology in question could mean, and what could happen to it. Alfred Nobel needs to recognize what his explosives can bring about.
We all need to take responsibility for what we do.
Joseph Bartz Translation: Oskar Henke 2018