Moral and ethical issues are one of the more challenging issues associated with the fourth industrial revolution. Identify three moral/ethical issue related to IR-4, and propose a solution for resolving them (120 words)
We live in an age of transformative scientific powers, capable of changing the very nature of the human species and radically remaking the planet itself.
Advances in information technologies and artificial intelligence are combining with advances in the biological sciences; including genetics, reproductive technologies, neuroscience, synthetic biology; as well as advances in the physical sciences to create breathtaking synergies – now recognized as the Fourth Industrial Revolution.
These new powers hold great promise for curing and preventing disease, improving agricultural output, and enhancing quality of life in many ways. However, no technology is neutral – and the powers of the Fourth Industrial Revolution certainly are not.
Since these technologies will ultimately decide so much of our future, it is deeply irresponsible not to consider together whether and how to deploy them. Thankfully there is growing global recognition of the need for governance. Professor Klaus Schwab, Executive Chairman of the World Economic Forum, for example, has called for “agile governance,” achieved through public-private collaborations among business, government, science, academia and nongovernmental civic organizations. Wendell Wallach and Gary Marchant, both scholars in this area, have proposed “governance coordinating committees” or GCC’s that would be created for each major technology sector and serve as honest brokers.
Whatever forms governance takes, and it will (and should) take many forms, we need to make sure that governing bodies and public discussion address four critical questions. The answers to these questions will require both scientific input and a willingness to discuss the ethical and social implications of the choices we face.
1. Should the technology be developed in the first place?
This question, for example, is now being asked with regard to a possible ban on autonomous lethal weapons, or militarized robots. To date, there is no record of a lethal autonomous weapon picking its own target and destroying it, without humans being involved in the decision-making. However, many experts see this prospect materializing in the near future, unless a worldwide ban is instituted soon.
Another example is geoengineering, which is the use of technology to alter planetary conditions, often to change the climate so as to reduce the earth’s warming. This is a truly global issue that needs a collective approach, since one nation-state may make climate changes that are beneficial for itself, but detrimental to others. Furthermore, some of the strategies – for example, proposals to seed the stratosphere with nano-particles - carry unknown but potentially large risks for the planet as a whole. Science may or may not be able to quantify the risk, but even if we have risk estimates, discerning how much risk we should take, if any, is not something science alone can answer. Ultimately it is a moral assessment we need to make collectively.
2. If a technology is going to proceed, to what ends should it be deployed?
During the Fourth Industrial Revolution, there will be a wide variety of so-called human enhancements on offer. Some will focus on eliminating diseases; others may extend human capacities we wish to promote or reduce, such as greater athletic ability, greater memory, or less aggressive behavior. Rather than making endorsements or prohibitions about enhancements in general, each type should be considered on a case-by-case basis in terms of how likely it is to advance, or diminish, human flourishing.
3. If the technology is to go forward, how should it proceed?
It matters how a technology is researched and how it enters the world. For example, The National Academy of Sciences, Engineering and Medicine in the United States recently issued a landmark report that takes a precautionary approach to the use of gene drives. Gene drives are technologies, which in combination with CRISPR Cas9 gene editing, can exponentially increase the prevalence of specific genetic elements in a whole population of certain kinds of wild plants or animals. Right now, for example, gene drives are being considered as a way of controlling, or even eradicating, mosquitoes that are disease vectors for human illnesses, like malaria and Zika. The National Academies’ report encourages the development of gene drive technology, but calls for carefully paced research, first in laboratory settings and small field studies, before engineered organisms are released into the wild.
4. Once norms have been set, how will the field be monitored to ensure adherence?
Right now, there are guidelines for many aspects of research and technology diffusion, but serious gaps in our ability to monitor adherence or hold bad actors accountable. For example, there are sound regulations for the management of some kinds of toxic chemicals, but extremely inadequate funds for regulatory staff to monitor and inspect chemical sites. Governance mechanisms for the 21st century will have to grapple with what areas need mandatory regulation and how to enforce them.
"Facts alone are insufficient"
The answers to these questions need to be informed by facts, but facts alone are insufficient. All four questions require a willingness to discuss the values we hold dear, even when values discussions may lead to controversy and conflict.
Safety is perhaps the least controversial value. Most of us around the globe believe that there is an obligation to reduce the likelihood that individuals will be harmed by new technologies. Indeed, the primary responsibility of most existing regulatory bodies is to promote safety.
But there are other very important values at stake, and they are often given short shrift. First, we should commit to equity – to doing all that is possible to ensure that all people, regardless of their economic means, will have access to technology’s benefits. Otherwise, we run the risk of exacerbating what Hastings Center scholar Erik Parens has called “the already obscene gap between the haves and have nots.”
To Overcome or resolve issues related to IR-4;
the transformation at hand goes beyond the mere adoption of technologies and other innovative tools. It is a deeper, more fundamental transition to new roles, new structures and new systems for the delivery of social good. Here are three considerations for civil society organizations as they continue along their journey of change.
1) Be ready to play new and different roles
A key function of a thriving civil society sector within democracies is its ability to promote accountability, fairness, trust and transparency in society. This role is more important than ever in the current context, whereby technological change - if not governed appropriately - risks creating new inequalities and reinforcing existing ones. There is a huge need for civil society involvement in influencing how the Fourth Industrial Revolution unfolds and how, on the one hand, its positive impacts can be directed towards those groups in society most in need of help, and on the other hand how its negative impacts can be prevented in the first place. Civil society organizations are expected to enter radically new spaces and to demonstrate their (historical) value as, for example, advocates, watchdogs or capacity builders against the complex and ever-changing context of the Fourth Industrial Revolution.
2) Be ready to address and resolve a range of tensions in order to perform those roles responsibly
There are a series of hardcore questions that organizations will need to grapple with as they navigate their approach to innovation and technology. What is driving civil society’s motivations to use technology? Which problems are we solving? How do we design systems, organizations and cultures for innovation? How do we remain independent while still relying on algorithmic tools or corporate-owned digital platforms for our work? How do we learn and share best practices? How do we allocate limited resources on technology in the short versus the long term? The ability for a civil society organization to successfully and responsibly navigate technological change is dependent on what the organization makes of these tensions and the decisions taken around them.
3) Be ready to work with other stakeholders in shaping the future of the sector
The nature of technological change, combined with other drivers such as closing civic spaces, means that civil society organizations cannot change on their own, or in silos. To a certain extent, their engagement with other sectors for their partnership, operational and funding models requires support and benefit from multi-stakeholder actions to incentivise radical change. Philanthropy, government, industry: they all share responsibility in creating the supporting structures, collaborative platforms and enabling conditions to accelerate civil society’s readiness for the Fourth Industrial Revolution. It is ultimately a game of systemic change that all actors must play in concert.
Artificial Intelligence and Robotics
Artificial Intelligence is at the top of this privacy leaking list. Several researchers have identified the privacy risk of AI. Real-time image processing revels human identity and leaks millions of personal information . This study finds the major issues of AI and Robotics in perspective data privacy are: No privacy standardization for AI-based technologies Consent gathering from the user is inefficient AI decision making (profiling) should be monitored
Augmented Reality (AR) and Virtual Reality (VR)
To enjoy the world of AR and VR, we must share some personal data. Recent studies [17], [18] found the followings by analysis the AR and VR devices from the following companies: mixed reality, Sony, oculus, play station, daydream, Viron, Next, Samsung’s Gear VR, HTC’s vive. The study found similarity among themselves with respect to private information gathering. Almost everyone is using cookies or beacons to gather information.
Internet of Things (IoT)
Internet of Things (IoT) data privacy was mentioned by several studies in the different domain of use . A future globalnetworkof“things”bring challenges concerning privacy. We, therefore, outline the main reason for IoT data privacy leaking: Default Raw data storage Ensure the device Energy limitation Encryption limitation Insufficient standardization organization Less information from IoT device producer in perspective of device level data collection and sharing.
Cyber-Physical System (CPS)
The fundamental enabler of the industry 4.0 is CyberPhysical Systems (CPS) where critical quality management a must was mentioned by a study by Aich [22]. Privacy revealing in CPS is typically passive . The study mentioned two ways of privacy leaking in CPS: Physical: This kind of privacy attack directly interfere with the physical properties of the system. For example, altering the powers of an implantable healthcare chip. Cyber: Computer virus, software and network-based attacks are cyber-attack to CPS. For instance, forging sensor data.
Blockchain Technology
Blockchain technology versus General Data Protection Regulation (Immutability vs mutability) is key blockchain drawback related to privacy . In one side, blockchain stores data immutably but, personal data must be erasable ethically after use. Storing personal information on a blockchain seriously violate personal data privacy. Identity hiding can be misused, and privacy-breaching may have done unanimously. Moral consideration is overruled in blockchain technology. Blockchain technology in human resource was proposed but the moral thoughts (i.e. privacy) were absent.
Get Answers For Free
Most questions answered within 1 hours.