Autonomous weapon systems (AWS) could in principle release military personnel from the onus of killing during combat missions, reducing the related risk of suffering a moral injury and its debilitating psychological effects. Does it follow that the armed forces are obliged to replace human soldiers with machines to reduce the incidence of moral injuries? We address this question from a virtue ethics perspective that construes moral injury as a form of character deterioration, a disgrace that just societies and institutions are morally committed to preventing. The question is divided in two sub-questions: (1) can the use of AWS reduce the risk of moral injury and is such a solution more effective than similar ones? (2) Is the use of AWS an ethically desirable solution to prevent moral injury or does it carry unethical implications that make it ultimately unsuitable? We tackle these questions comparing the opposite risks of character deterioration represented by moral injury and moral deskilling, discussing how the proposed solution evokes problematic trade-offs for the cultivation of military virtue.
- Moral injury
- lethal autonomous weapons
- moral deskilling
ASJC Scopus subject areas
- Sociology and Political Science