It’s that time of year again. The new resolution is here and people are already forming their opinions and for next year’s debate season. We already ran an Online LD Camp (GET THE RECORDING HERE FOR ONLY $25/members –https://lastingimpact.info/product/workshop22659/ ) and the Lasting Impact! LD Guide is about to be released NEXT WEEK!! As a coach, I have already taken a deeper look at the selected NCFCA Lincoln Douglas value debate resolution for 2024-2025 debate season. Resolved: In combat, the use of automation should be valued above the use of Personnel. At first glance and without the context, it may not be clear whether this resolution begs ethical questions significant enough to debate. I would argue however, that there is practically an ocean of relevant moral philosophy to this highly relevant and deceptively nuanced resolution…
The inner debater in me seems to be physically incapable of starting any serious conversation without first addressing definitions. For the TPers of the world who want to see my evidence, everything I say here about the terms in the resolution will be based on the definitions offered in the NCFCA white papers on the proposed resolutions. The main term that needs unpacking here is “Automation”. To paraphrase, automation essentially means “to do with machines what used to be done by people, usually to the extent of being independent of human oversight”. “Personnel” can be understood as roughly meaning “employed or recruited people”. So some examples of Automation replacing personnel in the past would be the first windmills used to grind grain, the removal of the telephone operator as a necessary job, or even the transition from a horse drawn carriage and carriage driver to the automobile. We are seeing this happen more and more as technology becomes more sophisticated, so also falling into this category is any task done by a computer instead of a person, which includes artificial intelligence allowing us to automate things like writing, creating images, playing chess and more. On its own, the question of automation (and especially artificial intelligence) has its own ethical dilemmas built in, but it gets even more complex in the context of the resolution.
The NCFCA white papers outline a few ways that we’ve seen automation affect the world of combat. Homing projectiles, drones, the US Phalanx and the Israeli Iron Dome all fall under the umbrella of automation within the realm of combat. Beyond that almost all technology related to weapons, vehicles and communications fall under the umbrella of automation, and affect the development of warfare in some way. With the strides we are seeing in robotics and artificial intelligence, it is only a matter of time before we see even further implementation of automation in the realm of combat become an option. If the question was “can we” use automation over personnel in the future the answer is an inevitable yes, but the question we are called to ask (a value debaters favorite question) is: Should we?
Some of the questions that the resolution should lead us to ask ourselves are: Is warfare a justifiable means to an end? Should we value the lives of our citizens and soldiers above those of other countries? If it is an ethical dilemma whether we allow artificial intelligence to wield power over life and death in the realm of warfare, what is the ethical limit of what we allow artificial intelligence to decide; education? Autonomous vehicles could result in lives being lost, do we hold self driving cars to the same standard as AI used in war? Are we to be held ethically accountable for the effects of machines we create? As Christians do we believe it to be good for our own country to have weapons others don’t? What would be the geopolitical implications of any one country making a technological breakthrough allowing them a massive combat advantage? Sure maybe one country may use the next advancement in combat automation to bring peace, but another given country would feasibly use the same advancement to subjugate its peers. As christians should we take the deontological or utilitarian approach to warfare?
Some of the potential conflicts we could see are: utilitarian ethics versus deontological ethics, globalism versus nationalism, technological advancement for good versus technological advancement for harm. Some of the possible contexts (or actors) of the resolution, we could see are: commanding officers in a time of war, independent companies and developers who would build autonomous technology for combat, terrorism versus antiterrorism, or even law enforcement versus organized crime.
The value debate resolution matters for a number of reasons. Speaking from personal experience I find that much of the value I get from competing in debate is how study of the subject matter affects me and my understanding of the world. After being in Locke and Adam Smith for a year on the property rights resolution, I came away with a new understanding of the concept of justice, and what it means to engage with the government as a christian. Study of the previous year’s resolution about Rationalism and Empiricism led me to develop an understanding of why I believe what I do, and helped me find understanding for people who would believe differently.
Maybe you aren’t personally excited about this resolution. Maybe you were around a few years ago and don’t want another value debate around warfare. Maybe you weren’t aware of the relevant moral questions. Maybe you find the idea of automation and technology incredibly boring. Or maybe you just found one of the other proposed resolutions more interesting, but for the reasons given above I believe this year’s resolution offers a lot of promise for a season of real world applicable, culturally relevant, and ethically complex debates.
Since arriving at college and beginning my studies for a degree in IT Innovation (emerging technologies and how to apply them to solve human problems) I’ve found that the ethical dilemmas around technology need more attention brought to them. The industry at large is still looking for the answers to the question this resolution asks, so there’s no time like the present to begin asking them ourselves. At Lasting Impact! we walk alongside students, parents, coaches, and clubs. We have opinions, but we want you to form your own.
Bio
Drew is an NCFCA Alumni who competed in speech and debate. He only competed in NCFCA during his senior year, but went to nationals in Lincoln Douglas debate after winning the regional championship. After graduation from Classical Conversations he went on to attend University of Nebraska at Omaha. Drew is fascinated by ideas. He likes to think, hear from those who disagree with him and is willing to challenge that which is widely accepted. He believes in debating with integrity; and being a debater who values truth, understanding and honesty. Most instrumental to Drew’s debate success, is possibly his looking at debate as a sport, or a game. While he believes that how you conduct yourself is much more important than if you win, he would argue there is no sport better than debate when it comes to expression of creative and strategic thinking. After graduating, he began coaching in order to impart his love of debate as a sport, and communication as a skill to his students. Schedule a summer prep coaching call with Drew today!
You must be logged in to post a comment.