Kazi Md. Modassir Hossain |
Introduction
War
has become a tool for states to expand the territories where they have restored
armed conflicts. During
the armed conflicts, various methods and warfare are being used, which creates damage on both sides depending on
how many parties is involved. In war, there are many types of weapons being
used, and these weapons are changing with the time. In this modern time, many
states have now returned to autonomous weapons via artificial intelligence (AI)
as the latest use of technology as warfare. This weapon is based on AI, which is the latest
technology developed by many countries, which enables one to select targets
without intervention by a human operator. However, day by day, these weapons
are dangerous for civilians. Because the automated weapon system cannot
discriminate between civilians and combatants and it can be damaging to
civilians, many groups can use this for illegal purposes, which leads to war
crimes and a violation of the International Humanitarian Law (IHL) and the Rome Statute of the International
Criminal Court (ICC). In this type of violation of war crimes, there is
an argument about who is accountable for the crimes because the automated
weapon system (AWS) is targeting combat without the intervention of a human
operator. This is the brief argument where lots of points
arise, such as individual responsibility and accountability for the violation
of such war crimes.
The International Criminal Court
and its jurisdiction
The Rome Statute was adopted by the international community in July 1998. Which came into being in 2002. From the statue, the International Criminal Court was established, which is
mentioned under Preamble 10 and Article 1 of the Rome Statute. The ICC became
the first and most-awaited
International Criminal Court to stop the crimes of genocide, crimes against
humanity, war crimes, and the crime of aggression as provided under articles 6,
7, 8bis, and 15ter, respectively. Moreover, we can individually be liable for
those crimes under Articles 25 and 30 of the Rome Statute. The Rome Statute
also specifically mentions under Articles 1, 20, 22, and 23 by using the term
"person" to make it clear that individuals, rather than states or
other organizations, can be held
accountable for violations of genocide, war crimes,
and crimes against humanity. Since
these crimes are international crimes by nature, states have an obligation to
investigate and prosecute them. If
the states are unable or unwilling to do so, the International Criminal Court
(ICC) has the jurisdiction to dispute those crimes under Article 17 of the Rome
Statute.
Autonomous Weapon under
International Humanitarian Law
International humanitarian law provides several legal instruments that are in accordance with the
general provisions of humanitarian law regarding restrictions, means, and
methods of the war. The first instrument that stipulates the importance of reviewing the legality of new weapons
is
the
Declaration of St. Petersburg (1868), and
then Article 36 of Additional Protocol I (1949) continue this by mandating
states to assess the legality of the new weapons. This includes determining
whether it is prohibited under International Humanitarian Law or another
International Law. Article 36 states that reviewing the legality of the intended deployment of the new weapon
is an obligation of a state. It is crucial to ensure that the armed forces of a
state are capable of carrying out hostilities in line with their
international responsibilities. Article
36(2) of Additional Protocol I further mentions
that, when developing new weapon technology, lawyers and politicians need to maintain respect
for the law and accountability for those who seriously violate the law as stipulated under
Article 49 of the Geneva Convention. Under Article 49 of the Geneva Convention I, it states that "[t]he High
Contracting Parties undertake to enact any legislation necessary to provide
effective penal sanctions for persons committing.
Also reviewing
a new weapon, proportionality is the key principle. The new weapon should ensure that harm is not greater than
military benefit. Autonomous weapon systems (AWS) that work without human
control often follow these principles because they cannot make complex
decisions like humans. This could lead to violations of humanitarian
principles. Which can later
be protected by the Convention on Certain Conventional Weapons 1980, which aims to maintain
human control over weapon systems.
Individual Criminal Accountability and Autonomous Weapons Based on AI
In current
times, the sophistication and capabilities of weaponry have significantly
surpassed those of earlier periods. Now the weapons are being advanced and do
not need any human intervention. The
weapons that are targeting combats without the intervention of humans are
called autonomous weapons
based on artificial intelligence AI. This type of weapon is divided
into three categories based on the amount of human involvement in their
actions, such as (1) "human-in-the-loop weapons," which are the robots that can select targets and deliver force only
with a human command. (2)"Human-on the - Loop Weapons" robot that can
select target and deliver the force under the oversight
of a weapon operator who can override
the roots actions
and (3)"Human-out -of- the - loop weapons,
which are the robots that are capable
of selecting targets and delivering force without any
human input or interaction. But this type of fully autonomous weapons does not yet exist, but
technology is moving in the direction of their development, and precursors are already in use. Other two types of weapons
are being used in current days by the
different countries' military, such as the MQ-9 Reaper drone or the Phalanx
CIWS (close-in weapon system). This type of weapon targets combats through the
use of developed software. However, there would be a possibility that the
weapon can harm civilians and violate the International Humanitarian Law (IHL)
because the weapons cannot discriminate between combatants and civilians, and
here there is accountability: against who committed crimes and whether the
software developers of autonomous weapons are accountable or not. In the autonomous
weapons of "Human-in the - Loop Weapons," both operators will be directly accountable under Article 30 of the Rome Statute
of the ICC because the weapon is only to target
the combat the ultimate power to shoot has on the operator decisions. According to Article 30, criminal
responsibility requires both intent and knowledge. A person must intend to
engage in the conduct and either
cause or be aware of the likely consequences. "Knowledge" refers to
awareness of circumstances or consequences in the ordinary
course of events. Here the operator will be accountable if any war
crime occurs because he had full intent and knowledge. On the other hand, the individual who develops the
software of the weapon can be indirectly accountable if war crimes arise
according to Article 25(3) (d) of the Rome Statute of the ICC. Article 25(3)
(d) said that individuals are
criminally liable if they intentionally contribute to a crime committed by a
group. The contribution must either aim to further the group's criminal
activity or be made with knowledge of the group's
intent to commit
the crime. So, if there is knowledge about the individual who
develops the software of autonomous weapons
that there is a
possibility of risk that the weapon can be used or create unlawful circumstances, the person can be indirectly liable for the
violations of war crimes. While it is said that an individual can be indirectly responsible if he has knowledge that a crime can be held,
he does not need to intend the specific crimes or outcomes. In the case of Prosecutor v. Germain
Katanga at the International Criminal Court, Germain Katanga, a leader of the
Patriotic Resistance Force (FRPI), was charged with multiple crimes, including
crimes against humanity and war crimes under Article 25(3)(d), for providing support to a militia group
in the Democratic Republic of the
Congo. Under Article 25(3)(d), the prosecution had to establish that Katanga knew
about the militia's criminal intent. They did not need to prove that he
specifically intended the crimes committed but had to show that he knew the
militia was engaged in criminal activities. In a similar case about the
software developer., If the individual has knowledge that the design of a
weapon has a high risk of causing civilian harm, he will be accountable for a crime by violate article 51 (4) of
the Additional Protocol I to the Geneva Conventions and Article 13(1) of Additional Protocol II to the Geneva Conventions.
Conclusion
In conclusion,
individuals who develop software for autonomous weapons can be held indirectly accountable for war crimes
under Article 25(3)(d)
of the Rome Statute if they are aware that their
contributions may further a group’s criminal activity. Full intent is not
required; knowledge of potential unlawful use suffices for liability. The case
of Prosecutor v. Germain Katanga highlights that even without direct intent to
commit specific crimes, awareness of a group's criminal intent can result in
accountability. Similarly, if a software developer knows their design could lead to civilian harm, they
could be criminally liable for violations under international law.