( The analysis is given below)

The Era of Killer Robots: The Technology and Global Debate Behind Autonomous Weapons
In today’s warfare, weapons have begun to be used that make decisions, identify targets, and carry out attacks all by themselves. These lethal killer drones are equipped with Artificial Intelligence (AI). They are known as Lethal Autonomous Weapon Systems (LAWS). More than 30 countries around the world are demanding a ban on them, but major powers are opposed to this demand. Understand the complete technology of killer drones and the global debate surrounding them…
Image – AI Generated
The ‘killer robot’ of science-fiction films has now stepped off the screen and reached the reality of the battlefield. Lethal Autonomous Weapon Systems (LAWS) are changing the definition of war. These are weapons that identify and destroy their targets on their own without any human intervention. This is why a major international debate has erupted regarding them today. More than 30 countries want a complete ban on these weapons because the question is not just about technology, but also about ethics. Today, we are writing in detail about these weapons and the technology installed in them.
What exactly are Autonomous Killer Drones?
Autonomous killer drones are essentially machines that perform three key tasks in war on their own: 1. Finding a target, 2. Identifying it, and 3. Attacking it. A human only activates them; after that, the entire decision-making power shifts to the machine’s algorithms and Artificial Intelligence.
These weapons are generally divided into three categories:
The first category is one where human approval is required before an attack.
The second category allows the machine to make decisions, but a human can intervene and stop it if they wish.
The third category is one where humans have no role at all. These are considered the “true” LAWS, and they cause the most concern.
The Entire System Runs on Artificial Intelligence
The strength of these killer machines lies in their Artificial Intelligence and sophisticated sensor systems. Various types of sensors are installed inside the drone, which constantly scan the surrounding environment.
LiDAR technology creates a 3D map of the area using lasers. Thermal cameras can detect human body heat even in the dark, while radar can identify movements at a distance. By combining information from all these sensors, the drone creates a complete picture of its surroundings.
Following this, the AI-based Deep Learning system goes to work. It is trained on millions of images and data points. Analyzing images from the camera within milliseconds, it can decide whether the target in front of it is a soldier, a military vehicle, or a civilian.
The most controversial stage comes next, when the algorithm decides whether or not to attack. This is called the Threat Assessment Algorithm. In this, the machine makes a decision by looking at several aspects such as the level of threat, nearby people, and military significance—but it does not involve human morality.
How Does it Attack?
For navigation, these drones have advanced technologies along with GPS. For example, with the help of SLAM (Simultaneous Localization and Mapping) technology, the drone can determine its path by creating a map of its surroundings itself. Even if the GPS signal is jammed, it does not lose its way.
Many times, these drones are used in groups, known as Swarm Technology. In this, many drones work by forming a network with each other. If one drone receives any information, it immediately reaches all the other drones.
For attacks, “Loitering Munitions” are often used. Such drones hover in the sky for a long time and explode by colliding directly with the target as soon as it is found. Some drones drop missiles or bombs, while others can also launch cyber or electronic attacks to jam the enemy’s electronic systems.
Which Such Weapons Exist in the World?
Israel’s Harop drone is considered a prominent example of this technology. It hovers in the air for hours and attacks as soon as it receives a signal from the enemy’s radar.
The ZALA KYB, developed by the Russian company Kalashnikov, is also considered a similar autonomous attack drone.
The Boeing MQ-28 Ghost Bat, a combat drone developed by the US and Australia, can fly alongside manned fighter jets and make many decisions on its own.
Turkey’s Kargu-2 drone has already become a cause for international debate. According to a UN report, it was first seen in Libya in 2021, when an autonomous drone attacked people without human orders.
Why Do More Than 30 Countries Want a Ban?
The biggest concern regarding these weapons is that Artificial Intelligence can also make mistakes. If an algorithm mistakes a civilian for a soldier, the result could be extremely dangerous.
Apart from this, the threat of a cyberattack is significant. If an enemy hacks these machines, the same weapons could be used against their own soldiers. In the event of a swarm drone attack, stopping hundreds of machines at once could be extremely difficult.
Also Read: The war of deception going on between Iran and Israel! Haven’t you heard of Decoy Technology?
Questions regarding ethics are even deeper. Should a machine be given the right to decide whose life to take? If a mistake occurs, who will be responsible? The programmer who wrote the software, the military officer, or the company that manufactured the drone?
Due to these concerns, more than 30 countries, including Austria and New Zealand, are demanding a total ban on these weapons at the United Nations. However, several major military powers like the US, Russia, China, and Israel are opposing it, as they believe this technology can provide a strategic advantage in the wars of the future.
Specific Events and Factual Analysis
1. Libyan Civil War and the Kargu-2 Drone:
In 2020, a sensational incident occurred in Libya. According to a United Nations report, an autonomous drone named ‘Kargu-2’ identified and attacked enemy forces on its own, without any human instruction. This is being described as history’s first “autonomous hunt,” which has sparked a sense of deep dread across the globe.
2. Use of AI in the Ukraine-Russia War:
In recent times, certain drones are being deployed on the Ukrainian battlefield that are capable of finding their targets and detonating bombs despite ‘Radio Jamming.’ Artificial Intelligence is being utilized in these ‘First-Person View’ (FPV) drones, enabling them to destroy enemy tanks or troops even without human operation.
3. Artificial Intelligence (AI) and Target Selection:
Many modern defense systems (such as Israel’s ‘Gospel’ or Gospel AI) are currently identifying thousands of potential targets within seconds. Although human consent is sought in these cases, the AI provides data so rapidly that it becomes nearly impossible for a human to verify the accuracy of those decisions.
The Terrifying Consequences:
Target Identification Error: If the AI mistakenly identifies an ordinary citizen as a terrorist, the machine will show no mercy and attack immediately.
The Boundlessness of War: Due to autonomous weapons, war will become like a ‘video game’ where the attacking country suffers no casualties of its own; hence, they will not hesitate to cause more bloodshed.
Editorial Opinion: Today, science is on the verge of creating a “Frankenstein-like” monster over which no one will have control. If international bans are not imposed on these ‘Killer Robots’ starting today, the very existence of mankind may be endangered in the future.
Short Analysis
1. Lack of Human Control:
In traditional warfare, humans decide who is the enemy and who is innocent. However, these autonomous weapons or ‘Killer Robots’ are based solely on algorithms. There is no place for human conscience or sensitivity in them.
2. Ethical and Legal Issues:
If an autonomous weapon makes a wrong decision or kills innocent people, who will be held accountable? A machine cannot be punished, which presents a major complexity for international laws.
3. Threat to Global Peace:
This technology has made it easier to wage war. Since the risk to soldiers’ lives is minimal, countries will not hesitate to launch devastating attacks even over minor incidents, which increases the apprehension of a Third World War.
Key Essence: Technology should empower us, but if the weapon itself becomes the master, then destruction is inevitable.