Scharre says while the current weapons are not like those seen in the movies, the technology is advancing, whether people like it or not.
"Things like more advanced hobby drones, the same technology that will go into self-driving cars, all of those sensors and intelligence will make autonomous weapons also possible," he says.
In his book, Scharre looks at the question: "How hard would it be for someone to build a simple, autonomous weapon in their garage?"
And while that's a scary scenario, he says that it's already happening on some levels as students today are learning programming skills, with free and readily available online resources.
"These tools are available for free download. You can download them online," he says. "[It] took me about three minutes online to find all of the free tools you would need to download this technology and make it happen."
And while high school students aren't creating these autonomous weapons, the ability to do so is a real possibility. Because of that, Scharre says the debate isn't so much about if this type of technology should be created, but more so what should be done about it.
"What do we do with this? Do we build weaponized versions of them? Do you build them en masse? Do militaries invest in this?" are all questions being asked as this technology would drastically change warfare.
"[It would create] a domain of warfare where humans have less control over what happens on the battlefield — where humans are no longer deciding who lives and who dies, and machines are making those decisions," Scharre says.
Debates like this are happening in countries all around the world, including those that have repeatedly violated international rules.
In Russia, the military is working to create a fleet of armed ground robots.
"They're building large, ground combat vehicles that have anti-tank missiles on them," Scharre says. "Russian generals have talked about a vision in the future of fully robotized units that are independently conducting operations, so other countries are leaning hard into this technology."
Scharre says that one of the fears of this technology advancing is that "flash wars" could occur. Much like a "flash crash" in the stock market, a "flash war" would occur at such as fast pace that humans would not be involved.
"The worry is that you get an equivalent — a flash war, where algorithms interact in some way and the robots start shooting each other and running amuck, and then humans are scrambling to put a lid back on it," Scharre says.
But, though some scenarios are terrifying, other people argue that autonomous weapons could save lives, by making fewer mistakes than might result from human error.
"Just like self-driving cars could someday make the roads much safer, some people have argued, 'Well, maybe autonomous weapons could be more precise and more humane. By avoiding civilian casualties in war and only killing the enemy,' " Scharre says.
From his own experience in the military serving as a special operations agent, Scharre says he has been in a situation in which an autonomous weapon would have killed a girl that the Taliban was using as a scout, but that soldiers did not target.
He says it's situations like that which highlight differences between what is legal in the laws of war and what is morally right — something that autonomous weapons might not distinguish.
"That is one of the concerns that people raise about autonomous weapons is a lack of an ability to feel empathy and to engage in mercy in war," Scharre says. "And that if we built these weapons, they would take away a powerful restraint in warfare that humans have."
There's still a lot to consider and discuss when it comes to autonomous weapons and the increasing technology, Scharre says. But as for whether humans are doomed, he says there is not a clear answer.
"We do have the opportunity to shape how we use technology. We're not at the mercy of it," Schare says. "The problem at the end of the day isn't the technology. It's getting humans to cooperate together on how we use the technology and make sure that we're using it for good and not for harm."
Noah Caldwell and Emily Kopp produced and edited the audio for this story. Wynne Davis adapted it for Web.
Copyright 2018 NPR. To see more, visit http://www.npr.org/.