'Slaughterbots' film shows potential horrors of killer drones

This post was originally published on this site
Fictional 'Slaughterbots' film warns of autonomous killer drones

Fictional ‘Slaughterbots’ film warns of autonomous killer drones

Perhaps the most nightmarish, dystopian film of 2017 didn’t come from Hollywood. Autonomous weapons critics, led by a college professor, put together a horror show.

It’s a seven-minute video, a collaboration between University of California-Berkeley professor Stuart Russell and the Future of Life Institute that shows a future in which palm-sized, autonomous drones use facial recognition technology and on-board explosives to commit untraceable massacres.

The film is the researchers’ latest attempt to build support for a global ban on autonomous weapon systems, which kill without meaningful human control.

They released the video to coincide with meetings the United Nations’ Convention on Conventional Weapons is holding this week in Geneva, Switzerland, to discuss autonomous weapons.

“We have an opportunity to prevent the future you just saw, but the window to act is closing fast,” said Russell, an artificial intelligence professor, at the film’s conclusion. “Allowing machines to choose to kill humans will be devastating to our security and freedom.”

In the film, thousands of college students are killed in attacks at a dozen universities after drones swarm campuses. Some of the drones first attach to buildings, blowing holes in walls so other drones can enter and hunt down specific students. A similar scene is shown at the U.S. Capitol, where a select group of Senators were killed.

Such atrocities aren’t possible today, but given the trajectory of tech’s development, that will change in the future. The researchers warn that several powerful nations are moving toward autonomous weapons, and if one nation deploys such weapons, it may trigger a global arms race to keep up.

Related: Futuristic cop cars may identify suspects

Because of these concerns, top artificial intelligence researchers have spent several years calling for a ban on autonomous weapons, which are sometimes called “killer robots.” The researchers warn that one day terrorists may be able to buy and use such drones to easily kill in huge numbers.

“A $25 million order now buys this, enough to kill half a city,” a defense contractor in the film describes as swarms of tiny drones fly out of a cargo plane.

The film is a sensationalistic turn in the approaches autonomous weapons critics have used to push for a ban. In the past, they relied on open letters and petitions with academic language. In 2015, thousands of AI and robotics researchers joined tech leaders such as Elon Musk and Stephen Hawking in calling for a ban on offensive autonomous weapons. That letter spoke of “armed quadcopters,” while this week’s video warns of “slaughterbots.”

Earlier this month, leading artificial intelligence researchers in Canada and Australia called on their governments to support a ban on lethal autonomous weapon systems.

This week’s new approach appears to be the result of the apparent gravity of the situation. This summer, a report from Harvard University’s Belfer Center warned that weapons using artificial intelligence will be as transformative as nuclear weapons.

Be Sociable, Share!

Related Posts

 

MarketTamer is not an investment advisor and is not registered with the U.S. Securities and Exchange Commission or the Financial Industry Regulatory Authority. Further, owners, employees, agents or representatives of MarketTamer are not acting as investment advisors and might not be registered with the U.S. Securities and Exchange Commission or the Financial Industry Regulatory.


This company makes no representations or warranties concerning the products, practices or procedures of any company or entity mentioned or recommended in this email, and makes no representations or warranties concerning said company or entity’s compliance with applicable laws and regulations, including, but not limited to, regulations promulgated by the SEC or the CFTC. The sender of this email may receive a portion of the proceeds from the sale of any products or services offered by a company or entity mentioned or recommended in this email. The recipient of this email assumes responsibility for conducting its own due diligence on the aforementioned company or entity and assumes full responsibility, and releases the sender from liability, for any purchase or order made from any company or entity mentioned or recommended in this email.


The content on any of MarketTamer websites, products or communication is for educational purposes only. Nothing in its products, services, or communications shall be construed as a solicitation and/or recommendation to buy or sell a security. Trading stocks, options and other securities involves risk. The risk of loss in trading securities can be substantial. The risk involved with trading stocks, options and other securities is not suitable for all investors. Prior to buying or selling an option, an investor must evaluate his/her own personal financial situation and consider all relevant risk factors. See: Characteristics and Risks of Standardized Options. The www.MarketTamer.com educational training program and software services are provided to improve financial understanding.


The information presented in this site is not intended to be used as the sole basis of any investment decisions, nor should it be construed as advice designed to meet the investment needs of any particular investor. Nothing in our research constitutes legal, accounting or tax advice or individually tailored investment advice. Our research is prepared for general circulation and has been prepared without regard to the individual financial circumstances and objectives of persons who receive or obtain access to it. Our research is based on sources that we believe to be reliable. However, we do not make any representation or warranty, expressed or implied, as to the accuracy of our research, the completeness, or correctness or make any guarantee or other promise as to any results that may be obtained from using our research. To the maximum extent permitted by law, neither we, any of our affiliates, nor any other person, shall have any liability whatsoever to any person for any loss or expense, whether direct, indirect, consequential, incidental or otherwise, arising from or relating in any way to any use of or reliance on our research or the information contained therein. Some discussions contain forward looking statements which are based on current expectations and differences can be expected. All of our research, including the estimates, opinions and information contained therein, reflects our judgment as of the publication or other dissemination date of the research and is subject to change without notice. Further, we expressly disclaim any responsibility to update such research. Investing involves substantial risk. Past performance is not a guarantee of future results, and a loss of original capital may occur. No one receiving or accessing our research should make any investment decision without first consulting his or her own personal financial advisor and conducting his or her own research and due diligence, including carefully reviewing any applicable prospectuses, press releases, reports and other public filings of the issuer of any securities being considered. None of the information presented should be construed as an offer to sell or buy any particular security. As always, use your best judgment when investing.