When humans finally create Artificial Intelligence, will Ctrl-Alt-Del equal murder? That\'s about all it takes for a computer in the new book Robopocalypse to decide it\'s time to eradicate all humans. In fairness, Archos, the book\'s supercomputer antagonist, was rebooted 14 times. By that standard, every computer running Microsoft Vista could arguably claim genocide. It\'s becoming a bit of a cliché that everytime someone creates a story about AI, the body count becomes proportional to the size of the network. Since the internet became popular, no computer stops short of global Armageddon. The methods vary, but apparently all supercomputers of the future have it in for us. Skynet took one hour to determine all humans were a threat before it dug out the launch codes and unleashed an army of time-travelling cyborgs against the world. The Cylons of Battlestar Galactica just make me think of Terminators with a Drama Queen complex. In the Matrix movies, humanity\'s only saving grace as far as the machines are concerned is our capability to double as AAA batteries. Humans writing about our creations turning against us aren\'t anything new (Frankenstein anyone?). Computers going rogue seemed to enter the mainstream in the ‘60s. 1968 was a banner year. HAL 9000 went haywire in 2001: A Space Odyssey and bounty hunters began chasing replicants through the streets of Los Angeles in Blade Runner. Robopocalypse, written by Daniel Wilson, is in many ways typical of the genre. It\'s Maximum Overdrive meets World War Z. There are no lingering ethical debates here about creating life. Kill or be killed by robots is what most people will take away from this book. But Wilson has one thing going for him. He has a PhD in robotics and he gives the readers a better insight into the mind of machines then anyone since perhaps Isaac Asimov. But, despite that, he still really doesn\'t delve into the reasons why robots eventually see humanity as the enemy. It\'s not really an answer, but Wilson does at least offer us a quote from H.P. Lovecraft. In the opening lines of The Call of Cthulu, Lovecraft warns that if humans were ever able to correlate all the world\'s data and see the Big Picture, we\'d all happily crawl back into the Stone Age. Wilson takes the position that if computers get there first, they\'ll push us back into the Stone Age themselves. Maybe the reason most writers go down the doomsday path is because we can\'t seem to conceive of an intelligence, even an artificial one, unlike our own. Even when humans put the AI into a physical form, the form is usually human-shaped, like Honda\'s ASIMO. And let\'s face it, we all know what we\'re like. Anyone with an elementary school education these days can point out any number of self-centred, ego-maniacal humans with little regard for life. If we make AI in our own images, why wouldn\'t computers turn on us? Some smart-alec at this point will want to bring up Lieutenant Commander Data from Star Trek or bring Asimov\'s Three Laws of Robots. Fools, parents have been trying to program ethical behaviour into their children since Cain gave Abel the first wedgie. Imagine how long it would take a super computer to work its way around some ethics coding. Ethics have often ended up as a grease spot on the road to power, and computers won\'t change that. If you think all of this is just theory, think again. The fictional numbers pale in comparison to the real ones. Skynet went postal running at 90 teraflops per second. IBM now makes a computer running at 500 teraflops. A less powerful IBM computer called Watson, running at an only 80 teraflops, was able to beat the top Jeopardy! champions, although I\'m not sure answering trivia questions is an indicator of homicidal tendencies. You can even find the results of AI research in your home. The X-box Kinect uses AI to help interpret the movements of the human body, and IBM uses AI to help analyse real world data to help use traffic congestion and save energy. AI is creeping in everywhere. If you haven\'t even notice that, what makes you think you stand a chance when that iPad learns how to swipe back? From gulfnews.