Monday, April 30, 2012

The #Turing Test and a thought experiment

I just came across a beautifully written piece in the New Yorker from last year called, Alan Turing's Apple. It does a lovely job of weaving Turing's experience in and out of contemporary times along with the Snow White fairy tale. At its heart its author, Amy Davidson, asks us to consider a thought experiment: if, in a Turing Test, a computer was asked if Turing's treatment that resulted in his death was fair, would a computer be as heartless as some people were in 1952?
    We tend to assume machines lack compassion and empathy, yet clearly some people do as well.

Sunday, April 29, 2012

Sebastian Thrun on #GoogleGlass & #GoogleCar

There's a very interesting interview from The Charlie Rose Show with Sebastian Thrun. He's the former director of SAIL (Stanford Artificial Intelligence Laboratory), and was part of the team which won the DARPA Grand Challenge in 2005 (a race involving driverless vehicles). Thrun now works for Google as the head of the Google X labs, and is responsible for the driverless Google cars and the Glass Project, which I blogged about recently.
   In this 20 minute interview he talks about the new Google glasses, which he's wearing, the motivation behind driverless cars, the future of education, and other projects he's currently working on.




Saturday, April 28, 2012

#Turing's obituaries

Hollymeade in Wilmslow, the house where Turing died
Thanks to David Stutz for making some of the obituaries to Alan Turing available on his blog. His (and my) favourite is from Sherborne, Turing's school.


“For those who knew him here [at Sherborne] the memory is of an even-tempered, lovable character with an impish sense of humour and a modesty proof against all achievement. You would not take him for a Wrangler, the youngest Fellow of King’s and the youngest F.R.S. [Fellow of the Royal Society], or as a Marathon runner, or that behind a negligĂ© appearance he was intensely practical. Rather you recollected him as one who buttered his porridge, brewed scientific concoctions in his study, suspended a weighted string from the staircase wall and set it swinging before Chapel to demonstrate the rotation of the Earth by its change of direction by noon, produced proofs of the postulates of Euclid, or brought bottles of imprisoned flies to study their “decadence” by inbreeding. On holidays in Cornwall or Sark he was a lively companion even to the extent of mixed bathing at midnight. During the war he was engaged in breaking down enemy codes, and had under him a regiment of girls, supervised to his amusement by a dragon of a female. His work was hush-hush, not to be divulged even to his mother. For it he was awarded the O.B.E. He also adopted a young Jewish refugee and saw him through his education. Besides long distance running, his hobbies were gardening and chess; and occasionally realistic water-colour painting.
    In all his preoccupation with logic, mathematics, and science he never lost the common touch; in a short life he accomplished much, and to the roll of great names in the history of his particular studies added his own.” — The Sherbornian, Summer Term 1954


This obituary from his old school does seem slightly at odds with their opinion of him whilst he was actually a student there. His recently published school reports paint a slightly different picture. For example, in mathematics one master comments, "Not very good.  He spends a good deal of time apparently in investigations in advanced mathematics to the neglect of his elementary work.  A sound ground work is essential in any subject.  His work is dirty." You can read his school reports on Alex Bellos' blog.

Friday, April 27, 2012

Travelling Salesman - A Movie About P = NP

Honestly a movie is being about about the travelling salesman problem. The plot of the movie is based on what would happen if a computer scientist proved that NP = P. If this happened, all public key cryptography would be useless  as the NP problems they are based on could be converted into problems in P and solved in a "reasonable" time. The movie called Traveling Salesman premiers on June 16 and according to the makers "is an intellectual thriller about four of the world's smartest mathematicians hired by the U.S. government to solve the most elusive problem in computer science history -- P vs. NP. The four have jointly created a "system" which could be the next major advancement for humanity or the downfall of society. As the mathematicians are about to sign documents that will give the government sole and private ownership of their solution, they wrestle with the moral dilemma of how their landmark discovery will be used."
   You can find out more information on the movie's website, and you thought computer science was boring.

How Alan #Turing invented the computer age

I'm very pleased to announce that Scientific American has published an essay I wrote titled, "How Alan Turing invented the computer age" in their Guest Blogs. The essay is 1,000 words long and gives a brief history of Turing's from his 1936 paper "On computable numbers..." through his WWII code breaking, to his post-war work on machine intelligence, his conviction and tragic suicide. It was quite a challenge to get all that into 1,000 words.
   Scientific American has an interesting range of blogs, which I recommend to you.

Thursday, April 26, 2012

The first video phone call

An advert for the Bell Picturphone
According to the blog Skype Numerology Skype estimates it has approximately 124 million users, though it admits not all of these are active. I use Skype (and Apple's FaceTime) to talk with colleagues, students I'm supervising and of course to keep in touch with friends and family. You might be surprised to find out that the first public transcontinental video phone call was made in April 1964. Bell had placed their new Picturephones in booths at the New York World Fair and in Disneyland California. Members of the public in New York and California could see and speak to each and long lines rapidly grew at each location.
    A commercial service started in June 1964 from calling booths in three cities: New York, Washington, D.C. and Chicago - but the service failed to excite.  Customers needed to schedule their calls in advance and it was very expensive. A 3-minute video call from New York to Washington cost $16, about $120 in today's money. Despite slashing prices the following year the service failed to take off and the plug was pulled in 1968. So next time you skype appreciate how convenient and cheap video calling has become.

Wednesday, April 25, 2012

New science-fiction movie The Creator features Alan #Turing

The World Science Festival is showing the premier of a new movie by Al Holmes and Al Taylor (Al+Al) called The Creator, "a beautiful and surreal short-form film by award-winning British filmmakers Al+Al, which follows sentient computers from the future on a mystical odyssey to discover their creator: legendary computer scientist Alan Turing. Decades ago, Turing famously asked, ‘Can machines think?’ and ever since, the notion of computers exceeding human intelligence has transfixed researchers and popular culture alike. Marking the centenary of Turing’s birth, The Creator will launch a wide-ranging conversation among leading computer scientists and physicists about the promise and perils of artificial intelligence, as we take a personal look at the remarkable and tragic life of this computer visionary."
  The World Science Festival 2012 runs from May 30 to June 3 in New York and the whole programme looks very interesting. Tickets for the premier of The Creator, which screens on Thursday, May 31, 2012 8:00 PM - 10:00 PM at The Museum of the Moving Image, cost from $15.



Tuesday, April 24, 2012

Why cut computer science research?

Given the fact that employment in the IT sector has been steadily growing over the last decade, in part driven by innovations coming out of computer science research labs, it seems crazy for a University to axe computer science research. But, that is exactly what the Dean of Engineering at the University of Florida is planning to do. The Computer and Information Science and Engineering Department (CISE) will  become a teaching only department and lose its research activity.
   If we leave aside the argument that active researchers actually make better teachers, this plan still seems crazy at a time when companies, in the US in particular, value computer science research like never before. Perhaps the University of Florida should spend less on its football team, the Gators, and more on research that actually may benefit the economy!
- This story was brought to my attention by my colleague Mark Wilson.

Alan #Turing and the Unsolvable Problem

There will be a public lecture on Alan Turing's halting problem this Thursday evening (June 26) at the University of Auckland, New Zealand. Professor Cristian Calude will give a lecture titled Alan Turing and the Unsolvable Problem: To Halt or Not to Halt - That is the Question. The lecture will explain and explore the theories that led to Turing's invention of the Turing Machine and the implications that follow from this including, obviously, the halting problem.
    The lecture starts at 6:00 pm with complimentary refreshments before at 5:30 pm. If you can't attend the lecture will be available as a webcast. More information about the speaker, the lecture and venue information here.

AI has been brain-dead since the 1970s

Well that's what Marvin Minsky said recently at Boston University according to an article in Wired. Minsky is in a good position to judge since he's considered one of the founding fathers of AI co-founded the MIT Artificial Intelligence Laboratory back in 1959 with John McCarthy. Minsky, "accused researchers of giving up on the immense challenge of building a fully autonomous, thinking machine."
    Read the Wired article and come back for my take on it...


...I think Minsky has a valid point. There are two issues which have troubled me in recent years. The one is the rise in popularity of competitions within AI: RoboCup, trading agents, computer poker, etc...These are very popular with grad students, who like to win, but it may be questionable if there is a big benefit to AI as a whole. Take RoboCup, the robot soccer competition; several years ago RoboCup was co-located with IJCAI, the main bi-annual AI competition. More delegates were registered for RoboCup than were for IJCAI! That can't be right - it means more people were working in the narrow application of making robots kick a ball about than in the entire discipline of artificial intelligence. That's not balanced or healthy for the development of the subject.
    I've also noticed over the last decade and half, and it's true that this is largely because of AI's failure with expert systems in the 1980s, the rise of what I call the "smart algorithms approach." That is, given a problem AI researchers now typically attack it using machine learning methods and never attempt to use any explicit knowledge even when that knowledge may be easily available. This approach has been successful but eventually AI's will need to be able to use knowledge, codify it, pass it amongst themselves and be able to generalize knowledge from their experience. Consider an analogy, whom would you trust more, a doctor who said "take this drug it always seems to work," or a doctor who said "take this drug it always seems to work because it inhibits the protein receptors on the virus and interferes with its reproduction." The latter has explicit knowledge used to support an observation based on data. the former just has data.
   Minsky is right, AI has to get back to explicitly handling knowledge, both expert and common sense.

Monday, April 23, 2012

Fighters for Internet freedom

Wikipedia's homepage on the day it blacked itself in protest
As part of it's Battle for the Internet series The Guardian has published a list (with short bios) of the top 20 "fighters for Internet freedom." The list includes obvious people like Tim Berners-Lee, the inventor of the Web, and Linus Torvalds, the creator of Linux, plus more controversial figures like Julian Assange, ex-hacker and editor of WikiLeaks, and the hacktivist collective Anonymous. My Favourite person included on their list is Lady Ada Lovelace, who died in 1852, over a century before the Internet was even invented. Check the Guardian's list out and suggest anyone you think they've overlooked. 

Sunday, April 22, 2012

The #Turing Test on Twitter

Twittest is a form of the Turing Test for Twitter, asking "are you the twit in twitter?" It's "a playful project for children and adults to explore identity across social networks. Loosely based upon the Turing Test, the game is simple: Can you spot the real students, teachers, celebrities and bots from the fakers by reading their tweets? And can you successfully fake being someone else?"
    In Alan Turing's centenary year this is an excellent way of exploring the concept of machine intelligence.

Saturday, April 21, 2012

An old case of computer identity theft


Acker Bilk
Mr. Acker Bilk is a jazz clarinetist who plays in the English style called traditional or “trad”. He is performing still – see his website – but his “hay day” was in the 1950s and 60s. His biggest hit and signature tune was “Stranger on the Shore,” which topped the charts in 1962.
    Mr Bilk developed his own distinctive image of: goatee beard, bowler hat and striped waistcoat. Being that he was so successful, I have always wondered why Mr Bilk found it necessary to advertise computers, for I distinctly remember him in an advertisement for an Elliot 503 computer – the brand that was Victoria University of Wellington’s first machine. I recently sent a note to Mr Bilk asking about the advertisement, but he denied having ever advertised computers - indeed he claims to detest them.
    A puzzle! Nothing on the Internet – too long ago. But looking through bound editions of old magazines in the Engineering library I eventually found the advertisement run in 1962 at the height of Acker Bilk’s success. Zooming in on the player you can also see that it isn’t Acker Bilk at all, just somebody dressed in his image. They wouldn’t get away with it nowadays!
    You can see the full advertisement in the Computer Science Department's on-line history pages.



- Bob Doran

Friday, April 20, 2012

#Anonymous wins vote for Time magazine's most influential person

The hacktivist collective Anonymous has won the popular vote by readers for Time magazine's "most influential person" of 2012. Time magazine revealed yesterday that Anonymous had polled 395,7943 votes in comparison to the runner-up Eric Martin who polled 264,193 votes.
   Notwithstanding the results of their poll Time has declared Jeremy Lin (no I hadn't heard of him either - he's a basketball player) as the most influential person, and ranked Anonymous 36th. If I were chief of network security for Time magazine I don't think I'd be sleeping too easy, as who knows how the hacktivists (and they are legion) will take to this blatant disregard of the peoples' voice. Of course some people are suggesting that Anonymous may have hacked the poll and others are saying that they're not eligible anyway since they're not a "person." The former argument may be true, but the latter isn't since in 1982 Time named the home computer "Person of the Year."

Pentagon orders augmented reality contact lenses

Here's a story related to a recent post about Google's Project Glass; the BBC has reported that the Pentagon has ordered Innovega's iOptik augmented reality technology that pairs contact lenses with glasses containing tiny flat-panel displays. iOptik will enable soldiers to see data about targets superimposed over the actual battlefield.  Of course there will be many civilian uses as well. In chapter 13 "Machines of Loving Grace" in The Universal Machine, augmented reality and ubiquitous computer are explained - the irony is that as our computers become more powerful they will practically disappear into the fabric of our clothing, our surroundings and into our glasses and contact lenses.  This is going to happen much sooner than you might think. The BBC report says that "the lenses are still going through clinical trials as part of the US Food and Drug Administration's approval process, but Mr Willey [Innovega's CEO] said he was confident the tech should be available to the public towards the end of 2014."

Thursday, April 19, 2012

The #Turing 100 - special event

Celebrations for the Alan Turing centenary are starting to take off as the anniversary of his birth approaches. There are far too many events, competitions, exhibitions, performances, publications and other things happening to feature them all, but this one caught my eye.
    The University of Reading has announced a special one-day event. called Turing100 that will take place on his 100th birthday, June 23 2012, at Bletchley Park where Turing worked as a code breaker during WWII. "Turing100 will be based around Turing's famous question and answer game, commonly known as the Turing test. Using the question and answer method, the experiments which will be part of the Turing 100 day will play a serious role in helping raise awareness of cyber crime using artificial conversation systems aiming to increase the detection rate for online deception and preventing the risk of Internet grooming."
    Reading reports that in an experiment in 2008 "in 25 of the 96 Turing tests there was a failure by human interrogators to correctly recognise at least one of two hidden machines (the human judges classified machines as human; humans were misclassified as machine; sex and age were difficult to recognise)."  Frankly, I find this result surprising since no chatbot I've seen yet comes close to being a convincing human. Try out Cleverbot for yourself, which is currently considered one of the best chatbots.
   My university, The University of Auckland, in New Zealand, is holding a month of public lectures to commemorate Turing's centenary. For details of all the celebrations worldwide visit The Alan Turing Year.

Wednesday, April 18, 2012

35th anniversary of the #Apple II

Woz & Jobs with the Apple II
Thirty five years ago if you'd gone to the West Coast Computer Faire, at San Francisco’s Civic Auditorium, front and centre as you walked through the doors you'd have seen a smart display from a new computer start-up - Apple Computers. Steve Jobs, just 21 years old, in typical fashion had talked his way into the prime spot at the exhibition, because Apple had something revolutionary to show off.
    The Apple II was Steve Wozniak's (Woz's) masterpiece; what he'd been put on this planet to do. He designed most of its circuitry, including new circuits for displaying color graphics. He'd spent the night before the exhibition writing code to control the disk drive - most other PC's of the day used cassette tapes, but Apple wanted the faster random access of a disk. Woz also wrote the initial operating system and software. All this he did, not to make money, but for fun, just to prove he could!
    Jobs had a significant input as well. He oversaw the visual design of the Apple II. It was the first PC to ship in an attractive chic case; most computers looked like they were built by ham radio enthusiasts - the Apple II looked more like a stylish European hi-fi. Jobs knew that it was important that the Apple II "worked right out of the box,"  and it was one of the first PCs to do color graphics and sound and even shipped with two game paddles. The Apple II was a computer you could take home, unpack and use right away. But for the enthusiast the Apple II featured eight expansion slots which eventually enabled a wealth of aftermarket products like: more memory, printers, modems, and light pens to be connected. The Apple II was simple yet flexible. 
    Although the Apple II was expensive, starting the slur that Apple fans have more money than sense, it was a best seller, grossing Apple at its peak over $1 billion a year. Remarkably the last incarnation of the Apple II shipped 16 years later in November 1993 -over 6 million were sold in total.  
    Happy birthday Apple II, you were a remarkable computer.

Tuesday, April 17, 2012

The future of work in a digital world

Robots in a warehouse
Since the economic crisis politicians from all sides have been pledging a return to full employment as soon as normality returns, but all seem to be in denial - the new normal will be profoundly different to the old normal. An article in The Economist points out that in the US "employment in retail trade is down by more than 300,000 jobs over the past decade, employment in computer systems design and related services is up nearly 400,000 jobs. Crucially, these employment trends are not symmetric. Retail employment is middle- to low-skill... The e-commerce services that are taking its place employ different sorts of workers"  - i.e., highly-skilled computer science graduates.
   In March Amazon announced that it was buying the industrial robotics firm Kiva Systems for $775 million. Kiva make robots that can pick items in warehouses and so can automate Amazon's vast distribution service.  Whilst this purchase makes a lot of sense for Amazon our politicians seem to be ignoring, or worse, are ignorant of its implications. There once, quite recently, was a time when if a person was willing to put in a hard days work they could get a job with a living wage. But as warehouses are staffed by robots, e-commerce replaces many shops and banks, vehicles become driverless, and as even fast-food outlets use robots, where will the jobs for the low-skilled come from?
   I love technology, but frankly this worries me. This dilemma is discussed more in chapter 13 "Machines of Loving Grace" and chapter 14 "Digital Consciousness" of The Universal Machine.

Monday, April 16, 2012

Battle for the Internet

The Guardian newspaper is running a week long special into the future of the Internet. They have opened with an interview with Google co-founder Sergey Brin who describes a range of threats to openness from government censorship to Apple and Facebook's walled gardens.
    The week will feature special reports on cyberwar, intellectual property and copyright, walled gardens, hacktivism, digital democracy and privacy. You can get involved through the "Comments" section on each story.

Sunday, April 15, 2012

The cost of knowledge

Cambridge University professor, Timothy Gowers, started a campaign to boycott Elsevier last January. The number of signatures recently passed 9,600 as researchers across the world have signed the boycott at The Cost Of Knowledge website. The boycott targets Elsevier specifically for charging “exorbitantly high prices for subscriptions,” which forces libraries to buy expensive journal bundles rather than choosing individual titles they want, and for supporting measures such as SOPA, PIPA, which attempt to control the free exchange of information for corporate profit. 
    So what's the deal here, surely these journals have been published for many years without issue. In 2010, Elsevier reported a profit margin of 36% on revenues of $3.2 billion. Any business person would tell you that a profit margin of 36% is more than healthy . Here's how Elsevier makes so much profit; researchers write a paper describing their research and submit it to a journal for publication. The editor of the journal, a senior academic, is an unpaid volunteer who selects several academics to review the paper. The reviewers are unpaid volunteers who read the paper, assess its quality and maybe suggest improvements to the authors. If the paper is accepted for publication the authors prepare the paper to very precise formatting instructions so that all papers in the journal look the same. The authors then submit the formatted paper to the editor and sign away their copyright to Elsevier. When enough new papers have been obtained a new volume of the journal is published. They used to be physically published, at a cost to the publisher, but now most academics access journals online. Elsevier then charge a subscription for access to the journal. An annual subscription to Artificial Intelligence, the leading journal in my discipline, costs $2,999 (USD)!
    So, at every stage of the production process academics give their time and effort for free and then Elsevier charges the very same academics a fee to access their own work - and you thought university professors were supposed to be smart!
    You can read a colleague's blog post about the Elsevier boycott here.

Saturday, April 14, 2012

What is a drone, anyway?

John Villasenor has written an useful article called, What is a Drone, Anyway, in the Scientific American blogs that guides the reader through the distinctions between: drones, planes on autopilot, unmanned aerial vehicles, remotely piloted aircraft and first-person view unmanned aircraft. We're becoming increasingly used to seeing drones in use in conflicts such as Afghanistan and Iraq, and their use is spreading to civilian law enforcement and  border patrol, so its a good idea if we understand the jargon.
   A drone is an autonomous arial vehicle that can fly without human intervention from point A to point B, or they can patrol a given area or perimeter. If the drone is armed, like the US Reaper drones, then traget acquisition and weapon firing is under human control. However, in a very worrying development South Korea has developed sentry robots to patrol the DMZ between it and North Korea that are capable of identifying targets and firing without human intervention. So the prospect of totally autonomous lethal drones is certainly here! Ron Arkin at Georgia Tech has recently written a book called Governing Lethal Behavior in Autonomous Robots to explore the important issues these development may lead to.
    By the way, Scientific American has some very interesting blogs, I encourage you to check them out.

AI could be on brink of passing #Turing Test


There's an interesting blog post in Wired called Artificial Intelligence Could Be on Brink of Passing Turing Test, which argues that some computer scientists think that we might soon be able to build an intelligent computer capable of passing the Turing Test. The key breakthrough they believe is that thanks to the Internet, the cloud, and massive distributed processing power it might be possible for software to catalog, analyze, correlate, and cross-link everything in the digital realm. "These data and the capacity to analyze them appropriately could allow a machine to answer heretofore computer-unanswerable questions and even pass a Turing test." [source Wired]
   In the past AI has proposed three ways that computers might become intelligent:

  1. we engineer it from the ground up, as in the Cyc Project, but this approach has been largely discredited,
  2. we build a computer that can learn, as exemplified by the HAL 9000, in 2001: A Space Odyssey, remembering its first lessons, or
  3. computers with a very large number of interconnections, like neurons, suddenly just become conscious, like the computer Mike in Robert A. Heinlein's classic science-fiction book The Moon Is a Harsh Mistress.

Perhaps it will be combination of all three: engineering, learning and serendipity. But then once conscious would it be ethical to switch the machine off?

Friday, April 13, 2012

Is computer code property?

If you write a computer program then you own it, right? If somebody steals your source code then that's theft, isn't it? Well apparently it's not that simple. A US Court of Appeal in New York has recently ruled that, "Former Goldman Sachs programmer Sergey Aleynikov, who downloaded source code for the investment firm’s high-speed trading system from the company’s computers, was wrongly charged with theft of property because the code did not qualify as a physical object under a federal theft statute, according to a court opinion published Wednesday." [sourced from Wired]
    Let's run that by again, "code did not qualify as a physical object under a federal theft statute." So, Aleynikov couldn't have stolen the code because it has no physical existence. This sounds similar to the early days of computing when programmers couldn't patent their software for the same reason. Consider the example of Dan Bricklin and Bob Frankston who came up with the idea of the spreadsheet and called it VisiCalc in 1979. Obviously since they'd invented the most important piece of PC software after the word-processor they must both have become fabulously wealthy. Sadly, no, since they couldn't patent the idea because software had no physical existence - courts ruled it would be like patenting music. They could copyright VisiCalc and prevent people copy their software but that wouldn't prevent others (e.g., Microsoft) from implementing their own spreadsheet.
   Developers are now able to patent ideas and techniques used in software and I thought this legal notion of physical existence with regard to software had been dealt with long ago - apparently not.

#Turing & the origin myth of #Apple's logo

Maria Konnikova has written an interesting piece in the Scientific American blogs called Hunters of Myths: Why Our Brains Love Origins. She focuses on the origin myth of Apple's logo and that, incorrectly, it is often believed to be a tribute to Alan Turing and the poisoned apple with which he committed suicide. She finds it fascinating "that Steve Jobs never denied the  story of Turing-as-muse, even when asked about it head on. Instead, he just looked enigmatic." She goes on to argue that people prefer the origin myth to the dull pragmatic truth and that Jobs was happy to let the falsehood spread.
    If you read down to the bottom of the article you'll see a reference to me and to this blog. I pointed out to Maria that last year Stephen Fry in his BBC QI TV programme recounted that Jobs had told him that the logo was not in honor of Turing - saying to Fry, "It isn't true, but God we wish it were!I blogged about this last year.
    I totally agree with Maria though, I certainly prefer the myth and am often reminded of Turing and his remarkable achievements when I see the Apple logo.

Thursday, April 12, 2012

The Universal Machine blog is syndicated


That's correct, this blog has been syndicated. I work at the Computer Science Department of the University of Auckland and I've been asked to create a blog for the department. It will take the majority of posts from The Universal Machine but will also feature additional posts from topics suggested by staff and students. The CS Blog "aims to improve the public understanding of computer science," so it is a good fit with most of my posts.

BBC #Horizon - Defeating Cancer (documentary)

The Da Vinci surgical robot
Another week and another excellent documentary from BBC Horizon. This weeks episode, Defeating Cancer, is ostensibly about the latest cancer treatments involving: targeted radiotherapy, precision surgery and new drug treatments. However, there was an obvious sub-plot - each of these innovations was only possible because of computing.
    The targeted radiotherapy involved using an incredibly sophisticated robotic arm and sensing system that could target a tumor so accurately it could even adjust for the patient's breathing. The precision surgery was performed by a surgical robot, controlled remotely by a surgeon using a 3D visualisation and control system. The new drug treatment showcased was the product of genetic sequencing, itself heavily dependent on computation, and then involved designing a new drug molecule using 3D visualisation of the active sites on a potential cancer causing gene.
   At the end of the doco the narrator did say, "none of these breakthroughs would be possible without technology."  What she meant was, without the universal machine - the computer.
   Defeating Cancer can be watched on the BBC iPlayer (in the UK), but a clip is available on YouTube.

Luck and Death at the Edge of the World

You meet some interesting people online, and recently I met Nas Hedron, author and blogger. He runs The Turing Centenary blog, is an active tweeter on all things related to Turing (@TuringCentenary) and he runs the Homo Artificialis blog. Recently he asked if I'd like to read a pre-publication version of his new science-fiction thriller Luck & Death at the Edge of the World. I've always been a sci-fi fan, its probably why I research in AI, so I agreed. One of the key characters in the book is an AI, who takes on the physical form of Alan Turing when it needs to materialize itself. I thoroughly enjoyed the book but can't give away too much of the plot.
   Nas Hedron is using the IndieGoGo funds sourcing site to raise funds to bring the book to publication. If you enjoy quality science-fiction and noir thrillers I suggest you take a look. You can sample three chapters for free and I recommend listening to the reading of the first chapter. You can get involved with this project for as little as $5, which provides you a special edition of the book, but you should check all the ways to help and the perks you'll receive.

Wednesday, April 11, 2012

Alan #Turing and Life's Enigma - Manchester Museum

We're all familiar with Alan Turing's work in mathematics, code-breaking, computing and AI - or at least you should be if you've been reading this blog or paying any attention to The Alan Turing Year. However, most people are less familiar with Turing's pioneering work on biological morphogenesis. The Manchester Museum currently has a special exhibition, called Alan Turing and Life's Enigma, which, "Inspired by 1950s design and combining Alan Turing’s notes with museum objects, this exhibition documents Turing’s investigation into one of the great mysteries of nature: how complex shapes and patterns arise from simple balls of cells."
    If, like me, you can't get to the exhibition I recommend looking at the Biology Curator's recent blog post, which does an excellent job of giving you a feel for the exhibition. I was particularly interested to see the influences on Turing's work and that Snow White and the Seven Dwarfs had a productive effect on him, and was not just limited to the poisoned apple with which he committed suicide.

Vesuvius - a 512 Qubit quantum computer

A Vesuvius 512 Qubit chip
D-Wave, the quantum computing company, has developed a 512 Qubit quantum computer.  Codenamed, Vesuvius, D-Wave say it will be  capable of executing a massive number of computations at once, more than 100,000,000,000,000,000,000,000,000,000,000,000,000, which would take millions of years on a standard desktop PC
    D-Wave claim they will be able to use Vesuvius for:

  • Binary classification – Enables the quantum computer to be fed vast amounts of complex input data, including text, images, and videos and label the material.
  • Quantum Unsupervised Feature Learning (QUFL) – Enables the computer to learn on its own, as well as create and optimize its own programs to make itself run more efficiently.
  • Temporal QUFL – Enables the Computer to predict the future based in information it learns through Binary classification and the QUFL feature.
  • Artificial Intelligence via a Quantum Neural Network – Enables the computer to completely reconstruct the human brain’s cognitive processes and teach itself how to make better decisions and better predict the future.
Wired has a good article on quantum cloud computing in a recent issue and quantum computing is explained in the final chapter "Digital Consciousness" of the Universal Machine.

Tuesday, April 10, 2012

RSA Conference Alan #Turing animation

The RSA Conference 2008 had a great animation featuring Alan Turing.



I previously though RSA was The Royal Society for the encouragement of Arts rather than a conference for cyber security professionals. However, the other RSA has produced a fascinating series of lectures accompanied by wonderful hand drawn illustrations in their RSA Animate series - I highly recommend taking a look, they're informative and entertaining.

Monday, April 9, 2012

The uncanny valley

The term "uncanny valley" was coined by Japanese robotics professor Masahiro Mori in 1970.  Basically it describes our emotional reaction to robots; when they look different to us, think WALL•E, we react "cute," but when robots approach a humanoid form we can find them creepy and repulsive (i.e. uncanny). For us to warm to humanoid robots their design must cross the uncanny valley and become so similar to us that we are no longer grossed out, think the robot boy David in the movie A.I.
  However, crossing this valley is very hard since we're highly trained to notice almost imperceptible nuances in human behavior. This humanoid robot, called ECCE, I think we'd all agree is right in the uncanny valley - skin would help!

Alan #Turing's life, according to his family

Cambridge University Press have just announced they are republishing a biography of Alan Turing written by his mother Sarah. The book was originally published in 1957 but the "Centenary Edition" includes a new forward and a new memoir by Turing's brother. The original book has been out of print for years and hard to track down. The publisher also says, "The contrast between this memoir and the original biography reveals tensions and sheds new light on Turing's relationship with his family, and on the man himself."
    In a striking coincidence the new edition's cover shares strong design elements with The Universal Machine's cover, namely the red apple with a bite out. Frankly I'm a bit surprised that a book by Sara Turing would use this cover image, since Sara always refused to accept that her son had committed suicide, preferring to believe it had been a tragic accident. Whereas for my book the apple was the perfect image since it not only references Turing, but also Apple, the company, which features prominently in several chapters.

Sunday, April 8, 2012

Computer Science is essential for everyone

Back at the beginning of March I posted a blog item about how the UK was reconsidering the teaching of ICT in favour of a more computer science based approach - including actually teaching kids to program. An excellent article in the Guardian describes the success one school has had using Scratch and even teaching 11 year old children to program in Python.
    The author says that, "In the decade following the introduction of the ICT curriculum the UK, once a world class computing champion, witnessed both a surge in demand and a downturn in success. UK software developers, together with the games and interactive entertainment industry, report a deepening chasm in recruitment to their profession. Universities have experienced a decline in applications to undergraduate courses in computing sciences, likewise, colleges report similar falls in applications to A level computing courses. A common thread dominates conversations with university staff and industry – ICT is not the same as computing."  I couldn't agree more.
    It appears that this movement against ICT teaching is spreading. A recent article in Forbes, called Computer Science is Essential for Everyone, makes a strong case that having knowledge about the fundamentals of computer science is a necessity in our increasingly digital and networked world.

Saturday, April 7, 2012

Google Art Project adds New Zealand museums

The new Auckland Art Gallery
Google announced on their blog this week that they had added more museums from around the world to their excellent Google Art Project. This now includes two museums from New Zealand: Te Papa, The Museum of New Zealand in Wellington and the Auckland Art Gallery. So if you've not been to New Zealand and think we just do sheep, rugby and extreme sports, you can now see some of our best art.
    Google says, "The Art Project is part of our efforts to bringing culture online and make it accessible to the widest possible audience. Under the auspices of the Google Cultural Institute..."  I think this is a wonderful use for the Web. The Google Art Project now provides free access to 30,000 high res images from 151 museums in 40 countries. You can browse some of the most famous art in the world and discover new treasures all from the comfort of home.

BBC Horizon The Hunt for AI - review

I watched the BBC's Horizon documentary The Hunt for AI last night and here is my review. I have to say that I wasn't really very impressed. Don't get me wrong I'm not being an AI snob -  "I work in AI and that was much too simplistic" kind of attitude. I just didn't think it really added anything to previous documentaries. Prof. Marcus Du Sautoy is a very engaging presenter and the doco touches all the main points: Alan Turing, the Chinese Room, embodied vs. disembodied intelligence, learning and creativity. But previous Horizon documentaries have already dealt with the same material, and done it better.
    Horizon aired a documentary on AI in 1987 called Thinking, which to my mind is better than the much newer Hunt for AI. In this old doco, they had Derek Jacobi as Alan Turing, from the play Breaking the Code, John Searle himself introduces the Chinese Room, Marvin Minsky counters and Herbert Dreyfus gets involved. Personally I think this new documentary's only advantage is that's it's contemporary and has robots in it.
   You can view episode 1 of 5 of Thinking on YouTube and The Hunt for AI on the BBC iPlayer  and decide for yourself.

Friday, April 6, 2012

Google - Project Glass - augmented reality glases

Back in February I posted that Google was starting work on augmented reality glasses, which would provide a heads-up display of relevant information to you. Google have just made public Project Glass, which looks like it aims to replace your smartphone with glasses: providing access to maps, calendars, contact information, music and all the information on the web. I wonder how Angry Birds will work?
    In chapter 14 "Machines of Loving Grace" in The Universal Machine I introduce readers to the concept of ubiquitous computing and augmented reality. You can watch a video that Google have prepared showing possible functionality of their glasses - it's really quite cool.



Thursday, April 5, 2012

2012 Gibbons Lecture Series – The #Turing Legacy

Auckland University's Department of Computer Science is holding a series of public lectures to commemorate Alan Turing's centenary as part of the Alan Turing Year.

Alan Matheson Turing who was born in 1912 is now widely accepted as one of the most important founders of both theoretical and practical computing.Turing's work was the basis for many areas of computing research and development that are still on-going. The Gibbons lectures for 2012 will involve local speakers discussing four topics in the rough order of Turing's involvement during his lifetime.



  • Apr 26: Alan Turing and the Unsolvable Problem: To Halt or Not to Halt - That is the Question by Prof Cristian S. Calude. Professor Calude from our department will talk on the Theory of Computing, the first area for which Turing is renowned and where Cristian has made many contributions himself.
  • May 3: Alan Turing and the Secret Cyphers: Breaking the German Codes at Bletchley Park by Prof Jack Copeland. Professor Copeland from the University of Canterbury, is one of the world's leading experts on Turing and will address Turing's secret involvement with Cryptanalysis during WW2. Note: This talk has been pre-recorded and will be webcast only.
  • May 10: Alan Turing and the Computing Engine: Turing's achievements in practical computing by Professor Brian Carpenter & Professor Bob Doran. Turing emerged from the war with a burning interest in building a practical electronic computer - this is covered in the third talk by Professors Carpenter and Doran of our department who have had a long interest in the origins of computing.
  • May 17: Alan Turing and the Artificial Brain: The Development of Artificial Intelligence by Associate Professor Ian Watson of our department. As computers started to become available Turing turned his interest to using them to perform intellectual tasks rather than just calculation. He is recognized as the founder of Artificial Intelligence - the subject to be covered in this final lecture.
If you are in Auckland, New Zealand you are welcome to attend. The lectures are free, and are preceded by complimentary refreshments. Click here for further details, venue and times.

#Turing's #sunflowers

In addition to his many other talents: mathematician, codebreaker, electrical engineer, programmer, computer scientist, long distance runner... Alan Turing was also pioneer of bioinformatics - the application of mathematics to biology. He was fascinated with the mathematical patterns found in plant stems, leaves and seeds, a study know as phyllotaxis
   Turing noticed, for example, that the number of spirals in the seed patterns of sunflower heads often conform to a number that appears in the mathematical sequence called the Fibonacci sequence (0, 1, 1, 2, 3, 5, 8, 13, 21, 34, 55, 89…). Turing set out to explain how this might help us to understand the growth of plants. To test this hypothesis the Manchester Science Festival wants we gather lots of data… and sunflowers are perfect for the job, so long as you can grow enough of them!
  Go over to their website and see how you can be involved by growing sunflowers this (northern hemisphere) summer. This is a great way of involving children in the Alan Turing Year.

Beyond Siri: the future of mobile devices and search

An interesting blog post from Lars Hard the founder and CEO of the the AI platform Expertmaker, called Beyond Siri: What AI means for the future of mobile devices and search. To make this future more real consider a few simple examples. Your intelligent agent on your phone, Siri 2.0 perhaps, says to you, "The Wine Boutique has just announced a special offer on the new vintage of your favorite label. Would you like me to reserve half a case for you?" Or Facebook might say, "Your sister is trying to organise a family get-together, you're free Sunday week, shall I tell her?" Or, perhaps Tripit 4.0 will say something like "There is a storm gathering over the Midwest. It looks like your flight will be delayed. Would you like me to book a hotel room now for this evening, and an alternate flight out first thing tomorrow?"
   These are all individually possible now, but collectively they could become very useful. It's not a new idea though. When Siri was launched, I commented in this post that Apple had envisaged this back in 1987. Of course then they didn't imagine the technology would be mobile and hence even more useful. Watch the video but image the Knowledge Navigator working on your phone.

Wednesday, April 4, 2012

BBC Horizon - The Hunt for AI (documentary)

The BBC's flagship science documentary series has just aired a documentary called The Hunt for AI in which, "Marcus Du Sautoy wants to find out how close we are to creating machines that can think like us: robots or computers that have artificial intelligence. His journey takes him to a strange and bizarre world where AI is now taking shape.
    Marcus meets two robots who are developing their own private language, and attempts to communicate to them. He discovers how a super computer beat humans at one of the toughest quiz shows on the planet, Jeopardy. And finds out if machines can have creativity and intuition like us.
   Marcus is worried that if machines can think like us, then he will be out of business. But his conclusion is that AI machines may surprise us with their own distinct way of thinking."
   I bet Alan Turing would have watched this show with a keen interest. You can watch this show via the BBC iPlayer in the UK.

Tuesday, April 3, 2012

2 new videos of Google's autonomous cars

Google has just released two new videos about it's self-driving car research. The first demonstrates how reliable and almost unremarkable this technology has become. The car, with a legally blind person behind the wheel, effortlessly cruises suburban streets, visits a Taco Bell drive-thru, and delivers the blind driver safely home. Just a few years ago this would have seemed like science-fiction!



The second video was released by Google and NASCAR, and yes it's an April Fool, claiming how in 2013 Google will release a self-driving stock car to compete in NASCAR. Sounds pretty unbelievable, except it really isn't. This could happen. It wouldn't do much for the sport but it's obviously technically feasible. Not a very good April Fool.

Monday, April 2, 2012

Pioneering early electronic music in #HungerGames

I've always had an interest in electronic music since I first came across Wendy Carlos and Switched on Bach, so I'm always interested to come across someone new. Yesterday I stumbled across Laurie Spiegel, who worked at Bell Labs, and in addition to composing created algorithmic composition software called Music Mouse.
   She's currently receiving attention because an early composition of hers, Sediment, features in the current hit movie The Hunger Games.


Sunday, April 1, 2012

I'm getting a bionic hand

I've been keeping this a big secret because it's fairly awesome. I research in artificial intelligence and robotics and have had a research project in development for a while now. It involves replacing my right hand with a robotic bionic hand. Needless to say I've had quite a mission getting this past Auckland University's Human Participants Ethics Committee. They were totally against it for months, but eventually realized that the only way I could develop the capabilities of the robotic hand and its neuro-computing control link was by replacing my own hand, given that I couldn't find any other willing volunteer locally.
   The hand is quite advanced and initially will be controlled by an Emotiv Epoc headset. The plan is to replace the headset with smaller and more discrete sensors on my forearm and to test case-based reasoning software that will enable the hand to learn how to move more precisely. To the right is an X-ray of my right hand and wrist, which was taken today and which my surgeon will be using for the amputation tomorrow. This research will be the first of its kind in New Zealand and is being conducted in collaboration with Johns Hopkins University Applied Physics Laboratory. Needless to say my wife is not very happy with my decision.
   More information about the project is available here.