Speakers Profile - Peter Shann Ford

Award Winning Innovator, enabler and former broadcaster

Travels From:

Fee Range: E

Peter Ford was the first Australian news anchor on CNN in Atlanta.

He was also working for a US Government Rehabilitation Research and Development Lab at that city's Veterans Administration. Both roles required new forms of communication.

In the lab, working with scientists from Georgia Institute of Technology, Peter's mission was to create alternate ways of controlling a computer, to enable severely disabled people to communicate again, without needing a keyboard or mouse.

In 2000 Peter was asked to consult with Prof. Bernard Brucker, of the University of Miami School of Medicine in Florida, regarding a patient who had been unable to move - not even blink - in more than a decade after a car accident. Peter coded a program that recognized specific patterns in the neuroelectric (EMG) signals in the patient's arm, enabling him to make the computer respond, for the first time since his accident. That first sound was a simple 'beep' from the computer, but it marked the shattering of walls isolating him from the world.

Immediately, Peter began evolving that first program into a more sophisticated system that enabled a person who was 'locked-in', unable to move or speak, to select text on a screen and have it 'spoken' in a computer-generated voice. He called the system NeuroSwitch.

In 2002 Prof. Stephen Hawking invited Peter to demonstrate the program at Cambridge University. The following year, Prof. Hawking asked Peter and his colleague - computer scientist James Schorey, CEO of Therapeutic Alliances, which made the EMG monitor that captured the data for Peter's software - to install a unit on his powered wheelchair.

The first phrase Prof Hawking chose to select with his NeuroSwitch was, "I am always being mistaken for Stephen Hawking" which he thought was very funny. Prof Hawking beta-tested every new generation of NeuroSwitch from 2002-2007.

Peter founded Control Bionics in Australia and the United States with Lindsay Phillips' investment group Nightingale Partners in 2005.

Today, the latest edition of this technology is NeuroNodeTM, a watch-sized, wearable, wireless communicator that sits on the surface of a person's skin and uses the neuroelectric signals in their muscle - even if it is disabled - to control an iPhone, iPad, Mac, PC and other devic: to write text on the screen, have it spoken in computer-generated voice (in a range of gender, age and accents), send and receive text messages and emails, surf the internet, and drive telepresence robots and powered wheelchairs.

NeuroNodeTM technology continues to evolve, using eye tracking, and minute movements of the eye (EOG) to expand its reach to people with severe disabilities. It is funded in Australia by NDIS, the Department of Veterans Affairs, and the RAAF, and in the USA by the Veterans Administration, Medicare and Medicaid.

Control Bionics continues to push the envelope of communication with a simple mission:
'To produce the best technology to enable people with the toughest disabilities to regain communication, independence and dignity.'