Tip:
Highlight text to annotate it
X
What I'd like to talk to you about this morning is the "wearable technology revolution" or
probably more appropriately the "mobile technology revolution," but mobile in the broadest sense
- computing off the desk out of the office; and out and about and how it's really transforming
people's lives. So, ever since the mid 80s you know, we've had these PCs and it's been
absolutely revolutionary. It's been... for me personally and for many, many people with
an impairment in particular, absolutely life transforming. For me it came at exactly the
right time so that I could do my education, I could get a degree and be employable so
I'm hugely thankful for that. What we've got here is a keyboard, screen, mouse, tower,
all the usual components of what people traditionally thought of as a computer. But of course all
of these things are dispensable. The screen, I can't see a screen. Many computers talk
these days and not only for people who like myself have a vision impairment, but also
for people who are you know, temporarily visually impaired; for example maybe on a sunny day
you want to be able to have SIRI speak back to you for example, or maybe your jogging
and you want to be able to just quickly send or hear a text that has just come in, that
sort of thing, or perhaps you're driving the car and you want to have email or your tweets
read back to you for example. The keyboard, completely dispensable. Again, you know, I
use a keyboard quite heavily because that's my main input methodology. But for people
with a motor difficulty, they might be using voice recognition for example. There's the
touch screen device where there's usually a keyboard present, a physical keyboard that
is. And so there's so many other options and the thing that is really transforming about
technology is these options, these input and output options. And as we'll see later we're
seeing this layered on top "Artificial Intelligence" which is really kind of helping speed up the
process and making things more efficient. And then we got the mouse as well. And I for
one can't use the mouse because I can't see the pointer on the screen. Other people can't
use the mouse because of discomfort or some other physical disability. So, all of these
things are dispensable. Even the tower has shrunk and shrunk until now, you know. It's
either built into the screen, or it's built into the keyboard, or it's you know, on your
wrist or whatever it might be. So, all of these things are hugely valuable but we're
still at AbilityNet trying to really close this imagination gap where people still think
that computers somehow comprise of some of these elements. And they find it difficult
to imagine people with a range of different abilities, impairments, disabilities being
able to work productively or function efficiently for example in the workplace. AbilityNet has
done many assessments of disabled individuals. And Professor Hawking quite early on, he has
been you know, benefited hugely from technology or rather the world has as well from his scientific
contribution to various realms of science. And he's hugely inspirational as well. Someone
who has a very significant disability who can only move the finger on one hand and he's
able to quickly build up sentences, have conversations, write papers, and a novel, books and that
sort of thing. And you know, his voice is very distinctive. Obviously there are more
modern sounding, human sounding synthesizers these days but he sticks with the old DynaVox
because that's his voice and we all have the right to have our own distinct voice. The
sort of thrust of the presentation I'd like to kind of get across to you - is the mainstreaming
of inclusivity. So, here we've got right from sort of day one with Windows the ability to
customise the look and feel of Windows and of Office, to be able to change the mouse
acceleration, the double click speed, all of that sort of thing. And with each new version
of Windows, like here's Windows 7 the built-in speech recognition and it's in Mac OS now,
and it's in smartphones. So, this idea of taking technology and mainstreaming it and
making it far more inclusive it what I'm hoping you guys will take away, because you're all
involved in technology, you're all making things, designing things. And this idea of
working for the 100% not just thinking about accessibility for example as something that
you bolt on to help those poor disabled people and it takes extra budget and extra effort
and when push comes to shove, often accessibility is kind of knocked off out of the equation.
This idea of taking the personas which we all hold dear when we're developing, when
we're designing a UI, UX and thinking of them in a much more diverse approach, think of
diverse personas. And AbilityNet have got a really good set of diverse personas that
weave into them, a range of different impairments as well. And here's a picture of a phone.
We're going to look at how I use a smartphone in a second. And nothing more transformative
has you know, there's been nothing more empowering since the evolution of the PC back in the
80s than the smartphone revolution because it's taking all the power. And all of those
choices, all those input and output methodologies, and it's put them in your pocket, and it's
made them mobile, and in many cases wearable as well. So, I'm going to look at that and
then I'm going to finish off with... or rather the majority of my presentation is going...
this is just a preview by the way, this isn't what we're going to cover. It's going to be
sort of "blue sky" stuff and how mobile technology and AI is really benefiting people out and
about. And not just able-bodied people, but disabled people, the full spectrum. Let's
think of it as a spectrum, let's not think of it as different communities that you're
trying to cater for. I'm going to give you a quick blast on my phone. [Phone Menu]. So,
I don't know if you can see, I'm holding it upside down because that wire is really short.
So, I haven't got the screen on. If I do a three finger triple tap, (Screen curtain off.)
Now, the screen's on but I don't use the screen. (Screen curtain on.) So, I put the screen
curtain on. Just doubled my battery life right there. So, smartphones and particularly iPhones,
iOS, is very, very inclusive. It's not been an afterthought bolted on for people with
a disability. There are so many things. Have a look in the "Accessibility" options on an
iOS device and you'll be staggered. There's not just stuff for vision impairment, there's
stuff for people with a hearing impairment, there are drivers in here to directly drive
digital hearing aids. There are drivers in here to drive braille displays, you can have
mono audio out instead of stereo so that people who can only hear in one ear are able to get
the you know, both tracks, both sides of a stereo track. There's lots of custom gestures
that you can do for people that have difficulties doing... for example a three finger triple
tap you can reassign those. There are so many different really powerful ways of having options
on the input and output methodologies. So, I as a blind person use a touchscreen quite
happily by putting my finger anywhere on the screen. [Phone Menu] Do you want me to slow
down a bit? [Phone Menu] Let's have British and I'll go to... [Phone Menu] Speech rate
[Phone Menu] that's better. [Phone Menu] So, it said, "Mail 516... ... double tap to open."
So, I can just roam my finger around the screen and whatever it touches will be spoken out
to me, and then I've got hints on so it told me that I can double tap to open this particular
item. So, I could flick left and right and consecutively go through every item on a screen
for example. Now, they've really done a good job - "VoiceOver," which is the built-in screen
reading software. It's not a half-hearted gesture at a screen. (New notification from
TweetList) Somebody's tweeting. (Laughter) (New notification from TweetList, Tweetlist).
Shhh! Stop tweeting for a second, okay? (Laughter) So for example if I go into the Apple Maps
and I know that Apple Maps have left people stranded and starving and baking to death
back in Australia. But I won't hear a bad word said for it because I can go into an
Apple Map and I can literally move my finger along streets and it will tell me you know,
Waffen Road, North-South Road, and I can... (New notification from Clash of Clans) Shhh...
I can move my finger along it and it'll go "Bonk, bonk, bonk" as I'm going along the
road. And if I go off the road, it'll go "Bomp, bomp, bomp." So, I know that I can trace the
road or it will say, "Approaching junction with Littleton Road East-West Road. Now, adjoining
Littleton Road, East-West Road." And I can say I can go along that. I can literally explore
my neighbourhood. That isn't a bolt on, that isn't a half-hearted attempt. For example
the camera, you might think that you know, the camera is one thing that a blind person
on a smartphone would never use. Actually nothing could be further from the truth. You
know, the facial recognition which puts the boxes or a rectangle, I don't know, a circle
or something round on someone's face to say that they're in focus so that you can you
know, focus on someone's face. Well, when it detects that VoiceOver is running, it will
say, "One face top left," so I know that I can move it slightly in and it'd say "One
face centre," and then I can take the photograph and not make a hash of it. So, these things
are hugely powerful. I'll show you a quick app called (Flexi) "Flexi." (Double tap to
open.) I'll start with that. (30%, 25%) Let's try that, okay. Going to Flexi. So (Flexi.
Activate keyboard with a single tap before typing.) "Activate keyboard with a single
tap before typing." Now, I'd like to kind of emphasize the blurring between mainstream
and specialist. Flexi I think was designed to help people with a vision impairments type
more quickly on the iPhone. But now it's a free app. They're in talks with Apple about
making an optional keyboard. And it's got huge mainstream appeal. I think most people
would like to be able to type as quickly as this. Let's have a go. Hang in a second. (Activate
keyboard with a single tap before typing and deleted it) Okay. (I am... ) I am demonstrating...
(... demonstrating a... )... a fantastic... (... fantastic)... typing... (... typing.
Sip, aap... )... app. (... called... ) called, Flexi (... feature... ) no... (... turkey...
Kosovo) (Laughter) (Flexi) (Capit... I am demonstrating a fantastic typing app called
Flexi.) "I am demonstrating a fantastic typing app called Flexi." So, as a blind person,
to be able to quickly input information accurately like that is a real boom. Because access is
one thing but just as UX is a real art, a real skill, then the access... simple access
isn't enough. Effective, productive, efficient access is what it's all about. And that's
where things like SIRI, you've all seen SIRI in action. We'll see a quick video comparing
SIRI to Google's version of it in a moment, a really, really powerful. (Home, Flexi) I
want to do one demo on the phone which is an app called "Money Reader." Let's... maybe
I'll launch it with SIRI. (Open Money Reader.) Okay, I've got some money here. I don't know
what these notes are, let's have a look. I'm putting notes there, nobody touch them, okay.
So, here we've got a note. So, another example of why the camera is hugely beneficial...you're
going to tell me what that is. It's very folded. Let's try. (Twenty pounds) Okay. So, I don't
even have to take a photo. It just you know, I just wave it underneath. And so that's an
example sort of realtime object recognition which as a blind person to be able to do that
on-the-fly you know, it's like having the camera on the phone which is always with you.
It's like having an intelligent eye always with you. And it can do product recognition,
package recognition, it'll tell me not only that it's you know, a packet of Kellogg's
Corn Flakes but it'll also tell me the nutritional information, it can give you serving instructions
about it you know, cookery suggestions, and it'll also tell you where you can get it cheaper
around the corner using GPS. (Laughter) So, just think you know, compare that with what
it must've been like for people before this sort of technology came along that wouldn't...
that aren't able to sort of access these sort of everyday convenient bits of information.
So, in many cases they are mainstream devices, mainstream apps, and in other cases there
are not. But we're seeing this blurring - the mainstreaming the inclusive nature of technology.
Here's a quick demo. We'll only look at the first couple of examples of comparing the
two AI virtual systems and how quick and how well they do. (Music Playing in Background)
>> Voice from Video: Hey guys, Davey here from phonebuff.com and I'm just going to be
going a comparison between SIRI on the iPhone 5 and IOS 6 against Google Now on the Samsung
Galaxy SIII and Android 4.1.1 Jelly Bean to see which voice assistant comes out on top.
I will be asking each a series of 21 questions to see not only which one answers more accurately,
but also which one answers quicker. Because ultimately when you're using your voice, you
want a fast response otherwise you would type it in. It's worth noting that both devices
are connected to the same WiFi network. So, with that said let's go ahead and start the
test. What's the weather like in Las Vegas? (Google Now: It's 90 degrees in most of the
valley in Las Vegas.) (SIRI: Okay, here's the weather for Las Vegas. Nevada through
Thursday.) Both answered accurately, however it's a little bit faster on Google Now. How
far away is Las Vegas from here? (SIRI: Getting directions to Las Vegas.) Got our answer here.
Looks like navigation's opening up on the iPhone 5 on SIRI. Got to answer it technically.
It took a little longer because it actually opened up you know, for navigation. I'm going
to hit end over there, so. Both were accurate however faster on Google Now again. Let's
ask another question. When is Mother's Day? (Google Now: Mother's Day is on Sunday May
12th 2013.) We got our answer already on Google Now. Looks like the iPhone 5 and SIRI are
hanged up right now. They're you know, it's basically circling searching for something
but hasn't given us a response just yet. >>ROBIN: I'll just pause it there. So, you
get the idea. They're both quite equivalent. I think Apple... well, Apple is always quite...
SIRI is always quite laggy and I don't know whether that's the technology or whether it's
because more people are using SIRI than our Google Now. For me personally, the accessibility
on IOS is still quite significantly far ahead. So, I'm SIRI all the way. Now, we're going
to look at a quick video of another really important use of the camera. This is for...
in this case, for people with a hearing impairment. This is using the iPhone 4. And so, we'll
see that you know...this came out pretty quickly after the iPhone 4 came out, and the really
very quick adoption of... in this case people with a hearing impairment using FaceTime to
do signing. >>Voice from Video: They're able to communicate
very easily with the iPhone 4. They're able to talk to each other. Now, if that's not
awesome I don't know what is. >>ROBIN: So, obviously mainstream about but
being used in this case for people with an impairment but you know, in America they have
a phrase people without a disability in the disabled community they call them "TABs" which
is Temporarily Able-Bodied, which means it's going to get you all in the end. So, but it's
true though, you know, we're all temporarily disabled at one point or other whether it's
you know, glaring sunshine or the fact that you have to use something Hands-Free for a
while because you're driving et cetera, et cetera, or maybe you've got tennis elbow or
something, so. This is a quick video clip of somebody using a communication... well
the iPhone as a communication aid. Now, is it a mainstream product, this iVoice app or
is it aimed at people with permanent communication aid? In this case it's a blind date with someone
who's got laryngitis so they're temporarily sort of disabled and need a communication
device. [Music Playing in Background] >>Voices from Video: "Hey, I'm sorry I was late. Hey, I'm sorry I'm late. That
parking was crazy, I haven't been here before but." "That's okay, it's nice to finally meet
you." "It's good to meet you. That came out of your iPhone." "Yeah, I have laryngitis.
So, I will be communicating through my new iVoice application. I programmed all of my
responses for the night." "Okay, you mean you're writing it down somehow, and it's reading
it back to me?" "No, all my responses are pre-programmed for tonight." "So, you know
everything I'm going to say tonight?" "Well, I know how I'm going to respond to everything
you are going to say tonight." "Okay, then. What's my favourite colour?" "I don't know
what your favourite colour is, silly." "But you knew I was going to say what's my favourite
colour?" "I knew I was going to respond 'I don't know what your favourite colour is,
silly." "Well, it's blue in case you're wondering." "Oh, my brother's favourite colour is blue
too." "Okay, hold on. If you knew your brother... ."
>>ROBIN: Okay, I'm going to stop it there. Okay, so, again, is it mainstream? Is it specialist?
I don't care and I don't want you guys to care. I want you... what I want you to care
about is every project, every app, every marketing campaign, everything you do, it should be
for both. It should be inclusive, it should be done in such a way that there are options,
there are choices, and it's embracing the broadest most diverse set of audiences, customers
as possible. So, I'm going to talk about Project Glass now. And I'll quickly show you a demo
video first which came out last year. And everyone was extremely skeptical. But then
after that I'm going to show you a video which shows that actually they are delivering and
it's going to be coming out later this year. So, let's watch a quick video. It's through
the eyes of somebody who's wearing these glasses with their heads up display, and their AI
built in, and their object recognition built in with a built-in camera, et cetera, et cetera,
et cetera, some of the things that we've been talking about earlier but now in the package
of a pair of glasses. [Music Playing in Background] [2min video with no caption]
So, obviously a very useful, very sexy device. We'll see what they look like in the flesh
in a moment. Very, very empowering for people you know, to be able to get all that information
instantly. There's a lot of artificial, real life kind of object recognition going on there.
He looked up at the clock and it assumed that looking at the clock meant that he was interested
in time and so it gave an appointment. He looked out at the window it gave him the weather.
It could tell that he was going towards the subway and it told him automatically that
it was closed. So, it was using that realtime object recognition that I looked at a moment
ago with the Money Reader in a broader sense there. And it was using voice recognition,
you could say, "Music pause." He could do a video conferencing call with someone. So,
you know, it's a definite development to have that with you all the time. You won't be seeing
people walking down the street looking down at their phones like this, you will be seeing
them looking up into the top right corner. So, let's just watch a quick video of Joshua
Topolsky from The Verge tech blog, who a couple of weeks ago spent a day at Google Campus
looking at Project Glass. I won't show all of this but maybe the first minute or so [Music
Playing in Background] >>Voices from Video: "We're here at Google's
headquarters in New York City to see the company's Project Glass. The wearable computing device
that it's been working on for a couple of years. Nobody has really had the chance to
play with. I'm personally excited because I finally get to put it on and see what it's
like for the first time ever." "The people want to be connected and technology has enabled
that. But a problem is that it also that the technology can be a distraction and it takes
you out of the moment. What we think about on the team is how to solve that problem.
you need it." "So, this is a fight against the like glancing your eyes at your smartphone."
"A key part of us solving that problem is bringing technology closer to your senses.
That's sort of a hunch we have. That if we did that, it would allow you to connect in
a faster way. But then a challenge we had, was the design challenge. How do we get technology
out of the way? If you're going to wear something on in your face, how can you get technology
out of the way when you don't need it?" "I think when people see Glass, the big thing
is like, 'What is this weird object on everybody's face?' I mean it's... they're not... we call
them "Glass" but they're not glasses. I mean it's obviously something else." "So, what
were the design challenges of making this thing real." "Well, I asked myself the same
question you know, 20 months ago when I first joined the team and I saw all these people
around. I came into the office and they were wearing this. So, what do you think?" "I think
it's good. I actually would prefer this to be honest with you. So, they do this I think."
"But this is the original prototype? This is the stuff that I was messing around with
when you got there?" "Yeah, and you know, it freaked me out a little bit I have to say.
But at the same time you know, the team is really, really passionate about this idea."
And I can sense that. And I could see the experiments of what was going on. And I you
know, I finally could wrap my head around this idea of... that we need to you know,
remove technology but still you know, wear it." "A little bit. Oh I see... what we really
want is, well I could see it fine. It says there... okay, 'Glass 325.' (Background Music
Stops Playing) It's oh... record a video. Okay, ohh... and I can see. So, this is seeing
what I'm seeing now? Like, because it is... if I had a desire to do that but I don't have
to." "No." "You don't have to." "You just live your life." "Okay Glass, take a picture."
"That's great, I mean that's very cool. So, you've got this kind of one size fits all
right now which is... ?" >>ROBIN: I'm going to pause it there I'm so
sorry. So, extremely useful, obviously as they say helps technology kind of get out
of the way but at the same time be omnipresent, probably even more present than it is when
it's in your pocket. But that for me is hugely exciting. Because okay, I can't access the
"heads-up display" but I'm hoping and we're advocating very strongly for this, that there
will be built-in speech output from within the glasses. Obviously there's multi-media
capability, it was playing music, you could hear the person on the other end of the video
calls that there is a little mic, a little speaker in there, and I would like there to
be speech synthesis. Now I don't care if that's coming via bluetooth from a smartphone, or
whether it's resident in the device. I just don't care but I want you to be able to speak
to me so that I can use the omnipresent camera which is looking wherever I'm looking, to
tell me what's going on you know, when I'm in front of a particular store, when a road
is closed, you know, what package I've got in my hand. I won't even have to get my phone
out of my pocket for it to tell me those things. So, if it has speech synthesis built in, then
that's going to make it a more inclusive device for me. For people for example with Asperger's
or autism who have difficulty interpreting facial expressions, here's a realtime software
that actually tests have shown is more successful than humans at detecting or correctly recognising
people's emotions. Sixty-five percent of the time this software got it right compared to
fifty-six percent of the time for humans which... . I'm really surprised that we're so poor
at telling people's emotions. So, you could imagine. You got your heads-up display, it's
looking at the people's facial recognition, we know that there's good facial recognition
built in to smart cameras these days. It's doing this analysis and using the heads-up
display it's telling the person with Asperger's, "They're now getting upset. Please change
the subject," or you know, "They're happy, tell another one," you know. So, you could
imagine the truly empowering nature of this kind of technology. So, it's obviously very
sexy, got a huge mainstream application, but let's think about how we can these devices
and make them truly inclusive. Similarly, it's obviously got a microphone built into
it for him to be able to issue voice commands. If that microphone had a slightly higher gain,
so that he could hear other people in the vicinity, then you could combine it with realtime
voice recognition which we have already, which is user independent, you don't have to enroll,
you don't have to train it to understand your own voice, and then it could be picking up
the speech of people around you, it could be doing the realtime voice recognition and
it could be putting subtitles up on the screen for you as a deaf person. So, now you've got
realtime sort of subtitles for the rest of your life. So, that is hugely empowering if
for example the gain the microphone is efficient so that it doesn't just pick up yourself.
So, those little things there would take Glass and make it a truly inclusive device. I heard
on the... a news article a while ago, a couple of weeks ago, that they have realtime dolphin
voice recognition as well. So, they dive with these things strapped to them - another example
of wearable technology. I haven't been able to find a video on it. And it translates what
you say into dolphin... . This isn't a joke. (Laughter) Okay. It translates what you...
It's not April first yet. ... into clicks and it translates them back into you. And
they showed it and they do some clicks and whistles, the dolphins, and then it said,
"I would like the blue ball." (Laughter) So, this stuff is you know, it's arrived. Here's
a quick video that I'm going to kind of play in the background. You all remember this,
right? [video starts] From a few months ago Felix Baumgartner went up in the balloon.
Let me skip forward to the bit where he actually sort of jumps out the... . (Music Playing
in Background) Does he jump yet? So, he jumps out of the balloon, at 25 miles up, and he
had a heads-up display just like... . He's going to jump in a minute. Put the volume
down so I can talk over the top. So, he jumps off this ledge, off his balloon, and plummets
to the earth, and he's got a heads-up display on his visor, a bit like Google Glass. But,
as soon as he starts plummeting down after he's got out of a spin that he managed to
get himself into, so, that could've been really nasty, but he managed to get out of that,
and then he was plummeting to earth. (Applause on video) And the thing that happened was
that his visor misted up. Obviously he couldn't sort of prepare for that aspect of it. His
breath mist up the visor so he couldn't use his heads-up display. So, what he did was
he pulled the cord very prematurely. And even though he got the world's highest free fall
he didn't get the world's longest free fall because he pulled it a lot sooner than he
would have liked. And it's just another example of where if you had redundancy, if you had
choices, say for example like I need Google Glass to speak to me because I won't be able
to use the heads-up display, if his visor has had... his helmet had had the option of
voice output as well, then it wouldn't have mattered that his instruments have misted
up and you couldn't see his heads-up display because it would've told him his altitude
and he would have got both records as well. So, there's this blurring between mainstream
and specialist that I'd like you to really embrace. So, going to quickly look at a video
of some autonomous vehicles. So, Google Cars have been driving around the streets of America
for the last two and a half years without an accident using the technologies that have
been created by DARPA for autonomous vehicles in sort of a military environment and you
could appreciate why. That's something that we'd be interested in. But it's been... it's
matured, it's really been perfected, this technology. And then are free... you're free
to drive autonomous vehicles in Nevada, they're licensed for Nevada. And here we got our fleet
of taxis in Berlin. I'll just... .(Music Playing in Background)
>>Voice from Video: Autonomous Labs proudly presents: How to call on an autonomous taxi
in Berlin. >>ROBIN: So, he's in a hotel room, he calls
the taxi from his iPad app and the taxi you know, pulls over but without any driver in
there. And for me, this would be a mainstream device. Obviously its got huge applications
for everybody you know, to be able to drive around and to call taxis, call your own car
up to the drive. When you go to the city and you want to park somewhere and there's no
parking you can just send it off to find the nearest parking space et cetera. But for me
as a blind person, you know, I would definitely pay an extra 5 or £10,000 on the price of
a family car to be able to help... you know, participate in the driving, or at least be
driven somewhere without needing to have somebody else or to go by public transport. And this
would be a truly inclusive device or product if for example the iPad app was accessible
for me. If it was VoiceOver compatible then I would be able to call that taxi. But if
it wasn't then it's the whole you know, there's a... a chain, a link in the chain that's broken
for me and the whole thing doesn't work. When you get into the taxi, you know, when it takes
you from A to B, if there isn't voice feedback to tell me you know, "Approaching destination.
Okay, we're here now you can get out," that sort of thing, then if it just stops, I don't
know if we stopped at the destination or if we've just stopped at traffic lights and I'm
you know, reluctant to get out at traffic lights. When I've called the taxi and it's
kind of pulled up at the curb, I need to know someway of you know, "Where's my cab? How's
it going to tell me that it's arrived? Is it going to beep so that I can head of towards
it?" that sort of thing. And obviously for people you know, with a motor impairment,
you need to have a range of wheelchair accessible taxis in that fleet of taxis as well. Going
to quickly finish off with a couple of short videos of other examples of variable technology
which in this case help overcome paraplegia. >>Voice from Video: Firstly, she's looking
good right? Standing tall. [Pause] As you can hear it's... there's not much sound here.
That's what's so important. >>ROBIN: I'll stop that one there and there's
another one where a woman was actually (Music Playing in the Background) doing the London
marathon. It took her 16 days but she made it in the end. Her husband was helping her.
>>Voices from the Video: "Although the finish line has long been cleared away and the crowds
have dispersed?" "Claire Lomas is still walking the London marathon."
>>ROBIN: So, again I tweet these links. >>Voices from Video: "Paralysed from the waist
down after a horse riding accident?" "She's making her way round the course with the help
of a bionic suit." "Crossing the roads make it so much more difficult. I can't feel my
legs, so. It's looking down ... " >>ROBIN: Her husband's just sort of making
sure that she doesn't fall or anything. It's not actually helping her walk.
>>Voices from Video: "Umm, which I find quite challenging." "This is how Claire starts every
race day. Strapping herself into the robotic suit. Sensors are activated when she tilts
forward. And the legs start to move with her partner behind her for every step."
>>ROBIN: I'll just stop that one. So, this technology, it's here today. It's mature or
maturing at a rapid rate. And that for example was developed for again military purposes,
but we can see this application, this specialist application obviously very evident and very
powerful. AbilityNet are involved in several European projects. One of them is called "BrainAble."
And here's an example of how it's helping someone who can't even move any part of their
body to be able to use a robotic arm. (Music Playing in the Background)
>>Voices from Video: "You're watching the most advanced brain machine interface in action.
Cathy Hutchinson is paralysed and unable to speak but just by thinking she's able to control
the movements of this robotic arm and drink her morning coffee. She's part of a pioneering
study run by researchers at Brown University in the US." "People who are paralysed have
their brain disconnected from... ." And I'll stop it there, sorry. So, I just
wanted to give you a flavour or the breadth and range of different technologies and how
they're really transforming and empowering people's lives. Obviously, they have benefits
to everybody, a lot of what we've been showing you here. But they also have disproportionate
benefits. You know, I think I... I feel like I've disproportionately benefited from the
smartphone revolution and from the PC, the advent of PC way back than other people have
or maybe I just feel lucky, I don't know. But there are people out there who are...
who owe everything to this sort of technology. So, I just wanted to kind of help you bridge
that imagination gap and don't even think, don't even assume that there's any limitations
out there that good design can't overcome. Thank you. (Applause)