Tip:
Highlight text to annotate it
X
AUSTIN ROBISON: All right.
Good morning, everybody.
It's great to see a really packed house in here.
It looks like standing room only.
We're really excited today to talk
to you guys about Android Wear.
I'm Austin Robison.
I'm a product manager with the team.
JUSTIN KOH: And I'm Justin Koh.
I'm an engineering lead on the team.
AUSTIN ROBISON: Excellent.
And we have been so thrilled with the excitement
that we've gotten from you guys throughout the rest
of the conference.
I think the third floor area where the devices
are out there has been mobbed the entire time that I
have walked by.
So thank you for all of your support,
and we're really excited to talk about what
we're doing with Android Wear.
So I wanted to start with a quick overview of what
Android Wear is all about.
And we'll dive into some code and really show you
what it's like to build applications for Wear.
So we all love our smartphones.
They're rich and engaging, and they
can deliver these are really deep engrossing experiences.
But I think this is a picture that we can all relate to,
sitting around, we're all focusing on our phones
and not on the people around us.
So technology can be distracting,
and here's where we saw an opportunity with Android Wear
to make this better.
Android Wear delivers information just when
you need it and just in a glance.
And really, that means you can be more connected
to the people around you and the people who
aren't with you at that time.
So if you look at that in a timeline,
this is something that I think we have all experienced.
You feel your phone buzz in your pockets, you pull it out,
and you're like, OK, well, I got some email notifications here.
You go to your inbox, and next thing you know,
you spend several minutes distracted
from the world around you, really engaged
with what you're doing, but not necessarily
present with what's going on.
So with Android Wear and having these glanceable notifications
on your wrist, you can have more interaction throughout the day,
have them shorter and less distracting
from the world around you, but still
get more information about what's happening.
So to deliver this new interaction model,
we created an entirely new UI.
Now, what that is is a vertical stream of contextually
ranked cards.
So you have the most important information for you
at the top of the stream, ready for you
to glance down and take a look at it.
This is completely automated from a Android perspective
of having your notifications bridged across.
It also has Google Now cards, and importantly, it
has your third party content, which
is what we're going to be talking about today.
So earlier this year, we launched
a developer preview that allowed you to get
started experimenting with your apps of what
it would be like to bring your experience to the wrist.
So this allowed you to take your notification objects
on your phone and really be able to augment
them to have additional pages of information,
to be able to bundle these things together in stacks,
and to be able to have voice replies.
So what we're making available today
is an SDK that allows you to extend those experiences even
further, where you're running code directly
on the wearable using all of the familiar Android tools
that you are already used to using.
So we're going to introduce a few APIs that
allow you to send data back and forth between the wearable
and your phone, as well as generate custom UI
and integrate with Voice Actions through intent filtering,
just like on the phone.
So we've been battle testing these APIs
with dozens of third party partners.
And we've had some really great apps be created.
You've seen several of them featured in the keynote.
You've seen several of them in previous sessions.
I'm going to walk through a couple more
here, just to give you an idea of the types of things that
are now possible with this SDK.
So Runtastic is a social fitness platform.
And their Wear integration helps users control those sessions
from their wrist.
So you can say, OK, Google, start running.
And they'll post a notification to that stream that
has a running timer of your current run.
They can also show new UI inside of these cards
with a new platform feature called Display Intents that
allow you to embed an activity directly
into these cards for drawing your own UI.
On the right here we also see a piece
of data that was generated in the phone, which
is your route that can be synchronized across
automatically.
And we'll go into the details of how you do that.
Another example, Philips Hue.
They are connected light bulbs that
allow you to change their colors from your phone.
There's a display out on the third floor
here, really fantastic.
Maybe they're down on the second floor, sorry.
And what their app does is when you come home,
it posts this contextual card to the stream
that allows you to then control your lighting.
They also can launch a full screen activity
to draw their own UI to have a more immersive experience
while you're controlling that app,
and then get back to the stream.
So important to note here is that we're really focusing apps
around micro interactions.
So we want users to be able to get into your feature,
complete a task, and then get back
to that stream as quickly as possible.
If there's an action that requires deeper integration,
we want to forward that user back to the phone.
And we've also made that very easy for you.
One more example here, Glympse, which
is a social location sharing application.
So you can send your location to your friends.
On the wearable, we can send contact information
from your phone across, so you can have a list of contacts
to look through to be able to send those locations.
When you receive a Glympse, you can also then see in real time
the progress of your friend coming to you.
So let's take a look at what we're
going to be looking at for the rest of the talk here.
So Justin just got this adorable new puppy,
and so we've built a little slide control app here
for showing some pictures of that puppy.
So we've got a phone and a wearable up here.
When we start on this slide show,
we can post a notification to the stream on the right here.
As you scroll through the ViewPager on the left,
we're updating in real time on the right
the title of the slide, as well as the number of the slide.
We also have an action that allows you to go backwards.
Tapping on that first piece also allows
you to advance the slides on the phone.
So we're using it as a remote control.
And we're using all of these new APIs
to provide a seamless experience across both devices.
So if we take a look at what it takes
to build this application, when we write
an app for a phone or a tablet, typically we
have one APK per app.
But with Android Wear, we're extending that concept
to expand across both devices.
So we have an APK that runs directly on the phone,
and we have an APK that runs directly
on the wearable that are both part of your app.
Google Play services is what brings these two things
together and where we're going to talk about a new set of APIs
that allow you to send data between the two
and facilitate that communication.
So the wearable services consist of three APIs, a Data API,
a Message API, and a Node API.
These are both callable from either side, so
from the wearable or from the phone.
And all of your data is private to your application
across the link.
So if we dig into here first and look at the Node API,
this is used for discovering when
nodes come into and out of a connected state.
So when we have a pairing established
between a watch and a phone, for example,
we can be notified when we come into connection range.
So you might use this in your app
to know when a wearable is present to toggle features
on and off.
We might display an icon to show that you're
connected to a wearable.
It's a very simple API for just being
aware of when the wearable is present.
Built on top of that is the Message API.
So this is a low latency mechanism
for sending byte arrays back and forth between the devices.
And they can be filtered on pads,
which Justin will explain in detail later.
So in the puppy example, when we are
tapping on the card in the stream
or tapping that Back button to go backwards,
we're calling sendMessage to send a simple control back
to the phone, which receives that message
and then moves the ViewPager.
The Data API is the biggest piece
of this new wearable service.
It's a replicated data store.
You can think about it as a shared cache.
So we have the ability to put data into this,
and take data out of it, be notified
when it changes on either side, and we take care
of synchronizing that data between devices.
So you don't need to worry about devices that are intermittently
connected, what to do when you have extended
periods of not being connected.
We do all of the hard work there.
And all you have to do from the outside is put data
into the store and take it out.
So in the puppy example, we might
put the title of the slide and that slide number
into the store.
Google Play services replicates that
across to the wearable, where we can then
update that notification in real time.
So we can have the data sent back and forth there.
So now I'm going to turn things over
to Justin for a little bit, who's
going to walk us through using these APIs
and really dig into some of the code, which
I know you're all excited about.
Take it away, Justin.
JUSTIN KOH: Thanks a lot, Austin.
Whoa, little drinking problem there.
So you'll have to excuse me.
We just got our new puppy, and I'm incredibly allergic to her.
So I've developed a bit of a cough.
So if you see me about to go like this,
everyone kinda ear muffs.
So let's take a look at the overall building blocks that
are going to make up our application.
On the left-hand side here, we see
we have our ViewPager that's going to have a couple
fragments, these slide fragments.
And when the user slides from left to right,
we're going to update our data item.
We're going to update the index from 1 to 2.
We push this into the data store.
And that event is going to get over to the wearable.
And we're going to update a notification when that happens.
So to begin, we need to download Android Studio
and create a new project.
So I've created a new project here called Slide Control.
You can see that Android Studio now supports multiple form
factors, all the form factors that we talked about yesterday.
Wear is one of those.
So you just click the check box here,
and Android Studio is going to take care of packaging up
this glanceable portion of your APK, or this watch APK,
into your phone application so that it can be distributed
through the Play Store.
Let's start by setting up the phone UI.
So like I said, this activity is ViewPager,
and it has a couple fragments in it,
with the background image, title of the slide.
And these are all the files that make up
the two different applications.
Again, on the left, we've got the phone APK files in green.
On the right, we've got the wearable APK files in blue.
Let's take a look at the presentation activity.
And this is the ViewPager one.
And we can see that the first thing we need to do
is to create a Google API client.
That's your entry point into all of these Google Play services.
So the new API here is the Wearable API.
So I'm going to build my client and stash it
away somewhere so I can use it throughout my application's
life cycle.
Next thing I want to do is wire up my ViewPager
so that when those events happen,
we can update the Data Store.
And I do that by creating an OnPageChangedListener
and calling updateDataItem.
Let's take a look at data items.
So like we said, URIs are everything in this Data API.
And we do that because it's really easy
to create them and parse them using existing Android classes.
So in this case, I am creating a data item
that's going to manage the state.
What is the current index and the current title
of the presentation slide that I'm on?
So I create this /state path.
And internally, the Data API is going to create this fully
qualified URI, wear://, nodeid, and state.
Now, you don't have to worry about that as much
as developers.
You can just worry about the paths, et cetera.
But knowing that this is the format of the URI
will help you later when it comes time
to reconstitute these URIs on the other side.
Data items are byte arrays at their base,
but we know that working with byte arrays
can be not that flexible.
It's really nice to be able to work at the key value pair
level, like a bundle.
So we've created a key value pair
object called the Data Map, which stores primitive types.
Unlike a bundle, we won't be storing anything
like parcelables or binders in there
because this item is going to be sent
to all of the nodes in our network.
So in this case, I create a data map,
and I'm going to put in these primitive types--
the current index, the total number of slides,
and the title of the current slide we're on.
Finally, I just insert it into the store.
And it's going to get synchronized
to all of the connected nodes.
On the wearable side, we want to be able to receive this.
And we want to be running a service to do so.
Now, we don't want all of the different applications
to have to be running constant services all the time just
to listen for a message.
And so what we've done is we've created this abstract class,
the WearableListenerService, which
implements all of the different callback methods
that you're going to be using with these wearable APIs.
That's NodeListener, MessageListener,
and DataListener.
So I have subclassed this here in
a PresentationListenerService, and I just
insert that into my Android manifest.
By doing so, the Google Play services APK
is going to manage the life cycle of this service for me
so it's not always running and consuming
resources, precious resources, on my wearable.
Let's go over here to PresentationListenerSerivce.
And you'll see that the DataListener interface
has one method, OnDataChanged.
When I get that, I'm going to check the path because there
may be multiple messages in my application.
And I want to make sure I'm doing
the right thing with the right data item.
So if the state, remember we set this up as the state,
is the data item's path, then I'm
going to either create or update the notification,
if it's just being changed.
And if it's being deleted, I want
to remove that notification from the stream
because it's no longer relevant to the user.
In handleStateChanged, we're going to extract our data map.
We're going to get all of those primitives out of there,
and we're going to post our notification.
Now, I know this is a lot of code.
I'm going to walk through it step by step.
What are the things that we want to do?
We want to get an intent back into our service
when the user taps that first card.
So to do so, it's just like Android.
I'm going to create a PendingIntent to our service.
And I'm going to use an action on this intent
to determine what the user wants to do.
Do they want to advance the slide,
or do they want to go previous?
Next thing I'm going to do is create
a NotificationManagerCompat object,
which is in the support library.
So building this basic notification
is just like building a notification on the phone.
I've got my title.
I've got my text.
And like Austin said, we're going
to setOngoing because this represents an ongoing process.
Finally, we're going to augment or extend our notification
with the new APIs that exist in the support library.
So I'm going to create a new action which
will be the advance action.
By default, if I were to have created a content
intent, on Android Wear we would render this as a blue circle.
But if you notice from the demos,
we wanted to be able to just tap the card and advance the slide.
So I've created my action here with a drawable and just
some text.
And I've called setContentAction.
That's going to hoist that action on to the card.
And if you remember from the design talk that you
may have seen, Designing for Wearables,
we're going to want to make that icon blue so
that the user knows that they can tap the card.
That's going to be the signal.
When they see blue, that's an actionable thing.
Finally, we extend our builder.
And that's going to create that notification that you see.
Now of course, with even more code,
we can add in the previous action.
We just create a different intent with a different action,
with a different action on the intent,
and then we create a different action on the notification
itself.
So as a reward for getting through all that code,
here's a picture of a puppy.
So on the phone side, we're going
to want to receive some messages.
So first of all, we have to send the message.
So in this case, we're going to see
that I'm going to tap on the Previous Slide button,
and it's going to go back there.
And that's still going to be in PresentationListenerService.
Remember, when that pending intent fires,
it's going to come back into our service.
And check what is the action of the intent that came in.
And I'm going to just send a control
message using the Message API.
So I take a Google API client where
I've added the Wearable API, and I'm just going to send 1 byte.
So it's either going to be 0 to go forward or 1 to go back.
And I'm going to send this to a specific node, the node that
owns the data item.
Remember, I can get this by parsing
the path of the data item.
And I'm going to send this on its own specific path, which
we're going to call the control path.
Over on the phone side, we make ViewPresentationActivity also
implement a MessageListener interface.
That means that it's going to implement
this onMessageReceived.
So if it is a control message, we just take that first byte
and we decide whether we're going to advance the slide
or go to the previous slide.
When that happens, we update the ViewPager.
We just call setCurrentItem on the ViewPager.
And then that's going to fire the OnPageChangeListener, which
will update the data item and come all the way back around.
So this is a really good pattern.
You want to make sure that your data items are sort of owned
by one of the nodes in your system.
In this case, the data item is owned by the phone.
We could have tried to edit it on the wearable side
and have that synchronize back and forth.
That gets a little messy.
It's much better to do data item owned buy one of the nodes,
and then you send messages to update it,
and then you just have this virtuous cycle
going around and around.
So as we see here, when we get our message,
we are going to just go back to the previous slide.
So let's talk a little bit about extending this use case.
Now, if I'm giving a longer presentation,
it might be good to have this notification in the stream.
I want my watch to be able to timeout back to the home screen
so I'm not burning power.
But if I'm giving a much shorter, a more
condensed presentation, I would maybe
want to control what activity is on top so that, as a user,
I can just kind of slide around and not really
have to worry about what I'm doing.
And to do this, I'm going to create a custom activity that's
running on the wearable.
Let's take a look at that.
So in the slide we see we can slide over,
and we have another action here called the Full screen.
And when I tap on it, I'm going to bring up
a activity on the wearable that has a ViewPager.
And these are synchronized.
So when I swipe over here, it's going
to update the ViewPager running on the phone.
It also has some other things that I
might want for my presentation like the current time,
and maybe I want to go red when I'm going over time,
or things like that.
So let's talk about how to get this data, the image
and the titles, over to the wearable.
We want to use the data layer.
We want to do it efficiently.
We could just keep updating our state data item,
but that means that every time it changes,
we're going to have to send all that data
over and over and over again.
So instead, I'm going to store my data in several data items.
And they're each going to have their own unique path.
So it's going to be wear://, the node ID, then slide/ the index.
And so you can see here I've got three different data items.
And over here on the wearable, I use
that to inflate some card fragments for my activity.
We're going to call this the SlideControlActivity.
So let's see how we do that.
In our SlideControlActivity, we have a very simple layout.
It's just a ViewPager.
But remember, I want to make sure
that this doesn't timeout to the home screen.
And I do that by setting android:keepScreenOn to true.
Again, just like you would do on a phone
if you had some sort of media playing application.
Now, remember, with great power comes great responsibility.
You want to be kind to people's batteries.
You don't want to be in a situation
where the user might accidentally end up
in this full screen activity and it doesn't efficiently
go to sleep.
So we want to watch out for-- with keepScreenOn, you probably
want to keep track of the user's timeout.
You might want to make it settable.
And if the user hasn't interacted
with the device for a certain period of time,
just finish your activity.
We also want to key activities to a user's context.
So remember, with PresentationListenerService,
when the users finished viewing the slide show on their phone,
we removed the notification.
Similarly, when the users finished viewing the slide show
on their phone, we also want to exit this activity
if it's running.
And the final note is to watch out for wakelocks.
It's very important on wearables with the smaller battery
capacity to just really be cognizant
of your users' batteries.
So let's take a look over here in SlideControlActivity.
And we've set this up very much like
the PresentationListenerService.
So we've got our handleStateChanged method here.
We're going to extract our presentation data,
and we're just going to set the ViewPager that's
running on the wearable to be the same index
as the index on the phone.
We also want to be able to control it from the wearable.
So I create another OnPageChangedListener,
and now I'm going to send a different type of message.
Instead of that control forward and backwards message,
I'm going to send a new type of message
on a different path called slide.
And here the byte is just going to say,
what is the desired index that I want to have?
Back on the phone side in the ViewPresentaionActivity,
if I see the message coming in on the slide path,
then I'm going to parse that out from the byte,
and I'm just going to send it there.
Once again, I will then trigger the OnPageChangedListener,
which will trigger Update Item, and everything goes around.
Virtuous cycle.
And here we see the results of that.
There we go.
So by default, Android Wear activities
can be dismissed by swiping left to right.
This even works with our ViewPager.
If we swipe over a few times and swipe back,
everything works the way you expect.
When I'm on the first slide, if I swipe from left to right,
I'm going to dismiss that activity.
But remember, I'm in the middle of a high stakes presentation
here, and I don't want to accidentally do
that if I'm moving around.
So when we want to override that behavior,
it's definitely possible.
And it's possible without a line of Java code.
It's just updating a style and your Android manifest.
So we come over here, and you create
a theme in your application that inherits
from the device default theme.
We're still running KitKat on the wearable,
and so we don't have these material themes yet.
So we're Theme.DeviceDefault.
You should definitely use those in your applications.
We have a new attribute here called
android:windowSwipeToDismiss.
By setting this to false, you're going
to do disable that default behavior of being
able to swipe an activity away.
Then I just set that as my theme in the activity.
Now, I don't want the user to get locked into my activity,
though.
They should know how to get out.
By default, all Android Wear devices,
you can put your palm on the screen
to go back to the home screen, but that's not necessarily
as elegant as we'd like to do.
We'd like to establish a pattern for these full screen
applications.
And what we've come up with is a long press.
So when you long press, we're going
to bring up this red circle.
And then if you tap it, we're going to finish the activity.
We've created a support class to make it very easy for you
to do this in your own applications.
So I'm going to update my layouts,
and I'm going to create a DismissOverlayView so
that it just hangs out on top.
And it's going to be hidden by default.
Over here in SlideControlActivity,
I just need to get a reference to that overlay view
and then use a GestureDetector.
I could've used a long click listener
if I had a different type of view,
but with ViewPager, it's easier to use a GestureDetector.
So again, just like using it on the phone,
I create a GestureDetector.
I'm going to feed motion events to it,
and when I get the long press, I'll
show my DismissOverlayView.
DismissOverlayView is going to take care of everything else.
When the user taps on it, it's going to exit the activity
and go back to the stream.
Final thing we want to talk about
are assets-- getting picture data over to the wearable.
Obviously, that's what we're going
to want to do to create these rich, awesome experiences
for our users.
So everything that we're doing is
in the name of efficiency-- efficiency
for battery life, et cetera.
So let's say I had an app that used a contact photo.
That's a very common use case.
And say that somebody else wanted
to use that exact same contact photo.
Well, we don't want to have to transmit multiple copies.
So by putting these assets into our data layer,
we will take a hash and only send one copy over.
Now on the other side of the link,
these applications don't really know that another application
was using that same asset.
They just see it as their own private copy.
So it's all secure.
It's all there.
It's great.
We're going to use those assets with our puppy app,
although who else is going to be using pictures of my puppy?
We're going to want to not send the full resolution image over,
however.
So the resolution on the Nexus 5 is much higher
than the one on the wearable.
So I'm going to want to make sure that only
I'm sending the image data that's
necessary for display on a smaller device.
So to do that, we go into ViewPresentationActivity.
And here we're going to call this function
for each of the slides in our presentation.
I'll probably do this in an asynchronous task
when I'm loading it up.
I'm going to create my data items at a known URI,
so in this case slide/index.
I'm going to put the title in.
And now I'm going to create an asset.
All of my slide objects know how to create
a smaller version of themselves, just as a byte array.
From there, I send that to Asset.createFromBytes,
and I put it into the map.
And then it's going to intelligently send that over
to the wearable.
On the wearable side in the SlideControlActivity,
I'm going to again use an asynchronous task, or a loader,
to load this up.
And here's where we see the importance of the paths.
Since I know that it's going to be wear:// et cetera,
I can send these URIs around and build upon them in order
to derive queries for these data items.
And so here I'm going to load up each of the different data
items from the data layer.
I'm going to extract the asset.
And from those bytes, I'm going to create a bitmap
that I can then use to update my UI.
And that's what we got.
So now I'm going to turn it back to Austin.
AUSTIN ROBISON: All right.
Thank you, Justin.
[APPLAUSE]
So we hope that's given you a taste of what's
possible with these new APIs and with this new form factor
and has sparked some ideas for your own apps,
or maybe even new apps.
So we're all very excited to see what you go and create.
For more information, you guys can visit
developer.android.com/wear, download the SDK,
and get started there.
There's an emulator.
You can use it with the devices that you got at the giveaway.
And you can get started there.
We also have even more information about these APIs
available as I/O Bytes.
Those are up on YouTube right now.
You can go check them out.
And with that, we will thank you, and we'll take questions.
There's microphones right in the aisle ways here.
Thank you.
[APPLAUSE]
Yes?
AUDIENCE: You mentioned that the wearable leverages
the Google Play services.
So does this mean that we can't connect
to Kindle devices and that kind of thing?
AUSTIN ROBISON: So the connection
is driven by Google Play services.
So we are compatible with any Android device running
4.3 or greater with the Play Store.
AUDIENCE: Thank you.
AUSTIN ROBISON: Let's go to this side.
AUDIENCE: Hi.
Is it possible to leave your custom activity on and let
the device go to sleep, or does that automatically
exit to the home screen when the wakelock is released?
JUSTIN KOH: It's going to automatically time out
after 30 seconds to the home screen.
AUDIENCE: Gotcha.
So there's no way to just leave the app on,
like say you're doing a presentation,
and you want to let it go to sleep while you're talking,
and then you open up and then you want to swipe again?
JUSTIN KOH: So that's where you would probably
use the notifications that we showed
in the first part of this.
AUDIENCE: Oh, gotcha.
Gotcha.
OK, great.
Thank you.
AUSTIN ROBISON: Mhm.
Over here.
AUDIENCE: You guys talked about the watch
as basically just a remote control and the UI,
but what about when the watch is providing like gyroscope,
or movements, and heartbeats and things like that?
How are you dealing with sensors?
JUSTIN KOH: So remember that it is
Android running on this wearable.
So you register a sensor listener for the type of sensor
that you want to consume.
And then remember that these APIs
are callable from both sides of the link.
So you can collect sensor data on the wearable
and then send it to your instance
of your app running on your phone.
AUDIENCE: So you mentioned that you can install discrete apps
on the phone themselves.
Right?
So how do you switch between app to app?
Where's the menu, if you will, or where's
the listing of all the apps that you have on your phone?
AUSTIN ROBISON: So the UI is really
designed to be post-grid.
So it is intentionally missing that grid of apps.
So we encourage developers to think about putting cards
into that stream when they're relevant
or to register for Voice Actions.
You can also say, OK, Google, start app name,
and be able to bring that application up on demand.
You can also get to that through the cue card, which
is the touch menu when you tap on the device.
AUDIENCE: So just a follow-up question.
Are there preset voice commands, or can you
define your own voice command?
AUSTIN ROBISON: The voice commands
are part of the platform.
So on developer.android.com there's
some documentation that can tell you
how to register for the action intents that are available.
We also would love to hear your suggestions
for extending those actions to other phrases or other use
cases.
AUDIENCE: OK.
Thank you.
AUSTIN ROBISON: Mhm.
AUDIENCE: So it might be related to the sensor question.
Can I get access to voice time series?
JUSTIN KOH: For what sort of application?
AUDIENCE: Say I want to record memos.
JUSTIN KOH: You will want to make sure
that the codec is available on the device.
And there are different codecs that
are available on these devices than you
might expect from the phone.
So what we do for our voice is we actually
stream the bytes using the Message API over to the phone,
and then use the phone to process the data.
Just to be more efficient.
AUDIENCE: As a developer, I'm able to do that too?
JUSTIN KOH: Yeah.
So you can set up an audio record and get raw bytes
and send those.
AUDIENCE: Perfect.
AUSTIN ROBISON: You can also, part of the voice recognition
intent, if you just want to record
and have us do the transcription to get the string back,
but if you want the raw bytes, then you can get the raw bytes.
JUSTIN KOH: But watch out for the codecs.
AUSTIN ROBISON: Yes?
AUDIENCE: Hi.
So my question is about the way you
instantiate a lot of this stuff.
Everything you've shown here touches activities and passes
in this.
Notifications tend to mostly live
outside the activity in my mind, right?
So the context that your passing into,
does it have be an activity, or can I
run it from a service or an application?
JUSTIN KOH: It can run from a service,
or yeah, the application context.
In fact, using the application context
is usually a good thing.
AUDIENCE: OK.
So a quick follow-up then.
Also about the bitmaps.
I mean, in my experience, it's always
been so dangerous to play with the byte arrays,
and especially with BitmapFactory,
unless you're using something like Picasa
to handle a lot of that for you.
So in your recommendation, you're
saying go launch an AsyncTask to the bitmap
stuff in the background.
Is there any potential that you're
going to be working towards offering an actual framework
around that to help us with memory management, as opposed
to us having to rely on outside sources?
JUSTIN KOH: Well, on developer.android.com
there are a series of developer articles
on dealing with large bitmaps.
For example, using LRU caches, et cetera.
And that's our current recommendation.
AUDIENCE: OK.
Thanks.
AUSTIN ROBISON: Yeah?
AUDIENCE: So my question is about Glass.
Glass kind of seems separate from all this wearable.
So if we're targeting Glass and Android wearables,
do we need to develop separately for that?
And GDK, the Glass SDK, it seems not as capable
as this, what we just saw.
So what's the plan to reconcile this?
AUSTIN ROBISON: Sure.
So the Wear team and the Glass team
work very closely together.
In fact, yesterday we announced that you'll
be able to take the Android Wear notifications,
so the augmenting APIs, and have those appear directly on Glass.
So we're absolutely working together
to make sure that we have a consistent set of APIs
across all of our wearables in the company.
And there'll be more coming soon.
AUDIENCE: OK.
So does Android Studio support for wearables [INAUDIBLE]
here in different form factors?
Is that going to extend to Glass as well
so we have one APK that runs or that
has components that support Glass as well?
JUSTIN KOH: It'll be one project, multiple APKs.
And in the slide that I showed, there is a check box for Glass.
AUDIENCE: OK.
Thank you so much.
JUSTIN KOH: And I forgot to mention this in my talk,
but yes, these are going to be-- all Android projects,
you can create an Android library
project to store shared code.
So in my example, I have a presentation object
and a slide object that I use on both sides.
I put that into an Android library project and link
both of those from my phone project
and from my wearable project.
AUDIENCE: Hi, yes.
I was wondering if there is going
to be future support for wearables
and connecting with Android TV, or Android Auto,
other of those components for connecting devices like that?
AUSTIN ROBISON: So the same notifications
that I just spoke about Glass will also
function with Android Auto.
So we are absolutely thinking about this
as a pan-Android compatibility.
So we definitely are working very closely
with all of those other teams.
We're working towards a really compelling feature
that we'll be able to tell you about in the future.
JUSTIN KOH: Right.
I think what's important to note here
is that, like we said yesterday in the keynote,
the phone is like your brain.
And so you can control your phone,
and then your phone's driving your TV,
or it's driving your car, and that's
what really links it all together.
AUSTIN ROBISON: Yeah?
AUDIENCE: Are we going to be able to hard set the sample
rate and the sensitivity for the IMU
that's on the wearable devices?
JUSTIN KOH: For the what on the devices?
AUDIENCE: The IMU.
Like the accelerometer, gyroscope, and compass.
JUSTIN KOH: You can use the Android APIs
for setting the update rate.
AUDIENCE: OK, but I mean the sensitivity itself.
Because the gyroscope and accelerometer
have different sensitivities built in,
and they are indicative for different sort
of movement states.
AUSTIN ROBISON: If you can do it on a phone,
you can do it on the wearable.
AUDIENCE: OK.
Thanks.
JUSTIN KOH: Can you do it on a phone?
AUDIENCE: No.
JUSTIN KOH: All right.
AUSTIN ROBISON: All right, feature request.
Yes?
AUDIENCE: Two small things.
Can you request resources directly
from the internet on the wearable,
or are they all just passed through the device?
JUSTIN KOH: We recommend passing it through the data layer.
And that's because we're not supporting a full HTP
stack on the wearable.
AUDIENCE: I gotcha.
And do you get screen on, screen off events from the wearable
as well?
JUSTIN KOH: You do.
AUDIENCE: OK.
And you could just register for them
in the manifest or broadcast receiver or something?
JUSTIN KOH: You can.
What are you trying to do, though?
Sounds like you're trying to do something.
AUDIENCE: Yeah, I got an idea brewing.
JUSTIN KOH: All right.
AUDIENCE: Let you know after.
JUSTIN KOH: Sounds good.
Looking forward to it.
AUSTIN ROBISON: All right.
Yeah?
AUDIENCE: What do you recommend for testing?
Will Espresso have support for wearable for automated testing
or anything like that?
Or is there something coming up?
JUSTIN KOH: Is that like a UI automation package?
AUDIENCE: Yeah, the Espresso and things like that.
JUSTIN KOH: We do use uiautomators in our testing
internally.
So any of the same tools that would
work on a phone or a tablet would work on the wearable.
AUDIENCE: Do you need have both the phone emulator
and the wearable running or just one of them?
JUSTIN KOH: You mean if you wanted
to set up like an integration test
with like phone talking to wearable?
AUDIENCE: [INAUDIBLE] What do you guys recommend
for testing, I guess, for those kind of things?
AUSTIN ROBISON: So there's a couple different ways.
You can, with the SDK, have your phone
connected to your computer over USB.
That can either then be communicating
with a wearable emulator, which is a really good testing set
up, or you can have it then communicating
with a real device, as well, that can be connected over ADB.
So the support of having all of those different devices
in different states is there in the SDK.
AUDIENCE: All right.
Thank you.
AUSTIN ROBISON: Yes?
AUDIENCE: Do you have a way to stream audio
directly from the wearable to the phone?
So if I want to make a telephony app, I can do that?
JUSTIN KOH: Well, in the devices that we announced yesterday,
there are no speakers.
So you'd only be able to talk but not get anything back.
But I think there was an earlier question about getting access
to the raw bytes.
AUDIENCE: Right.
But that won't be streaming.
Or can it be streamed?
JUSTIN KOH: Through [INAUDIBLE]?
We are streaming in our application,
using the Message API.
AUDIENCE: OK.
That's it.
Thank you.
AUSTIN ROBISON: Yes?
AUDIENCE: Could you talk a little bit more
about the existing new voice commands and the process
will be for applying for voice commands?
I was just trying to check it out online,
and one of your voice command pages gives a 404.
AUSTIN ROBISON: The developer pages
are being updated right now, or at least
they were before I got on stage.
There's a page on there that will
list all of the platform intents.
So there are I think about a dozen
in there that support things from call
a car, to start a run, to take a note.
You register for those as you register for any other intent.
If you'd like to request more of them,
you can either speak with your dev rel representative,
or you can go to the G+ community,
where we'll have a form that you can fill out to request
additional Voice Actions.
So that's +Android Wear Developers.
AUDIENCE: So is that going to be the same set of Voice
Actions we have on Glass as well?
Because they have their own voice
command application process.
Is that--
AUSTIN ROBISON: Today they are separate.
AUDIENCE: OK.
Thank you.
AUSTIN ROBISON: Mhm.
AUDIENCE: Hi.
What is the AOSP story for Android Wear?
AUSTIN ROBISON: You want to take that one?
You know that one?
All right, I know that one.
OK.
So the parts of Android Wear that are built on top of KitKat
and are built on top of the existing AOSP parts
will be open source, just like any other release of Android.
The Google Play services and our proprietary first party apps
will not be open source, so it's just like on the phone.
AUDIENCE: So I won't be able to build my own AOSP-based phone?
Watch, I mean.
AUSTIN ROBISON: Correct.
AUDIENCE: Thank you.
AUDIENCE: Hi, so I think this is great stuff,
and I appreciate the fact that Google is thinking
about disconnecting people from their devices
just so they can be more human, basically.
But my question is this kind of variable stuff still needs
the phone to be in proximity.
I want to know what happens if the phone is not in proximity.
So one scenario would be let's say I want to go for a run.
And I don't care about taking calls or anything
at that point, but if a friend texts me and says,
hey, I'm going to run late for the next thing,
then I might want to see that.
Or might want to reply that, hey, OK, I'm
going to take a longer run, in that case.
Whatever.
So how does that interaction happen
and if there is room to do that right now,
or if there is a plan for future?
AUSTIN ROBISON: So today the devices
use Bluetooth for connectivity.
So when you are out of Bluetooth range with your phone,
then you have no connectivity, but you can still
access the data items that are in your data store.
For example, you can still do a few actions
on the device itself through touch.
In the future, we definitely think
about a connected constellation of devices on your body.
So in the future, we will explore this.
I think that's the direction that the industry's going
in-- more and more connectivity.
There are power trade-offs to be made here.
But it's definitely something that we think about.
AUDIENCE: OK.
So just one question.
I mean, you said we can still do a few things when it's out
of Bluetooth range, but whatever I
do it's not going to-- I mean, if I, say,
send a text to someone, it's not going to actually send it
because it can't connect to the phone.
JUSTIN KOH: That's correct.
AUDIENCE: Yeah.
OK, awesome.
Thank you.
AUSTIN ROBISON: Mhm.
AUDIENCE: Yeah, so following on the HTP stack,
I know this is basically an Android Wear OEM agreement as
far as things like hardware they have to support.
Are you guys going to release-- can we expect a GPS?
I know heart rate is going to be on some devices
but not on others.
I'm just curious as to what specific hardware is going
to be guaranteed in every Wear-compatible device.
AUSTIN ROBISON: So in the beginning here,
having the accelerometer IMU package is part of it.
The heart rate is part of the platform.
It is optional.
And in the future, we will be narrowing that down
and releasing more information.
AUDIENCE: OK.
Is there anything you can mention today as far as GPS?
AUSTIN ROBISON: Not today.
AUDIENCE: Not today.
All right.
Thank you.
JUSTIN KOH: All right.
Last question, I think.
We're getting the wrap it up signal.
AUDIENCE: You're going to post the puppy app?
JUSTIN KOH: Yes.
I need to finish it.
I want to.
AUDIENCE: It's a good point to start from.
Thanks.
JUSTIN KOH: We have plenty of samples
as well, like with different parts of the puppy app.
The puppy app kind of draws them together,
but we have samples written by some of our great team members
sitting here in the front row which
will be targeted at some of the different things
that we're doing.
AUSTIN ROBISON: All right.
Last last question.
JUSTIN KOH: Super last question.
AUDIENCE: You mentioned no speakers on their thing,
but I'm wondering if you guys have integrated talk
back so that it could work on a device that
does have a headphone input.
AUSTIN ROBISON: Not today.
AUDIENCE: Will you be planning on any kind of accessibility?
AUSTIN ROBISON: Accessiblity is definitely
something that is very important to the platform,
and we'll be improving upon that in future releases.
AUDIENCE: OK.
JUSTIN KOH: All right.
AUSTIN ROBISON: All right.
JUSTIN KOH: Thanks, everybody.
AUSTIN ROBISON: Thank you, everybody.
[APPLAUSE]