What’s New for App Developers in Android Auto (Android Dev Summit ’18)

What’s New for App Developers in Android Auto (Android Dev Summit ’18)


[MUSIC PLAYING] KODLEE YIN: Hi. My name is Kodlee. RASEKH RIFAAT: And I’m Rasekh. And we’re both engineers
on the Android Auto team here to talk to you
about what’s new for app developers on Android Auto. Now we’re incredibly
excited at Google about the automotive
space right now because we see it going through
a huge transformation in connectivity,
electrification, interfaces and sensors, sharing,
and autonomy. Cars are rapidly turning into
full-blown computers on wheels. They’ve got high-speed
mobile connections, cameras, microphones, and screens of all
shapes and sizes everywhere. Android Auto is an
effort from Google and our automotive partners to
bring these advances together and create a safe and
seamless connected experience for drivers everywhere. Of course that’s
easier said than done. There are dozens of
different car platforms today, many different input
types from touchscreens to touchpads to
rotary controllers, many different screen shapes,
sizes, and resolutions. Today, you can see that
vision at work in any Android Auto-compatible car. Drivers have access
to their favorite apps right from their car’s display,
and developers build their app once without worrying about
different makes and models, input controls, and screens. Today, we’ll talk about
two of the most important app categories,
messaging and media. KODLEE YIN: Great. So first up is messaging. Messaging has come a long way
in both Android Auto and Android the OS. When Android Auto started
supporting messaging, there wasn’t really a good
way for messaging apps to get their messaging
information over to the car. That’s where CarExtender
came into play. CarExtender allowed a
way for messaging apps to provide conversation
details and a way to reply to conversations
to Android Auto. But since Android N, apps could
stylize their notifications with something called
MessagingStyle. MessagingStyle is a huge
step up from CarExtender as it allows messaging apps
to provide conversation information directly
into the notification. Not only does it provide a nicer
UI for conversation details but it provides affordances like
replying and liking directly in line to the notification. Android Auto now
fully supports the use of MessagingStyle and
actions without the need for CarExtender. This also means Android Auto
and the Assistant both fully support group messaging. So for the price of
implementing MessagingStyle, apps not only gain a richer
mobile user experience but also gain the benefit
of automotive support. So let’s see how Android Auto
interfaces with this, starting on the messaging app side. From Android Auto’s
point of view, messaging apps have
three core functions– notifying users of messages,
marking those messages as read, and replying to those messages. Working backwards, apps can
implement reading and replying with services. These services can be
triggered internally with intents or externally, like
via Android Auto, with pending intents. Notifying is done via
an Android notification, and the messaging
information is provided with the MessagingStyle. The mark is read, and
reply-pending intents are wrapped in actions and both
provided in the notification as well. Note here that the reply action
has a remote input that’s added that acts as a sort of
input field for the reply. And that’s the messaging
app’s architecture. Moving on to the other
side of the notification, we can see how Android Auto
leverages these objects. Android Auto will first
post an in-car notification and once tapped
on will read aloud the messages contained within. The mark as read pending
intent is then fired. The user is given the choice
to respond and, if taken, a transcription of that response
is set in that remote input. The reply-pending
intent is then fired. And that’s the entire
Android Auto flow, so let’s see how we
can put that into code. First, the app needs to declare
support for Android Auto. To do that, it needs to
create a new XML file that’s linked in the Android manifest. This XML file says that it
has notifications that Android Auto should take a look at. Note that for messaging apps
that support SMS, MMS, or RCS, this uses SMS bit also
needs to be added. So now Android Auto is taking
a look at our messages. We can build up
the MessagingStyle. So we can’t really have a
conversation without people, so the first person we have to
add is the user of the device. To do that, we create
this new Person object. Person is used to set
things like the user’s name, their icon, and a
unique key in the event that multiple people
have the same name. So we create this
deviceUser, and we create the messagingStyle with it. We can then add our
conversation information. So I’m from Seattle
and I love skiing, so I’m setting the conversation
title to ski group. Because I’m taking
multiple friends this is a group conversation,
so the messaging app needs to set it as such. Note here that conversation
title and whether or not the conversation is a
group can be set independently. This is new in Android P
and has been back ported to older Android versions
in the Compat library. And finally, we can
add all the messages in this conversation in the
order they were received. In this case, my friend wants
to coordinate breakfast, so the messaging app provides
the text, the timestamp, and the sender in
the form of a person. With this conversation set up,
it’s time to add the actions. For the reply action we
instantiate an Action.Builder and set the semantic action
to SEMANTIC_ACTION_REPLY. That must also tell the OS that
firing the reply-pending intent won’t show any extra UI. This is especially
important in Android Auto because we don’t want to
be distracting drivers with extra pop ups. Finally, the reply action is
supplied with that remote input I talked about earlier. On the mark-as-read side, things
are done about the same way. This time the
semantic action is set to SEMANTIC_ACTION_MARK_AS_READ,
and again we tell the OS that firing
that pending intent won’t show extra UI. Note here that the
mark-as-read action does not need a remote input. So that’s all three pieces. The notification
can now be built. For reference, here are the
three elements we created– messaging style, which holds all
our conversation information; our reply action; and
our mark-as-read action. To build a notification,
some boilerplate is provided, and then we set the
messaging style. We can then add our actions. Here is where the messaging
app has some options. Note that the reply
action is added as a regular visible action,
and the mark-as-read action is added as invisible. This is purely stylistic. One can add both actions
as visible or invisible. This will just change how it
shows up in the mobile UI. On Android Auto,
actions are never shown, but Android Auto will be able to
read both visible and invisible actions. And finally, the messaging
app can post the notification. And there we have it. My friends and I have planned
breakfast on the road, and our ski trip is under way. RASEKH RIFAAT:
And now that we’ve coordinated with everybody,
let’s find something to listen to on the drive
out to the mountains. Media in the car is one of
our core user experiences, and getting drivers
access to their content should be front and center. I’m going to talk about several
new features we’re introducing today to enhance the
abilities of media apps to provide content
within Android Auto. In particular, we want to
make content more visually pleasing by adding additional
content-style hints and enabling additional search
results provided by the app. To start off, let’s go
over the architecture that an app has
when communicating with Android Auto. The first thing a media app
needs is a MediaBrowserService. It provides a tree of
playable and browsable items. Browsable items are basically
folders to organize app content instead of returning a giant
list of playable items. The media apps implement
the onLoadChildren method which loads a particular
level of the tree. Here in our first call
to onLoadChildren, our example service
would return home, recently played,
recommended, and playlists. Now, since this is
running in a car, we recommend that
media apps only provide two levels
in a tree to avoid distracting drivers
and making them click through multiple
levels while they’re driving. Now once the users
pick something playable from the browse tree,
the MediaSessionService is used to start playing
music and to provide metadata and controls to show
what’s currently playing. For example, our media app
that we’re showing here supports play/pause, skip
forward, and skip back, and we show that in
the Android Auto UI. There’s also the
ability for media apps to provide their own
custom actions, maybe something like 30-second skip. And obviously we want to get
the user away from touching or doing things while
they’re driving, so we bring in the Assistant. It might say something like,
hey Google, play my ski jams. The Google Assistant
performs speech recognition and can request that the
MediaSessionService play the query, and music
starts playing. We’re going to take it
one step farther today. We’re giving the
ability for media apps to implement an
additional function on the MediaBrowserService,
onSearch. And once the music has started
playing from a Google Assistant query, we’ll provide that
query to the media app, and they can provide
additional results. Here in this case, the media
app provided a ski trip playlist from this year as well
as one from last year. So let’s take a look at the
code needed to make this happen. For apps which already
support Android Auto, this should look
pretty familiar. This is the onSearch method. It takes the query
string, an extras bundle, and a result object which the
app fills in and sends back to Android Auto. First off, apps should
return an empty list if they get a query
they don’t support. Second, for queries that can’t
be answered synchronously, apps detach from
the results object, and that lets the
media framework know not to wait and not to send
anything back to Android Auto right away. This gives a chance for apps to
do extra work on a background thread before sending the
results to Android Auto. And finally when the
results are ready, they can send the result
and the result object, and Android Auto will be
notified and show the results on screen. Now all these code snippets
come from the Universal Music Player, an open-source media app
published by Google on GitHub. It can be easily
cloned, compiled, and used as a great reference
building your own media app. So voila. Our media app returns a list of
items from the ski jams query. Notice it returned two
playlists and an album. It would be really
nice if Android Auto could group those items and
show them to the user as groups. Fortunately, we’re
introducing a way to do that in the
onSearchResults. Here’s an example function
which your media app might use to convert from an internal
representation of a media item into the MediaBrowserCompat
media item that Android Auto needs. We can annotate items
with a category extra, and Android Auto will
group any adjacent items with the same category
and show a heading. For the two ski trip playlists
we can annotate with playlist, and Android Auto will
group them together and add the heading for you. We’re also adding some
additional annotations on media items that would be
really useful on our trip. For example, I may be
heading out to the mountains with my family. I worry that maybe
a song comes on that has some explicit content. We now add the ability to say,
OK, this particular playlist or song has explicit
content, and Android Auto can show that in the UI. Similarly, out in
the mountains, I might not have great bandwidth. I’d love to know if the
playlist or songs have already been downloaded,
or maybe I don’t want to burn my data on
music that I’m playing. We can also annotate
or not media items have been downloaded and
are already on the device. Great. Looks like the Ski Trip
2018 is already downloaded, doesn’t have any
explicit content. Great choice for my trip
out to the mountains. There’s one more function
that needs updating. The MediaBrowserService
onGetRoot is called when a
media app is first connected to by Android Auto. In order for search and
for the additional styling hints to be enabled, you’ll
need to add a couple of extras to let Android Auto know that
you support those features. As I mentioned, we’re
introducing the concept of additional content
styling, and Android Auto will be interpreting
the browse tree returned by apps in a much more
visually pleasing way. By default, items which are
browsable, like folders, will be interpreted as lists. This is how we do things today. But for playable items,
things like songs or albums or playlists, we’re going to
be showing them now as grids. Most of these items have
much richer visual content that users can identify by
seeing much easier than reading and much safer when
you’re in the car. There are, however, times when
a list is better than a grid. For example, in a podcast app
each of the individual podcasts would probably have individual
art that is much more visually representative while the
episodes, instead, they’d have all the same art but
different episode titles and lengths and status,
and it would be much better to show them as lists. In the onGetRoot
function, the media app can provide a hint
to Android Auto to say I prefer
my browsable items to be grids and
my playable items to be lists or vice versa. So they have full control over
how we’re showing the items. I already mentioned the
Universal Media Player. I just want to reiterate, it’s
a great, comprehensive sample media app that’s available. It gives you a canonical
implementation of a media app that actually plays
music, and it also supports Android Auto as
well as other services like Wear and Android TV. And if you are
developing a media app, I also encourage you to
check out the Android Media Controller, another open-source
app hosted on GitHub. It will connect to
your app’s MediaSession and MediaBrowseService, and
it shows you information that your app is
presenting to Android Auto in a clear, semantic format. If you’re using whitelisting to
block apps other than Android Auto from accessing
your browse tree, it’s probably a good idea to
either add the Media Controller to the whitelist or disable
the whitelist while testing. So to sum up, we’ve
shown code samples for MessagingStyle,
notification actions, providing search results
with the MediaBrowserServ iceCompat.onSearch, attaching
new extras for Media Items metadata, and declaring support
for content browse and search in rootHints. KODLEE YIN: So great. We look forward to seeing all
of your messaging and media apps in the car. Rasekh and I will be
available tomorrow morning at office hours to
answer any questions you have about Android Auto. Thank you all for watching. [MUSIC PLAYING]

2 thoughts on “What’s New for App Developers in Android Auto (Android Dev Summit ’18)

  1. Honda Dealer is putting 4.1.1 Very old operating sistem on they car 2017/2018 Honda civic why don't know but it don't work great I'm driving one let me. Tell you is the worst thing ever I don't even use it anymore.. I cat not count on Android auto any more

Leave a Reply

Your email address will not be published. Required fields are marked *