Testing Rebooted (with AndroidX Test) (Android Dev Summit ’18)
Articles Blog

Testing Rebooted (with AndroidX Test) (Android Dev Summit ’18)

August 12, 2019


[MUSIC PLAYING] JONATHAN GERRISH: It’s great
to see so many people here interested in testing. My name’s Jonathan Gerrish,
and together with my colleague, Yuki Hamada, we’re going to be
presenting a session on testing APIs today. So just to get
started, just hands up, anyone who’s written unit tests. It’s not a trick question. Good. And what about
integration tests? You can be honest. Awesome. OK, let’s get started. So on Android, there’s
two kinds of tests you might be familiar with. There’s local unit
tests and then there’s instrumentation tests. So let’s start by looking
at local unit tests. So these are tests
that are executed on your local
developer’s workstation on the local VM running there. And because you don’t need to
run the entire Android build chain– you avoid dexing and packaging
and installing on a device– these tests are actually
really, really fast. On these kind of tests, you can
use a tool like Roboelectric, which comes with its
own set of testing APIs for setting up the state of
your Android environment, or you can use a
tool like Mockito and you can stub
out the interactions with the Android framework. Either way, they allow
you to write tests, setting up the state
of the environment that satisfy the
preconditions that you might want in your test case. The second kind of tests
are instrumentation tests. Now, these are the
tests that will run on a virtual or real device. A real device could just
be a phone connected to your workstation or it
could be a farm of devices somewhere in the cloud. These kind of tests
run slower because you have to execute the whole
build chain and install an application onto
the device, but they have the advantage
of being a lot more accurate because the
real or virtual device is very similar, or in some
cases, identical to devices that your users will be
using out in the field. And this brings
you the confidence that your app is going
to behave as you expect. One criticism we have heard is
that on these kinds of tests, there’s actually a lack
of testing APIs available, which makes it difficult
for you to set up the state of your environment
in a way that satisfies certain preconditions or edge
cases that you might want to be testing. And so we’ve heard
you loud and clear, and this is something
we’re actively working on. So a bit of a little
history lesson. In 2017 at Google
I/O, we presented what we called the
Android testing story. So it was based loosely around
the software development pyramid. And in this model,
we encouraged you to write lots and lots of
fast, scalable unit tests that test all your
exhaustive conditions. We encouraged you
to write a smaller number of instrumentation
tests that will actually prove that all these
units assembled together behave as you would
expect on a real device. And in some ways, this
was kind of a compromise. It was a compromise between the
advantages of one kind of test and the trade-offs of another. So it brings you a holistic
way of testing your app. And we showed how
this kind of approach can lead to test trim
development on Android. First of all, you would
start with a failing UI test. This would be an instrumentation
test, probably written with Espresso, and it would
test the UI of your component– your feature. And then you would satisfy that
feature by a series of units– classes– coming together
with their interactions. And you could test
drive these, as well, using a tool like
Roboelectric or Mockito running as a local
test, and this gives you very fast development cycles. Finally, when you bring
them all together, you’re able to run the slower
running, but more faithful instrumentation test, and
hopefully, it goes green and you’re done. Well, we enter a
refactoring cycle because maybe your code
leaves a little to be desired and you want to do some cleanup. So you can spend some
refactoring cycles there, before coming around to the
beginning of the cycle, where, if you have any more
work on the feature, you might add another test– test another aspect
of that feature– and you’ll keep iterating until
you’re complete, at which time, you’re good to submit your code
and move on to the next task. So at Google I/O this year,
we realized there was somewhat of a test writing crisis. And because there’s so
many tools available, it’s not always clear
which one to use. And each of these tools all
have their own different styles and APIs and paradigms
for the same concepts that exist on Android. And the problem with
this is that tests written at different levels
are not portable across levels. Your test is kind of stuck. It’s coupled to the testing
tool and the environment that you’ve written it on. So this year at Google
I/O, we announced a beta of AndroidX Test. It brings testing as
a first class citizen as part of the tool
chain, as part of Jetpack. And we include some of
the existing libraries you’ve used before, some new
APIs, full Kotlin support, which allows you to write really
beautiful and concise tests, and it’s available
on and off device. Well, last week, AndroidX
Test moved out of beta, into the first full
and final release– 1.0. It’s also, as of last
week, fully open sourced, so we look forward to
welcoming your contributions. And all of the documentation
on developer.Androi d.com/training/testing has
all been revamped to show you the new styles of APIs. So please go and check that out. So let’s take a look inside. The first module
that we pulled across was the existing
JUnit for support– the runner and the rules that
you may have used before. We’ve also added a new
module, which we call Core, and this includes some new APIs. ApplicationProvider–
as its name suggests, it’s a
quick and easy way to get a hold of the
application context. ActivityScenario, which
is a brand new API that provides coarse and fine
grained APIs for which you can test your activities. And FragmentScenario, which
was released, actually, just this week, providing a
very similar set of testing features for fragments. We’ve also brought Espresso into
the Jetpack AndroidX family. Espresso, if you’re
not aware, is a library with a set of view matching
APIs and a set of view actions. So it’ll allow you to match
and then interact with those UI elements. It also includes some other
things like the ability to capture and stub
intents for the system. And finally, we’ve also released
some Truth Android extensions. And Truth is Google’s
open-source fluent testing sessions library,
and we’ve brought in a bunch of components
for Android subjects, which allow you to
test your Android objects in a way that reads
beautifully and concisely. Those of you who’ve been
using Roboelectric will know that we’ve had a version– 4.0– in beta for a while. And as of last week, we
did a simultaneous release. That’s now gone into final. Roboelectric 4 fully supports
all of the unified APIs that are in AndroidX
Test, as well as a number of its own new
features and improvements. OK. And so I’d like to
welcome Yuki on stage. He’s going to give you
a deeper dive into some of the APIs available. Yuki? [APPLAUSE] Here you go. YUKI HAMADA: Thanks. Thanks, Jonathan. Hi, everyone. So let me introduce
our new APIs. Let’s start with
ApplicationProvider, which is a new way of accessing
context from your test board. So when you work
on Android testing, you need to handle two
different context objects. The first one comes from
application under your test and the second one comes
from the instrumentation APK, where your test code is stored. So with today’s library, we
have two different methods to access this context object,
and this makes your test code harder to understand
because the library uses library static terminology. Target context means context
from your instrumentation APK, while– sorry. The target context means the
context from your application, while get context
means the context from your instrumentation APK. Also, it is not obvious which
one to use for your test. So in our new API, we hide
the instrumentation context from public API, and
ApplicationProvider only provides our application context
in the form of your application graph. Let’s take a look
at the example code. So here, let’s say we have a
test for the location tracker activity, and in
the setup method, we get the target context
in the old fashioned way, and we typecast it to our
location tracker application so that we can register our
mock object for testing. And second line, set the
mock object for that. And this code is pretty
simple and it actually works, but you could have
used a wrong context and faced a runtime
error and ended up with wasting your
time for debugging. Now, with the new way,
application context provider provides your context in
your application graph, so you can do the
exact same thing, but there is much less
chance for confusion or bugs. OK. Let me move onto the
more complicated stuff– ActivityScenario. Actually, before we
dive into details, I have a few questions for you. How many of you have
written your own activity and handled the lifecycle
transition by yourself? Please raise your
hand if you are. Cool. And how many of you have
shipped your activity with a nig related
to life cycles? Oh, many of you. And who didn’t like
the test for that? Who did not like the test? Cool. Well, yeah. I see some hands
up– keeping up. And to be honest, I did, too. And I agree. The writing test for the
activity life cycle transition is pretty hard, and
there is no such good API in our testing library as now. So that’s why me and
Jonathan and our team have sought for a
solution, and we developed the ActivityScenario,
which– you can use it for driving your activity
state to an arbitrary state for testing. So let’s revisit the
activity state first. So the created state is
when activity is created– I mean, instantiated– but it
is not visible to users yet, or activity can be
created state while it is running in background, as well. And the started state
is when activity is created and started. It’s partially visible to
users, but not the foreground activity. Activities running in picture
mode are also in this state. And the resumed state
is where your activity is fully visible to users
and running in foreground. And under the framework– can
change the life cycle state by any time by the
user interactions, so your activity has
to handle those state transitions properly for
a good user experience. Otherwise, you’ll see some bugs. And the ActivityScenario
provides a method move to state, which you can drive
your activity’s life cycle state to an arbitrary
state for testing. Let’s take a look
at the example code. Here, we have our test for
our location tracker activity, and here, we want to verify
that the location listener is properly unregistered
from the system when the activity
becomes a created state. So at first, we have
this start activity. Launch activity takes your
activity graph and start activity, and waits for it
until it becomes resumed state. And move to state initiates
that life cycle transition and moves the life cycle
state to the created state. Also, the older
ActivityScenario method works as a blocking code. So after this method
go, it is guaranteed that the activity’s life
cycle state– to be created. And then you can inspect your
activity’s internal state by coding own activity method. Yes, that’s easy. Now we have our API. And also, you can
use ActivityScenario for testing the creation
of your activity. So activity recreation
happens when your activity is running in background
for a long time and when you come back later. And your activity has to
save its internal state to a saved instance
state bundle before it is destroyed, otherwise,
you will lose the state. And ActivityScenario
has a method– recreate– where you can use
it for testing such a scenario. Let’s see the example code. So here in this test,
we want to make sure that input text is restored
properly after the activity is recreated after destruction. So first, we have some test
data like test user as an input, and again, we start
activity, and then we fill a text box by
using the Espresso library. As you can see, the
ActivityScenario works pretty nicely
with Espresso, and then we code recreate, which
actually destroys the activity and recreates a new instance
and waits for the activity to be resumed state. And then, again, using
your Espresso again, we can make sure that the
text is there, as we expected. Yes, it’s all so simple. Finally, I’d like to show you
one more example from the Truth extension. Intent subject– by
use of intent subject, you can verify
your intent values, and it produces really
good human friendly error messages if the error happens. Let’s see an example code. In this time, we
want to, again, make sure our data intent has an
expected contact name in it– extra. So first, we create a
bundle of the data intent, and then we assert
by three lines. One is checking if that
intent has expected actions, and the second one is
type, and the third one is the extra bundles. And if the value doesn’t
match your expectation, you see this error. As you can see,
you can immediately know in this example,
the intent action is not what you expected. And [INAUDIBLE] component
come with AndroidX Test 1.0. I can’t show you
everything today, but for example, we
have more assertions and we also have
Android builders where you can create
your testing data easily. And also, scenarios for
activity and fragments, and you can take a look at this link
or documentation to see more. I hope you do try it
out after the talk. OK. So this is our solution for
the test writing crisis. So with unified
API, you no longer consider whether to
write an instrumentation test or the Roboelectric test
because you can now just write an Android test,
and that test runs on both runtime
environments nicely. So with the unified API, you
can focus on what to test and you can forget
about where and how. And to ensure the
consistency of the behavior of our API, we have
a verification test, and we run the same
test against the– sorry. We run the same test
locally with Roboelectric, and also, we run our
test on virtual devices, from the API level 15
to the latest version. And let’s go back to the
workflow we showed you earlier in this talk. So we can execute the
test driven development much more efficiently
using the device agnostic test with the data test
writing in a unified API. As our recommendation,
we recommend that you write the test and
run the test with Roboelectric, until your code is ready
to submit, and then run the same test, but
on the virtual devices before actually you submit
the test to maximize your confidence. Also, you can run the same test
as a continuous integration test against the
[INAUDIBLE] binaries. With the upcoming Nitrogen
[? two ?] [? chains, ?] you can set up such a
configuration easily. If you want to know more
about the Project Nitrogen, we have a session
tomorrow, and we highly recommend attending it. Cool. Thank you very
much for listening, and have a happy testing. [APPLAUSE] [MUSIC PLAYING]

Only registered users can comment.

  1. Google has to give presentations to people that are good at presenting. I admire a lot the development work that all Google engineers do, but not all engineers are good to make public presentations, and the whole point of presentations are teaching. If someone is not good at presenting the teaching goal is not well achieved.

    I don't want to disrespect anyone, by no means, but I do think that Google should evaluate better who's prepared to go on stage and give a nice, understandable, engaging talk, and who's not. Different people have different skills and there's nothing wrong with that.

  2. Great work. AndroidX Test is as exciting as when I saw the beautiful espresso API. It makes very good sense.
    Regarding the criticism on presentation skills, I find no problem with the presentation, except for the accent. However, software development is truly global and we should probably get used to different accents. I appreciate the courage of this developer to give out a talk to global audience in spite of his difficulties in English. Pretty sure he is working very hard on his English and Presentation Skills.

  3. Testing code should be simpler to write then coding the system under test. The aligning and streamlining of different testing apis is necessary to make TDD common practice in android development. I am excited to work with AndroidX and hope to see more code labs and samples on how to better use all the frameworks together.
    P.S. I'm sure his next talk will be even better.

  4. Let him use this comment section as a StackOverflow site.

    15:23 –> This code throws NoSuchMethodError: No static method getIntentForActivity()

    Anyone else following this video to actually implement androidx Test core? Please share any git repo if you successfully implemented activityScenario.

  5. I guess I am still unsure about how to reuse the same test between instrumentation test and unit test. An example would be nice.

Leave a Reply

Your email address will not be published. Required fields are marked *