Android Touch System — Part 3: MotionEvent Listeners | by Sherry Yuan | Apr, 2022

Photo by Ilyuza Mingazova on Unsplash

This is Part 3 of my Android Touch System series. Parts 1 and 2 take a deep dive into touch functions (onDispatchTouchEvent(), onInterceptTouchEvent(), onTouchEvent()) and how touch events flow through the view hierarchy. This post will cover the main event listeners provided by the View class, as well as standalone gesture detector classes.

Most listener interfaces correspond one-to-one with a function in the view class — for example OnTouchListener with onTouchEvent() — so when should we use a listener instead of overriding the function directly? There are three considerations:

  1. Using a listener allows handling touch events without creating a new custom view. If a view has everything you need except event-handling, it’s less verbose to add a listener to your instance of it rather than extending it. For example, if you want a specific TextView to open a screen when clicked, setting an OnClickListener would be simpler than creating a CustomTextView that extends TextView and overrides onTouchEvent().
  2. Do other instances of the view require the same event handling? For example, if your app has an ImageButtonView that handles a tap differently for each usage, adding a listener to each instance would make sense. But if tapping an ImageButtonView should always open a specific screen, then overriding onTouchEvent() would be simpler.
  3. The listeners are generally called before the corresponding function, and if the listener returns true, the corresponding function won’t be called. Here’s some source code from View.dispatchTouchEvent():
if (li.mOnTouchListener != null 
&& li.mOnTouchListener.onTouch(this, event)) {
result = true;
if (!result && onTouchEvent(event)) {
result = true;

mOnTouchListener.onTouch() will always run before onTouchEvent(). If you still want the view’s onTouchEvent() (or any other event-handling function) to be called, remember to return false from the listener.

OnTouchListener correspondents to onTouchEvent()and is called in dispatchTouchEvent() right before onTouchEvent().

This is the general listener that receives all touch events and allows you to do your own processing on the events. If your view only cares about specific events like clicks, long presses, etc, it’s simpler to use one of the more fine-grained listeners covered later instead.

Kotlin provides two ways for setting the listener. We can write a lambda:

binding.viewA.setOnTouchListener { v, event ->

Or implement the interface:

class DemoOnTouchListener(
val viewName: String
) : View.OnTouchListener {
override fun onTouch(v: View, event: MotionEvent): Boolean {
TimberLogger.log("$event in $viewName")
return true
DemoOnTouchListener("View A")

The benefit of the lambda approach is conciseness and better performance. A lambda exist as a single object in the JVM no matter how many times it’s reused, whereas a class requires a new allocation for each usage. The class implementation approach is useful if you need to reuse the event-handling logic, or if you need more complex state management.

Most of the other listener interfaces also support both approaches. The code looks very similar for them, so for brevity’s sake I won’t provide sample code in their sections.

OnClickListener correspondents to performClick()which is called in onTouchEvent().

If you’re only interested in click events, you can implement this interface instead of OnTouchListener. OnClickListener.onClick() returns void, since OnTouchListener.onTouch() returns a boolean, requiring developers to do additional event processing and figure out what to return.

It’s also easier to programmatically trigger OnClickListener.onClick() than nTouchListener.onTouch(); you can call the parameter-less performClick()whereas onTouchEvent(event: MotionEvent) has a parameter. You can also use callOnClick() to directly call any attached OnClickListeners. unlike performClick()this won’t do any other clicking actions, such as reporting an accessibility event.

However, if a view has both an OnTouchListener and an OnClickListener set, OnTouchListener.onTouch() gets called first. If it returns truethe OnClickListener.onClick() will never be called.

This is an infrequently-used listener and corresponds to onGenericMotionEvent(). Generic motions include joystick movements, mouse hovers, track pad touches, scroll wheel movements and other input events.

The View source code includes:

public final boolean dispatchPointerEvent(MotionEvent event) {
if (event.isTouchEvent()) {
return dispatchTouchEvent(event);
} else {
return dispatchGenericMotionEvent(event);

So generic motion events are actually mutually exclusive from touch events; they include ACTION_HOVER_MOVE, ACTION_HOVER_ENTER, ACTION_HOVER_EXITand ACTION_SCROLLall of which don’t involve a finger/pointer touching the screen.

One use case for OnGenericMotionListener is handling scrolls on the rotating side button on Android Wear devices.

This is an even more obscure listener and corresponds to performContextClick()which is called in dispatchGenericMotionEvent(). It’s triggered by a subset of generic, non-touch motion events, namely stylus button presses or right mouse clicks.

I’m mostly mentioning this one for the sake of completeness and to clear up any confusion about which click listener to use. For any finger/pointer clicks, you’d want to use OnClickListener.

All the previous listeners are interfaces declared in the View class, but GestureDetector is a standalone class and requires additional setup. It allows detecting common gestures that involve multiple motion events and would be tricky to implement the event-processing for, including onLongPress() and onFling().

GestureDetector‘s constructor has one parameter, an OnGestureListener. We can either implement the GestureDetector.OnGestureListener interface and override each function, or extend the GestureDetector.SimpleOnGestureListener class. SimpleOnGestureListener implements GestureDetector.OnGestureListener, GestureDetector.OnDoubleTapListenerand GestureDetector.OnContextClickListener with default no-op implementations for each function, so that we only need to override the ones we care about.

Here’s some sample code:

// Declare a GestureDetector
val gestureDetector = GestureDetector(
object : GestureDetector.SimpleOnGestureListener() {

override fun onDoubleTap() {
Timberlogger.d("double tap")
return true

// We can use it in setOnTouchListener()
setOnTouchListener { _, event ->
// Or in onTouchEvent() in a custom view
override fun onTouchEvent() {
if (gestureDetector.onTouchEvent()) {
return true
return super.onTouchEvent()

ScaleGestureDetector is another standalone class and can detect scaling transformation gestures — ie. pinch to zoom. It provides a getScaleFactor() function to help with any resizing calculations.

It’s similar in concept to GestureDetector, although the two classes aren’t related in terms of inheritance or composition. Their usages look similar, too:

val scaleGestureDetector = ScaleGestureDetector(
object : ScaleGestureDetector.SimpleOnScaleGestureListener() {

override fun onScale(detector: ScaleGestureDetector): Boolean {
Timberlogger.d("scale ${detector.scaleFactor}")
return true

setOnTouchListener { _, event ->

If you have a view that needs to handle both panning and pinching-to-zoom, you’ll need both detectors; OnGestureListener.onScroll() detects panning gestures and OnScaleGesturerListener.onScale() detects the pinches.

The main resources I used for this article are:

Here are the links to Part 1: Touch Functions and the View Hierarchy and Part 2: Common Touch Event Scenarios again. Part 4 will look at touch in Jetpack Compose. This demo app on Github has all the sample code for my Android Touch System articles.

Thanks for reading and stay tuned for Part 4!

Leave a Comment