Implementation of touch screen gesture recognition in Android Application Development

Many times, using gesture operations such as fling and scroll on the touch screen to operate will greatly improve the user experience of the application, such as scrolling in the browser with scroll gesture, turning pages in the reader with fling, etc. In the Android system, gesture recognition is realized through the gesturedetector. Onggesturelistener interface. However, William searched through the official Android documents and found no relevant example. Touchpaint in the API demo only mentioned the processing of ontouch events and did not involve gestures.

Let's first clarify some concepts. First, Android's event processing mechanism is based on listener (listener). Compared with what we call touch screen related events today, it is through ontouchlistener. Secondly, all subclasses of view can add listeners to certain events through setontouchlistener(), setonkeylistener(), etc. Thirdly, the listener is generally provided in the form of interface, which contains one or more abstract methods. We need to implement these methods to complete the operations of ontouch(), onkey(), etc. In this way, when we set an event listener for a view and implement the abstract method, the program can give an appropriate response through the callbakc function when a specific event is dispatched to the view.

Take a simple example and use the simplest textview to illustrate it (in fact, it is no different from the skeleton generated in ADT).

We set an ontouchlistener for the instance TV of textview. Because the gesturetest class implements the ontouchlistener interface, simply give this as a parameter. The ontouch method implements the abstract method in ontouchlistener. As long as we add logic code here, we can respond when the user touches the screen, just like what we do here - print a prompt message.

Here, we can obtain the type of touch event, including action, through the getaction () method of motionevent_ DOWN,ACTION_ MOVE,ACTION_ Up, and action_ CANCEL。 ACTION_ Down means to press the touch screen, action_ Move refers to moving the force point after pressing the touch screen, action_ Up means to release the touch screen, action_ Cancel will not be triggered directly by the user (so it is not in the scope of today's discussion, please refer to ViewGroup. Onintercepttuchevent (motionevent)). By judging different operations of users and combining getrawx(), getrawy(), getx() and gety() methods to obtain coordinates, we can realize functions such as dragging a button and dragging a scroll bar. For standby, you can see the documentation of the motionevent class. In addition, you can also see the touchpaint example.

Back to today's focus, how can we identify the user's gesture when we capture the touch operation? Here we need the help of the gesturedetector.ongesturelistener interface, so our gesturetest class looks like this.

Then, in the ontouch () method, we call the ontouchevent () method of gesturedetector and give the captured motionevent to gesturedetector to analyze whether there is an appropriate callback function to handle the user's gesture.

Next, we implement the following six abstract methods, of which the most useful are onfling (), onscroll (), and onlongpress (). I have written the meaning of the gesture represented by each method in the notes, and you can see it by looking at it.

Let's try to handle the onfling () event. The meaning of each parameter in the onfling () method is written in the comments. It should be noted that in the processing code of the flying event, except the first action that triggers the flying_ Down and last action_ In addition to the coordinates and other information contained in move, we can also take the user's moving speed on the x-axis or y-axis as the condition. For example, in the following code, we only process when the user moves more than 100 pixels and the movement speed per second on the x-axis is greater than 200 pixels.

The problem is that if we try to run the program at this time, you will find that we can't get the desired results at all. Tracking the execution of the code, you will find that the onfling () event has not been captured. This is what bothered me at the beginning. Why? I found the answer in the discussion group's gesture detection post, that is, we need to TV. Setontouchlistener (this) in oncreate; Then add the following code.

Only in this way can the view handle hold (i.e. action_move or multiple action_down) different from tap. We can also do this through Android: longclickable in the layout definition.

thorough

Let's summarize two situations:

1. Touch this activity

2. Touch a view

1、 Let's start with activity,

a、implements OnGestureListener

b、GestureDetector mGestureDetector = new GestureDetector(this);

c. Override the ontouchevent method of the activity

d. You can capture the onfling event

2、 Individual view

a、implements OnGestureListener,OnTouchListener

b、

c. Overriding is the ontouch method of implements ontouchlistener

d. You can capture the onfling event

3、 Implement onfling

The content of this article comes from the network collection of netizens. It is used as a learning reference. The copyright belongs to the original author.
THE END
分享
二维码
< <上一篇
下一篇>>