SlideShare une entreprise Scribd logo
1  sur  46
iOS Application Development
Overview of Taps, Multitouch, and
Gestures
These are confidential sessions - please refrain from streaming, blogging, or taking pictures
Session 12
Vu Tran Lam
IAD-2013
• About Event in iOS
• Introduction to Gestures, Taps and Touches
• Event Delivery: the Responder Chain
• Handling Multitouch event
• Using gesture recognisers
• Detecting Swipe, Rotation, Pan, Long-press, Tap, and Pinch
Gestures
• Testing the Gesture Recognition Application
Today’s Topics
• Users manipulate their iOS devices in a number of ways, such as
touching the screen or shaking the device.
• iOS interprets when and how a user is manipulating the hardware
and passes this information to your app.
• The more your app responds to actions in natural and intuitive
ways, the more compelling the experience is for the user.
About Event in iOS
• Gesture is an umbrella term used to encapsulate any single
interaction between the touch screen and the user, starting at the
point that the screen is touched (by one or more fingers) and the
time that the last finger leaves the surface of the screen.
• A tap, as the name suggests, occurs when the user touches the
screen with a single finger and then immediately lifts it from the
screen.
• A touch occurs when a finger establishes contact with the screen.
Introduction to Gestures, Taps, and Touches
• When you design an app, it’s likely that you want to respond to events
dynamically. For example, a touch can occur in many different objects
onscreen, and you have to decide which object you want to respond
to a given event and understand how that object receives the event.
• When a user-generated event occurs, UIKit creates an event object
containing the information needed to process the event. Then it
places the event object in the active app’s event queue.
• An event travels along a specific path until it is delivered to an object
that can handle it.
• The ultimate goal of these event paths is to find an object that can
handle and respond to an event. Therefore, UIKit first sends the event
to the object that is best suited to handle the event. For touch events,
that object is the hit-test view, and for other events, that object is the
first responder.
Event Delivery: The Responder Chain
• iOS uses hit-testing to find the view that is under a touch. Hit-
testing involves checking whether a touch is within the bounds of
any relevant view objects. If it is, it recursively checks all of that
view’s subviews. The lowest view in the view hierarchy that contains
the touch point becomes the hit-test view. After iOS determines
the hit-test view, it passes the touch event to that view for handling.
Hit-Testing Returns View Where a Touch Occurred
• The responder chain is a series of linked responder objects. It starts
with the first responder and ends with the application object. If the
first responder cannot handle an event, it forwards the event to the
next responder in the responder chain.
• A responder object is an object that can respond to and handle
events. The UIResponder class is the base class for all responder
objects, and it defines the programmatic interface not only for event
handling but also for common responder behavior.
• An object becomes the first responder by doing two things:
• Overriding the canBecomeFirstResponder method to return YES.
• Receiving a becomeFirstResponder message. If necessary, an object can
send itself this message.
Responder Chain Is Made Up of Responder Objects
• If the initial object-either the hit-test view or the first responder-
doesn’t handle an event, UIKit passes the event to the next
responder in the chain. Each responder decides whether it wants to
handle the event or pass it along to its own next responder by
calling the nextResponder.
Responder Chain Follows a Specific Delivery Path
Multitouch Events
• Creating a Subclass of UIResponder
• Implementing Touch-Event Handling Methods in Subclass
• Tracking the Phase and Location of a Touch Event
• Retrieving and Querying Touch Objects
• Handling Tap Gestures
• Handling Swipe and Drag Gestures
Handling Multitouch Events
• For your app to implement custom touch-event handling, first
create a subclass of a responder class.
• Then, for instances of your subclass to receive multitouch events:
• Your subclass must implement the UIResponder methods for touch-event
handling.
• The view receiving touches must have its userInteractionEnabled
property set to YES.
• The view receiving touches must be visible; it can’t be transparent or
hidden.
Creating a Subclass of UIResponder
• iOS recognizes touches as part of a multitouch sequence.
• During a multitouch sequence, the app sends a series of event
messages to the target responder.
• To receive and handle these messages, the responder object’s class
must implement the following UIResponder methods:
Implementing Touch-Event Handling in Subclass
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event;
• iOS tracks touches in a multitouch sequence.
• It records attributes for each of them, including the phase of the
touch, its location in a view, its previous location, and its timestamp.
Use these properties to determine how to respond to a touch.
• A touch object stores phase information in the phase property, and
each phase corresponds to one of the touch event methods.
Tracking the Phase and Location of Touch Event
• Within an event handling method, you get information about the
event by retrieving touch objects from:
• The set object. The passed-in NSSet contains all touches that are new
or have changed in the phase represented by the method, such as
UITouchPhaseBegan for the touchesBegan:withEvent: method.
• The event object. The passed-in UIEvent object contains all of the
touches for a given multitouch sequence.
• If you want to know the location of a touch, use locationInView:
method.
Retrieving and Querying Touch Objects
All Touches for a Given Touch Event
All Touches Belonging to a Specific Window
All Touches Belonging to a Specific View
• Besides being able to recognize a tap gesture in your app, you’ll
probably want to distinguish a single tap, a double tap, or even a
triple tap. Use a touch’s tapCount property to determine the
number of times the user tapped a view.
Handling Tap Gestures
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{}
!
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event{}
!
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
for (UITouch *aTouch in touches)
{
if (aTouch.tapCount >= 2)
{
// The view responds to the tap
[self respondToDoubleTapGesture:aTouch];
}
}
}
!
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event{}
• Horizontal and vertical swipes are a simple type of gesture that you
can track. To detect a swipe gesture, track the movement of the
user’s finger along the desired axis of motion.
• Then, decide whether the movement is a swipe by examining the
following questions for each gesture:
• Did the user’s finger move far enough?
• Did the finger move in a relatively straight line?
• Did the finger move quickly enough to call it a swipe?
Handling Swipe and Drag Gestures
Handling Swipe and Drag Gestures
// Tracking a swipe gesture in a view
#define H_SWIPE_DRAG_MIN 12
#define V_SWIPE_DRAG_MAX 4
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *aTouch = [touches anyObject];
// startTouchPosition is a property
self.startTouchPosition = [aTouch locationInView:self];
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *aTouch = [touches anyObject];
CGPoint currentTouchPosition = [aTouch locationInView:self];
// Check if direction of touch is horizontal and long enough
if (fabsf(self.startTouchPosition.x - currentTouchPosition.x) >= H_SWIPE_DRAG_MIN &&
fabsf(self.startTouchPosition.y - currentTouchPosition.y) <= V_SWIPE_DRAG_MAX) {
// If touch appears to be a swipe
if (self.startTouchPosition.x < currentTouchPosition.x) {
[self myProcessRightSwipe:touches withEvent:event];
} else {
[self myProcessLeftSwipe:touches withEvent:event];
}
self.startTouchPosition = CGPointZero;
}
}
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event {
self.startTouchPosition = CGPointZero;
}
Handling Swipe and Drag Gestures
//Dragging a view using a single touch
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {}
!
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *aTouch = [touches anyObject];
CGPoint loc = [aTouch locationInView:self];
CGPoint prevloc = [aTouch previousLocationInView:self];
CGRect myFrame = self.frame;
float deltaX = loc.x - prevloc.x;
float deltaY = loc.y - prevloc.y;
myFrame.origin.x += deltaX;
myFrame.origin.y += deltaY;
[self setFrame:myFrame];
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {}
!
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event {}
Multitouch Demo
Gesture Recognizers
• Introduction to Gesture Recognizers.
• Using Gesture Recognizers to Simplify Event Handling.
• Responding to Events with Gesture Recognizers.
• Using Gesture Recognizers in iOS 7 App
Using Gesture Recognizers
Introduction to Gesture Recognizers
• Gesture recognizers convert low-level event handling code into
higher-level actions. They are objects that you attach to a view,
which allows view to respond to actions the way a control does.
• Gesture recognizers interpret touches to determine whether they
correspond to a specific gesture, such as a swipe, pinch, or
rotation.
• Built-in Gesture Recognizer Classes
• Gesture Recognizers Are Attached to a View
• Gestures Trigger Action Messages
Use Gesture Recognizers to Simplify Event Handling
Built-in Gesture Recognizer Classes
Gesture UIKit class
Tapping (any number of taps) UITapGestureRecognizer
Pinching in and out (for zooming a view) UIPinchGestureRecognizer
Panning or dragging UIPanGestureRecognizer
Swiping (in any direction) UISwipeGestureRecognizer
Rotating (fingers moving in opposite directions) UIRotationGestureRecognizer
Long press (also known as “touch and hold”) UILongPressGestureRecognizer
• The UIKit framework provides predefined gesture recognizers that
detect common gestures.
• It’s best to use a predefined gesture recognizer when possible
because their simplicity reduces the amount of your code.
• Every gesture recognizer is associated with one view.
• By contrast, a view can have multiple gesture recognizers, because
a single view might respond to many different gestures.
• For a gesture recognizer to recognize touches that occur in a
particular view, you must attach the gesture recognizer to that view.
• When a user touches that view, the gesture recognizer receives a
message that a touch occurred before the view object does.
• As a result, the gesture recognizer can respond to touches on
behalf of the view.
Gesture Recognizers Are Attached to a View
• When a gesture recognizer recognizes its specified gesture, it sends
an action message to its target.
Gestures Trigger Action Messages
• There are three things you do to add a built-in gesture recognizer to
your app:
• Create and configure a gesture recognizer instance. This step includes
assigning a target, action, and sometimes assigning gesture-specific
attributes (such as the numberOfTapsRequired).
• Attach the gesture recognizer to a view.
• Implement the action method that handles the gesture.
Use Gesture Recognizers to Simplify Event Handling
• Using Interface Builder to Add a Gesture Recognizer to Your App
• Adding a Gesture Recognizer Programmatically
• Responding to Discrete Gestures
• Responding to Continuous Gestures
Use Gesture Recognizers to Simplify Event Handling
• Within Interface Builder in Xcode, add a gesture recognizer to your
app the same way you add any object to your user interface—drag
the gesture recognizer from the object library to a view.
• After you create the gesture recognizer object, you need to create
and connect an action method. This method is called whenever the
connected gesture recognizer recognizes its gesture.
@interface APLGestureRecognizerViewController ()
!
@property (nonatomic, strong) IBOutlet UITapGestureRecognizer *tapRecognizer;
!
@end
!
@implementation
- (IBAction)displayGestureForTapRecognizer:(UITapGestureRecognizer *)recognizer
{
// Will implement method later...
}
@end
Using Interface Builder to Add a Gesture Recognizer
Using Interface Builder to Add a Gesture Recognizer
Drag Gesture Recognizer to
View Controller
Create Outlet for each
Gesture Recognizer
Add delegate referencing
and action to Gesture
Recognizer
• You can create a gesture recognizer programmatically by allocating
and initializing an instance of a concrete UIGestureRecognizer
subclass, such as UIPinchGestureRecognizer.
• When you initialize the gesture recognizer, specify a target object and
an action selector. Often, the target object is the view’s view controller.
• If you create a gesture recognizer programmatically, you need to
attach it to a view using the addGestureRecognizer: method
- (void)viewDidLoad
{
[super viewDidLoad];
// Create and initialize a tap gesture
UITapGestureRecognizer *tapRecognizer = [[UITapGestureRecognizer alloc]
initWithTarget:self action:@selector(respondToTapGesture:)];
// Specify that the gesture must be a single tap
tapRecognizer.numberOfTapsRequired = 1;
// Add the tap gesture recognizer to the view
[self.view addGestureRecognizer:tapRecognizer];
// Do any additional setup after loading the view, typically from a nib
}
Adding a Gesture Recognizer Programmatically
• Handling a double tap gesture
- (IBAction)showGestureForTapRecognizer:(UITapGestureRecognizer *)recognizer
{
// Get the location of the gesture
CGPoint location = [recognizer locationInView:self.view];
// Display an image view at that location
[self drawImageForGestureRecognizer:recognizer atPoint:location];
// Animate the image view so that it fades out
[UIView animateWithDuration:0.5 animations:^{
self.imageView.alpha = 0.0;
}];
}
Responding to Discrete Gestures
• Responding to a left or right swipe gesture
- (IBAction)showGestureForSwipeRecognizer:(UISwipeGestureRecognizer
*)recognizer {
// Get the location of the gesture
CGPoint location = [recognizer locationInView:self.view];
// Display an image view at that location
[self drawImageForGestureRecognizer:recognizer atPoint:location];
// If gesture is a left swipe, specify an end location
// to the left of the current location
if (recognizer.direction == UISwipeGestureRecognizerDirectionLeft)
{
location.x -= 220.0;
}
else
{
location.x += 220.0;
}
// Animate the image view in the direction of the swipe as it fades out
[UIView animateWithDuration:0.5 animations:^{
self.imageView.alpha = 0.0;
self.imageView.center = location;
}];
}
Responding to Discrete Gestures
• Continuous gestures allow your app to respond to a gesture as it is
happening. For example, your app can zoom while a user is pinching
or allow a user to drag an object around the screen.
Responding to Continuous Gestures
// Respond to a rotation gesture
- (IBAction)showGestureForRotationRecognizer:(UIRotationGestureRecognizer *)recognizer {
// Get the location of the gesture
CGPoint location = [recognizer locationInView:self.view];
// Set the rotation angle of image view to match the rotation of gesture
CGAffineTransform transform = CGAffineTransformMakeRotation([recognizer
rotation]);
self.imageView.transform = transform;
// Display an image view at that location
[self drawImageForGestureRecognizer:recognizer atPoint:location];
// If the gesture has ended or is canceled, begin the animation
// back to horizontal and fade out
if (([recognizer state] == UIGestureRecognizerStateEnded) || ([recognizer
state] == UIGestureRecognizerStateCancelled)) {
[UIView animateWithDuration:0.5 animations:^{
self.imageView.alpha = 0.0;
self.imageView.transform = CGAffineTransformIdentity;
}];
}
}
• Prior to iOS 7, if a gesture recognizer requires another gesture
recognizer to fail, you use requireGestureRecognizerToFail: to
set up a permanent relationship between the two objects at creation
time.
• In iOS 7, UIGestureRecognizerDelegate introduces two methods
that allow failure requirements to be specified at runtime by a gesture
recognizer delegate object:
• gestureRecognizer:gestureRecognizer
shouldRequireFailureOfGestureRecognizer:otherGestureRecognizer
• gestureRecognizer:gestureRecognizer
shouldBeRequiredToFailByGestureRecognizer:otherGestureRecognizer
Using Gesture Recognizers in iOS 7 App
Using Gesture Recognizers in iOS 7 App
UIScreenEdgePanGestureRecognizer *myScreenEdgePanGestureRecognizer;
...
myScreenEdgePanGestureRecognizer = [[UIScreenEdgePanGestureRecognizer alloc]
initWithTarget:self action:@selector(handleScreenEdgePan:)];
myScreenEdgePanGestureRecognizer.delegate = self;
// Configure the gesture recognizer and attach it to the view.
...
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer
shouldBeRequiredToFailByGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer
{
BOOL result = NO;
if ((gestureRecognizer == myScreenEdgePanGestureRecognizer) &&
[[otherGestureRecognizer view] isDescendantOfView:[gestureRecognizer view]])
{
result = YES;
}
return result;
}
• Prior to iOS 7, if a gesture recognizer requires another gesture
recognizer to fail, you use requireGestureRecognizerToFail: to
set up a permanent relationship between the two objects at creation
time.
• In iOS 7, UIGestureRecognizerDelegate introduces two methods
that allow failure requirements to be specified at runtime by a gesture
recognizer delegate object:
• gestureRecognizer:gestureRecognizer
shouldRequireFailureOfGestureRecognizer:otherGestureRecognizer
• gestureRecognizer:gestureRecognizer
shouldBeRequiredToFailByGestureRecognizer:otherGestureRecognizer
Working Gesture Recognizers in iOS 7 App
Using Gesture Recognizers in iOS 7 App
UIScreenEdgePanGestureRecognizer *myScreenEdgePanGestureRecognizer;
...
myScreenEdgePanGestureRecognizer = [[UIScreenEdgePanGestureRecognizer alloc]
initWithTarget:self action:@selector(handleScreenEdgePan:)];
myScreenEdgePanGestureRecognizer.delegate = self;
// Configure the gesture recognizer and attach it to the view.
...
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer
shouldBeRequiredToFailByGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer
{
BOOL result = NO;
if ((gestureRecognizer == myScreenEdgePanGestureRecognizer) &&
[[otherGestureRecognizer view] isDescendantOfView:[gestureRecognizer view]])
{
result = YES;
}
return result;
}
Gesture Recognizers Demo
Event Handling Guide for iOS
https://developer.apple.com/library/ios/documentation/EventHandling/Conceptual/EventHandlingiPhoneOS/
Introduction/Introduction.html
UIGestureRecognizer Class Reference
https://developer.apple.com/library/ios/documentation/UIKit/Reference/UIGestureRecognizer_Class/Reference/
Reference.html
iOS 7 UI Transition Guide
https://developer.apple.com/library/ios/documentation/userexperience/conceptual/transitionguide/
AppearanceCustomization.html
Documentation
many thanks
to
lamvt@fpt.com.vn
please
say
Stanford University
https://developer.apple.com
Developer Center
http://www.stanford.edu/class/cs193p
xin
chào
Next: Working with Navigation and Tab Bar

Contenu connexe

Tendances (7)

Java gui event
Java gui eventJava gui event
Java gui event
 
Events1
Events1Events1
Events1
 
Event handling
Event handlingEvent handling
Event handling
 
Event handling63
Event handling63Event handling63
Event handling63
 
Event Handling in java
Event Handling in javaEvent Handling in java
Event Handling in java
 
7java Events
7java Events7java Events
7java Events
 
Advance Java Programming(CM5I) Event handling
Advance Java Programming(CM5I) Event handlingAdvance Java Programming(CM5I) Event handling
Advance Java Programming(CM5I) Event handling
 

Similaire à Session 12 - Overview of taps, multitouch, and gestures

Mobile Application Development
Mobile Application DevelopmentMobile Application Development
Mobile Application DevelopmentMuhammad Sajid
 
Flash Lite & Touch: build an iPhone-like dynamic list
Flash Lite & Touch: build an iPhone-like dynamic listFlash Lite & Touch: build an iPhone-like dynamic list
Flash Lite & Touch: build an iPhone-like dynamic listSmall Screen Design
 
Multi Touch And Gesture Event Interface And Types
Multi Touch And Gesture Event Interface And TypesMulti Touch And Gesture Event Interface And Types
Multi Touch And Gesture Event Interface And TypesEthan Cha
 
Event Handling in JAVA
Event Handling in JAVAEvent Handling in JAVA
Event Handling in JAVASrajan Shukla
 
event-handling.pptx
event-handling.pptxevent-handling.pptx
event-handling.pptxusvirat1805
 
Event Handling in Java
Event Handling in JavaEvent Handling in Java
Event Handling in JavaAyesha Kanwal
 
File Handling
File HandlingFile Handling
File HandlingSohanur63
 
Scratching the Surface with JavaFX
Scratching the Surface with JavaFXScratching the Surface with JavaFX
Scratching the Surface with JavaFXjavafxpert
 
Multi Touch presentation
Multi Touch presentationMulti Touch presentation
Multi Touch presentationsenthil0809
 
Chap - 2 - Event Handling.pptx
Chap - 2 - Event Handling.pptxChap - 2 - Event Handling.pptx
Chap - 2 - Event Handling.pptxTadeseBeyene
 
Events and Listeners in Android
Events and Listeners in AndroidEvents and Listeners in Android
Events and Listeners in Androidma-polimi
 
B2. activity and intent
B2. activity and intentB2. activity and intent
B2. activity and intentPERKYTORIALS
 

Similaire à Session 12 - Overview of taps, multitouch, and gestures (20)

Mobile Application Development
Mobile Application DevelopmentMobile Application Development
Mobile Application Development
 
Mobile Application Development class 005
Mobile Application Development class 005Mobile Application Development class 005
Mobile Application Development class 005
 
JAVA PROGRAMMING- GUI Programming with Swing - The Swing Buttons
JAVA PROGRAMMING- GUI Programming with Swing - The Swing ButtonsJAVA PROGRAMMING- GUI Programming with Swing - The Swing Buttons
JAVA PROGRAMMING- GUI Programming with Swing - The Swing Buttons
 
Flash Lite & Touch: build an iPhone-like dynamic list
Flash Lite & Touch: build an iPhone-like dynamic listFlash Lite & Touch: build an iPhone-like dynamic list
Flash Lite & Touch: build an iPhone-like dynamic list
 
Multi Touch And Gesture Event Interface And Types
Multi Touch And Gesture Event Interface And TypesMulti Touch And Gesture Event Interface And Types
Multi Touch And Gesture Event Interface And Types
 
Event Handling in JAVA
Event Handling in JAVAEvent Handling in JAVA
Event Handling in JAVA
 
event-handling.pptx
event-handling.pptxevent-handling.pptx
event-handling.pptx
 
What is Event
What is EventWhat is Event
What is Event
 
Event Handling in Java
Event Handling in JavaEvent Handling in Java
Event Handling in Java
 
Androd Listeners
Androd ListenersAndrod Listeners
Androd Listeners
 
File Handling
File HandlingFile Handling
File Handling
 
Scratching the Surface with JavaFX
Scratching the Surface with JavaFXScratching the Surface with JavaFX
Scratching the Surface with JavaFX
 
Unit 6 Java
Unit 6 JavaUnit 6 Java
Unit 6 Java
 
Multi Touch presentation
Multi Touch presentationMulti Touch presentation
Multi Touch presentation
 
Chap - 2 - Event Handling.pptx
Chap - 2 - Event Handling.pptxChap - 2 - Event Handling.pptx
Chap - 2 - Event Handling.pptx
 
Events and Listeners in Android
Events and Listeners in AndroidEvents and Listeners in Android
Events and Listeners in Android
 
Dacj 2-2 b
Dacj 2-2 bDacj 2-2 b
Dacj 2-2 b
 
B2. activity and intent
B2. activity and intentB2. activity and intent
B2. activity and intent
 
Event handling
Event handlingEvent handling
Event handling
 
Event handling
Event handlingEvent handling
Event handling
 

Plus de Vu Tran Lam

Session 13 - Working with navigation and tab bar
Session 13 - Working with navigation and tab barSession 13 - Working with navigation and tab bar
Session 13 - Working with navigation and tab barVu Tran Lam
 
Session 14 - Working with table view and search bar
Session 14 - Working with table view and search barSession 14 - Working with table view and search bar
Session 14 - Working with table view and search barVu Tran Lam
 
Session 9-10 - UI/UX design for iOS 7 application
Session 9-10 - UI/UX design for iOS 7 applicationSession 9-10 - UI/UX design for iOS 7 application
Session 9-10 - UI/UX design for iOS 7 applicationVu Tran Lam
 
Session 8 - Xcode 5 and interface builder for iOS 7 application
Session 8 - Xcode 5 and interface builder for iOS 7 applicationSession 8 - Xcode 5 and interface builder for iOS 7 application
Session 8 - Xcode 5 and interface builder for iOS 7 applicationVu Tran Lam
 
Session 7 - Overview of the iOS7 app development architecture
Session 7 - Overview of the iOS7 app development architectureSession 7 - Overview of the iOS7 app development architecture
Session 7 - Overview of the iOS7 app development architectureVu Tran Lam
 
Session 5 - Foundation framework
Session 5 - Foundation frameworkSession 5 - Foundation framework
Session 5 - Foundation frameworkVu Tran Lam
 
Session 4 - Object oriented programming with Objective-C (part 2)
Session 4  - Object oriented programming with Objective-C (part 2)Session 4  - Object oriented programming with Objective-C (part 2)
Session 4 - Object oriented programming with Objective-C (part 2)Vu Tran Lam
 
Session 3 - Object oriented programming with Objective-C (part 1)
Session 3 - Object oriented programming with Objective-C (part 1)Session 3 - Object oriented programming with Objective-C (part 1)
Session 3 - Object oriented programming with Objective-C (part 1)Vu Tran Lam
 
Session 2 - Objective-C basics
Session 2 - Objective-C basicsSession 2 - Objective-C basics
Session 2 - Objective-C basicsVu Tran Lam
 
Session 16 - Designing universal interface which used for iPad and iPhone
Session 16  -  Designing universal interface which used for iPad and iPhoneSession 16  -  Designing universal interface which used for iPad and iPhone
Session 16 - Designing universal interface which used for iPad and iPhoneVu Tran Lam
 
iOS 7 Application Development Course
iOS 7 Application Development CourseiOS 7 Application Development Course
iOS 7 Application Development CourseVu Tran Lam
 
Session 15 - Working with Image, Scroll, Collection, Picker, and Web View
Session 15  - Working with Image, Scroll, Collection, Picker, and Web ViewSession 15  - Working with Image, Scroll, Collection, Picker, and Web View
Session 15 - Working with Image, Scroll, Collection, Picker, and Web ViewVu Tran Lam
 
Session 1 - Introduction to iOS 7 and SDK
Session 1 -  Introduction to iOS 7 and SDKSession 1 -  Introduction to iOS 7 and SDK
Session 1 - Introduction to iOS 7 and SDKVu Tran Lam
 
Succeed in Mobile career
Succeed in Mobile careerSucceed in Mobile career
Succeed in Mobile careerVu Tran Lam
 
Android Application Development Course
Android Application Development Course Android Application Development Course
Android Application Development Course Vu Tran Lam
 
Your Second iPhone App - Code Listings
Your Second iPhone App - Code ListingsYour Second iPhone App - Code Listings
Your Second iPhone App - Code ListingsVu Tran Lam
 
Introduction to MVC in iPhone Development
Introduction to MVC in iPhone DevelopmentIntroduction to MVC in iPhone Development
Introduction to MVC in iPhone DevelopmentVu Tran Lam
 
Building a Completed iPhone App
Building a Completed iPhone AppBuilding a Completed iPhone App
Building a Completed iPhone AppVu Tran Lam
 
Introduction to iPhone Programming
Introduction to iPhone Programming Introduction to iPhone Programming
Introduction to iPhone Programming Vu Tran Lam
 
Responsive Web Design
Responsive Web DesignResponsive Web Design
Responsive Web DesignVu Tran Lam
 

Plus de Vu Tran Lam (20)

Session 13 - Working with navigation and tab bar
Session 13 - Working with navigation and tab barSession 13 - Working with navigation and tab bar
Session 13 - Working with navigation and tab bar
 
Session 14 - Working with table view and search bar
Session 14 - Working with table view and search barSession 14 - Working with table view and search bar
Session 14 - Working with table view and search bar
 
Session 9-10 - UI/UX design for iOS 7 application
Session 9-10 - UI/UX design for iOS 7 applicationSession 9-10 - UI/UX design for iOS 7 application
Session 9-10 - UI/UX design for iOS 7 application
 
Session 8 - Xcode 5 and interface builder for iOS 7 application
Session 8 - Xcode 5 and interface builder for iOS 7 applicationSession 8 - Xcode 5 and interface builder for iOS 7 application
Session 8 - Xcode 5 and interface builder for iOS 7 application
 
Session 7 - Overview of the iOS7 app development architecture
Session 7 - Overview of the iOS7 app development architectureSession 7 - Overview of the iOS7 app development architecture
Session 7 - Overview of the iOS7 app development architecture
 
Session 5 - Foundation framework
Session 5 - Foundation frameworkSession 5 - Foundation framework
Session 5 - Foundation framework
 
Session 4 - Object oriented programming with Objective-C (part 2)
Session 4  - Object oriented programming with Objective-C (part 2)Session 4  - Object oriented programming with Objective-C (part 2)
Session 4 - Object oriented programming with Objective-C (part 2)
 
Session 3 - Object oriented programming with Objective-C (part 1)
Session 3 - Object oriented programming with Objective-C (part 1)Session 3 - Object oriented programming with Objective-C (part 1)
Session 3 - Object oriented programming with Objective-C (part 1)
 
Session 2 - Objective-C basics
Session 2 - Objective-C basicsSession 2 - Objective-C basics
Session 2 - Objective-C basics
 
Session 16 - Designing universal interface which used for iPad and iPhone
Session 16  -  Designing universal interface which used for iPad and iPhoneSession 16  -  Designing universal interface which used for iPad and iPhone
Session 16 - Designing universal interface which used for iPad and iPhone
 
iOS 7 Application Development Course
iOS 7 Application Development CourseiOS 7 Application Development Course
iOS 7 Application Development Course
 
Session 15 - Working with Image, Scroll, Collection, Picker, and Web View
Session 15  - Working with Image, Scroll, Collection, Picker, and Web ViewSession 15  - Working with Image, Scroll, Collection, Picker, and Web View
Session 15 - Working with Image, Scroll, Collection, Picker, and Web View
 
Session 1 - Introduction to iOS 7 and SDK
Session 1 -  Introduction to iOS 7 and SDKSession 1 -  Introduction to iOS 7 and SDK
Session 1 - Introduction to iOS 7 and SDK
 
Succeed in Mobile career
Succeed in Mobile careerSucceed in Mobile career
Succeed in Mobile career
 
Android Application Development Course
Android Application Development Course Android Application Development Course
Android Application Development Course
 
Your Second iPhone App - Code Listings
Your Second iPhone App - Code ListingsYour Second iPhone App - Code Listings
Your Second iPhone App - Code Listings
 
Introduction to MVC in iPhone Development
Introduction to MVC in iPhone DevelopmentIntroduction to MVC in iPhone Development
Introduction to MVC in iPhone Development
 
Building a Completed iPhone App
Building a Completed iPhone AppBuilding a Completed iPhone App
Building a Completed iPhone App
 
Introduction to iPhone Programming
Introduction to iPhone Programming Introduction to iPhone Programming
Introduction to iPhone Programming
 
Responsive Web Design
Responsive Web DesignResponsive Web Design
Responsive Web Design
 

Session 12 - Overview of taps, multitouch, and gestures

  • 2. Overview of Taps, Multitouch, and Gestures These are confidential sessions - please refrain from streaming, blogging, or taking pictures Session 12 Vu Tran Lam IAD-2013
  • 3. • About Event in iOS • Introduction to Gestures, Taps and Touches • Event Delivery: the Responder Chain • Handling Multitouch event • Using gesture recognisers • Detecting Swipe, Rotation, Pan, Long-press, Tap, and Pinch Gestures • Testing the Gesture Recognition Application Today’s Topics
  • 4. • Users manipulate their iOS devices in a number of ways, such as touching the screen or shaking the device. • iOS interprets when and how a user is manipulating the hardware and passes this information to your app. • The more your app responds to actions in natural and intuitive ways, the more compelling the experience is for the user. About Event in iOS
  • 5. • Gesture is an umbrella term used to encapsulate any single interaction between the touch screen and the user, starting at the point that the screen is touched (by one or more fingers) and the time that the last finger leaves the surface of the screen. • A tap, as the name suggests, occurs when the user touches the screen with a single finger and then immediately lifts it from the screen. • A touch occurs when a finger establishes contact with the screen. Introduction to Gestures, Taps, and Touches
  • 6. • When you design an app, it’s likely that you want to respond to events dynamically. For example, a touch can occur in many different objects onscreen, and you have to decide which object you want to respond to a given event and understand how that object receives the event. • When a user-generated event occurs, UIKit creates an event object containing the information needed to process the event. Then it places the event object in the active app’s event queue. • An event travels along a specific path until it is delivered to an object that can handle it. • The ultimate goal of these event paths is to find an object that can handle and respond to an event. Therefore, UIKit first sends the event to the object that is best suited to handle the event. For touch events, that object is the hit-test view, and for other events, that object is the first responder. Event Delivery: The Responder Chain
  • 7. • iOS uses hit-testing to find the view that is under a touch. Hit- testing involves checking whether a touch is within the bounds of any relevant view objects. If it is, it recursively checks all of that view’s subviews. The lowest view in the view hierarchy that contains the touch point becomes the hit-test view. After iOS determines the hit-test view, it passes the touch event to that view for handling. Hit-Testing Returns View Where a Touch Occurred
  • 8. • The responder chain is a series of linked responder objects. It starts with the first responder and ends with the application object. If the first responder cannot handle an event, it forwards the event to the next responder in the responder chain. • A responder object is an object that can respond to and handle events. The UIResponder class is the base class for all responder objects, and it defines the programmatic interface not only for event handling but also for common responder behavior. • An object becomes the first responder by doing two things: • Overriding the canBecomeFirstResponder method to return YES. • Receiving a becomeFirstResponder message. If necessary, an object can send itself this message. Responder Chain Is Made Up of Responder Objects
  • 9. • If the initial object-either the hit-test view or the first responder- doesn’t handle an event, UIKit passes the event to the next responder in the chain. Each responder decides whether it wants to handle the event or pass it along to its own next responder by calling the nextResponder. Responder Chain Follows a Specific Delivery Path
  • 11. • Creating a Subclass of UIResponder • Implementing Touch-Event Handling Methods in Subclass • Tracking the Phase and Location of a Touch Event • Retrieving and Querying Touch Objects • Handling Tap Gestures • Handling Swipe and Drag Gestures Handling Multitouch Events
  • 12. • For your app to implement custom touch-event handling, first create a subclass of a responder class. • Then, for instances of your subclass to receive multitouch events: • Your subclass must implement the UIResponder methods for touch-event handling. • The view receiving touches must have its userInteractionEnabled property set to YES. • The view receiving touches must be visible; it can’t be transparent or hidden. Creating a Subclass of UIResponder
  • 13. • iOS recognizes touches as part of a multitouch sequence. • During a multitouch sequence, the app sends a series of event messages to the target responder. • To receive and handle these messages, the responder object’s class must implement the following UIResponder methods: Implementing Touch-Event Handling in Subclass - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event; - (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event; - (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event; - (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event;
  • 14. • iOS tracks touches in a multitouch sequence. • It records attributes for each of them, including the phase of the touch, its location in a view, its previous location, and its timestamp. Use these properties to determine how to respond to a touch. • A touch object stores phase information in the phase property, and each phase corresponds to one of the touch event methods. Tracking the Phase and Location of Touch Event
  • 15. • Within an event handling method, you get information about the event by retrieving touch objects from: • The set object. The passed-in NSSet contains all touches that are new or have changed in the phase represented by the method, such as UITouchPhaseBegan for the touchesBegan:withEvent: method. • The event object. The passed-in UIEvent object contains all of the touches for a given multitouch sequence. • If you want to know the location of a touch, use locationInView: method. Retrieving and Querying Touch Objects
  • 16. All Touches for a Given Touch Event
  • 17. All Touches Belonging to a Specific Window
  • 18. All Touches Belonging to a Specific View
  • 19. • Besides being able to recognize a tap gesture in your app, you’ll probably want to distinguish a single tap, a double tap, or even a triple tap. Use a touch’s tapCount property to determine the number of times the user tapped a view. Handling Tap Gestures - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{} ! - (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event{} ! - (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event { for (UITouch *aTouch in touches) { if (aTouch.tapCount >= 2) { // The view responds to the tap [self respondToDoubleTapGesture:aTouch]; } } } ! - (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event{}
  • 20. • Horizontal and vertical swipes are a simple type of gesture that you can track. To detect a swipe gesture, track the movement of the user’s finger along the desired axis of motion. • Then, decide whether the movement is a swipe by examining the following questions for each gesture: • Did the user’s finger move far enough? • Did the finger move in a relatively straight line? • Did the finger move quickly enough to call it a swipe? Handling Swipe and Drag Gestures
  • 21. Handling Swipe and Drag Gestures // Tracking a swipe gesture in a view #define H_SWIPE_DRAG_MIN 12 #define V_SWIPE_DRAG_MAX 4 - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { UITouch *aTouch = [touches anyObject]; // startTouchPosition is a property self.startTouchPosition = [aTouch locationInView:self]; } - (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event { UITouch *aTouch = [touches anyObject]; CGPoint currentTouchPosition = [aTouch locationInView:self]; // Check if direction of touch is horizontal and long enough if (fabsf(self.startTouchPosition.x - currentTouchPosition.x) >= H_SWIPE_DRAG_MIN && fabsf(self.startTouchPosition.y - currentTouchPosition.y) <= V_SWIPE_DRAG_MAX) { // If touch appears to be a swipe if (self.startTouchPosition.x < currentTouchPosition.x) { [self myProcessRightSwipe:touches withEvent:event]; } else { [self myProcessLeftSwipe:touches withEvent:event]; } self.startTouchPosition = CGPointZero; } } - (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event { self.startTouchPosition = CGPointZero; }
  • 22. Handling Swipe and Drag Gestures //Dragging a view using a single touch - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {} ! - (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event { UITouch *aTouch = [touches anyObject]; CGPoint loc = [aTouch locationInView:self]; CGPoint prevloc = [aTouch previousLocationInView:self]; CGRect myFrame = self.frame; float deltaX = loc.x - prevloc.x; float deltaY = loc.y - prevloc.y; myFrame.origin.x += deltaX; myFrame.origin.y += deltaY; [self setFrame:myFrame]; } - (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {} ! - (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event {}
  • 25. • Introduction to Gesture Recognizers. • Using Gesture Recognizers to Simplify Event Handling. • Responding to Events with Gesture Recognizers. • Using Gesture Recognizers in iOS 7 App Using Gesture Recognizers
  • 26. Introduction to Gesture Recognizers • Gesture recognizers convert low-level event handling code into higher-level actions. They are objects that you attach to a view, which allows view to respond to actions the way a control does. • Gesture recognizers interpret touches to determine whether they correspond to a specific gesture, such as a swipe, pinch, or rotation.
  • 27. • Built-in Gesture Recognizer Classes • Gesture Recognizers Are Attached to a View • Gestures Trigger Action Messages Use Gesture Recognizers to Simplify Event Handling
  • 28. Built-in Gesture Recognizer Classes Gesture UIKit class Tapping (any number of taps) UITapGestureRecognizer Pinching in and out (for zooming a view) UIPinchGestureRecognizer Panning or dragging UIPanGestureRecognizer Swiping (in any direction) UISwipeGestureRecognizer Rotating (fingers moving in opposite directions) UIRotationGestureRecognizer Long press (also known as “touch and hold”) UILongPressGestureRecognizer • The UIKit framework provides predefined gesture recognizers that detect common gestures. • It’s best to use a predefined gesture recognizer when possible because their simplicity reduces the amount of your code.
  • 29. • Every gesture recognizer is associated with one view. • By contrast, a view can have multiple gesture recognizers, because a single view might respond to many different gestures. • For a gesture recognizer to recognize touches that occur in a particular view, you must attach the gesture recognizer to that view. • When a user touches that view, the gesture recognizer receives a message that a touch occurred before the view object does. • As a result, the gesture recognizer can respond to touches on behalf of the view. Gesture Recognizers Are Attached to a View
  • 30. • When a gesture recognizer recognizes its specified gesture, it sends an action message to its target. Gestures Trigger Action Messages
  • 31. • There are three things you do to add a built-in gesture recognizer to your app: • Create and configure a gesture recognizer instance. This step includes assigning a target, action, and sometimes assigning gesture-specific attributes (such as the numberOfTapsRequired). • Attach the gesture recognizer to a view. • Implement the action method that handles the gesture. Use Gesture Recognizers to Simplify Event Handling
  • 32. • Using Interface Builder to Add a Gesture Recognizer to Your App • Adding a Gesture Recognizer Programmatically • Responding to Discrete Gestures • Responding to Continuous Gestures Use Gesture Recognizers to Simplify Event Handling
  • 33. • Within Interface Builder in Xcode, add a gesture recognizer to your app the same way you add any object to your user interface—drag the gesture recognizer from the object library to a view. • After you create the gesture recognizer object, you need to create and connect an action method. This method is called whenever the connected gesture recognizer recognizes its gesture. @interface APLGestureRecognizerViewController () ! @property (nonatomic, strong) IBOutlet UITapGestureRecognizer *tapRecognizer; ! @end ! @implementation - (IBAction)displayGestureForTapRecognizer:(UITapGestureRecognizer *)recognizer { // Will implement method later... } @end Using Interface Builder to Add a Gesture Recognizer
  • 34. Using Interface Builder to Add a Gesture Recognizer Drag Gesture Recognizer to View Controller Create Outlet for each Gesture Recognizer Add delegate referencing and action to Gesture Recognizer
  • 35. • You can create a gesture recognizer programmatically by allocating and initializing an instance of a concrete UIGestureRecognizer subclass, such as UIPinchGestureRecognizer. • When you initialize the gesture recognizer, specify a target object and an action selector. Often, the target object is the view’s view controller. • If you create a gesture recognizer programmatically, you need to attach it to a view using the addGestureRecognizer: method - (void)viewDidLoad { [super viewDidLoad]; // Create and initialize a tap gesture UITapGestureRecognizer *tapRecognizer = [[UITapGestureRecognizer alloc] initWithTarget:self action:@selector(respondToTapGesture:)]; // Specify that the gesture must be a single tap tapRecognizer.numberOfTapsRequired = 1; // Add the tap gesture recognizer to the view [self.view addGestureRecognizer:tapRecognizer]; // Do any additional setup after loading the view, typically from a nib } Adding a Gesture Recognizer Programmatically
  • 36. • Handling a double tap gesture - (IBAction)showGestureForTapRecognizer:(UITapGestureRecognizer *)recognizer { // Get the location of the gesture CGPoint location = [recognizer locationInView:self.view]; // Display an image view at that location [self drawImageForGestureRecognizer:recognizer atPoint:location]; // Animate the image view so that it fades out [UIView animateWithDuration:0.5 animations:^{ self.imageView.alpha = 0.0; }]; } Responding to Discrete Gestures
  • 37. • Responding to a left or right swipe gesture - (IBAction)showGestureForSwipeRecognizer:(UISwipeGestureRecognizer *)recognizer { // Get the location of the gesture CGPoint location = [recognizer locationInView:self.view]; // Display an image view at that location [self drawImageForGestureRecognizer:recognizer atPoint:location]; // If gesture is a left swipe, specify an end location // to the left of the current location if (recognizer.direction == UISwipeGestureRecognizerDirectionLeft) { location.x -= 220.0; } else { location.x += 220.0; } // Animate the image view in the direction of the swipe as it fades out [UIView animateWithDuration:0.5 animations:^{ self.imageView.alpha = 0.0; self.imageView.center = location; }]; } Responding to Discrete Gestures
  • 38. • Continuous gestures allow your app to respond to a gesture as it is happening. For example, your app can zoom while a user is pinching or allow a user to drag an object around the screen. Responding to Continuous Gestures // Respond to a rotation gesture - (IBAction)showGestureForRotationRecognizer:(UIRotationGestureRecognizer *)recognizer { // Get the location of the gesture CGPoint location = [recognizer locationInView:self.view]; // Set the rotation angle of image view to match the rotation of gesture CGAffineTransform transform = CGAffineTransformMakeRotation([recognizer rotation]); self.imageView.transform = transform; // Display an image view at that location [self drawImageForGestureRecognizer:recognizer atPoint:location]; // If the gesture has ended or is canceled, begin the animation // back to horizontal and fade out if (([recognizer state] == UIGestureRecognizerStateEnded) || ([recognizer state] == UIGestureRecognizerStateCancelled)) { [UIView animateWithDuration:0.5 animations:^{ self.imageView.alpha = 0.0; self.imageView.transform = CGAffineTransformIdentity; }]; } }
  • 39. • Prior to iOS 7, if a gesture recognizer requires another gesture recognizer to fail, you use requireGestureRecognizerToFail: to set up a permanent relationship between the two objects at creation time. • In iOS 7, UIGestureRecognizerDelegate introduces two methods that allow failure requirements to be specified at runtime by a gesture recognizer delegate object: • gestureRecognizer:gestureRecognizer shouldRequireFailureOfGestureRecognizer:otherGestureRecognizer • gestureRecognizer:gestureRecognizer shouldBeRequiredToFailByGestureRecognizer:otherGestureRecognizer Using Gesture Recognizers in iOS 7 App
  • 40. Using Gesture Recognizers in iOS 7 App UIScreenEdgePanGestureRecognizer *myScreenEdgePanGestureRecognizer; ... myScreenEdgePanGestureRecognizer = [[UIScreenEdgePanGestureRecognizer alloc] initWithTarget:self action:@selector(handleScreenEdgePan:)]; myScreenEdgePanGestureRecognizer.delegate = self; // Configure the gesture recognizer and attach it to the view. ... - (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldBeRequiredToFailByGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer { BOOL result = NO; if ((gestureRecognizer == myScreenEdgePanGestureRecognizer) && [[otherGestureRecognizer view] isDescendantOfView:[gestureRecognizer view]]) { result = YES; } return result; }
  • 41. • Prior to iOS 7, if a gesture recognizer requires another gesture recognizer to fail, you use requireGestureRecognizerToFail: to set up a permanent relationship between the two objects at creation time. • In iOS 7, UIGestureRecognizerDelegate introduces two methods that allow failure requirements to be specified at runtime by a gesture recognizer delegate object: • gestureRecognizer:gestureRecognizer shouldRequireFailureOfGestureRecognizer:otherGestureRecognizer • gestureRecognizer:gestureRecognizer shouldBeRequiredToFailByGestureRecognizer:otherGestureRecognizer Working Gesture Recognizers in iOS 7 App
  • 42. Using Gesture Recognizers in iOS 7 App UIScreenEdgePanGestureRecognizer *myScreenEdgePanGestureRecognizer; ... myScreenEdgePanGestureRecognizer = [[UIScreenEdgePanGestureRecognizer alloc] initWithTarget:self action:@selector(handleScreenEdgePan:)]; myScreenEdgePanGestureRecognizer.delegate = self; // Configure the gesture recognizer and attach it to the view. ... - (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldBeRequiredToFailByGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer { BOOL result = NO; if ((gestureRecognizer == myScreenEdgePanGestureRecognizer) && [[otherGestureRecognizer view] isDescendantOfView:[gestureRecognizer view]]) { result = YES; } return result; }
  • 44. Event Handling Guide for iOS https://developer.apple.com/library/ios/documentation/EventHandling/Conceptual/EventHandlingiPhoneOS/ Introduction/Introduction.html UIGestureRecognizer Class Reference https://developer.apple.com/library/ios/documentation/UIKit/Reference/UIGestureRecognizer_Class/Reference/ Reference.html iOS 7 UI Transition Guide https://developer.apple.com/library/ios/documentation/userexperience/conceptual/transitionguide/ AppearanceCustomization.html Documentation
  • 46. Next: Working with Navigation and Tab Bar