Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Support
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in
Toggle navigation
M
multitouch
Project overview
Project overview
Details
Activity
Releases
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Issues
0
Issues
0
List
Boards
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Analytics
Analytics
CI / CD
Repository
Value Stream
Wiki
Wiki
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
Taddeüs Kroes
multitouch
Commits
64da29f3
Commit
64da29f3
authored
Jul 04, 2012
by
Taddeüs Kroes
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
Wrote the appendix about gesture detection in the reference implementation.
parent
3182e73b
Changes
2
Show whitespace changes
Inline
Side-by-side
Showing
2 changed files
with
168 additions
and
14 deletions
+168
-14
docs/data/diagrams.tex
docs/data/diagrams.tex
+57
-0
docs/report.tex
docs/report.tex
+111
-14
No files found.
docs/data/diagrams.tex
View file @
64da29f3
...
@@ -369,3 +369,60 @@
...
@@ -369,3 +369,60 @@
\label
{
fig:testappdiagram
}
\label
{
fig:testappdiagram
}
\end{figure}
\end{figure}
}
}
\def\transformationtracker
{
\begin{figure}
[h!]
\center
\tikzstyle
{
centroid
}
= [draw, shape=circle, minimum width=1.5em, fill]
\tikzstyle
{
finger
}
= [draw, shape=circle, minimum width=1.5em, fill=white]
\tikzstyle
{
prev
}
= [opacity=0.3]
\subfigure
[
Initial situation: three touch points are positioned on the touch
surface.
]
{
\begin{tikzpicture}
\node
[centroid] (centroid) at (0.33, -1)
{}
;
\node
[finger] (A) at (-1, -3)
{}
edge (centroid);
\node
[finger] (B) at (0, 2)
{}
edge (centroid);
\node
[finger] (C) at (2, -2)
{}
edge (centroid);
\end{tikzpicture}
}
\quad
\subfigure
[
One of the touchpoints is moved, triggering a
\emph
{
point
\_
move
}
event. The ratio
$
d
_
2
:d
_
1
$
is used for a
\emph
{
pinch
}
gesture, and
the difference in angle
$
\alpha
$
is used for a
\emph
{
rotate
}
gesture.
]
{
\begin{tikzpicture}
\node
[centroid] (centroid) at (0.33, -1)
{}
;
\node
[finger] (A) at (-1, -3)
{}
edge (centroid);
\node
[finger, prev] (B') at (90:2)
{}
edge [prev] node [right, opacity=1]
{$
d
_
1
$}
(centroid);
\node
[finger] (B) at (110:1.8)
{}
edge node [left]
{$
d
_
2
$}
(centroid);
\node
[finger] (C) at (2, -2)
{}
edge (centroid);
\draw
[->] (87:1) arc (92:113:1);
\node
[] at (96:0.8)
{$
\alpha
$}
;
\end{tikzpicture}
\label
{
fig:pinchrotate
}
}
\quad
\subfigure
[
The new centroid is calculated. The movement of the centroid is
used for a
\emph
{
drag
}
gesture.
]
{
\begin{tikzpicture}
\node
[centroid, prev] (centroid') at (0.33, -1)
{}
;
\node
[centroid] (centroid) at (0.12, -1.)
{}
;
\node
[finger] (A) at (-1, -3)
{}
edge (centroid) edge [prev] (centroid');
\node
[finger, prev] (B') at (90:2)
{}
edge [prev] (centroid');
\node
[finger] (B) at (110:1.8)
{}
edge (centroid);
\node
[finger] (C) at (2, -2)
{}
edge (centroid) edge [prev] (centroid');
\end{tikzpicture}
}
\caption
{
Example transformation using three touch points.
}
\label
{
fig:transformationtracker
}
\end{figure}
}
docs/report.tex
View file @
64da29f3
...
@@ -588,11 +588,6 @@ have been implemented using an imperative programming style. Technical details
...
@@ -588,11 +588,6 @@ have been implemented using an imperative programming style. Technical details
about the implementation of gesture detection are described in appendix
about the implementation of gesture detection are described in appendix
\ref
{
app:implementation-details
}
.
\ref
{
app:implementation-details
}
.
%\section{Basic usage}
% TODO
% example usage uit H3 hierheen halen
\section
{
Full screen Pygame application
}
\section
{
Full screen Pygame application
}
%The goal of this application was to experiment with the TUIO
%The goal of this application was to experiment with the TUIO
...
@@ -934,14 +929,116 @@ client application, as stated by the online specification
...
@@ -934,14 +929,116 @@ client application, as stated by the online specification
\chapter
{
Gesture detection in the reference implementation
}
\chapter
{
Gesture detection in the reference implementation
}
\label
{
app:implementation-details
}
\label
{
app:implementation-details
}
Both rotation and pinch use the centroid of all touch points. A
\emph
{
rotation
}
The reference implementation contains three gesture tracker implementations,
gesture uses the difference in angle relative to the centroid of all touch
which are described in sections
\ref
{
sec:basictracker
}
to
points, and
\emph
{
pinch
}
uses the difference in distance. Both values are
\ref
{
sec:transformationtracker
}
. Section
\ref
{
sec:handtracker
}
describes the
normalized using division by the number of touch points. A pinch event contains
custom ``hand tracker'' that is used by the test application from section
a scale factor, and therefore uses a division of the current by the previous
\ref
{
sec:testapp
}
.
average distance to the centroid.
\section
{
Basic tracker
}
% TODO
\label
{
sec:basictracker
}
\emph
{
TODO: rotatie en pinch gaan iets anders/uitgebreider worden beschreven.
}
The ``basic tracker'' implementation exists only to provide access to low-level
events in an application. Low-level events are only handled by gesture
trackers, not by the application itself. Therefore, the basic tracker maps
\emph
{
point
\_\{
down,move,up
\}
}
events to equally named gestures that are
handled by the application.
\section
{
Tap tracker
}
\label
{
sec:taptracker
}
The ``tap tracker'' detects three types of tap gestures:
\begin{enumerate}
\item
The basic
\emph
{
tap
}
gesture is triggered when a touch point releases
the touch surface within a certain time and distance of its initial
position. When a
\emph
{
point
\_
down
}
event is received, its location is
saved along with the current timestamp. On the next
\emph
{
point
\_
up
}
event of the touch point, the difference in time and position with its
saved values are compared with predefined thresholds to determine
whether a
\emph
{
tap
}
gesture should be triggered.
\item
A
\emph
{
double tap
}
gesture consists of two sequential
\emph
{
tap
}
gestures that are located within a certain distance of each other, and
occur within a certain time window. When a
\emph
{
tap
}
gesture is
triggered, the tracker saves it as the ``last tap'' along with the
current timestamp. When another
\emph
{
tap
}
gesture is triggered, its
location and the current timestamp are compared with those of the
``last tap'' gesture to determine whether a
\emph
{
double tap
}
gesture
should be triggered. If so, the gesture is triggered at the location of
the ``last tap'', because the second tap may be less accurate.
\item
A separate thread handles detection of
\emph
{
single tap
}
gestures at
a rate of thirty times per second. When the time since the ``last tap''
exceeds the maximum time between two taps of a
\emph
{
double tap
}
gesture, a
\emph
{
single tap
}
gesture is triggered.
\end{enumerate}
The
\emph
{
single tap
}
gesture exists to be able to make a distinction between
single and double tap gestures. This distinction is not possible with the
regular
\emph
{
tap
}
gesture, since the first
\emph
{
tap
}
gesture has already been
handled by the application when the second
\emph
{
tap
}
of a
\emph
{
double tap
}
gesture is triggered.
\section
{
Transformation tracker
}
\label
{
sec:transformationtracker
}
The transformation tracker triggers
\emph
{
rotate
}
,
\emph
{
pinch
}
,
\emph
{
drag
}
and
\emph
{
flick
}
gestures. These gestures use the centroid of all touch points.
A
\emph
{
rotate
}
gesture uses the difference in angle relative to the centroid
of all touch points, and
\emph
{
pinch
}
uses the difference in distance. Both
values are normalized using division by the number of touch points
$
N
$
. A
\emph
{
pinch
}
gesture contains a scale factor, and therefore uses a division of
the current by the previous average distance to the centroid. Any movement of
the centroid is used for
\emph
{
drag
}
gestures. When a dragged touch point is
released, a
\emph
{
flick
}
gesture is triggered in the direction of the
\emph
{
drag
}
gesture. The application can use a
\emph
{
flick
}
gesture to give
momentum to a dragged widget so that it keeps moving for some time after the
dragging stops.
Figure
\ref
{
fig:transformationtracker
}
shows an example situation in which a
touch point is moved, triggering a
\emph
{
pinch
}
gesture, a
\emph
{
rotate
}
gesture and a
\emph
{
drag
}
gesture.
\transformationtracker
The
\emph
{
pinch
}
gesture in figure
\ref
{
fig:pinchrotate
}
uses the ratio
$
d
_
2
:d
_
1
$
to calculate its
$
scale
$
parameter. The difference in distance to the
centroid must be divided by the number of touch points (
$
N
$
) used for the
gesture, yielding the difference
$
\frac
{
d
_
2
-
d
_
1
}{
N
}$
. The
$
scale
$
parameter
represents the scale relative to the previous situation, which results in the
following formula:
$$
pinch.scale
=
\frac
{
d
_
1
+
\frac
{
d
_
2
-
d
_
1
}{
N
}}{
d
_
1
}$$
The angle used for the
\emph
{
rotate
}
gesture is also divided by the number of
touch points:
$$
rotate.angle
=
\frac
{
\alpha
}{
N
}$$
\section
{
Hand tracker
}
\label
{
sec:handtracker
}
The hand tracker sees each touch point as a finger. Based on a predefined
distance threshold, each finger is assigned to a hand. Each hand consists of a
list of finger locations, and the centroid of those locations.
When a new finger is detected on the touch surface (a
\emph
{
point
\_
down
}
event),
the distance from that finger to all hand centroids is calculated. The hand to
which the distance is the shortest can be the hand that the finger belongs to.
If the distance is larger than the predefined distance threshold, the finger is
assumed to be a new hand and
\emph
{
hand
\_
down
}
gesture is triggered. Otherwise,
the finger is assigned to the closest hand. In both cases, a
\emph
{
finger
\_
down
}
gesture is triggered.
Each touch point is assigned an ID by the reference implementation. When the
hand tracker assigns a finger to a hand after a
\emph
{
point
\_
down
}
event, its
touch point ID is saved in a hash map
\footnote
{
In computer science, a hash
table or hash map is a data structure that uses a hash function to map
identifying values, known as keys (e.g., a person's name), to their associated
values (e.g., their telephone number). Source:
\url
{
http://en.wikipedia.org/wiki/Hashmap
}}
with the
\texttt
{
Hand
}
object. When
a finger moves (a
\emph
{
point
\_
move
}
event) or releases the touch surface
(
\emph
{
point
\_
up
}
), The corresponding hand is loaded from the hash map and
triggers a
\emph
{
finger
\_
move
}
or
\emph
{
finger
\_
up
}
gesture. If a released
finger is the last of a hand, that hand is removed with a
\emph
{
hand
\_
up
}
gesture.
\end{document}
\end{document}
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment