Преглед на файлове

Wrote the appendix about gesture detection in the reference implementation.

Taddeus Kroes преди 13 години
родител
ревизия
64da29f33a
променени са 2 файла, в които са добавени 168 реда и са изтрити 14 реда
  1. 57 0
      docs/data/diagrams.tex
  2. 111 14
      docs/report.tex

+ 57 - 0
docs/data/diagrams.tex

@@ -369,3 +369,60 @@
         \label{fig:testappdiagram}
     \end{figure}
 }
+
+\def\transformationtracker{
+    \begin{figure}[h!]
+        \center
+        \tikzstyle{centroid} = [draw, shape=circle, minimum width=1.5em, fill]
+        \tikzstyle{finger} = [draw, shape=circle, minimum width=1.5em, fill=white]
+        \tikzstyle{prev} = [opacity=0.3]
+        \subfigure[
+            Initial situation: three touch points are positioned on the touch
+            surface.
+        ]{
+            \begin{tikzpicture}
+                \node [centroid] (centroid) at (0.33, -1) {};
+                \node [finger] (A) at (-1, -3) {} edge (centroid);
+                \node [finger] (B) at (0, 2) {} edge (centroid);
+                \node [finger] (C) at (2, -2) {} edge (centroid);
+            \end{tikzpicture}
+        }
+        \quad
+        \subfigure[
+            One of the touchpoints is moved, triggering a \emph{point\_move}
+            event. The ratio $d_2:d_1$ is used for a \emph{pinch} gesture, and
+            the difference in angle $\alpha$ is used for a \emph{rotate}
+            gesture.
+        ]{
+            \begin{tikzpicture}
+                \node [centroid] (centroid) at (0.33, -1) {};
+                \node [finger] (A) at (-1, -3) {} edge (centroid);
+                \node [finger, prev] (B') at (90:2) {}
+                    edge [prev] node [right, opacity=1] {$d_1$} (centroid);
+                \node [finger] (B) at (110:1.8) {} edge node [left] {$d_2$} (centroid);
+                \node [finger] (C) at (2, -2) {} edge (centroid);
+                \draw [->] (87:1) arc (92:113:1);
+                \node [] at (96:0.8) {$\alpha$};
+            \end{tikzpicture}
+            \label{fig:pinchrotate}
+        }
+        \quad
+        \subfigure[
+            The new centroid is calculated. The movement of the centroid is
+            used for a \emph{drag} gesture.
+        ]{
+            \begin{tikzpicture}
+                \node [centroid, prev] (centroid') at (0.33, -1) {};
+                \node [centroid] (centroid) at (0.12, -1.) {};
+                \node [finger] (A) at (-1, -3) {} edge (centroid) edge [prev] (centroid');
+                \node [finger, prev] (B') at (90:2) {} edge [prev] (centroid');
+                \node [finger] (B) at (110:1.8) {} edge (centroid);
+                \node [finger] (C) at (2, -2) {} edge (centroid) edge [prev] (centroid');
+            \end{tikzpicture}
+        }
+        \caption{
+            Example transformation using three touch points.
+        }
+        \label{fig:transformationtracker}
+    \end{figure}
+}

+ 111 - 14
docs/report.tex

@@ -588,11 +588,6 @@ have been implemented using an imperative programming style. Technical details
 about the implementation of gesture detection are described in appendix
 \ref{app:implementation-details}.
 
-%\section{Basic usage}
-
-% TODO
-% example usage uit H3 hierheen halen
-
 \section{Full screen Pygame application}
 
 %The goal of this application was to experiment with the TUIO
@@ -934,14 +929,116 @@ client application, as stated by the online specification
 \chapter{Gesture detection in the reference implementation}
 \label{app:implementation-details}
 
-Both rotation and pinch use the centroid of all touch points. A \emph{rotation}
-gesture uses the difference in angle relative to the centroid of all touch
-points, and \emph{pinch} uses the difference in distance.  Both values are
-normalized using division by the number of touch points. A pinch event contains
-a scale factor, and therefore uses a division of the current by the previous
-average distance to the centroid.
-
-% TODO
-\emph{TODO: rotatie en pinch gaan iets anders/uitgebreider worden beschreven.}
+The reference implementation contains three gesture tracker implementations,
+which are described in sections \ref{sec:basictracker} to
+\ref{sec:transformationtracker}. Section \ref{sec:handtracker} describes the
+custom ``hand tracker'' that is used by the test application from section
+\ref{sec:testapp}.
+
+\section{Basic tracker}
+\label{sec:basictracker}
+
+The ``basic tracker'' implementation exists only to provide access to low-level
+events in an application. Low-level events are only handled by gesture
+trackers, not by the application itself. Therefore, the basic tracker maps
+\emph{point\_\{down,move,up\}} events to equally named gestures that are
+handled by the application.
+
+\section{Tap tracker}
+\label{sec:taptracker}
+
+The ``tap tracker'' detects three types of tap gestures:
+
+\begin{enumerate}
+    \item The basic \emph{tap} gesture is triggered when a touch point releases
+        the touch surface within a certain time and distance of its initial
+        position. When a \emph{point\_down} event is received, its location is
+        saved along with the current timestamp. On the next \emph{point\_up}
+        event of the touch point, the difference in time and position with its
+        saved values are compared with predefined thresholds to determine
+        whether a \emph{tap} gesture should be triggered.
+    \item A \emph{double tap} gesture consists of two sequential \emph{tap}
+        gestures that are located within a certain distance of each other, and
+        occur within a certain time window. When a \emph{tap} gesture is
+        triggered, the tracker saves it as the ``last tap'' along with the
+        current timestamp. When another \emph{tap} gesture is triggered, its
+        location and the current timestamp are compared with those of the
+        ``last tap'' gesture to determine whether a \emph{double tap} gesture
+        should be triggered. If so, the gesture is triggered at the location of
+        the ``last tap'', because the second tap may be less accurate.
+    \item A separate thread handles detection of \emph{single tap} gestures at
+        a rate of thirty times per second. When the time since the ``last tap''
+        exceeds the maximum time between two taps of a \emph{double tap}
+        gesture, a \emph{single tap} gesture is triggered.
+\end{enumerate}
+
+The \emph{single tap} gesture exists to be able to make a distinction between
+single and double tap gestures. This distinction is not possible with the
+regular \emph{tap} gesture, since the first \emph{tap} gesture has already been
+handled by the application when the second \emph{tap} of a \emph{double tap}
+gesture is triggered.
+
+\section{Transformation tracker}
+\label{sec:transformationtracker}
+
+The transformation tracker triggers \emph{rotate}, \emph{pinch}, \emph{drag}
+and \emph{flick} gestures. These gestures use the centroid of all touch points.
+A \emph{rotate} gesture uses the difference in angle relative to the centroid
+of all touch points, and \emph{pinch} uses the difference in distance. Both
+values are normalized using division by the number of touch points $N$. A
+\emph{pinch} gesture contains a scale factor, and therefore uses a division of
+the current by the previous average distance to the centroid. Any movement of
+the centroid is used for \emph{drag} gestures. When a dragged touch point is
+released, a \emph{flick} gesture is triggered in the direction of the
+\emph{drag} gesture.  The application can use a \emph{flick} gesture to give
+momentum to a dragged widget so that it keeps moving for some time after the
+dragging stops.
+
+Figure \ref{fig:transformationtracker} shows an example situation in which a
+touch point is moved, triggering a \emph{pinch} gesture, a \emph{rotate}
+gesture and a \emph{drag} gesture.
+
+\transformationtracker
+
+The \emph{pinch} gesture in figure \ref{fig:pinchrotate} uses the ratio
+$d_2:d_1$ to calculate its $scale$ parameter. The difference in distance to the
+centroid must be divided by the number of touch points ($N$) used for the
+gesture, yielding the difference $\frac{d_2 - d_1}{N}$. The $scale$ parameter
+represents the scale relative to the previous situation, which results in the
+following formula:
+
+$$pinch.scale = \frac{d_1 + \frac{d_2 - d_1}{N}}{d_1}$$
+
+The angle used for the \emph{rotate} gesture is also divided by the number of
+touch points:
+$$rotate.angle = \frac{\alpha}{N}$$
+
+\section{Hand tracker}
+\label{sec:handtracker}
+
+The hand tracker sees each touch point as a finger. Based on a predefined
+distance threshold, each finger is assigned to a hand. Each hand consists of a
+list of finger locations, and the centroid of those locations.
+
+When a new finger is detected on the touch surface (a \emph{point\_down} event),
+the distance from that finger to all hand centroids is calculated. The hand to
+which the distance is the shortest can be the hand that the finger belongs to.
+If the distance is larger than the predefined distance threshold, the finger is
+assumed to be a new hand and \emph{hand\_down} gesture is triggered. Otherwise,
+the finger is assigned to the closest hand. In both cases, a
+\emph{finger\_down} gesture is triggered.
+
+Each touch point is assigned an ID by the reference implementation. When the
+hand tracker assigns a finger to a hand after a \emph{point\_down} event, its
+touch point ID is saved in a hash map\footnote{In computer science, a hash
+table or hash map is a data structure that uses a hash function to map
+identifying values, known as keys (e.g., a person's name), to their associated
+values (e.g., their telephone number). Source:
+\url{http://en.wikipedia.org/wiki/Hashmap}} with the \texttt{Hand} object. When
+a finger moves (a \emph{point\_move} event) or releases the touch surface
+(\emph{point\_up}), The corresponding hand is loaded from the hash map and
+triggers a \emph{finger\_move} or \emph{finger\_up} gesture. If a released
+finger is the last of a hand, that hand is removed with a \emph{hand\_up}
+gesture.
 
 \end{document}