Commit a2f08297 authored by Taddeüs Kroes's avatar Taddeüs Kroes

Started writing new 'Design' chapter.

parent 562d1db3
...@@ -49,8 +49,9 @@ ...@@ -49,8 +49,9 @@
\end{tikzpicture} \end{tikzpicture}
} }
\newcommand{\simplediagram}{ \newcommand{\basicdiagram}[1]{
\begin{figure}[H] \begin{figure}[H]
\label{fig:basicdiagram}
\center \center
\architecture{ \architecture{
\node[block, dashed, below of=driver] (arch) {Architecture} \node[block, dashed, below of=driver] (arch) {Architecture}
...@@ -58,11 +59,73 @@ ...@@ -58,11 +59,73 @@
\node[block, below of=arch] {Application} \node[block, below of=arch] {Application}
edge[linefrom] node[right] {gestures} (arch); edge[linefrom] node[right] {gestures} (arch);
} }
\caption{Translation of driver-specific messages to gestures.} \caption{#1}
\end{figure} \end{figure}
} }
\newcommand{\completediagrams}{ \newcommand{\driverdiagram}[1]{
\begin{figure}[H]
\label{fig:driverdiagram}
\center
\architecture{
\node[block, below of=driver] (eventdriver) {Event driver}
edge[linefrom] node[right, near end] {driver-specific messages} (driver);
\node[block, below of=eventdriver, dashed] (analysis) {Event analysis}
edge[linefrom] node[right] {events} (eventdriver);
\node[block, below of=analysis] {Application}
edge[linefrom] node[right, near start] {gestures} (analysis);
\node[right of=eventdriver, xshift=2em] (dummy) {};
\group{eventdriver}{eventdriver}{dummy}{analysis}{Architecture}
}
\caption{#1}
\end{figure}
}
\newcommand{\widgetdiagram}[1]{
\begin{figure}[H]
\label{fig:widgetdiagram}
\center
\architecture{
\node[block, below of=driver] (eventdriver) {Event driver}
edge[linefrom] node[right, near end] {driver-specific messages} (driver);
\node[block, below of=eventdriver] (widget) {Widget tree}
edge[linefrom] node[right] {events} (eventdriver);
\node[block, right of=widget, xshift=7em, dashed] (analysis) {Event analysis}
edge[linefrom, bend right=10] node[above] {events} (widget)
edge[lineto, bend left=10] node[] {gestures} (widget);
\node[block, below of=widget] {Application}
edge[linefrom] node[right, near start] {gestures} (widget);
\group{eventdriver}{eventdriver}{analysis}{widget}{Architecture}
}
\caption{#1}
\end{figure}
}
\newcommand{\trackerdiagram}[1]{
\begin{figure}[H]
\label{fig:trackerdiagram}
\center
\architecture{
\node[block, below of=driver] (eventdriver) {Event driver}
edge[linefrom] node[right, near end] {driver-specific messages} (driver);
\node[block, below of=eventdriver] (widget) {Widget tree}
edge[linefrom] node[right] {events} (eventdriver);
\node[block, right of=widget, xshift=7em] (tracker) {Gesture trackers}
edge[linefrom, bend right=10] node[above] {events} (widget)
edge[lineto, bend left=10] node[] {gestures} (widget);
\node[block, below of=widget] {Application}
edge[linefrom] node[right, near start] {gestures} (widget);
\group{eventdriver}{eventdriver}{tracker}{widget}{Architecture}
}
\caption{#1}
\end{figure}
}
\newcommand{\examplediagrams}{
\begin{figure}[H] \begin{figure}[H]
\hspace{-2.3em} \hspace{-2.3em}
\subfigure[Architecture using a single widget, demonstration gesture \subfigure[Architecture using a single widget, demonstration gesture
......
...@@ -82,7 +82,7 @@ Python. ...@@ -82,7 +82,7 @@ Python.
should allow for extensions to be added to any implementation. should allow for extensions to be added to any implementation.
The reference implementation is a Proof of Concept that translates TUIO The reference implementation is a Proof of Concept that translates TUIO
events to some simple touch gestures that are used by some test messages to some simple touch gestures that are used by some test
applications. applications.
%Being a Proof of Concept, the reference implementation itself does not %Being a Proof of Concept, the reference implementation itself does not
%necessarily need to meet all the requirements of the design. %necessarily need to meet all the requirements of the design.
...@@ -118,7 +118,6 @@ Python. ...@@ -118,7 +118,6 @@ Python.
\section{Gesture recognition software for Windows 7} \section{Gesture recognition software for Windows 7}
% TODO
The online article \cite{win7touch} presents a Windows 7 application, The online article \cite{win7touch} presents a Windows 7 application,
written in Microsofts .NET. The application shows detected gestures in a written in Microsofts .NET. The application shows detected gestures in a
canvas. Gesture trackers keep track of stylus locations to detect specific canvas. Gesture trackers keep track of stylus locations to detect specific
...@@ -160,7 +159,182 @@ Python. ...@@ -160,7 +159,182 @@ Python.
of gesture detection code, thus keeping a code library manageable and of gesture detection code, thus keeping a code library manageable and
extendable, is to user different gesture trackers. extendable, is to user different gesture trackers.
\chapter{Requirements} % FIXME: change title below
\chapter{Design - new}
% Diagrams are defined in a separate file
\input{data/diagrams}
\section{Introduction}
% TODO: rewrite intro, reference to experiment appendix
This chapter describes a design for a generic multi-touch gesture detection
architecture. The architecture constists of multiple components, each with
a specific set of tasks. Naturally, the design is based on a number of
requirements. The first three sections each describe a requirement, and a
solution that meets the requirement. The following sections show the
cohesion of the different components in the architecture.
To test multi-touch interaction properly, a multi-touch device is required.
The University of Amsterdam (UvA) has provided access to a multi-touch
table from PQlabs. The table uses the TUIO protocol \cite{TUIO} to
communicate touch events. See appendix \ref{app:tuio} for details regarding
the TUIO protocol.
\subsection*{Position of architecture in software}
The input of the architecture comes from some multi-touch device
driver. For example, the table used in the experiments uses the TUIO
protocol. The task of the architecture is to translate this input to
multi-touch gestures that are used by an application, as illustrated in
figure \ref{fig:basicdiagram}. At the end of this chapter, the diagram
is extended with the different components of the architecture.
\basicdiagram{A diagram showing the position of the architecture
relative to a multi-touch application.}
\section{Supporting multiple drivers}
The TUIO protocol is an example of a touch driver that can be used by
multi-touch devices. Other drivers do exist, which should also be supported
by the architecture. Therefore, there must be some translation of
driver-specific messages to a common format in the arcitecture. Messages in
this common format will be called \emph{events}. Events can be translated
to multi-touch \emph{gestures}. The most basic set of events is
${point\_down, point\_move, point\_up}$.
A more extended set could also contain more complex events. However, a
object can also have a rotational property, like the ``fiducials'' type in
the TUIO protocol. This results in $\{point\_down, point\_move, point\_up,
object\_down, object\_move, object\_up,\\object\_rotate\}$.
The component that translates driver-specific messages to events, is called
the \emph{event driver}. The event driver runs in a loop, receiving and
analyzing driver messages. The event driver that is used in an application
is dependent of the support of the multi-touch device.
When a sequence of messages is analyzed as an event, the event driver
delegates the event to other components in the architecture for translation
to gestures.
\driverdiagram{Extension of the diagram from figure \ref{fig:basicdiagram},
showing the position of the event driver in the architecture.}
\section{Restricting gestures to a screen area}
An application programmer should be able to bind a gesture handler to some
element on the screen. For example, a button tap\footnote{A ``tap'' gesture
is triggered when a touch object releases the screen within a certain time
and distance from the point where it initially touched the screen.} should
only occur on the button itself, and not in any other area of the screen. A
solution to this program is the use of \emph{widgets}. The button from the
example can be represented as a rectangular widget with a position and
size. The position and size are compared with event coordinates to
determine whether an event should occur within the button.
\subsection*{Widget tree}
A problem occurs when widgets overlap. If a button in placed over a
container and an event occurs occurs inside the button, should the
button handle the event first? And, should the container receive the
event at all or should it be reserved for the button?.
The solution to this problem is to save widgets in a tree structure.
There is one root widget, whose size is limited by the size of the
touch screen. Being the leaf widget, and thus the widget that is
actually touched when an object touches the device, the button widget
should receive an event before its container does. However, events
occur on a screen-wide level and thus at the root level of the widget
tree. Therefore, an event is delegated in the tree before any analysis
is performed. Delegation stops at the ``lowest'' widget in the three
containing the event coordinates. That widget then performs some
analysis of the event, after which the event is released back to the
parent widget for analysis. This release of an event to a parent widget
is called \emph{propagation}. To be able to reserve an event to some
widget or analysis, the propagation of an event can be stopped during
analysis.
% TODO: insprired by JavaScript DOM
% TODO: add GTK to bibliography
Many GUI frameworks, like GTK \cite{GTK}, also use a tree structure to
manage their widgets. This makes it easy to connect the architecture to
such a framework. For example, the programmer can define a
\texttt{GtkTouchWidget} that synchronises the position of a touch
widget with that of a GTK widget, using GTK signals.
\subsection*{Callbacks}
\label{sec:callbacks}
When an event is propagated by a widget, it is first used for event
analysis on that widget. The event analysis can then trigger a gesture
in the widget, which has to be handled by the application. To handle a
gesture, the widget should provide a callback mechanism: the
application binds a handler for a specific type of gesture to a widget.
When a gesture of that type is triggered after event analysis, the
widget triggers the callback.
\subsection*{Position of widget tree in architecture}
\widgetdiagram{Extension of the diagram from figure
\ref{fig:driverdiagram}, showing the position of widgets in the
architecture.}
\section{Event analysis}
The events that are delegated to widgets must be analyzed in some way to
from gestures. This analysis is specific to the type of gesture being
detected. E.g. the detection of a ``tap'' gesture is very different from
detection of a ``rotate'' gesture. The \cite[.NET
implementation]{win7touch} separates the detection of different gestures
into different \emph{gesture trackers}. This keeps the different pieces of
detection code managable and extandable. Therefore, the architecture also
uses gesture trackers to separate the analysis of events. A single gesture
tracker detects a specific set of gesture types, given a sequence of
events. An example of a possible gesture tracker implementation is a
``transformation tracker'' that detects rotation, scaling and translation
gestures.
\subsection*{Assignment of a gesture tracker to a widget}
As explained in section \ref{sec:callbacks}, events are delegated from
a widget to some event analysis. The analysis component of a widget
consists of a list of gesture trackers, each tracking a specific set of
gestures. No two trackers in the list should be tracking the same
gesture type.
When a handler for a gesture is ``bound'' to a widget, the widget
asserts that it has a tracker that is tracking this gesture. Thus, the
programmer does not create gesture trackers manually. Figure
\ref{fig:trackerdiagram} shows the position of gesture trackers in the
architecture.
\trackerdiagram{Extension of the diagram from figure
\ref{fig:widgetdiagram}, showing the position of gesture trackers in
the architecture.}
\section{Example usage}
% FIXME: Delete the 2 following chapters
\chapter{Experiments}
\label{chapter:requirements} \label{chapter:requirements}
% testimplementatie met taps, rotatie en pinch. Hieruit bleek: % testimplementatie met taps, rotatie en pinch. Hieruit bleek:
...@@ -174,72 +348,14 @@ Python. ...@@ -174,72 +348,14 @@ Python.
% wellicht in een ander programma nodig om maar 1 hand te gebruiken, en % wellicht in een ander programma nodig om maar 1 hand te gebruiken, en
% dus punten dicht bij elkaar te kiezen (oplossing: windows). % dus punten dicht bij elkaar te kiezen (oplossing: windows).
% TODO: Move content into the following sections:
\section{Introduction}
\section{Supporting multiple drivers}
\section{Restricting gestures to a screen area}
\section{Separating and extending code}
\section{Introduction} \section{Introduction}
% TODO
To test multi-touch interaction properly, a multi-touch device is required. To test multi-touch interaction properly, a multi-touch device is required.
The University of Amsterdam (UvA) has provided access to a multi-touch The University of Amsterdam (UvA) has provided access to a multi-touch
table from PQlabs. The table uses the TUIO protocol \cite{TUIO} to table from PQlabs. The table uses the TUIO protocol \cite{TUIO} to
communicate touch events. See appendix \ref{app:tuio} for details regarding communicate touch events. See appendix \ref{app:tuio} for details regarding
the TUIO protocol. the TUIO protocol.
\section{Experimenting with TUIO and event bindings}
\label{sec:experimental-draw}
When designing a software library, its API should be understandable and
easy to use for programmers. To find out the basic requirements of the API
to be usable, an experimental program has been written based on the
Processing code from \cite{processingMT}. The program receives TUIO events
and translates them to point \emph{down}, \emph{move} and \emph{up} events.
These events are then interpreted to be (double or single) \emph{tap},
\emph{rotation} or \emph{pinch} gestures. A simple drawing program then
draws the current state to the screen using the PyGame library. The output
of the program can be seen in figure \ref{fig:draw}.
\begin{figure}[H]
\center
\label{fig:draw}
\includegraphics[scale=0.4]{data/experimental_draw.png}
\caption{Output of the experimental drawing program. It draws the touch
points and their centroid on the screen (the centroid is used
as center point for rotation and pinch detection). It also
draws a green rectangle which responds to rotation and pinch
events.}
\end{figure}
One of the first observations is the fact that TUIO's \texttt{SET} messages
use the TUIO coordinate system, as described in appendix \ref{app:tuio}.
The test program multiplies these with its own dimensions, thus showing the
entire screen in its window. Also, the implementation only works using the
TUIO protocol. Other drivers are not supported.
Though using relatively simple math, the rotation and pinch events work
surprisingly well. Both rotation and pinch use the centroid of all touch
points. A \emph{rotation} gesture uses the difference in angle relative to
the centroid of all touch points, and \emph{pinch} uses the difference in
distance. Both values are normalized using division by the number of touch
points. A pinch event contains a scale factor, and therefore uses a
division of the current by the previous average distance to the centroid.
There is a flaw in this implementation. Since the centroid is calculated
using all current touch points, there cannot be two or more rotation or
pinch gestures simultaneously. On a large multi-touch table, it is
desirable to support interaction with multiple hands, or multiple persons,
at the same time. This kind of application-specific requirements should be
defined in the application itself, whereas the experimental implementation
defines detection algorithms based on its test program.
Also, the different detection algorithms are all implemented in the same
file, making it complex to read or debug, and difficult to extend.
\section{Summary of observations} \section{Summary of observations}
\label{sec:observations} \label{sec:observations}
...@@ -412,10 +528,6 @@ Python. ...@@ -412,10 +528,6 @@ Python.
\section{Diagrams} \section{Diagrams}
\input{data/diagrams}
\simplediagram
\completediagrams
\section{Example usage} \section{Example usage}
This section describes an example that illustrates the communication This section describes an example that illustrates the communication
...@@ -450,12 +562,6 @@ Python. ...@@ -450,12 +562,6 @@ Python.
start server start server
\end{verbatim} \end{verbatim}
\chapter{Reference implementation}
% TODO
% alleen window.contains op point down, niet move/up
% een paar simpele windows en trackers
\chapter{Test applications} \chapter{Test applications}
% TODO % TODO
...@@ -534,4 +640,60 @@ client application, as stated by the online specification ...@@ -534,4 +640,60 @@ client application, as stated by the online specification
values back to the actual screen dimension. values back to the actual screen dimension.
\end{quote} \end{quote}
\chapter{Experimental program}
\label{app:experiment}
% TODO: rewrite intro
When designing a software library, its API should be understandable and easy to
use for programmers. To find out the basic requirements of the API to be
usable, an experimental program has been written based on the Processing code
from \cite{processingMT}. The program receives TUIO events and translates them
to point \emph{down}, \emph{move} and \emph{up} events. These events are then
interpreted to be (double or single) \emph{tap}, \emph{rotation} or
\emph{pinch} gestures. A simple drawing program then draws the current state to
the screen using the PyGame library. The output of the program can be seen in
figure \ref{fig:draw}.
\begin{figure}[H]
\center
\label{fig:draw}
\includegraphics[scale=0.4]{data/experimental_draw.png}
\caption{Output of the experimental drawing program. It draws the touch
points and their centroid on the screen (the centroid is used as center
point for rotation and pinch detection). It also draws a green
rectangle which responds to rotation and pinch events.}
\end{figure}
One of the first observations is the fact that TUIO's \texttt{SET} messages use
the TUIO coordinate system, as described in appendix \ref{app:tuio}. The test
program multiplies these with its own dimensions, thus showing the entire
screen in its window. Also, the implementation only works using the TUIO
protocol. Other drivers are not supported.
Though using relatively simple math, the rotation and pinch events work
surprisingly well. Both rotation and pinch use the centroid of all touch
points. A \emph{rotation} gesture uses the difference in angle relative to the
centroid of all touch points, and \emph{pinch} uses the difference in distance.
Both values are normalized using division by the number of touch points. A
pinch event contains a scale factor, and therefore uses a division of the
current by the previous average distance to the centroid.
There is a flaw in this implementation. Since the centroid is calculated using
all current touch points, there cannot be two or more rotation or pinch
gestures simultaneously. On a large multi-touch table, it is desirable to
support interaction with multiple hands, or multiple persons, at the same time.
This kind of application-specific requirements should be defined in the
application itself, whereas the experimental implementation defines detection
algorithms based on its test program.
Also, the different detection algorithms are all implemented in the same file,
making it complex to read or debug, and difficult to extend.
\chapter{Reference implementation in Python}
\label{app:implementation}
% TODO
% alleen window.contains op point down, niet move/up
% een paar simpele windows en trackers
\end{document} \end{document}
Markdown is supported
0%
or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment