Commit 2967cab6 authored by Taddeus Kroes's avatar Taddeus Kroes

Worked on report.

parent 7d79785b
...@@ -95,6 +95,13 @@ ...@@ -95,6 +95,13 @@
year = "2012" year = "2012"
} }
@misc{GTK,
author = "Mattis, Peter and team, the GTK+",
howpublished = "\url{http://www.mathematik.uni-ulm.de/help/gtk+-1.1.3/gtk.html}",
title = "{GIMP Toolkit}",
year = "1998"
}
@electronic{qt, @electronic{qt,
added-at = "2012-04-05T10:52:23.000+0200", added-at = "2012-04-05T10:52:23.000+0200",
author = "{Nokia Corp.}", author = "{Nokia Corp.}",
...@@ -107,4 +114,3 @@ ...@@ -107,4 +114,3 @@
x-fetchedfrom = "Bibsonomy", x-fetchedfrom = "Bibsonomy",
year = 2012 year = 2012
} }
...@@ -2,7 +2,7 @@ ...@@ -2,7 +2,7 @@
\usepackage[english]{babel} \usepackage[english]{babel}
\usepackage[utf8]{inputenc} \usepackage[utf8]{inputenc}
\usepackage{hyperref,graphicx,float,tikz,subfigure} \usepackage{hyperref,graphicx,float,tikz}
% Link colors % Link colors
\hypersetup{colorlinks=true,linkcolor=black,urlcolor=blue,citecolor=DarkGreen} \hypersetup{colorlinks=true,linkcolor=black,urlcolor=blue,citecolor=DarkGreen}
...@@ -32,13 +32,13 @@ ...@@ -32,13 +32,13 @@
% TODO: put Qt link in bibtex % TODO: put Qt link in bibtex
Multi-touch devices enable a user to interact with software using intuitive Multi-touch devices enable a user to interact with software using intuitive
body gestures, rather than with interaction tools like mouse and keyboard. hand gestures, rather with interaction tools like mouse and keyboard. With the
With the upcoming use of touch screens in phones and tablets, multi-touch increasing use of touch screens in phones and tablets, multi-touch interaction is
interaction is becoming increasingly common.The driver of a touch device becoming increasingly common.The driver of a touch device provides low-level
provides low-level events. The most basic representation of these low-level events. The most basic representation of these low-level events consists of
event consists of \emph{down}, \emph{move} and \emph{up} events. \emph{down}, \emph{move} and \emph{up} events.
Multi-touch gestures must be designed in such a way, that they can be More complex gestures must be designed in such a way, that they can be
represented by a sequence of basic events. For example, a ``tap'' gesture can represented by a sequence of basic events. For example, a ``tap'' gesture can
be represented as a \emph{down} event that is followed by an \emph{up} event be represented as a \emph{down} event that is followed by an \emph{up} event
within a certain time. within a certain time.
...@@ -71,26 +71,16 @@ To design such an architecture properly, the following questions are relevant: ...@@ -71,26 +71,16 @@ To design such an architecture properly, the following questions are relevant:
% TODO: zijn onderstaande nog relevant? beter omschrijven naar "Design" % TODO: zijn onderstaande nog relevant? beter omschrijven naar "Design"
% gerelateerde vragen? % gerelateerde vragen?
\item How can the architecture be used by different programming languages? \item How can the architecture be used by different programming languages?
A generic architecture should not be limited to be used in only one A generic architecture should not be limited to one language.
language. \item How can the architecture serve multiple applications at the same
\item Can events be used by multiple processes at the same time? For time?
example, a network implementation could run as a service instead of
within a single application, triggering events in any application that
needs them.
\end{itemize} \end{itemize}
% Afbakening % Afbakening
The scope of this thesis includes the design of a generic multi-touch detection The scope of this thesis includes the design of a generic multi-touch detection
architecture, a reference implementation of this design written in Python, and architecture, a reference implementation of this design, and the integration of
the integration of the reference implementation in a test case application. To the reference implementation in a test case application.
test multi-touch interaction properly, a multi-touch device is required. The
University of Amsterdam (UvA) has provided access to a multi-touch table from
PQlabs. The table uses the TUIO protocol \cite{TUIO} to communicate touch
events. See appendix \ref{app:tuio} for details regarding the TUIO protocol.
The reference implementation is a Proof of Concept that translates TUIO
messages to some simple touch gestures (see appendix \ref{app:implementation}
for details).
\section{Structure of this document} \section{Structure of this document}
...@@ -134,7 +124,9 @@ for details). ...@@ -134,7 +124,9 @@ for details).
An important observation in this application is that different gestures are An important observation in this application is that different gestures are
detected by different gesture trackers, thus separating gesture detection detected by different gesture trackers, thus separating gesture detection
code into maintainable parts. code into maintainable parts. The architecture has adopted this design
feature by also using different gesture trackers to track different gesture
types.
\section{Processing implementation of simple gestures in Android} \section{Processing implementation of simple gestures in Android}
...@@ -142,14 +134,13 @@ for details). ...@@ -142,14 +134,13 @@ for details).
gestures (tap, double tap, rotation, pinch and drag) using gestures (tap, double tap, rotation, pinch and drag) using
Processing\footnote{Processing is a Java-based development environment with Processing\footnote{Processing is a Java-based development environment with
an export possibility for Android. See also \url{http://processing.org/.}} an export possibility for Android. See also \url{http://processing.org/.}}
can be found found in a forum on the Processing website can be found in a forum on the Processing website \cite{processingMT}. The
\cite{processingMT}. The implementation is fairly simple, but it yields implementation is fairly simple, but it yields some very appealing results.
some very appealing results. The detection logic of all gestures is The detection logic of all gestures is combined in a single class. This
combined in a single class. This does not allow for extendability, because does not allow for extendability, because the complexity of this class
the complexity of this class would increase to an undesirable level (as would increase to an undesirable level (as predicted by the GART article
predicted by the GART article \cite{GART}). However, the detection logic \cite{GART}). However, the detection logic itself is partially re-used in
itself is partially re-used in the reference implementation of the the reference implementation of the generic gesture detection architecture.
generic gesture detection architecture.
\section{Analysis of related work} \section{Analysis of related work}
...@@ -178,14 +169,13 @@ for details). ...@@ -178,14 +169,13 @@ for details).
architecture as a diagram of relations between different components. architecture as a diagram of relations between different components.
Sections \ref{sec:driver-support} to \ref{sec:event-analysis} define Sections \ref{sec:driver-support} to \ref{sec:event-analysis} define
requirements for the archtitecture, and extend the diagram with components requirements for the archtitecture, and extend the diagram with components
that meet these requirements. Section \ref{sec:example} desicribes an that meet these requirements. Section \ref{sec:example} describes an
example usage of the architecture in an application. example usage of the architecture in an application.
\subsection*{Position of architecture in software} \subsection*{Position of architecture in software}
The input of the architecture comes from some multi-touch device The input of the architecture comes from some multi-touch device
driver. For example, the table used in the experiments uses the TUIO driver. The task of the architecture is to translate this input to
protocol. The task of the architecture is to translate this input to
multi-touch gestures that are used by an application, as illustrated in multi-touch gestures that are used by an application, as illustrated in
figure \ref{fig:basicdiagram}. In the course of this chapter, the figure \ref{fig:basicdiagram}. In the course of this chapter, the
diagram is extended with the different components of the architecture. diagram is extended with the different components of the architecture.
...@@ -196,9 +186,9 @@ for details). ...@@ -196,9 +186,9 @@ for details).
\section{Supporting multiple drivers} \section{Supporting multiple drivers}
\label{sec:driver-support} \label{sec:driver-support}
The TUIO protocol is an example of a touch driver that can be used by The TUIO protocol \cite{TUIO} is an example of a touch driver that can be
multi-touch devices. Other drivers do exist, which should also be supported used by multi-touch devices. Other drivers do exist, which should also be
by the architecture. Therefore, there must be some translation of supported by the architecture. Therefore, there must be some translation of
driver-specific messages to a common format in the arcitecture. Messages in driver-specific messages to a common format in the arcitecture. Messages in
this common format will be called \emph{events}. Events can be translated this common format will be called \emph{events}. Events can be translated
to multi-touch \emph{gestures}. The most basic set of events is to multi-touch \emph{gestures}. The most basic set of events is
...@@ -206,9 +196,10 @@ for details). ...@@ -206,9 +196,10 @@ for details).
object with only an (x, y) position on the screen. object with only an (x, y) position on the screen.
A more extended set could also contain more complex events. An object can A more extended set could also contain more complex events. An object can
also have a rotational property, like the ``fiducials'' type in the TUIO also have a rotational property, like the ``fiducials''\footnote{A fiducial
protocol. This results in $\{point\_down, point\_move,\\point\_up, is a pattern used by some touch devices to identify objects.} type
object\_down, object\_move, object\_up, object\_rotate\}$. in the TUIO protocol. This results in $\{point\_down, point\_move,\\
point\_up, object\_down, object\_move, object\_up, object\_rotate\}$.
The component that translates driver-specific messages to events, is called The component that translates driver-specific messages to events, is called
the \emph{event driver}. The event driver runs in a loop, receiving and the \emph{event driver}. The event driver runs in a loop, receiving and
...@@ -224,15 +215,19 @@ for details). ...@@ -224,15 +215,19 @@ for details).
\section{Restricting gestures to a screen area} \section{Restricting gestures to a screen area}
An application programmer should be able to bind a gesture handler to some Touch input devices are unaware of the graphical input widgets rendered on
element on the screen. For example, a button tap\footnote{A ``tap'' gesture screen and therefore generate events that simply identify the screen
is triggered when a touch object releases the screen within a certain time location at which an event takes place. In order to be able to direct a
and distance from the point where it initially touched the screen.} should gesture to a particular widget on screen, an application programmer should
only occur on the button itself, and not in any other area of the screen. A be able to bind a gesture handler to some element on the screen. For
solution to this program is the use of \emph{widgets}. The button from the example, a button tap\footnote{A ``tap'' gesture is triggered when a touch
example can be represented as a rectangular widget with a position and object releases the screen within a certain time and distance from the
size. The position and size are compared with event coordinates to point where it initially touched the screen.} should only occur on the
determine whether an event should occur within the button. button itself, and not in any other area of the screen. A solution to this
problem is the use of \emph{widgets}. The button from the example can be
represented as a rectangular widget with a position and size. The position
and size are compared with event coordinates to determine whether an event
should occur within the button.
\subsection*{Widget tree} \subsection*{Widget tree}
...@@ -257,7 +252,6 @@ for details). ...@@ -257,7 +252,6 @@ for details).
analysis. analysis.
% TODO: insprired by JavaScript DOM % TODO: insprired by JavaScript DOM
% TODO: add GTK to bibliography
Many GUI frameworks, like GTK \cite{GTK}, also use a tree structure to Many GUI frameworks, like GTK \cite{GTK}, also use a tree structure to
manage their widgets. This makes it easy to connect the architecture to manage their widgets. This makes it easy to connect the architecture to
such a framework. For example, the programmer can define a such a framework. For example, the programmer can define a
...@@ -285,17 +279,16 @@ for details). ...@@ -285,17 +279,16 @@ for details).
\label{sec:event-analysis} \label{sec:event-analysis}
The events that are delegated to widgets must be analyzed in some way to The events that are delegated to widgets must be analyzed in some way to
from gestures. This analysis is specific to the type of gesture being gestures. This analysis is specific to the type of gesture being detected.
detected. E.g. the detection of a ``tap'' gesture is very different from E.g. the detection of a ``tap'' gesture is very different from detection of
detection of a ``rotate'' gesture. The \cite[.NET a ``rotate'' gesture. The implementation described in \cite{win7touch}
implementation]{win7touch} separates the detection of different gestures separates the detection of different gestures into different \emph{gesture
into different \emph{gesture trackers}. This keeps the different pieces of trackers}. This keeps the different pieces of detection code managable and
detection code managable and extandable. Therefore, the architecture also extandable. Therefore, the architecture also uses gesture trackers to
uses gesture trackers to separate the analysis of events. A single gesture separate the analysis of events. A single gesture tracker detects a
tracker detects a specific set of gesture types, given a sequence of specific set of gesture types, given a sequence of events. An example of a
events. An example of a possible gesture tracker implementation is a possible gesture tracker implementation is a ``transformation tracker''
``transformation tracker'' that detects rotation, scaling and translation that detects rotation, scaling and translation gestures.
gestures.
\subsection*{Assignment of a gesture tracker to a widget} \subsection*{Assignment of a gesture tracker to a widget}
...@@ -315,6 +308,10 @@ for details). ...@@ -315,6 +308,10 @@ for details).
\ref{fig:widgetdiagram}, showing the position of gesture trackers in \ref{fig:widgetdiagram}, showing the position of gesture trackers in
the architecture.} the architecture.}
\section{Serving multiple applications}
% TODO
\section{Example usage} \section{Example usage}
\label{sec:example} \label{sec:example}
...@@ -360,6 +357,14 @@ for details). ...@@ -360,6 +357,14 @@ for details).
\chapter{Test applications} \chapter{Test applications}
To test multi-touch interaction properly, a multi-touch device is required. The
University of Amsterdam (UvA) has provided access to a multi-touch table from
PQlabs. The table uses the TUIO protocol \cite{TUIO} to communicate touch
events. See appendix \ref{app:tuio} for details regarding the TUIO protocol.
The reference implementation is a Proof of Concept that translates TUIO
messages to some simple touch gestures (see appendix \ref{app:implementation}
for details).
% TODO % TODO
% testprogramma's met PyGame/Cairo % testprogramma's met PyGame/Cairo
......
Markdown is supported
0%
or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment