Commit 7b535751 authored by Taddeüs Kroes's avatar Taddeüs Kroes

Addressed some feedback comments on report.

parent 86f81f02
......@@ -4,3 +4,4 @@ Code:
Report/appendix reference gesture detection:
- Point_leave(+point_enter) kan niet -> flaw v/h systeem/driver?
- "gesture detection component" -> "gesture tracker"
......@@ -43,49 +43,38 @@
]
\newcommand{\architecture}[1]{
\begin{tikzpicture}[node distance=6em, auto]
\node[block] (driver) {Driver};
\node[block] (driver) {Device driver};
#1
\end{tikzpicture}
}
\def\basicdiagram{
\begin{figure}[h]
\def\fulldiagram{
\begin{figure}[h!]
\center
\architecture{
\node[block, dashed, below of=driver] (arch) {Architecture}
edge[linefrom] node[right] {driver-specific messages} (driver);
\node[block, below of=arch] {Application}
edge[linefrom] node[right] {gestures} (arch);
}
\caption{A diagram showing the position of the architecture relative to
the device driver and a multi-touch application. The input of the
architecture is given by a touch device driver. The output is
translated to complex interaction gestures and passed to the
application that is using the architecture.}
\label{fig:basicdiagram}
\end{figure}
}
\node[block, below of=driver] (eventdriver) {Event driver}
edge[linefrom] node[right, near end] {device-specific messages} (driver);
\node[block, below of=eventdriver] (area) {Event areas}
edge[linefrom] node[right] {1} (eventdriver);
\node[block, right of=area, xshift=7em] (tracker) {Gesture trackers}
edge[linefrom, bend right=10] node[below=2pt] {2} (area)
edge[lineto, bend left=10, dotted] (area);
\node[block, below of=area] {Application}
edge[linefrom, dotted] node[right, near start] {3} (area);
\def\driverdiagram{
\begin{figure}[H]
\center
\architecture{
\node[block, below of=driver] (eventdriver) {Event driver}
edge[linefrom] node[right, near end] {driver-specific messages} (driver);
\node[block, below of=eventdriver, dashed] (analysis) {Event analysis}
edge[linefrom] node[right] {events} (eventdriver);
\node[block, below of=analysis] {Application}
edge[linefrom] node[right, near start] {gestures} (analysis);
\node[right of=eventdriver, xshift=2em] (dummy) {};
\group{eventdriver}{eventdriver}{dummy}{analysis}{Architecture}
}
\caption{Extension of the diagram from figure \ref{fig:basicdiagram},
showing the position of the event driver in the architecture. The
event driver translates driver-specific to a common set of events,
which are delegated to analysis components that will interpret them
as more complex gestures.}
\label{fig:driverdiagram}
\group{eventdriver}{eventdriver}{tracker}{area}{Architecture}
}
\caption{
Components of the architecture design. The \emph{event driver}
translates device-specific messages to low-level ``events''. These
events are delegated to a number of \emph{event areas} (1), which
restrict events to an area on the screen. \emph{Gesture trackers}
translate low-level events to high-level ``gestures'' (2), which
are handled by the application (3). Dotted arrows represent a flow
of gestures, regular arrows represent events (unless labeled
otherwise).
}
\label{fig:fulldiagram}
\end{figure}
}
......@@ -93,81 +82,37 @@
\begin{figure}[H]
\center
\begin{tikzpicture}[node distance=6em]
\node[block] (driver) {Driver};
\node[block] (driver) {Device driver};
\node[block, below of=driver] (eventdriver) {Event driver}
edge[linefrom] (driver);
\node[block, right of=driver, xshift=2em] (seconddriver) {Driver};
\node[block, right of=driver, xshift=2em] (seconddriver) {Device driver};
\node[block, below of=seconddriver] (secondeventdriver) {Event driver}
edge[linefrom] node[right, near end] {driver-specific messages} (seconddriver);
edge[linefrom] node[right, near end] {device-specific messages} (seconddriver);
\node[block, below of=eventdriver, dashed] (analysis) {Event analysis}
\node[block, below of=eventdriver] (areas) {Event areas}
edge[linefrom] (eventdriver)
edge[linefrom] node[right=5pt] {events} (secondeventdriver);
\node[block, below of=analysis] {Application}
edge[linefrom] node[right, near start] {gestures} (analysis);
\node[block, right of=area, xshift=7em] (tracker) {Gesture trackers}
edge[linefrom, bend right=10] (areas)
edge[lineto, bend left=10, dotted] (areas);
\node[block, below of=areas] {Application}
edge[linefrom, dotted] node[right, near start] {gestures} (areas);
\node[right of=seconddriver, xshift=2em] (dummy) {};
\group{eventdriver}{eventdriver}{dummy}{analysis}{Architecture}
\group{eventdriver}{eventdriver}{dummy}{areas}{Architecture}
\end{tikzpicture}
\caption{Multiple event drivers running simultaneously.}
\label{fig:multipledrivers}
\end{figure}
}
\def\areadiagram{
\begin{figure}[h]
\center
\architecture{
\node[block, below of=driver] (eventdriver) {Event driver}
edge[linefrom] node[right, near end] {driver-specific messages} (driver);
\node[block, below of=eventdriver] (area) {Event areas}
edge[linefrom] node[right] {events} (eventdriver);
\node[block, right of=area, xshift=7em, dashed] (analysis) {Gesture detection}
edge[linefrom, bend right=10] node[above] {events} (area)
edge[lineto, bend left=10] node[] {gestures} (area);
\node[block, below of=area] {Application}
edge[linefrom] node[right, near start] {gestures through callback function} (area);
\group{eventdriver}{eventdriver}{analysis}{area}{Architecture}
}
\caption{Extension of the diagram from figure \ref{fig:driverdiagram},
with event areas. An event area delegates events to a gesture detection
component that triggers a gesture. The event area then calls the
handlers that are bound to the gesture type by the application.}
\label{fig:areadiagram}
\end{figure}
}
\def\trackerdiagram{
\begin{figure}[h!]
\center
\architecture{
\node[block, below of=driver] (eventdriver) {Event driver}
edge[linefrom] node[right, near end] {driver-specific messages} (driver);
\node[block, below of=eventdriver] (area) {Event area tree}
edge[linefrom] node[right] {events} (eventdriver);
\node[block, right of=area, xshift=7em] (tracker) {Gesture trackers}
edge[linefrom, bend right=10] node[above] {events} (area)
edge[lineto, bend left=10] node[] {gestures} (area);
\node[block, below of=area] {Application}
edge[linefrom] node[right, near start] {gestures} (area);
\group{eventdriver}{eventdriver}{tracker}{area}{Architecture}
}
\caption{Extension of the diagram from figure \ref{fig:areadiagram}
with gesture trackers. Gesture trackers analyze detect high-level
gestures from low-level events.}
\label{fig:trackerdiagram}
\end{figure}
}
\def\examplediagram{
\begin{figure}[h!]
\center
\architecture{
\node[block, below of=driver] (eventdriver) {Event driver}
edge[linefrom] node[right, near end] {driver-specific messages} (driver);
edge[linefrom] node[right, near end] {device-specific messages} (driver);
\node[block, below of=eventdriver] (rootarea) {Root area}
edge[linefrom] (eventdriver);
......@@ -352,7 +297,7 @@
\center
\architecture{
\node[block, below of=driver] (eventdriver) {Event driver}
edge[linefrom] node[right, near end] {driver-specific messages} (driver);
edge[linefrom] node[right, near end] {device-specific messages} (driver);
\node[block, below of=eventdriver] (rootarea) {Screen area}
edge[linefrom] (eventdriver);
......
......@@ -230,3 +230,19 @@
year = 2010
}
@article{PIP,
added-at = "2011-12-05T00:00:00.000+0100",
author = "Sutherland, Ivan E. and Sproull, Robert F. and Schumacker, Robert A.",
interhash = "7c3ac13951889d07f968ca7c0398c34d",
intrahash = "2bfef4fbc31892de2ab1bf8607514e2b",
journal = "ACM Comput. Surv.",
keywords = "dblp",
number = 1,
pages = "13-16",
title = "{A Characterization of Ten Hidden-Surface Algorithms.}",
url = "http://dblp.uni-trier.de/db/journals/csur/csur6.html#SutherlandSS74; http://doi.acm.org/10.1145/356625.356626; http://www.bibsonomy.org/bibtex/22bfef4fbc31892de2ab1bf8607514e2b/dblp",
volume = 6,
x-fetchedfrom = "Bibsonomy",
year = 1974
}
......@@ -194,19 +194,12 @@ detection for every new gesture-based application.
at the same time.
This chapter describes a design for such an architecture. The architecture
is represented as diagram of relations between different components.
Sections \ref{sec:multipledrivers} to \ref{sec:daemon} define requirements
for the architecture, and extend this diagram with components that meet
these requirements. Section \ref{sec:example} describes an example usage of
the architecture in an application.
components are shown by figure \ref{fig:fulldiagram}. Sections
\ref{sec:multipledrivers} to \ref{sec:daemon} explain the use of all
components in detail.
The input of the architecture comes from a multi-touch device driver.
The task of the architecture is to translate this input to multi-touch
gestures that are used by an application, as illustrated in figure
\ref{fig:basicdiagram}. In the course of this chapter, the diagram is
extended with the different components of the architecture.
\basicdiagram
\fulldiagram
\newpage
\section{Supporting multiple drivers}
\label{sec:multipledrivers}
......@@ -216,10 +209,10 @@ detection for every new gesture-based application.
low-level touch events (see appendix \ref{app:tuio} for more details).
These messages are specific to the API of the TUIO protocol. Other drivers
may use different messages types. To support more than one driver in the
architecture, there must be some translation from driver-specific messages
architecture, there must be some translation from device-specific messages
to a common format for primitive touch events. After all, the gesture
detection logic in a ``generic'' architecture should not be implemented
based on driver-specific messages. The event types in this format should be
based on device-specific messages. The event types in this format should be
chosen so that multiple drivers can trigger the same events. If each
supported driver would add its own set of event types to the common format,
the purpose of it being ``common'' would be defeated.
......@@ -237,14 +230,11 @@ detection for every new gesture-based application.
TUIO protocol. Another driver that can keep apart rotated objects from
simple touch points could also trigger them.
The component that translates driver-specific messages to common events,
The component that translates device-specific messages to common events,
will be called the \emph{event driver}. The event driver runs in a loop,
receiving and analyzing driver messages. When a sequence of messages is
analyzed as an event, the event driver delegates the event to other
components in the architecture for translation to gestures. This
communication flow is illustrated in figure \ref{fig:driverdiagram}.
\driverdiagram
components in the architecture for translation to gestures.
Support for a touch driver can be added by adding an event driver
implementation. The choice of event driver implementation that is used in an
......@@ -277,13 +267,13 @@ detection for every new gesture-based application.
the architecture should offer a solution to this problem, or leave the task
of assigning gestures to application widgets to the application developer.
If the architecture does not provide a solution, the ``Event analysis''
component in figure \ref{fig:multipledrivers} receives all events that
occur on the screen surface. The gesture detection logic thus uses all
events as input to detect a gesture. This leaves no possibility for a
gesture to occur at multiple screen positions at the same time. The problem
is illustrated in figure \ref{fig:ex1}, where two widgets on the screen can
be rotated independently. The rotation detection component that detects
If the architecture does not provide a solution, the ``gesture detection''
component in figure \ref{fig:fulldiagram} receives all events that occur on
the screen surface. The gesture detection logic thus uses all events as
input to detect a gesture. This leaves no possibility for a gesture to
occur at multiple screen positions at the same time. The problem is
illustrated in figure \ref{fig:ex1}, where two widgets on the screen can be
rotated independently. The rotation detection component that detects
rotation gestures receives all four fingers as input. If the two groups of
finger events are not separated by cluster detection, only one rotation
event will occur.
......@@ -304,7 +294,7 @@ detection for every new gesture-based application.
covered by a widget, before passing them on to a gesture detection
component. Different gesture detection components can then detect gestures
simultaneously, based on different sets of input events. An area of the
screen surface will be represented by an \emph{event area}. An event area
screen surface is represented by an \emph{event area}. An event area
filters input events based on their location, and then delegates events to
gesture detection components that are assigned to the event area. Events
which are located outside the event area are not delegated to its gesture
......@@ -312,7 +302,11 @@ detection for every new gesture-based application.
In the example of figure \ref{fig:ex1}, the two rotatable widgets can be
represented by two event areas, each having a different rotation detection
component.
component. Each event area can consist of four corner locations of the
square it represents. To detect whether an event is located inside a
square, the event areas use a point-in-polygon (PIP) test \cite{PIP}. It is
the task of the client application to update the corner locations of the
event area with those of the widget.
\subsection{Callback mechanism}
......@@ -324,10 +318,6 @@ detection for every new gesture-based application.
callback mechanism to handle gestures in an application. Callback handlers
are bound to event areas, since events areas controls the grouping of
events and thus the occurrence of gestures in an area of the screen.
Figure \ref{fig:areadiagram} shows the position of areas in the
architecture.
\areadiagram
\subsection{Area tree}
\label{sec:tree}
......@@ -337,12 +327,11 @@ detection for every new gesture-based application.
event area that contains the event coordinates.
If the architecture were to be used in combination with an application
framework like GTK \cite{GTK}, each GTK widget that responds to gestures
should have a mirroring event area that synchronizes its location with that
of the widget. Consider a panel with five buttons that all listen to a
``tap'' event. If the location of the panel changes as a result of movement
of the application window, the positions of all buttons have to be updated
too.
framework, each widget that responds to gestures should have a mirroring
event area that synchronizes its location with that of the widget. Consider
a panel with five buttons that all listen to a ``tap'' event. If the
location of the panel changes as a result of movement of the application
window, the positions of all buttons have to be updated too.
This process is simplified by the arrangement of event areas in a tree
structure. A root event area represents the panel, containing five other
......@@ -354,7 +343,10 @@ detection for every new gesture-based application.
If the GUI toolkit provides an API for requesting the position and size of
a widget, a recommended first step when developing an application is to
create a subclass of the area that automatically synchronizes with the
position of a widget from the GUI framework.
position of a widget from the GUI framework. For example, the test
application described in section \ref{sec:testapp} extends the GTK
\cite{GTK} application window widget with the functionality of a
rectangular event area, to direct touch events to an application window.
\subsection{Event propagation}
\label{sec:eventpropagation}
......@@ -394,13 +386,12 @@ detection for every new gesture-based application.
An additional type of event propagation is ``immediate propagation'', which
indicates propagation of an event from one gesture detection component to
another. This is applicable when an event area uses more than one gesture
detection component. One of the components can stop the immediate
detection component. When regular propagation is stopped, the event is
propagated to other gesture detection components first, before actually
being stopped. One of the components can also stop the immediate
propagation of an event, so that the event is not passed to the next
gesture detection component, nor to the ancestors of the event area.
When regular propagation is stopped, the event is propagated to other
gesture detection components first, before actually being stopped.
\newpage
\eventpropagationfigure
The concept of an event area is based on the assumption that the set of
......@@ -467,13 +458,9 @@ detection for every new gesture-based application.
detection component defines a simple function that compares event
coordinates.
\trackerdiagram
When a gesture tracker detects a gesture, this gesture is triggered in the
corresponding event area. The event area then calls the callbacks which are
bound to the gesture type by the application. Figure
\ref{fig:trackerdiagram} shows the position of gesture trackers in the
architecture.
bound to the gesture type by the application.
The use of gesture trackers as small detection units provides extendability
of the architecture. A developer can write a custom gesture tracker and
......@@ -643,6 +630,7 @@ the entire touch surface. The output of the application can be seen in figure
\end{figure}
\section{GTK+/Cairo application}
\label{sec:testapp}
The second test application uses the GIMP toolkit (GTK+) \cite{GTK} to create
its user interface. Since GTK+ defines a main event loop that is started in
......
Markdown is supported
0%
or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment