Commit 346cf020 authored by Taddeüs Kroes's avatar Taddeüs Kroes

Added report improvements in comments, to be worked out later.

parent d509d424
...@@ -248,19 +248,75 @@ goal is to test the effectiveness of the design and detect its shortcomings. ...@@ -248,19 +248,75 @@ goal is to test the effectiveness of the design and detect its shortcomings.
\section{Restricting gestures to a screen area} \section{Restricting gestures to a screen area}
% TODO: in introduction: gestures zijn opgebouwd uit meerdere primitieven
Touch input devices are unaware of the graphical input widgets rendered on Touch input devices are unaware of the graphical input widgets rendered on
screen and therefore generate events that simply identify the screen screen and therefore generate events that simply identify the screen
location at which an event takes place. In order to be able to direct a location at which an event takes place. In order to be able to direct a
gesture to a particular widget on screen, an application programmer should gesture to a particular widget on screen, an application programmer must
be able to bind a gesture handler to some element on the screen. For restrict the occurrence of a gesture to the area of the screen covered by
example, a button tap\footnote{A ``tap'' gesture is triggered when a touch that widget. An important question is if the architecture should offer a
object releases the screen within a certain time and distance from the solution to this problem, or leave it to the programmer to assign gestures
point where it initially touched the screen.} should only occur on the to a widget.
button itself, and not in any other area of the screen. A solution to this
problem is the use of \emph{widgets}. The button from the example can be % TODO: eerst: aan developer overlaten, verwijzen naar vorige diagram dan:
represented as a rectangular widget with a position and size. The position % consider the following example: ... twee vierkantjes die allebei naar
and size are compared with event coordinates to determine whether an event % rotatie luisteren (figuur ter illustratie): als je ze tegelijk roteert
should occur within the button. % treedt er maar één globaal event op. Dus: niet gestures beperken tot een
% area, maar events. dan kun je op elk vierkant een aparte detection logic
% zetten met als input de events op die locatie oftewel: je kan het niet
% aan de developer overlaten omdat de input van de detection logic moet
% veranderen (heeft developer geen invloed op) dus conclusie: Je moet
% events kunnen beperken tot een "area" van het scherm. op dit moment kan
% de diagram dus al worden uitgebreid
% dan: simpelste aanpak is een lijst van area's, als event erin past dan
% delegeren. probleem (aangeven met voorbeeld van geneste widgets die
% allebei naar tap luisteren): als area's overlappen wil je bepaalde events
% reserveren voor bepaalde stukjes detection logic
% oplossing: area'a opslaan in boomstructuur en event propagatie gebruiken
% -> area binnenin een parent area kan events propageren naar die parent,
% detection logic kan propagatie tegenhouden. om omhoog in de boom te
% propageren moet het event eerst bij de leaf aankomen, dus eerst delegatie
% tot laagste leaf node die het event bevat.
% speciaal geval: overlappende area's in dezelfde laag v/d boom. in dat
% geval: area die later is toegevoegd (rechter sibling) wordt aangenomen
% bovenop de sibling links ervan te liggen en krijgt dus eerst het event.
% Als propagatie in bovenste (rechter) area wordt gestopt, krijgt de
% achterste (linker) sibling deze ook niet meer
% bijkomend voordeel van boomstructuur: makkelijk te integreren in bijv GTK
% die voor widgets een boomstructuur gebruikt -> voor elke widget die touch
% events heeft een area aanmaken
Gestures are composed of primitive events using detection logic. If a
particular gesture should only occur within some area of the screen, it
should be composed of only events that occur within that area Events that
occur outside the area are not likely to be relevant to the . In other
words, the gesture detection logic is affected by the area in which the
gestures should be detected. Since the detection logic is part of the
architecture, the architecture must be able to restrict the set of events
to that are delegated to the particular piece of detection logic for the
gesture being detected in the area.
For example, a button tap\footnote{A ``tap'' gesture is triggered when a
touch object releases a touch surface within a certain time and distance
from the point where it initially touched the surface.} should only occur
on the button itself, and not in any other area of the screen. A solution
to this problem is the use of \emph{widgets}. The button from the example
can be represented as a rectangular widget with a position and size. The
position and size are compared with event coordinates to determine whether
an event should occur within the button.
\subsection*{Callbacks}
\label{sec:callbacks}
When an event is propagated by a widget, it is first used for event
analysis on that widget. The event analysis can then trigger a gesture
in the widget, which has to be handled by the application. To handle a
gesture, the widget should provide a callback mechanism: the
application binds a handler for a specific type of gesture to a widget.
When a gesture of that type is triggered after event analysis, the
widget triggers the callback.
\subsection*{Widget tree} \subsection*{Widget tree}
...@@ -283,7 +339,7 @@ goal is to test the effectiveness of the design and detect its shortcomings. ...@@ -283,7 +339,7 @@ goal is to test the effectiveness of the design and detect its shortcomings.
is called \emph{propagation}. To be able to reserve an event to some is called \emph{propagation}. To be able to reserve an event to some
widget or analysis, the propagation of an event can be stopped during widget or analysis, the propagation of an event can be stopped during
analysis. analysis.
% TODO: insprired by JavaScript DOM % TODO: inspired by JavaScript DOM
Many GUI frameworks, like GTK \cite{GTK}, also use a tree structure to Many GUI frameworks, like GTK \cite{GTK}, also use a tree structure to
manage their widgets. This makes it easy to connect the architecture to manage their widgets. This makes it easy to connect the architecture to
...@@ -291,17 +347,6 @@ goal is to test the effectiveness of the design and detect its shortcomings. ...@@ -291,17 +347,6 @@ goal is to test the effectiveness of the design and detect its shortcomings.
\texttt{GtkTouchWidget} that synchronises the position of a touch \texttt{GtkTouchWidget} that synchronises the position of a touch
widget with that of a GTK widget, using GTK signals. widget with that of a GTK widget, using GTK signals.
\subsection*{Callbacks}
\label{sec:callbacks}
When an event is propagated by a widget, it is first used for event
analysis on that widget. The event analysis can then trigger a gesture
in the widget, which has to be handled by the application. To handle a
gesture, the widget should provide a callback mechanism: the
application binds a handler for a specific type of gesture to a widget.
When a gesture of that type is triggered after event analysis, the
widget triggers the callback.
\subsection*{Position of widget tree in architecture} \subsection*{Position of widget tree in architecture}
\widgetdiagram{Extension of the diagram from figure \widgetdiagram{Extension of the diagram from figure
...@@ -311,6 +356,12 @@ goal is to test the effectiveness of the design and detect its shortcomings. ...@@ -311,6 +356,12 @@ goal is to test the effectiveness of the design and detect its shortcomings.
\section{Event analysis} \section{Event analysis}
\label{sec:event-analysis} \label{sec:event-analysis}
% TODO: essentie moet zijn dat gesture trackers detection logic opdelen in
% behapbare stukken, en worden toegewezen aan een enkele area waardoor er
% meerdere trackers tegelijk kunnen draaien op verschillende delen v/h
% scherm. een formele definitie van gestures zou wellicht beter zijn, maar
% wordt niet gegeven in deze thesis (wel besproken in future work)
The events that are delegated to widgets must be analyzed in some way to The events that are delegated to widgets must be analyzed in some way to
gestures. This analysis is specific to the type of gesture being detected. gestures. This analysis is specific to the type of gesture being detected.
E.g. the detection of a ``tap'' gesture is very different from detection of E.g. the detection of a ``tap'' gesture is very different from detection of
...@@ -353,6 +404,7 @@ goal is to test the effectiveness of the design and detect its shortcomings. ...@@ -353,6 +404,7 @@ goal is to test the effectiveness of the design and detect its shortcomings.
The button is located inside an application window, which can be resized The button is located inside an application window, which can be resized
using pinch gestures. using pinch gestures.
% TODO: comments weg, in pseudocode opschrijven
\begin{verbatim} \begin{verbatim}
initialize GUI, creating a window initialize GUI, creating a window
...@@ -404,10 +456,11 @@ for details). ...@@ -404,10 +456,11 @@ for details).
\chapter{Suggestions for future work} \chapter{Suggestions for future work}
% TODO % TODO
% - network protocol (ZeroMQ) voor meerdere talen en simultane processen
% gebruik formele definitie van gestures in gesture trackers, bijv. state machine % - gebruik formelere definitie van gestures ipv expliciete detection logic,
% network protocol (ZeroMQ) voor meerdere talen en simultane processen % bijv. een state machine
% tussenlaag die widget tree synchroniseert met een applicatieframework % - volgende stap: maken van een library die meerdere drivers en complexe
% gestures bevat
\bibliographystyle{plain} \bibliographystyle{plain}
\bibliography{report}{} \bibliography{report}{}
...@@ -450,7 +503,8 @@ complex objects such as fiducials, arguments like rotational position and ...@@ -450,7 +503,8 @@ complex objects such as fiducials, arguments like rotational position and
acceleration are also included. acceleration are also included.
ALIVE and SET messages can be combined to create ``point down'', ``point move'' ALIVE and SET messages can be combined to create ``point down'', ``point move''
and ``point up'' events (as used by the \cite[.NET application]{win7touch}). and ``point up'' events (as used by the Windows 7 implementation
\cite{win7touch}).
TUIO coordinates range from $0.0$ to $1.0$, with $(0.0, 0.0)$ being the left TUIO coordinates range from $0.0$ to $1.0$, with $(0.0, 0.0)$ being the left
top corner of the screen and $(1.0, 1.0)$ the right bottom corner. To focus top corner of the screen and $(1.0, 1.0)$ the right bottom corner. To focus
......
Markdown is supported
0%
or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment