Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Support
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in
Toggle navigation
M
multitouch
Project overview
Project overview
Details
Activity
Releases
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Issues
0
Issues
0
List
Boards
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Analytics
Analytics
CI / CD
Repository
Value Stream
Wiki
Wiki
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
Taddeüs Kroes
multitouch
Commits
3752d526
Commit
3752d526
authored
Jun 20, 2012
by
Taddeüs Kroes
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
Wrote section on second test application.
parent
4fdba920
Changes
3
Show whitespace changes
Inline
Side-by-side
Showing
3 changed files
with
134 additions
and
33 deletions
+134
-33
docs/data/diagrams.tex
docs/data/diagrams.tex
+61
-1
docs/data/testapp.png
docs/data/testapp.png
+0
-0
docs/report.tex
docs/report.tex
+73
-32
No files found.
docs/data/diagrams.tex
View file @
3752d526
...
@@ -198,7 +198,7 @@
...
@@ -198,7 +198,7 @@
}
}
\caption
{
Diagram representation of an extended example, showing the
\caption
{
Diagram representation of an extended example, showing the
flow of events and gestures in the architecture. The root area represents
flow of events and gestures in the architecture. The root area represents
an application window
s
that can be resized using
\emph
{
pinch
}
gestures.
an application window that can be resized using
\emph
{
pinch
}
gestures.
The window contains a draggable circle, and a button that listens to
The window contains a draggable circle, and a button that listens to
\emph
{
tap
}
gestures. Dotted arrows represent a flow of gestures, regular
\emph
{
tap
}
gestures. Dotted arrows represent a flow of gestures, regular
arrows represent events (unless labeled otherwise).
}
arrows represent events (unless labeled otherwise).
}
...
@@ -346,3 +346,63 @@
...
@@ -346,3 +346,63 @@
\label
{
fig:daemon
}
\label
{
fig:daemon
}
\end{figure}
\end{figure}
}
}
\def\testappdiagram
{
\begin{figure}
[h!]
\center
\architecture
{
\node
[block, below of=driver]
(eventdriver)
{
Event driver
}
edge[linefrom] node[right, near end]
{
driver-specific messages
}
(driver);
\node
[block, below of=eventdriver]
(rootarea)
{
Screen area
}
edge[linefrom] (eventdriver);
\node
[block, below of=rootarea, xshift=-5em]
(appwindow)
{
Application window area
}
edge[lineto, <->] (rootarea);
\node
[block, left of=appwindow, xshift=-4em, text width=7em]
{
Transformation tracker
}
edge[lineto, dotted, bend right=10] (appwindow)
edge[linefrom, bend left=10] (appwindow);
\node
[block, below of=rootarea, xshift=5em]
(overlay)
{
Overlay area
}
edge[lineto, <->] (rootarea);
\node
[block, right of=overlay, xshift=4em]
(tracker)
{
Hand tracker
}
edge[lineto, dotted, bend left=10] (overlay)
edge[linefrom, bend right=10] (overlay);
\node
[block, below of=appwindow, xshift=-5em]
(rectangle)
{
Rectangle area
}
edge[lineto, <->] (appwindow);
\node
[block, left of=rectangle, xshift=-4em, yshift=2em, text width=7em]
(recttracker)
{
Transformation tracker
}
edge[lineto, dotted, bend left=10] (rectangle)
edge[linefrom, bend right=10] (rectangle);
\node
[block, left of=rectangle, xshift=-4em, yshift=-2em, text width=7em]
{
Tap tracker
}
edge[lineto, dotted, bend right=10] (rectangle)
edge[linefrom, bend left=10] (rectangle);
\node
[block, below of=appwindow, xshift=5em]
(triangle)
{
Triangle area
}
edge[lineto, <->] (appwindow);
\node
[block, right of=triangle, xshift=4em, yshift=2em, text width=7em]
{
Transformation tracker
}
edge[lineto, dotted, bend right=10] (triangle)
edge[linefrom, bend left=10] (triangle);
\node
[block, right of=triangle, xshift=4em, yshift=-2em, text width=7em]
(taptracker)
{
Tap tracker
}
edge[lineto, dotted, bend left=10] (triangle)
edge[linefrom, bend right=10] (triangle);
\node
[block, below of=rootarea, yshift=-12em]
{
Application
}
edge[linefrom, dotted, bend left=25] (appwindow)
edge[linefrom, dotted] (rectangle)
edge[linefrom, dotted] (triangle)
edge[linefrom, dotted, bend right=25] (overlay);
\group
{
recttracker
}{
eventdriver
}{
tracker
}{
taptracker
}{
Architecture
}
}
\caption
{
Diagram representation of the second test application. A full
screen event area contains an application window and a full screen
overlay. The application window contains a rectangle and a triangle.
the application window and its children can be transformed, and thus
each have ``transformation tracker''. The rectangle and triangle also
have a ``tap tracker'' that detects double tap gestures. Dotted arrows
represent a flow of gestures, regular arrows represent events (unless
labeled otherwise).
}
\label
{
fig:testappdiagram
}
\end{figure}
}
docs/data/testapp.png
0 → 100644
View file @
3752d526
27.7 KB
docs/report.tex
View file @
3752d526
...
@@ -271,7 +271,7 @@ detection for every new gesture-based application.
...
@@ -271,7 +271,7 @@ detection for every new gesture-based application.
therefore generate events that simply identify the screen location at which
therefore generate events that simply identify the screen location at which
an event takes place. User interfaces of applications that do not run in
an event takes place. User interfaces of applications that do not run in
full screen modus are contained in a window. Events which occur outside the
full screen modus are contained in a window. Events which occur outside the
application window should not be handled by the
program
in most cases.
application window should not be handled by the
application
in most cases.
What's more, widget within the application window itself should be able to
What's more, widget within the application window itself should be able to
respond to different gestures. E.g. a button widget may respond to a
respond to different gestures. E.g. a button widget may respond to a
``tap'' gesture to be activated, whereas the application window responds to
``tap'' gesture to be activated, whereas the application window responds to
...
@@ -561,8 +561,9 @@ start the GUI main loop in the current thread
...
@@ -561,8 +561,9 @@ start the GUI main loop in the current thread
A reference implementation of the design has been written in Python. Two test
A reference implementation of the design has been written in Python. Two test
applications have been created to test if the design ``works'' in a practical
applications have been created to test if the design ``works'' in a practical
application, and to detect its flaws. One application is mainly used to test
application, and to detect its flaws. One application is mainly used to test
the gesture tracker implementations. The other program uses multiple event
the gesture tracker implementations. The other application uses multiple event
areas in a tree structure, demonstrating event delegation and propagation.
areas in a tree structure, demonstrating event delegation and propagation. Teh
second application also defines a custom gesture tracker.
To test multi-touch interaction properly, a multi-touch device is required. The
To test multi-touch interaction properly, a multi-touch device is required. The
University of Amsterdam (UvA) has provided access to a multi-touch table from
University of Amsterdam (UvA) has provided access to a multi-touch table from
...
@@ -585,14 +586,6 @@ The reference implementation is written in Python and available at
...
@@ -585,14 +586,6 @@ The reference implementation is written in Python and available at
$
(
x, y
)
$
position.
$
(
x, y
)
$
position.
\end{itemize}
\end{itemize}
\textbf
{
Gesture trackers
}
\begin{itemize}
\item
Basic tracker, supports
$
point
\_
down,~point
\_
move,~point
\_
up
$
gestures.
\item
Tap tracker, supports
$
tap,~single
\_
tap,~double
\_
tap
$
gestures.
\item
Transformation tracker, supports
$
rotate,~pinch,~drag
$
gestures.
\item
Hand tracker, supports
$
hand
\_
down,~hand
\_
up
$
gestures.
\end{itemize}
\textbf
{
Event areas
}
\textbf
{
Event areas
}
\begin{itemize}
\begin{itemize}
\item
Circular area
\item
Circular area
...
@@ -601,6 +594,13 @@ The reference implementation is written in Python and available at
...
@@ -601,6 +594,13 @@ The reference implementation is written in Python and available at
\item
Full screen area
\item
Full screen area
\end{itemize}
\end{itemize}
\textbf
{
Gesture trackers
}
\begin{itemize}
\item
Basic tracker, supports
$
point
\_
down,~point
\_
move,~point
\_
up
$
gestures.
\item
Tap tracker, supports
$
tap,~single
\_
tap,~double
\_
tap
$
gestures.
\item
Transformation tracker, supports
$
rotate,~pinch,~drag,~flick
$
gestures.
\end{itemize}
The implementation does not include a network protocol to support the daemon
The implementation does not include a network protocol to support the daemon
setup as described in section
\ref
{
sec:daemon
}
. Therefore, it is only usable in
setup as described in section
\ref
{
sec:daemon
}
. Therefore, it is only usable in
Python programs. The two test programs are also written in Python.
Python programs. The two test programs are also written in Python.
...
@@ -611,11 +611,11 @@ have been implemented using an imperative programming style. Technical details
...
@@ -611,11 +611,11 @@ have been implemented using an imperative programming style. Technical details
about the implementation of gesture detection are described in appendix
about the implementation of gesture detection are described in appendix
\ref
{
app:implementation-details
}
.
\ref
{
app:implementation-details
}
.
\section
{
Full screen Pygame
program
}
\section
{
Full screen Pygame
application
}
%The goal of this
program
was to experiment with the TUIO
%The goal of this
application
was to experiment with the TUIO
%protocol, and to discover requirements for the architecture that was to be
%protocol, and to discover requirements for the architecture that was to be
%designed. When the architecture design was completed, the
program
was rewritten
%designed. When the architecture design was completed, the
application
was rewritten
%using the new architecture components. The original variant is still available
%using the new architecture components. The original variant is still available
%in the ``experimental'' folder of the Git repository \cite{gitrepos}.
%in the ``experimental'' folder of the Git repository \cite{gitrepos}.
...
@@ -623,10 +623,10 @@ An implementation of the detection of some simple multi-touch gestures (single
...
@@ -623,10 +623,10 @@ An implementation of the detection of some simple multi-touch gestures (single
tap, double tap, rotation, pinch and drag) using Processing
\footnote
{
Processing
tap, double tap, rotation, pinch and drag) using Processing
\footnote
{
Processing
is a Java-based programming environment with an export possibility for Android.
is a Java-based programming environment with an export possibility for Android.
See also
\cite
{
processing
}
.
}
can be found in a forum on the Processing website
See also
\cite
{
processing
}
.
}
can be found in a forum on the Processing website
\cite
{
processingMT
}
. The
program
has been ported to Python and adapted to
\cite
{
processingMT
}
. The
application
has been ported to Python and adapted to
receive input from the TUIO protocol. The implementation is fairly simple, but
receive input from the TUIO protocol. The implementation is fairly simple, but
it yields some appealing results (see figure
\ref
{
fig:draw
}
). In the original
it yields some appealing results (see figure
\ref
{
fig:draw
}
). In the original
program
, the detection logic of all gestures is combined in a single class
application
, the detection logic of all gestures is combined in a single class
file. As predicted by the GART article
\cite
{
GART
}
, this leads to over-complex
file. As predicted by the GART article
\cite
{
GART
}
, this leads to over-complex
code that is difficult to read and debug.
code that is difficult to read and debug.
...
@@ -635,16 +635,13 @@ architecture. The detection code is separated into two different gesture
...
@@ -635,16 +635,13 @@ architecture. The detection code is separated into two different gesture
trackers, which are the ``tap'' and ``transformation'' trackers mentioned in
trackers, which are the ``tap'' and ``transformation'' trackers mentioned in
section
\ref
{
sec:implementation
}
.
section
\ref
{
sec:implementation
}
.
The application receives TUIO events and translates them to
\emph
{
point
\_
down
}
,
The positions of all touch objects and their centroid are drawn using the
\emph
{
point
\_
move
}
and
\emph
{
point
\_
up
}
events. These events are then
interpreted to be
\emph
{
single tap
}
,
\emph
{
double tap
}
,
\emph
{
rotation
}
or
\emph
{
pinch
}
gestures. The positions of all touch objects are drawn using the
Pygame library. Since the Pygame library does not provide support to find the
Pygame library. Since the Pygame library does not provide support to find the
location of the display window, the root event area captures events in the
location of the display window, the root event area captures events in the
entire screens surface. The application can be run either full screen or in
entire screens surface. The application can be run either full screen or in
windowed mode. If windowed, screen-wide gesture coordinates are mapped to the
windowed mode. If windowed, screen-wide gesture coordinates are mapped to the
size of the Pyame window. In other words, the Pygame window always represents
size of the Pyame window. In other words, the Pygame window always represents
the entire touch surface. The output of the
program
can be seen in figure
the entire touch surface. The output of the
application
can be seen in figure
\ref
{
fig:draw
}
.
\ref
{
fig:draw
}
.
\begin{figure}
[h!]
\begin{figure}
[h!]
...
@@ -652,26 +649,70 @@ the entire touch surface. The output of the program can be seen in figure
...
@@ -652,26 +649,70 @@ the entire touch surface. The output of the program can be seen in figure
\includegraphics
[scale=0.4]
{
data/pygame
_
draw.png
}
\includegraphics
[scale=0.4]
{
data/pygame
_
draw.png
}
\caption
{
Output of the experimental drawing program. It draws all touch
\caption
{
Output of the experimental drawing program. It draws all touch
points and their centroid on the screen (the centroid is used for rotation
points and their centroid on the screen (the centroid is used for rotation
and pinch detection). It also draws a green rectangle which
and pinch detection). It also draws a green rectangle which responds to
rotation and pinch events.
}
\label
{
fig:draw
}
\label
{
fig:draw
}
responds to rotation and pinch events.
}
\end{figure}
\end{figure}
\section
{
GTK
/Cairo program
}
\section
{
GTK
+/Cairo application
}
The second test application uses the GIMP toolkit (GTK+)
\cite
{
GTK
}
to create
The second test application uses the GIMP toolkit (GTK+)
\cite
{
GTK
}
to create
its user interface. Since GTK+ defines a main event loop that is started in
its user interface. Since GTK+ defines a main event loop that is started in
order to use the interface, the architecture implementation runs in a separate
order to use the interface, the architecture implementation runs in a separate
thread. The application creates a main window, whose size and position are
thread.
synchronized with the root event area of the architecture.
The application creates a main window, whose size and position are synchronized
% TODO
with the root event area of the architecture. The synchronization is handled
\emph
{
TODO: uitbreiden en screenshots erbij (dit programma is nog niet af)
}
automatically by a
\texttt
{
GtkEventWindow
}
object, which is a subclass of
\texttt
{
gtk.Window
}
. This object serves as a layer that connects the event area
functionality of the architecture to GTK+ windows.
The main window contains a number of polygons which can be dragged, resized and
rotated. Each polygon is represented by another event area to allow
simultaneous interaction with different polygons. The main window also responds
to transformation, by transforming all polygons. Additionally, double tapping
on a polygon changes its color.
An ``overlay'' event area is used to detect all fingers currently touching the
screen. The application defines a custom gesture tracker, called the ``hand
tracker'', which is used by the overlay. The hand tracker uses distances
between detected fingers to detect which fingers belong to the same hand. The
application draws a line from each finger to the hand it belongs to, as visible
in figure
\ref
{
fig:testapp
}
.
\section
{
Discussion
}
\begin{figure}
[h!]
\center
\includegraphics
[scale=0.35]
{
data/testapp.png
}
\caption
{
Screenshot of the second test application. Two polygons can be
dragged, rotated and scaled. Separate groups of fingers are recognized as
hands, each hand is drawn as a centroid with a line to each finger.
}
\label
{
fig:testapp
}
\end{figure}
% TODO
To manage the propagation of events used for transformations, the applications
\emph
{
TODO: Tekortkomingen aangeven die naar voren komen uit de tests
}
arranges its event areas in a tree structure as described in section
\ref
{
sec:tree
}
. Each transformable event area has its own ``transformation
tracker'', which stops the propagation of events used for transformation
gestures. Because the propagation of these events is stopped, overlapping
polygons do not cause a problem. Figure
\ref
{
fig:testappdiagram
}
shows the tree
structure used by the application.
Note that the overlay event area, though covering the whole screen surface, is
not the root event area. The overlay event area is placed on top of the
application window (being a rightmost sibling of the application window event
area in the tree). This is necessary, because the transformation trackers stop
event propagation. The hand tracker needs to capture all events to be able to
give an accurate representations of all fingers touching the screen Therefore,
the overlay should delegate events to the hand tracker before they are stopped
by a transformation tracker. Placing the overlay over the application window
forces the screen event area to delegate events to the overlay event area
first.
\testappdiagram
%\section{Discussion}
%
%\emph{TODO: Tekortkomingen aangeven die naar voren komen uit de tests}
% Verschillende apparaten/drivers geven een ander soort primitieve events af.
% Verschillende apparaten/drivers geven een ander soort primitieve events af.
% Een vertaling van deze device-specifieke events naar een algemeen formaat van
% Een vertaling van deze device-specifieke events naar een algemeen formaat van
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment