Skip to content
GitLab
Explore
Sign in
Primary navigation
Search or go to…
Project
M
multitouch
Manage
Activity
Members
Labels
Plan
Issues
Issue boards
Milestones
Wiki
Code
Merge requests
Repository
Branches
Commits
Tags
Repository graph
Compare revisions
Build
Pipelines
Jobs
Pipeline schedules
Artifacts
Deploy
Releases
Model registry
Operate
Environments
Monitor
Incidents
Analyze
Value stream analytics
Contributor analytics
CI/CD analytics
Repository analytics
Model experiments
Help
Help
Support
GitLab documentation
Compare GitLab plans
Community forum
Contribute to GitLab
Provide feedback
Keyboard shortcuts
?
Snippets
Groups
Projects
Show more breadcrumbs
Taddeüs Kroes
multitouch
Commits
2967cab6
Commit
2967cab6
authored
12 years ago
by
Taddeus Kroes
Browse files
Options
Downloads
Patches
Plain Diff
Worked on report.
parent
7d79785b
No related branches found
Branches containing commit
No related tags found
No related merge requests found
Changes
2
Hide whitespace changes
Inline
Side-by-side
Showing
2 changed files
docs/report.bib
+7
-1
7 additions, 1 deletion
docs/report.bib
docs/report.tex
+66
-61
66 additions, 61 deletions
docs/report.tex
with
73 additions
and
62 deletions
docs/report.bib
+
7
−
1
View file @
2967cab6
...
...
@@ -95,6 +95,13 @@
year
=
"2012"
}
@misc
{
GTK
,
author
=
"Mattis, Peter and team, the GTK+"
,
howpublished
=
"\url{http://www.mathematik.uni-ulm.de/help/gtk+-1.1.3/gtk.html}"
,
title
=
"{GIMP Toolkit}"
,
year
=
"1998"
}
@electronic
{
qt
,
added-at
=
"2012-04-05T10:52:23.000+0200"
,
author
=
"{Nokia Corp.}"
,
...
...
@@ -107,4 +114,3 @@
x-fetchedfrom
=
"Bibsonomy"
,
year
=
2012
}
This diff is collapsed.
Click to expand it.
docs/report.tex
+
66
−
61
View file @
2967cab6
...
...
@@ -2,7 +2,7 @@
\usepackage
[english]
{
babel
}
\usepackage
[utf8]
{
inputenc
}
\usepackage
{
hyperref,graphicx,float,tikz
,subfigure
}
\usepackage
{
hyperref,graphicx,float,tikz
}
% Link colors
\hypersetup
{
colorlinks=true,linkcolor=black,urlcolor=blue,citecolor=DarkGreen
}
...
...
@@ -32,13 +32,13 @@
% TODO: put Qt link in bibtex
Multi-touch devices enable a user to interact with software using intuitive
body
gestures, rather
than
with interaction tools like mouse and keyboard.
With the upcom
ing use of touch screens in phones and tablets, multi-touch
interaction is
becoming increasingly common.The driver of a touch device
provides low-level
events. The most basic representation of these low-level
event consists of
\emph
{
down
}
,
\emph
{
move
}
and
\emph
{
up
}
events.
hand
gestures, rather with interaction tools like mouse and keyboard.
With the
increas
ing use of touch screens in phones and tablets, multi-touch
interaction is
becoming increasingly common.The driver of a touch device
provides low-level
events. The most basic representation of these low-level
events consists of
\emph
{
down
}
,
\emph
{
move
}
and
\emph
{
up
}
events.
M
ulti-touch
gestures must be designed in such a way, that they can be
M
ore complex
gestures must be designed in such a way, that they can be
represented by a sequence of basic events. For example, a ``tap'' gesture can
be represented as a
\emph
{
down
}
event that is followed by an
\emph
{
up
}
event
within a certain time.
...
...
@@ -71,26 +71,16 @@ To design such an architecture properly, the following questions are relevant:
% TODO: zijn onderstaande nog relevant? beter omschrijven naar "Design"
% gerelateerde vragen?
\item
How can the architecture be used by different programming languages?
A generic architecture should not be limited to be used in only one
language.
\item
Can events be used by multiple processes at the same time? For
example, a network implementation could run as a service instead of
within a single application, triggering events in any application that
needs them.
A generic architecture should not be limited to one language.
\item
How can the architecture serve multiple applications at the same
time?
\end{itemize}
% Afbakening
The scope of this thesis includes the design of a generic multi-touch detection
architecture, a reference implementation of this design written in Python, and
the integration of the reference implementation in a test case application. To
test multi-touch interaction properly, a multi-touch device is required. The
University of Amsterdam (UvA) has provided access to a multi-touch table from
PQlabs. The table uses the TUIO protocol
\cite
{
TUIO
}
to communicate touch
events. See appendix
\ref
{
app:tuio
}
for details regarding the TUIO protocol.
The reference implementation is a Proof of Concept that translates TUIO
messages to some simple touch gestures (see appendix
\ref
{
app:implementation
}
for details).
architecture, a reference implementation of this design, and the integration of
the reference implementation in a test case application.
\section
{
Structure of this document
}
...
...
@@ -134,7 +124,9 @@ for details).
An important observation in this application is that different gestures are
detected by different gesture trackers, thus separating gesture detection
code into maintainable parts.
code into maintainable parts. The architecture has adopted this design
feature by also using different gesture trackers to track different gesture
types.
\section
{
Processing implementation of simple gestures in Android
}
...
...
@@ -142,14 +134,13 @@ for details).
gestures (tap, double tap, rotation, pinch and drag) using
Processing
\footnote
{
Processing is a Java-based development environment with
an export possibility for Android. See also
\url
{
http://processing.org/.
}}
can be found found in a forum on the Processing website
\cite
{
processingMT
}
. The implementation is fairly simple, but it yields
some very appealing results. The detection logic of all gestures is
combined in a single class. This does not allow for extendability, because
the complexity of this class would increase to an undesirable level (as
predicted by the GART article
\cite
{
GART
}
). However, the detection logic
itself is partially re-used in the reference implementation of the
generic gesture detection architecture.
can be found in a forum on the Processing website
\cite
{
processingMT
}
. The
implementation is fairly simple, but it yields some very appealing results.
The detection logic of all gestures is combined in a single class. This
does not allow for extendability, because the complexity of this class
would increase to an undesirable level (as predicted by the GART article
\cite
{
GART
}
). However, the detection logic itself is partially re-used in
the reference implementation of the generic gesture detection architecture.
\section
{
Analysis of related work
}
...
...
@@ -178,14 +169,13 @@ for details).
architecture as a diagram of relations between different components.
Sections
\ref
{
sec:driver-support
}
to
\ref
{
sec:event-analysis
}
define
requirements for the archtitecture, and extend the diagram with components
that meet these requirements. Section
\ref
{
sec:example
}
des
i
cribes an
that meet these requirements. Section
\ref
{
sec:example
}
describes an
example usage of the architecture in an application.
\subsection*
{
Position of architecture in software
}
The input of the architecture comes from some multi-touch device
driver. For example, the table used in the experiments uses the TUIO
protocol. The task of the architecture is to translate this input to
driver. The task of the architecture is to translate this input to
multi-touch gestures that are used by an application, as illustrated in
figure
\ref
{
fig:basicdiagram
}
. In the course of this chapter, the
diagram is extended with the different components of the architecture.
...
...
@@ -196,9 +186,9 @@ for details).
\section
{
Supporting multiple drivers
}
\label
{
sec:driver-support
}
The TUIO protocol is an example of a touch driver that can be
used by
multi-touch devices. Other drivers do exist, which should also be
supported
by the architecture. Therefore, there must be some translation of
The TUIO protocol
\cite
{
TUIO
}
is an example of a touch driver that can be
used by
multi-touch devices. Other drivers do exist, which should also be
supported
by the architecture. Therefore, there must be some translation of
driver-specific messages to a common format in the arcitecture. Messages in
this common format will be called
\emph
{
events
}
. Events can be translated
to multi-touch
\emph
{
gestures
}
. The most basic set of events is
...
...
@@ -206,9 +196,10 @@ for details).
object with only an (x, y) position on the screen.
A more extended set could also contain more complex events. An object can
also have a rotational property, like the ``fiducials'' type in the TUIO
protocol. This results in
$
\{
point
\_
down, point
\_
move,
\\
point
\_
up,
object
\_
down, object
\_
move, object
\_
up, object
\_
rotate
\}
$
.
also have a rotational property, like the ``fiducials''
\footnote
{
A fiducial
is a pattern used by some touch devices to identify objects.
}
type
in the TUIO protocol. This results in
$
\{
point
\_
down, point
\_
move,
\\
point
\_
up, object
\_
down, object
\_
move, object
\_
up, object
\_
rotate
\}
$
.
The component that translates driver-specific messages to events, is called
the
\emph
{
event driver
}
. The event driver runs in a loop, receiving and
...
...
@@ -224,15 +215,19 @@ for details).
\section
{
Restricting gestures to a screen area
}
An application programmer should be able to bind a gesture handler to some
element on the screen. For example, a button tap
\footnote
{
A ``tap'' gesture
is triggered when a touch object releases the screen within a certain time
and distance from the point where it initially touched the screen.
}
should
only occur on the button itself, and not in any other area of the screen. A
solution to this program is the use of
\emph
{
widgets
}
. The button from the
example can be represented as a rectangular widget with a position and
size. The position and size are compared with event coordinates to
determine whether an event should occur within the button.
Touch input devices are unaware of the graphical input widgets rendered on
screen and therefore generate events that simply identify the screen
location at which an event takes place. In order to be able to direct a
gesture to a particular widget on screen, an application programmer should
be able to bind a gesture handler to some element on the screen. For
example, a button tap
\footnote
{
A ``tap'' gesture is triggered when a touch
object releases the screen within a certain time and distance from the
point where it initially touched the screen.
}
should only occur on the
button itself, and not in any other area of the screen. A solution to this
problem is the use of
\emph
{
widgets
}
. The button from the example can be
represented as a rectangular widget with a position and size. The position
and size are compared with event coordinates to determine whether an event
should occur within the button.
\subsection*
{
Widget tree
}
...
...
@@ -257,7 +252,6 @@ for details).
analysis.
% TODO: insprired by JavaScript DOM
% TODO: add GTK to bibliography
Many GUI frameworks, like GTK
\cite
{
GTK
}
, also use a tree structure to
manage their widgets. This makes it easy to connect the architecture to
such a framework. For example, the programmer can define a
...
...
@@ -285,17 +279,16 @@ for details).
\label
{
sec:event-analysis
}
The events that are delegated to widgets must be analyzed in some way to
from gestures. This analysis is specific to the type of gesture being
detected. E.g. the detection of a ``tap'' gesture is very different from
detection of a ``rotate'' gesture. The
\cite
[.NET
implementation]
{
win7touch
}
separates the detection of different gestures
into different
\emph
{
gesture trackers
}
. This keeps the different pieces of
detection code managable and extandable. Therefore, the architecture also
uses gesture trackers to separate the analysis of events. A single gesture
tracker detects a specific set of gesture types, given a sequence of
events. An example of a possible gesture tracker implementation is a
``transformation tracker'' that detects rotation, scaling and translation
gestures.
gestures. This analysis is specific to the type of gesture being detected.
E.g. the detection of a ``tap'' gesture is very different from detection of
a ``rotate'' gesture. The implementation described in
\cite
{
win7touch
}
separates the detection of different gestures into different
\emph
{
gesture
trackers
}
. This keeps the different pieces of detection code managable and
extandable. Therefore, the architecture also uses gesture trackers to
separate the analysis of events. A single gesture tracker detects a
specific set of gesture types, given a sequence of events. An example of a
possible gesture tracker implementation is a ``transformation tracker''
that detects rotation, scaling and translation gestures.
\subsection*
{
Assignment of a gesture tracker to a widget
}
...
...
@@ -315,6 +308,10 @@ for details).
\ref
{
fig:widgetdiagram
}
, showing the position of gesture trackers in
the architecture.
}
\section
{
Serving multiple applications
}
% TODO
\section
{
Example usage
}
\label
{
sec:example
}
...
...
@@ -360,6 +357,14 @@ for details).
\chapter
{
Test applications
}
To test multi-touch interaction properly, a multi-touch device is required. The
University of Amsterdam (UvA) has provided access to a multi-touch table from
PQlabs. The table uses the TUIO protocol
\cite
{
TUIO
}
to communicate touch
events. See appendix
\ref
{
app:tuio
}
for details regarding the TUIO protocol.
The reference implementation is a Proof of Concept that translates TUIO
messages to some simple touch gestures (see appendix
\ref
{
app:implementation
}
for details).
% TODO
% testprogramma's met PyGame/Cairo
...
...
This diff is collapsed.
Click to expand it.
Preview
0%
Loading
Try again
or
attach a new file
.
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Save comment
Cancel
Please
register
or
sign in
to comment