Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Support
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in
Toggle navigation
M
multitouch
Project overview
Project overview
Details
Activity
Releases
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Issues
0
Issues
0
List
Boards
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Analytics
Analytics
CI / CD
Repository
Value Stream
Wiki
Wiki
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
Taddeüs Kroes
multitouch
Commits
7b535751
Commit
7b535751
authored
Jun 25, 2012
by
Taddeüs Kroes
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
Addressed some feedback comments on report.
parent
86f81f02
Changes
4
Hide whitespace changes
Inline
Side-by-side
Showing
4 changed files
with
90 additions
and
140 deletions
+90
-140
TODO.txt
TODO.txt
+1
-0
docs/data/diagrams.tex
docs/data/diagrams.tex
+37
-92
docs/report.bib
docs/report.bib
+16
-0
docs/report.tex
docs/report.tex
+36
-48
No files found.
TODO.txt
View file @
7b535751
...
@@ -4,3 +4,4 @@ Code:
...
@@ -4,3 +4,4 @@ Code:
Report/appendix reference gesture detection:
Report/appendix reference gesture detection:
- Point_leave(+point_enter) kan niet -> flaw v/h systeem/driver?
- Point_leave(+point_enter) kan niet -> flaw v/h systeem/driver?
- "gesture detection component" -> "gesture tracker"
docs/data/diagrams.tex
View file @
7b535751
...
@@ -43,49 +43,38 @@
...
@@ -43,49 +43,38 @@
]
]
\newcommand
{
\architecture
}
[1]
{
\newcommand
{
\architecture
}
[1]
{
\begin{tikzpicture}
[node distance=6em, auto]
\begin{tikzpicture}
[node distance=6em, auto]
\node
[block]
(driver)
{
Driver
}
;
\node
[block]
(driver)
{
D
evice d
river
}
;
#1
#1
\end{tikzpicture}
\end{tikzpicture}
}
}
\def\
basic
diagram
{
\def\
full
diagram
{
\begin{figure}
[h]
\begin{figure}
[h
!
]
\center
\center
\architecture
{
\architecture
{
\node
[block, dashed, below of=driver]
(arch)
{
Architecture
}
\node
[block, below of=driver]
(eventdriver)
{
Event driver
}
edge[linefrom] node[right]
{
driver-specific messages
}
(driver);
edge[linefrom] node[right, near end]
{
device-specific messages
}
(driver);
\node
[block, below of=arch]
{
Application
}
\node
[block, below of=eventdriver]
(area)
{
Event areas
}
edge[linefrom] node[right]
{
gestures
}
(arch);
edge[linefrom] node[right]
{
1
}
(eventdriver);
}
\node
[block, right of=area, xshift=7em]
(tracker)
{
Gesture trackers
}
\caption
{
A diagram showing the position of the architecture relative to
edge[linefrom, bend right=10] node[below=2pt]
{
2
}
(area)
the device driver and a multi-touch application. The input of the
edge[lineto, bend left=10, dotted] (area);
architecture is given by a touch device driver. The output is
\node
[block, below of=area]
{
Application
}
translated to complex interaction gestures and passed to the
edge[linefrom, dotted] node[right, near start]
{
3
}
(area);
application that is using the architecture.
}
\label
{
fig:basicdiagram
}
\end{figure}
}
\def\driverdiagram
{
\group
{
eventdriver
}{
eventdriver
}{
tracker
}{
area
}{
Architecture
}
\begin{figure}
[H]
}
\center
\caption
{
\architecture
{
Components of the architecture design. The
\emph
{
event driver
}
\node
[block, below of=driver]
(eventdriver)
{
Event driver
}
translates device-specific messages to low-level ``events''. These
edge[linefrom] node[right, near end]
{
driver-specific messages
}
(driver);
events are delegated to a number of
\emph
{
event areas
}
(1), which
\node
[block, below of=eventdriver, dashed]
(analysis)
{
Event analysis
}
restrict events to an area on the screen.
\emph
{
Gesture trackers
}
edge[linefrom] node[right]
{
events
}
(eventdriver);
translate low-level events to high-level ``gestures'' (2), which
\node
[block, below of=analysis]
{
Application
}
are handled by the application (3). Dotted arrows represent a flow
edge[linefrom] node[right, near start]
{
gestures
}
(analysis);
of gestures, regular arrows represent events (unless labeled
otherwise).
\node
[right of=eventdriver, xshift=2em]
(dummy)
{}
;
}
\group
{
eventdriver
}{
eventdriver
}{
dummy
}{
analysis
}{
Architecture
}
\label
{
fig:fulldiagram
}
}
\caption
{
Extension of the diagram from figure
\ref
{
fig:basicdiagram
}
,
showing the position of the event driver in the architecture. The
event driver translates driver-specific to a common set of events,
which are delegated to analysis components that will interpret them
as more complex gestures.
}
\label
{
fig:driverdiagram
}
\end{figure}
\end{figure}
}
}
...
@@ -93,81 +82,37 @@
...
@@ -93,81 +82,37 @@
\begin{figure}
[H]
\begin{figure}
[H]
\center
\center
\begin{tikzpicture}
[node distance=6em]
\begin{tikzpicture}
[node distance=6em]
\node
[block]
(driver)
{
Driver
}
;
\node
[block]
(driver)
{
D
evice d
river
}
;
\node
[block, below of=driver]
(eventdriver)
{
Event driver
}
\node
[block, below of=driver]
(eventdriver)
{
Event driver
}
edge[linefrom] (driver);
edge[linefrom] (driver);
\node
[block, right of=driver, xshift=2em]
(seconddriver)
{
Driver
}
;
\node
[block, right of=driver, xshift=2em]
(seconddriver)
{
D
evice d
river
}
;
\node
[block, below of=seconddriver]
(secondeventdriver)
{
Event driver
}
\node
[block, below of=seconddriver]
(secondeventdriver)
{
Event driver
}
edge[linefrom] node[right, near end]
{
d
river
-specific messages
}
(seconddriver);
edge[linefrom] node[right, near end]
{
d
evice
-specific messages
}
(seconddriver);
\node
[block, below of=eventdriver
, dashed]
(analysis)
{
Event analysi
s
}
\node
[block, below of=eventdriver
]
(areas)
{
Event area
s
}
edge[linefrom] (eventdriver)
edge[linefrom] (eventdriver)
edge[linefrom] node[right=5pt]
{
events
}
(secondeventdriver);
edge[linefrom] node[right=5pt]
{
events
}
(secondeventdriver);
\node
[block, below of=analysis]
{
Application
}
\node
[block, right of=area, xshift=7em]
(tracker)
{
Gesture trackers
}
edge[linefrom] node[right, near start]
{
gestures
}
(analysis);
edge[linefrom, bend right=10] (areas)
edge[lineto, bend left=10, dotted] (areas);
\node
[block, below of=areas]
{
Application
}
edge[linefrom, dotted] node[right, near start]
{
gestures
}
(areas);
\node
[right of=seconddriver, xshift=2em]
(dummy)
{}
;
\node
[right of=seconddriver, xshift=2em]
(dummy)
{}
;
\group
{
eventdriver
}{
eventdriver
}{
dummy
}{
a
nalysi
s
}{
Architecture
}
\group
{
eventdriver
}{
eventdriver
}{
dummy
}{
a
rea
s
}{
Architecture
}
\end{tikzpicture}
\end{tikzpicture}
\caption
{
Multiple event drivers running simultaneously.
}
\caption
{
Multiple event drivers running simultaneously.
}
\label
{
fig:multipledrivers
}
\label
{
fig:multipledrivers
}
\end{figure}
\end{figure}
}
}
\def\areadiagram
{
\begin{figure}
[h]
\center
\architecture
{
\node
[block, below of=driver]
(eventdriver)
{
Event driver
}
edge[linefrom] node[right, near end]
{
driver-specific messages
}
(driver);
\node
[block, below of=eventdriver]
(area)
{
Event areas
}
edge[linefrom] node[right]
{
events
}
(eventdriver);
\node
[block, right of=area, xshift=7em, dashed]
(analysis)
{
Gesture detection
}
edge[linefrom, bend right=10] node[above]
{
events
}
(area)
edge[lineto, bend left=10] node[]
{
gestures
}
(area);
\node
[block, below of=area]
{
Application
}
edge[linefrom] node[right, near start]
{
gestures through callback function
}
(area);
\group
{
eventdriver
}{
eventdriver
}{
analysis
}{
area
}{
Architecture
}
}
\caption
{
Extension of the diagram from figure
\ref
{
fig:driverdiagram
}
,
with event areas. An event area delegates events to a gesture detection
component that triggers a gesture. The event area then calls the
handlers that are bound to the gesture type by the application.
}
\label
{
fig:areadiagram
}
\end{figure}
}
\def\trackerdiagram
{
\begin{figure}
[h!]
\center
\architecture
{
\node
[block, below of=driver]
(eventdriver)
{
Event driver
}
edge[linefrom] node[right, near end]
{
driver-specific messages
}
(driver);
\node
[block, below of=eventdriver]
(area)
{
Event area tree
}
edge[linefrom] node[right]
{
events
}
(eventdriver);
\node
[block, right of=area, xshift=7em]
(tracker)
{
Gesture trackers
}
edge[linefrom, bend right=10] node[above]
{
events
}
(area)
edge[lineto, bend left=10] node[]
{
gestures
}
(area);
\node
[block, below of=area]
{
Application
}
edge[linefrom] node[right, near start]
{
gestures
}
(area);
\group
{
eventdriver
}{
eventdriver
}{
tracker
}{
area
}{
Architecture
}
}
\caption
{
Extension of the diagram from figure
\ref
{
fig:areadiagram
}
with gesture trackers. Gesture trackers analyze detect high-level
gestures from low-level events.
}
\label
{
fig:trackerdiagram
}
\end{figure}
}
\def\examplediagram
{
\def\examplediagram
{
\begin{figure}
[h!]
\begin{figure}
[h!]
\center
\center
\architecture
{
\architecture
{
\node
[block, below of=driver]
(eventdriver)
{
Event driver
}
\node
[block, below of=driver]
(eventdriver)
{
Event driver
}
edge[linefrom] node[right, near end]
{
d
river
-specific messages
}
(driver);
edge[linefrom] node[right, near end]
{
d
evice
-specific messages
}
(driver);
\node
[block, below of=eventdriver]
(rootarea)
{
Root area
}
\node
[block, below of=eventdriver]
(rootarea)
{
Root area
}
edge[linefrom] (eventdriver);
edge[linefrom] (eventdriver);
...
@@ -352,7 +297,7 @@
...
@@ -352,7 +297,7 @@
\center
\center
\architecture
{
\architecture
{
\node
[block, below of=driver]
(eventdriver)
{
Event driver
}
\node
[block, below of=driver]
(eventdriver)
{
Event driver
}
edge[linefrom] node[right, near end]
{
d
river
-specific messages
}
(driver);
edge[linefrom] node[right, near end]
{
d
evice
-specific messages
}
(driver);
\node
[block, below of=eventdriver]
(rootarea)
{
Screen area
}
\node
[block, below of=eventdriver]
(rootarea)
{
Screen area
}
edge[linefrom] (eventdriver);
edge[linefrom] (eventdriver);
...
...
docs/report.bib
View file @
7b535751
...
@@ -230,3 +230,19 @@
...
@@ -230,3 +230,19 @@
year = 2010
year = 2010
}
}
@article{PIP,
added-at = "2011-12-05T00:00:00.000+0100",
author = "Sutherland, Ivan E. and Sproull, Robert F. and Schumacker, Robert A.",
interhash = "7c3ac13951889d07f968ca7c0398c34d",
intrahash = "2bfef4fbc31892de2ab1bf8607514e2b",
journal = "ACM Comput. Surv.",
keywords = "dblp",
number = 1,
pages = "13-16",
title = "{A Characterization of Ten Hidden-Surface Algorithms.}",
url = "http://dblp.uni-trier.de/db/journals/csur/csur6.html#SutherlandSS74; http://doi.acm.org/10.1145/356625.356626; http://www.bibsonomy.org/bibtex/22bfef4fbc31892de2ab1bf8607514e2b/dblp",
volume = 6,
x-fetchedfrom = "Bibsonomy",
year = 1974
}
docs/report.tex
View file @
7b535751
...
@@ -194,19 +194,12 @@ detection for every new gesture-based application.
...
@@ -194,19 +194,12 @@ detection for every new gesture-based application.
at the same time.
at the same time.
This chapter describes a design for such an architecture. The architecture
This chapter describes a design for such an architecture. The architecture
is represented as diagram of relations between different components.
components are shown by figure
\ref
{
fig:fulldiagram
}
. Sections
Sections
\ref
{
sec:multipledrivers
}
to
\ref
{
sec:daemon
}
define requirements
\ref
{
sec:multipledrivers
}
to
\ref
{
sec:daemon
}
explain the use of all
for the architecture, and extend this diagram with components that meet
components in detail.
these requirements. Section
\ref
{
sec:example
}
describes an example usage of
the architecture in an application.
The input of the architecture comes from a multi-touch device driver.
\fulldiagram
The task of the architecture is to translate this input to multi-touch
\newpage
gestures that are used by an application, as illustrated in figure
\ref
{
fig:basicdiagram
}
. In the course of this chapter, the diagram is
extended with the different components of the architecture.
\basicdiagram
\section
{
Supporting multiple drivers
}
\section
{
Supporting multiple drivers
}
\label
{
sec:multipledrivers
}
\label
{
sec:multipledrivers
}
...
@@ -216,10 +209,10 @@ detection for every new gesture-based application.
...
@@ -216,10 +209,10 @@ detection for every new gesture-based application.
low-level touch events (see appendix
\ref
{
app:tuio
}
for more details).
low-level touch events (see appendix
\ref
{
app:tuio
}
for more details).
These messages are specific to the API of the TUIO protocol. Other drivers
These messages are specific to the API of the TUIO protocol. Other drivers
may use different messages types. To support more than one driver in the
may use different messages types. To support more than one driver in the
architecture, there must be some translation from d
river
-specific messages
architecture, there must be some translation from d
evice
-specific messages
to a common format for primitive touch events. After all, the gesture
to a common format for primitive touch events. After all, the gesture
detection logic in a ``generic'' architecture should not be implemented
detection logic in a ``generic'' architecture should not be implemented
based on d
river
-specific messages. The event types in this format should be
based on d
evice
-specific messages. The event types in this format should be
chosen so that multiple drivers can trigger the same events. If each
chosen so that multiple drivers can trigger the same events. If each
supported driver would add its own set of event types to the common format,
supported driver would add its own set of event types to the common format,
the purpose of it being ``common'' would be defeated.
the purpose of it being ``common'' would be defeated.
...
@@ -237,14 +230,11 @@ detection for every new gesture-based application.
...
@@ -237,14 +230,11 @@ detection for every new gesture-based application.
TUIO protocol. Another driver that can keep apart rotated objects from
TUIO protocol. Another driver that can keep apart rotated objects from
simple touch points could also trigger them.
simple touch points could also trigger them.
The component that translates d
river
-specific messages to common events,
The component that translates d
evice
-specific messages to common events,
will be called the
\emph
{
event driver
}
. The event driver runs in a loop,
will be called the
\emph
{
event driver
}
. The event driver runs in a loop,
receiving and analyzing driver messages. When a sequence of messages is
receiving and analyzing driver messages. When a sequence of messages is
analyzed as an event, the event driver delegates the event to other
analyzed as an event, the event driver delegates the event to other
components in the architecture for translation to gestures. This
components in the architecture for translation to gestures.
communication flow is illustrated in figure
\ref
{
fig:driverdiagram
}
.
\driverdiagram
Support for a touch driver can be added by adding an event driver
Support for a touch driver can be added by adding an event driver
implementation. The choice of event driver implementation that is used in an
implementation. The choice of event driver implementation that is used in an
...
@@ -277,13 +267,13 @@ detection for every new gesture-based application.
...
@@ -277,13 +267,13 @@ detection for every new gesture-based application.
the architecture should offer a solution to this problem, or leave the task
the architecture should offer a solution to this problem, or leave the task
of assigning gestures to application widgets to the application developer.
of assigning gestures to application widgets to the application developer.
If the architecture does not provide a solution, the ``
Event analysis
''
If the architecture does not provide a solution, the ``
gesture detection
''
component in figure
\ref
{
fig:
multipledrivers
}
receives all events that
component in figure
\ref
{
fig:
fulldiagram
}
receives all events that occur on
occur on the screen surface. The gesture detection logic thus uses all
the screen surface. The gesture detection logic thus uses all events as
events as input to detect a gesture. This leaves no possibility for a
input to detect a gesture. This leaves no possibility for a gesture to
gesture to occur at multiple screen positions at the same time. The problem
occur at multiple screen positions at the same time. The problem is
i
s illustrated in figure
\ref
{
fig:ex1
}
, where two widgets on the screen can
i
llustrated in figure
\ref
{
fig:ex1
}
, where two widgets on the screen can be
be
rotated independently. The rotation detection component that detects
rotated independently. The rotation detection component that detects
rotation gestures receives all four fingers as input. If the two groups of
rotation gestures receives all four fingers as input. If the two groups of
finger events are not separated by cluster detection, only one rotation
finger events are not separated by cluster detection, only one rotation
event will occur.
event will occur.
...
@@ -304,7 +294,7 @@ detection for every new gesture-based application.
...
@@ -304,7 +294,7 @@ detection for every new gesture-based application.
covered by a widget, before passing them on to a gesture detection
covered by a widget, before passing them on to a gesture detection
component. Different gesture detection components can then detect gestures
component. Different gesture detection components can then detect gestures
simultaneously, based on different sets of input events. An area of the
simultaneously, based on different sets of input events. An area of the
screen surface
will be
represented by an
\emph
{
event area
}
. An event area
screen surface
is
represented by an
\emph
{
event area
}
. An event area
filters input events based on their location, and then delegates events to
filters input events based on their location, and then delegates events to
gesture detection components that are assigned to the event area. Events
gesture detection components that are assigned to the event area. Events
which are located outside the event area are not delegated to its gesture
which are located outside the event area are not delegated to its gesture
...
@@ -312,7 +302,11 @@ detection for every new gesture-based application.
...
@@ -312,7 +302,11 @@ detection for every new gesture-based application.
In the example of figure
\ref
{
fig:ex1
}
, the two rotatable widgets can be
In the example of figure
\ref
{
fig:ex1
}
, the two rotatable widgets can be
represented by two event areas, each having a different rotation detection
represented by two event areas, each having a different rotation detection
component.
component. Each event area can consist of four corner locations of the
square it represents. To detect whether an event is located inside a
square, the event areas use a point-in-polygon (PIP) test
\cite
{
PIP
}
. It is
the task of the client application to update the corner locations of the
event area with those of the widget.
\subsection
{
Callback mechanism
}
\subsection
{
Callback mechanism
}
...
@@ -324,10 +318,6 @@ detection for every new gesture-based application.
...
@@ -324,10 +318,6 @@ detection for every new gesture-based application.
callback mechanism to handle gestures in an application. Callback handlers
callback mechanism to handle gestures in an application. Callback handlers
are bound to event areas, since events areas controls the grouping of
are bound to event areas, since events areas controls the grouping of
events and thus the occurrence of gestures in an area of the screen.
events and thus the occurrence of gestures in an area of the screen.
Figure
\ref
{
fig:areadiagram
}
shows the position of areas in the
architecture.
\areadiagram
\subsection
{
Area tree
}
\subsection
{
Area tree
}
\label
{
sec:tree
}
\label
{
sec:tree
}
...
@@ -337,12 +327,11 @@ detection for every new gesture-based application.
...
@@ -337,12 +327,11 @@ detection for every new gesture-based application.
event area that contains the event coordinates.
event area that contains the event coordinates.
If the architecture were to be used in combination with an application
If the architecture were to be used in combination with an application
framework like GTK
\cite
{
GTK
}
, each GTK widget that responds to gestures
framework, each widget that responds to gestures should have a mirroring
should have a mirroring event area that synchronizes its location with that
event area that synchronizes its location with that of the widget. Consider
of the widget. Consider a panel with five buttons that all listen to a
a panel with five buttons that all listen to a ``tap'' event. If the
``tap'' event. If the location of the panel changes as a result of movement
location of the panel changes as a result of movement of the application
of the application window, the positions of all buttons have to be updated
window, the positions of all buttons have to be updated too.
too.
This process is simplified by the arrangement of event areas in a tree
This process is simplified by the arrangement of event areas in a tree
structure. A root event area represents the panel, containing five other
structure. A root event area represents the panel, containing five other
...
@@ -354,7 +343,10 @@ detection for every new gesture-based application.
...
@@ -354,7 +343,10 @@ detection for every new gesture-based application.
If the GUI toolkit provides an API for requesting the position and size of
If the GUI toolkit provides an API for requesting the position and size of
a widget, a recommended first step when developing an application is to
a widget, a recommended first step when developing an application is to
create a subclass of the area that automatically synchronizes with the
create a subclass of the area that automatically synchronizes with the
position of a widget from the GUI framework.
position of a widget from the GUI framework. For example, the test
application described in section
\ref
{
sec:testapp
}
extends the GTK
\cite
{
GTK
}
application window widget with the functionality of a
rectangular event area, to direct touch events to an application window.
\subsection
{
Event propagation
}
\subsection
{
Event propagation
}
\label
{
sec:eventpropagation
}
\label
{
sec:eventpropagation
}
...
@@ -394,13 +386,12 @@ detection for every new gesture-based application.
...
@@ -394,13 +386,12 @@ detection for every new gesture-based application.
An additional type of event propagation is ``immediate propagation'', which
An additional type of event propagation is ``immediate propagation'', which
indicates propagation of an event from one gesture detection component to
indicates propagation of an event from one gesture detection component to
another. This is applicable when an event area uses more than one gesture
another. This is applicable when an event area uses more than one gesture
detection component. One of the components can stop the immediate
detection component. When regular propagation is stopped, the event is
propagated to other gesture detection components first, before actually
being stopped. One of the components can also stop the immediate
propagation of an event, so that the event is not passed to the next
propagation of an event, so that the event is not passed to the next
gesture detection component, nor to the ancestors of the event area.
gesture detection component, nor to the ancestors of the event area.
When regular propagation is stopped, the event is propagated to other
gesture detection components first, before actually being stopped.
\newpage
\eventpropagationfigure
\eventpropagationfigure
The concept of an event area is based on the assumption that the set of
The concept of an event area is based on the assumption that the set of
...
@@ -467,13 +458,9 @@ detection for every new gesture-based application.
...
@@ -467,13 +458,9 @@ detection for every new gesture-based application.
detection component defines a simple function that compares event
detection component defines a simple function that compares event
coordinates.
coordinates.
\trackerdiagram
When a gesture tracker detects a gesture, this gesture is triggered in the
When a gesture tracker detects a gesture, this gesture is triggered in the
corresponding event area. The event area then calls the callbacks which are
corresponding event area. The event area then calls the callbacks which are
bound to the gesture type by the application. Figure
bound to the gesture type by the application.
\ref
{
fig:trackerdiagram
}
shows the position of gesture trackers in the
architecture.
The use of gesture trackers as small detection units provides extendability
The use of gesture trackers as small detection units provides extendability
of the architecture. A developer can write a custom gesture tracker and
of the architecture. A developer can write a custom gesture tracker and
...
@@ -643,6 +630,7 @@ the entire touch surface. The output of the application can be seen in figure
...
@@ -643,6 +630,7 @@ the entire touch surface. The output of the application can be seen in figure
\end{figure}
\end{figure}
\section
{
GTK+/Cairo application
}
\section
{
GTK+/Cairo application
}
\label
{
sec:testapp
}
The second test application uses the GIMP toolkit (GTK+)
\cite
{
GTK
}
to create
The second test application uses the GIMP toolkit (GTK+)
\cite
{
GTK
}
to create
its user interface. Since GTK+ defines a main event loop that is started in
its user interface. Since GTK+ defines a main event loop that is started in
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment