report.tex 23 KB

123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161162163164165166167168169170171172173174175176177178179180181182183184185186187188189190191192193194195196197198199200201202203204205206207208209210211212213214215216217218219220221222223224225226227228229230231232233234235236237238239240241242243244245246247248249250251252253254255256257258259260261262263264265266267268269270271272273274275276277278279280281282283284285286287288289290291292293294295296297298299300301302303304305306307308309310311312313314315316317318319320321322323324325326327328329330331332333334335336337338339340341342343344345346347348349350351352353354355356357358359360361362363364365366367368369370371372373374375376377378379380381382383384385386387388389390391392393394395396397398399400401402403404405406407408409410411412413414415416417418419420421422423424425426427428429430431432433434435436437438439440441442443444445446447448449450451452453454455456457458459460461462463464465466467468469470471472473474475476477478479480481482483484485486487488489490491492493494495496497498499500501502503504505506507508509510511512513514515516517518519520521522523524525526527528529530531
  1. \documentclass[twoside,openright]{uva-bachelor-thesis}
  2. \usepackage[english]{babel}
  3. \usepackage[utf8]{inputenc}
  4. \usepackage{hyperref,graphicx,float,tikz}
  5. \usetikzlibrary{shapes,arrows}
  6. % Link colors
  7. \hypersetup{colorlinks=true,linkcolor=black,urlcolor=blue,citecolor=DarkGreen}
  8. % Title Page
  9. \title{A universal detection mechanism for multi-touch gestures}
  10. \author{Taddeüs Kroes}
  11. \supervisors{Dr. Robert G. Belleman (UvA)}
  12. \signedby{Dr. Robert G. Belleman (UvA)}
  13. \begin{document}
  14. % Title page
  15. \maketitle
  16. \begin{abstract}
  17. % TODO
  18. \end{abstract}
  19. % Set paragraph indentation
  20. \parindent 0pt
  21. \parskip 1.5ex plus 0.5ex minus 0.2ex
  22. % Table of contant on separate page
  23. \tableofcontents
  24. \chapter{Introduction}
  25. % Ruwe probleemstelling
  26. Multi-touch interaction is becoming increasingly common, mostly due to the wide
  27. use of touch screens in phones and tablets. When programming applications using
  28. this method of interaction, the programmer needs an abstraction of the raw data
  29. provided by the touch driver of the device. This abstraction exists in several
  30. multi-touch application frameworks like Nokia's
  31. Qt\footnote{\url{http://qt.nokia.com/}}. However, applications that do not use
  32. these frameworks have no access to their multi-touch events.
  33. % Aanleiding
  34. This problem was observed during an attempt to create a multi-touch
  35. ``interactor'' class for the Visualization Toolkit \cite[VTK]{VTK}. Because VTK
  36. provides the application framework here, it is undesirable to use an entire
  37. framework like Qt simultaneously only for its multi-touch support.
  38. % Ruw doel
  39. The goal of this project is to define a universal multi-touch event triggering
  40. mechanism. To test the definition, a reference implementation is written in
  41. Python.
  42. % Setting
  43. To test multi-touch interaction properly, a multi-touch device is required.
  44. The University of Amsterdam (UvA) has provided access to a multi-touch table
  45. from PQlabs. The table uses the TUIO protocol \cite{TUIO} to communicate touch
  46. events.
  47. \section{Definition of the problem}
  48. % Hoofdvraag
  49. The goal of this thesis is to create a multi-touch event triggering mechanism
  50. for use in a VTK interactor. The design of the mechanism must be universal.
  51. % Deelvragen
  52. To design such a mechanism properly, the following questions are relevant:
  53. \begin{itemize}
  54. \item What is the input of the mechanism? Different touch drivers have
  55. different API's. To be able to support different drivers (which is
  56. highly desirable), there should probably be a translation from the
  57. driver API to a fixed input format.
  58. \item How can extendability be accomplished? The set of supported events
  59. should not be limited to a single implementation, but an application
  60. should be able to define its own custom events.
  61. \item How can the mechanism be used by different programming languages?
  62. A universal mechanism should not be limited to be used in only one
  63. language.
  64. \item Can events be shared with multiple processes at the same time? For
  65. example, a network implementation could run as a service instead of
  66. within a single application, triggering events in any application that
  67. needs them.
  68. % FIXME: gaan we nog wat doen met onderstaand?
  69. %\item Is performance an issue? For example, an event loop with rotation
  70. % detection could swallow up more processing resources than desired.
  71. \item How can the mechanism be integrated in a VTK interactor?
  72. \end{itemize}
  73. % Afbakening
  74. The scope of this thesis includes the design of a universal multi-touch
  75. triggering mechanism, a reference implementation of this design, and its
  76. integration into a VTK interactor. To be successful, the design should
  77. allow for extensions to be added to any implementation.
  78. The reference implementation is a Proof of Concept that translates TUIO
  79. events to some simple touch gestures that are used by a VTK interactor.
  80. Being a Proof of Concept, the reference implementation itself does not
  81. necessarily need to meet all the requirements of the design.
  82. \section{Structure of this document}
  83. % TODO: pas als het klaar is
  84. \chapter{Related work}
  85. \section{Gesture and Activity Recognition Toolkit}
  86. The Gesture and Activity Recognition Toolkit (GART) \cite{GART} is a
  87. toolkit for the development of gesture-based applications. The toolkit
  88. states that the best way to classify gestures is to use machine learning.
  89. The programmer trains a program to recognize using the machine learning
  90. library from the toolkit. The toolkit contains a callback-mechanism that
  91. the programmer uses to execute custom code when a gesture is recognized.
  92. Though multi-touch input is not directly supported by the toolkit, the
  93. level of abstraction does allow for it to be implemented in the form of a
  94. ``touch'' sensor.
  95. The reason to use machine learning is the statement that gesture detection
  96. ``is likely to become increasingly complex and unmanageable'' when using a
  97. set of predefined rules to detect whether some sensor input can be seen as
  98. a specific gesture. This statement is not necessarily true. If the
  99. programmer is given a way to separate the detection of different types of
  100. gestures and flexibility in rule definitions, over-complexity can be
  101. avoided.
  102. % oplossing: trackers. bijv. TapTracker, TransformationTracker gescheiden
  103. \section{Gesture recognition software for Windows 7}
  104. % TODO
  105. The online article \cite{win7touch} presents a Windows 7 application,
  106. written in Microsofts .NET. The application shows detected gestures in a
  107. canvas. Gesture trackers keep track of stylus locations to detect specific
  108. gestures. The event types required to track a touch stylus are ``stylus
  109. down'', ``stylus move'' and ``stylus up'' events. A
  110. \texttt{GestureTrackerManager} object dispatches these events to gesture
  111. trackers. The application supports a limited number of pre-defined
  112. gestures.
  113. An important observation in this application is that different gestures are
  114. detected by different gesture trackers, thus separating gesture detection
  115. code into maintainable parts.
  116. \section{Processing implementation of simple gestures in Android}
  117. An implementation of a detection mechanism for some simple multi-touch
  118. gestures (tap, double tap, rotation, pinch and drag) using
  119. Processing\footnote{Processing is a Java-based development environment with
  120. an export possibility for Android. See also \url{http://processing.org/.}}
  121. can be found found in a forum on the Processing website
  122. \cite{processingMT}. The implementation is fairly simple, but it yields
  123. some very appealing results. The detection logic of all gestures is
  124. combined in a single class. This does not allow for extendability, because
  125. the complexity of this class would increase to an undesirable level (as
  126. predicted by the GART article \cite{GART}). However, the detection logic
  127. itself is partially re-used in the reference implementation of the
  128. universal gesture detection mechanism.
  129. \chapter{Preliminary}
  130. \section{The TUIO protocol}
  131. \label{sec:tuio}
  132. The TUIO protocol \cite{TUIO} defines a way to geometrically describe
  133. tangible objects, such as fingers or fiducials on a multi-touch table. The
  134. table used for this thesis uses the protocol in its driver. Object
  135. information is sent to the TUIO UDP port (3333 by default).
  136. For efficiency reasons, the TUIO protocol is encoded using the Open Sound
  137. Control \cite[OSC]{OSC} format. An OSC server/client implementation is
  138. available for Python: pyOSC \cite{pyOSC}.
  139. A Python implementation of the TUIO protocol also exists: pyTUIO
  140. \cite{pyTUIO}. However, the execution of an example script yields an error
  141. regarding Python's built-in \texttt{socket} library. Therefore, the
  142. reference implementation uses the pyOSC package to receive TUIO messages.
  143. The two most important message types of the protocol are ALIVE and SET
  144. messages. An ALIVE message contains the list of session id's that are
  145. currently ``active'', which in the case of multi-touch a table means that
  146. they are touching the screen. A SET message provides geometric information
  147. of a session id, such as position, velocity and acceleration.
  148. Each session id represents an object. The only type of objects on the
  149. multi-touch table are what the TUIO protocol calls ``2DCur'', which is a
  150. (x, y) position on the screen.
  151. ALIVE messages can be used to determine when an object touches and releases
  152. the screen. For example, if a session id was in the previous message but
  153. not in the current, The object it represents has been lifted from the
  154. screen.
  155. SET provide information about movement. In the case of simple (x, y)
  156. positions, only the movement vector of the position itself can be
  157. calculated. For more complex objects such as fiducials, arguments like
  158. rotational position is also included.
  159. ALIVE and SET messages can be combined to create ``point down'', ``point
  160. move'' and ``point up'' events (as used by the \cite[.NET
  161. application]{win7touch}).
  162. TUIO coordinates range from $0.0$ to $1.0$, with $(0.0, 0.0)$ being the
  163. left top corner of the screen and $(1.0, 1.0)$ the right bottom corner. To
  164. focus events within a window, a translation to window coordinates is
  165. required in the client application, as stated by the online specification
  166. \cite{TUIO_specification}:
  167. \begin{quote}
  168. In order to compute the X and Y coordinates for the 2D profiles a TUIO
  169. tracker implementation needs to divide these values by the actual
  170. sensor dimension, while a TUIO client implementation consequently can
  171. scale these values back to the actual screen dimension.
  172. \end{quote}
  173. \section{The Visualization Toolkit}
  174. \label{sec:vtk}
  175. % TODO
  176. \chapter{Experiments}
  177. % testimplementatie met taps, rotatie en pinch. Hieruit bleek:
  178. % - dat er verschillende manieren zijn om bijv. "rotatie" te
  179. % detecteren, (en dat daartussen onderscheid moet kunnen worden
  180. % gemaakt)
  181. % - dat detectie van verschillende soorten gestures moet kunnen
  182. % worden gescheiden, anders wordt het een chaos.
  183. % - Er zijn een aantal keuzes gemaakt bij het ontwerpen van de gestures,
  184. % bijv dat rotatie ALLE vingers gebruikt voor het centroid. Het is
  185. % wellicht in een ander programma nodig om maar 1 hand te gebruiken, en
  186. % dus punten dicht bij elkaar te kiezen (oplossing: windows).
  187. % Tekenprogramma dat huidige points + centroid tekent en waarmee
  188. % transformatie kan worden getest Link naar appendix "supported events"
  189. % Proof of Concept: VTK interactor
  190. \section{Experimenting with TUIO and event bindings}
  191. \label{sec:experimental-draw}
  192. When designing a software library, its API should be understandable and
  193. easy to use for programmers. To find out the basic requirements of the API
  194. to be usable, an experimental program has been written based on the
  195. Processing code from \cite{processingMT}. The program receives TUIO events
  196. and translates them to point \emph{down}, \emph{move} and \emph{up} events.
  197. These events are then interpreted to be (double or single) \emph{tap},
  198. \emph{rotation} or \emph{pinch} gestures. A simple drawing program then
  199. draws the current state to the screen using the PyGame library. The output
  200. of the program can be seen in figure \ref{fig:draw}.
  201. \begin{figure}[H]
  202. \center
  203. \label{fig:draw}
  204. \includegraphics[scale=0.4]{data/experimental_draw.png}
  205. \caption{Output of the experimental drawing program. It draws the touch
  206. points and their centroid on the screen (the centroid is used
  207. as center point for rotation and pinch detection). It also
  208. draws a green rectangle which responds to rotation and pinch
  209. events.}
  210. \end{figure}
  211. One of the first observations is the fact that TUIO's \texttt{SET} messages
  212. use the TUIO coordinate system, as described in section \ref{sec:tuio}.
  213. The test program multiplies these with its own dimensions, thus showing the
  214. entire screen in its window. Also, the implementation only works using the
  215. TUIO protocol. Other drivers are not supported.
  216. Though using relatively simple math, the rotation and pinch events work
  217. surprisingly well. Both rotation and pinch use the centroid of all touch
  218. points. A \emph{rotation} gesture uses the difference in angle relative to
  219. the centroid of all touch points, and \emph{pinch} uses the difference in
  220. distance. Both values are normalized using division by the number of touch
  221. points. A pinch event contains a scale factor, and therefore uses a
  222. division of the current by the previous average distance to the centroid.
  223. There is a flaw in this implementation. Since the centroid is calculated
  224. using all current touch points, there cannot be two or more rotation or
  225. pinch gestures simultaneously. On a large multi-touch table, it is
  226. desirable to support interaction with multiple hands, or multiple persons,
  227. at the same time.
  228. Also, the different detection algorithms are all implemented in the same
  229. file, making it complex to read or debug, and difficult to extend.
  230. \section{VTK interactor}
  231. % TODO
  232. % VTK heeft eigen pipeline, mechanisme moet daarnaast draaien
  233. \section{Summary of observations}
  234. \label{sec:observations}
  235. \begin{itemize}
  236. \item The TUIO protocol uses a distinctive coordinate system and set of
  237. messages.
  238. \item Touch events occur outside of the application window.
  239. \item Gestures that use multiple touch points are using all touch
  240. points (not a subset of them).
  241. \item Code complexity increases when detection algorithms are added.
  242. \item % TODO: VTK interactor observations
  243. \end{itemize}
  244. % -------
  245. % Results
  246. % -------
  247. \chapter{Design}
  248. \section{Requirements}
  249. \label{sec:requirements}
  250. From the observations in section \ref{sec:observations}, a number of
  251. requirements can be specified for the design of the event mechanism:
  252. \begin{itemize}
  253. % vertalen driver-specifieke events naar algemeen formaat
  254. \item To be able to support multiple input drivers, there must be a
  255. translation from driver-specific messages to some common format
  256. that can be used in gesture detection algorithms.
  257. % events toewijzen aan GUI window (windows)
  258. \item An application GUI window should be able to receive only events
  259. occuring within that window, and not outside of it.
  260. % scheiden groepen touchpoints voor verschillende gestures (windows)
  261. \item To support multiple objects that are performing different
  262. gestures at the same time, the mechanism must be able to perform
  263. gesture detection on a subset of the active touch points.
  264. % scheiden van detectiecode voor verschillende gesture types
  265. \item To avoid an increase in code complexity when adding new detection
  266. algorithms, detection code of different gesture types must be
  267. separated.
  268. \end{itemize}
  269. \section{Components}
  270. Based on the requirements from section \ref{sec:requirements}, a design
  271. for the mechanism has been created. The design consists of a number of
  272. components, each having a specific set of tasks.
  273. \subsection{Event server}
  274. % vertaling driver naar point down, move, up
  275. % vertaling naar schermpixelcoordinaten
  276. % TUIO in reference implementation
  277. The \emph{event server} is an abstraction for driver-specific server
  278. implementations, such as a TUIO server. It receives driver-specific
  279. messages and tanslates these to a common set of events and a common
  280. coordinate system.
  281. A minimal example of a common set of events is $\{point\_down,
  282. point\_move, point\_up\}$. This is the set used by the reference
  283. implementation. Respectively, these events represent an object being
  284. placed on the screen, moving along the surface of the screen, and being
  285. released from the screen.
  286. A more extended set could also contain the same three events for a
  287. surface touching the screen. However, a surface can have a rotational
  288. property, like the ``fiducials'' type in the TUIO protocol. This
  289. results in as $\{point\_down, point\_move, point\_up, surface\_down,
  290. surface\_move, surface\_up,\\surface\_rotate\}$.
  291. An important note here, is that similar events triggered by different
  292. event servers must have the same event type and parameters. In other
  293. words, the output of the event servers should be determined by the
  294. gesture servers (not the contrary).
  295. The output of an event server implementation should also use a common
  296. coordinate system, that is the coordinate system used by the gesture
  297. server. For example, the reference implementation uses screen
  298. coordinates in pixels, where (0, 0) is the upper left corner of the
  299. screen.
  300. The abstract class definition of the event server should provide some
  301. functionality to detect which driver-specific event server
  302. implementation should be used.
  303. \subsection{Gesture trackers}
  304. A \emph{gesture tracker} detects a single gesture type, given a set of
  305. touch points. If one group of points on the screen is assigned to one
  306. tracker and another group to another tracker, multiple gestures, an be
  307. detected at the same time. For this assignment, the mechanism uses
  308. windows. These will be described in the next section.
  309. % event binding/triggering
  310. A gesture tracker triggers a gesture event by executing a callback.
  311. Callbacks are ``bound'' to a tracker by the application. Because
  312. multiple gesture types can have very similar detection algorithm, a
  313. tracker can detect multiple different types of gestures. For instance,
  314. the rotation and pinch gestures from the experimental program in
  315. section \ref{sec:experimental-draw} both use the centroid of all touch
  316. points.
  317. If no callback is bound for a particular gesture type, no detection of
  318. that type is needed. A tracker implementation can use this knowledge
  319. for code optimization.
  320. % scheiding algoritmiek
  321. A tracker implementation defines the gesture types it can trigger, and
  322. the detection algorithms to trigger them. Consequently, detection
  323. algorithms can be separated in different trackers. Different
  324. trackers can be saved in different files, reducing the complexity of
  325. the code in a single file. \\
  326. % extendability
  327. Because tacker defines its own set of gesture types, the application
  328. developer can define application-specific trackers (by extending a base
  329. \texttt{GestureTracker} class, for example). In fact, any built-in
  330. gesture trackers of an implementation are also created this way. This
  331. allows for a plugin-like way of programming, which is very desirable if
  332. someone would want to build a library of gesture trackers. Such a
  333. library can easy be extended by others.
  334. \subsection{Windows}
  335. A \emph{window} represents a subset of the entire screen surface. The
  336. goal of a window is to restrict the detection of certain gestures to
  337. certain areas. A window contains a list of touch points, and a list of
  338. trackers. A window server (defined in the next section) assigns touch
  339. points to a window, but the window itself defines functionality to
  340. check whether a touch point is inside the window. This way, new windows
  341. can be defined to fit over any 2D object used by the application.
  342. The first and most obvious use of a window is to restrict touch events
  343. to a single application window. However, the use of windows can be used
  344. in a lot more powerful way.
  345. For example, an application contains an image with a transparent
  346. background that can be dragged around. The user can only drag the image
  347. by touching its foreground. To accomplish this, the application
  348. programmer can define a window type that uses a bitmap to determine
  349. whether a touch point is on the visible image surface. The tracker
  350. which detects drag gestures is then bound to this window, limiting the
  351. occurence of drag events to the image surface.
  352. % toewijzen even aan deel v/h scherm:
  353. % TUIO coördinaten zijn over het hele scherm en van 0.0 tot 1.0, dus
  354. % moeten worden vertaald naar pixelcoördinaten binnen een ``window''
  355. % TODO
  356. \subsection{Gesture server}
  357. % luistert naar point down, move, up
  358. The \emph{gesture server} delegates events from the event server to the
  359. set of windows that contain the touch points related to the events.
  360. % toewijzing point (down) aan window(s)
  361. The gesture server contains a list of windows. When the event server
  362. triggers an event, the gesture server ``asks'' each window whether it
  363. contains the related touch point. If so, the window updates its gesture
  364. trackers, which can then trigger gestures.
  365. \section{Diagram of component relations}
  366. \begin{figure}[H]
  367. \input{data/diagram}
  368. % TODO: caption
  369. \end{figure}
  370. \section{Example usage}
  371. This section describes an example that illustrates the communication
  372. between different components. The example application listens to tap events
  373. in a GUI window.
  374. \begin{verbatim}
  375. # Create a gesture server that will be started later
  376. server = new GestureServer object
  377. # Add a new window to the server, representing the GUI
  378. window = new Window object
  379. set window position and size to that of GUIO window
  380. add window to server
  381. # Define a handler that must be triggered when a tap gesture is detected
  382. begin function handler(gesture)
  383. # Do something
  384. end function
  385. # Create a tracker that detects tap gestures
  386. tracker = new TapTracker object # Where TapTracker is an implementation of
  387. # abstract Tracker
  388. add tracker tot window
  389. bind handler to tracker.tap
  390. # If the GUI toolkit allows it, bind window movement and resize handlers
  391. # that alter the position size and sieze of the window object
  392. # Start the gesture server (which in turn starts a driver-specific event
  393. # server)
  394. start server
  395. \end{verbatim}
  396. \section{Network protocol}
  397. % TODO
  398. % ZeroMQ gebruiken voor communicatie tussen meerdere processen (in
  399. % verschillende talen)
  400. \chapter{Reference implementation}
  401. % TODO
  402. % alleen window.contains op point down, niet move/up
  403. \chapter{Integration in VTK}
  404. % VTK interactor
  405. %\chapter{Conclusions}
  406. % TODO
  407. % Windows zijn een manier om globale events toe te wijzen aan vensters
  408. % Trackers zijn een effectieve manier om gebaren te detecteren
  409. % Trackers zijn uitbreidbaar door object-orientatie
  410. \chapter{Suggestions for future work}
  411. % TODO
  412. % Network protocol (ZeroMQ) voor meerdere talen en simultane processen
  413. % Hierij ook: extra laag die gesture windows aanmaakt die corresponderen met window manager
  414. % State machine
  415. % Window in boomstructuur voor efficientie
  416. \bibliographystyle{plain}
  417. \bibliography{report}{}
  418. %\appendix
  419. \end{document}