report.tex 8.4 KB

123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161162163164165166167168169170171172173174175176177178179180181182183184185186187188189190191192193194195196197198199200201202203204205206207208209210211212213214215216217218219220221222223224225226227228229230231232233234235236237238
  1. \documentclass[twoside,openright]{uva-bachelor-thesis}
  2. \usepackage[english]{babel}
  3. \usepackage[utf8]{inputenc}
  4. \usepackage{hyperref,graphicx,float}
  5. % Link colors
  6. \hypersetup{colorlinks=true,linkcolor=black,urlcolor=blue,citecolor=DarkGreen}
  7. % Title Page
  8. \title{Universal multi-touch event mechanism}
  9. \author{Taddeüs Kroes}
  10. \supervisors{Dr. Robert G. Belleman (UvA)}
  11. \signedby{Dr. Robert G. Belleman (UvA)}
  12. \begin{document}
  13. % Title page
  14. \maketitle
  15. \begin{abstract}
  16. % TODO
  17. \end{abstract}
  18. % Set paragraph indentation
  19. \parindent 0pt
  20. \parskip 1.5ex plus 0.5ex minus 0.2ex
  21. % Table of contant on separate page
  22. \tableofcontents
  23. \chapter{Introduction}
  24. % Ruwe probleemstelling
  25. Multi-touch interaction is becoming increasingly common, mostly due to the wide
  26. use of touch screens in phones and tablets. When programming applications using
  27. this method of interaction, the programmer needs an abstraction of the raw data
  28. provided by the touch driver of the device. This abstraction exists in several
  29. multi-touch application frameworks like Nokia's
  30. Qt\footnote{\url{http://qt.nokia.com/}}. However, applications that do not use
  31. these frameworks have no access to their multi-touch events.
  32. % Aanleiding
  33. This problem was observed during an attempt to create a multi-touch
  34. ``interactor'' class for the Visualization Toolkit (VTK \cite{VTK}). Because
  35. VTK provides the application framework here, it is undesirable to use an entire
  36. framework like Qt simultaneously only for its multi-touch support.
  37. % Ruw doel
  38. The goal of this project is to define a universal multi-touch event triggering
  39. mechanism. To test the definition, a reference implementation is written in
  40. Python.
  41. % Setting
  42. To test multi-touch interaction properly, a multi-touch device is required.
  43. The University of Amsterdam (UvA) has provided access to a multi-touch table
  44. from PQlabs. The table uses the TUIO protocol \cite{TUIO} to communicate touch
  45. events.
  46. \section{Definition of the problem}
  47. % Hoofdvraag
  48. The goal of this thesis is to create a multi-touch event triggering mechanism
  49. for use in a VTK interactor. The design of the mechanism must be universal.
  50. % Deelvragen
  51. To design such a mechanism properly, the following questions are relevant:
  52. \begin{itemize}
  53. \item What is the input of the mechanism? Different touch drivers have
  54. different API's. To be able to support different drivers (which is
  55. highly desirable), there should probably be a translation from the
  56. driver API to a fixed input format.
  57. \item How can extendability be accomplished? The set of supported events
  58. should not be limited to a single implementation, but an application
  59. should be able to define its own custom events.
  60. \item Can events be shared with multiple processes at the same time? For
  61. example, a network implementation could run as a service instead of
  62. within a single application, triggering events in any application that
  63. needs them.
  64. \item Is performance an issue? For example, an event loop with rotation
  65. detection could swallow up more processing resources than desired.
  66. \end{itemize}
  67. % Afbakening
  68. The scope of this thesis includes the design of an multi-touch triggering
  69. mechanism, a reference implementation of this design, and its integration
  70. into a VTK interactor. To be successful, the design should allow for
  71. extensions to be added to any implementation. The reference implementation
  72. is a Proof of Concept that translates TUIO events to some simple touch
  73. gestures that are used by a VTK interactor.
  74. \section{Structure of this document}
  75. % TODO
  76. \chapter{Related work}
  77. \section{Gesture recognition software for Windows 7}
  78. % TODO
  79. \cite[test]{win7touch}
  80. \section{The TUIO protocol}
  81. The TUIO protocol \cite{TUIO} defines a way to geometrically describe
  82. tangible objects, such as fingers or fiducials on a multi-touch table. The
  83. table used for this thesis uses the protocol in its driver. Object
  84. information is sent to the TUIO UDP port (3333 by default).
  85. For efficiency reasons, the TUIO protocol is encoded using the Open Sound
  86. Control (OSC)\footnote{\url{http://opensoundcontrol.org/specification}}
  87. format. An OSC server/client implementation is available for Python:
  88. pyOSC\footnote{\url{https://trac.v2.nl/wiki/pyOSC}}.
  89. A Python implementation of the TUIO protocol also exists:
  90. pyTUIO\footnote{\url{http://code.google.com/p/pytuio/}}. However, the
  91. execution of an example script yields an error regarding Python's built-in
  92. \texttt{socket} library. Therefore, the reference implementation uses the
  93. pyOSC package to receive TUIO messages.
  94. The two most important message types of the protocol are ALIVE and SET
  95. messages. An ALIVE message contains the list of session id's that are
  96. currently ``active'', which in the case of multi-touch a table means that
  97. they are touching the screen. A SET message provides geometric information
  98. of a session id, such as position, velocity and acceleration.
  99. Each session id represents an object. The only type of objects on the
  100. multi-touch table are what the TUIO protocol calls ``2DCur'', which is a
  101. (x, y) position on the screen.
  102. ALIVE messages can be used to determine when an object touches and releases
  103. the screen. For example, if a session id was in the previous message but
  104. not in the current, The object it represents has been lifted from the
  105. screen.
  106. SET provide information about movement. In the case of simple (x, y)
  107. positions, only the movement vector of the position itself can be
  108. calculated. For more complex objects such as fiducials, arguments like
  109. rotational position is also included.
  110. TUIO coordinates range from $0.0$ to $1.0$, with $(0.0, 0.0)$ being the
  111. left top corner of the screen and $(1.0, 1.0)$ the right bottom corner. To
  112. focus events within a window, a translation to window coordinates is
  113. required in the client application, as stated by the online specification
  114. \cite{TUIO_specification}:
  115. \begin{quote}
  116. In order to compute the X and Y coordinates for the 2D profiles a TUIO
  117. tracker implementation needs to divide these values by the actual
  118. sensor dimension, while a TUIO client implementation consequently can
  119. scale these values back to the actual screen dimension.
  120. \end{quote}
  121. % TODO
  122. \chapter{Experiments}
  123. % testimplementatie met taps, rotatie en pinch. Hieruit bleek:
  124. % - dat er verschillende manieren zijn om bijv. "rotatie" te
  125. % detecteren, (en dat daartussen onderscheid moet kunnen worden
  126. % gemaakt)
  127. % - dat detectie van verschillende soorten gestures moet kunnen
  128. % worden gescheiden, anders wordt het een chaos.
  129. % - Er zijn een aantal keuzes gemaakt bij het ontwerpen van de gestures,
  130. % bijv dat rotatie ALLE vingers gebruikt voor het centroid. Het is
  131. % wellicht in een ander programma nodig om maar 1 hand te gebruiken, en
  132. % dus punten dicht bij elkaar te kiezen (oplossing: windows).
  133. % Tekenprogramma dat huidige points + centroid tekent en waarmee
  134. % transformatie kan worden getest Link naar appendix "supported events"
  135. % Proof of Concept: VTK interactor
  136. % -------
  137. % Results
  138. % -------
  139. \chapter{Design}
  140. % TODO: link naar appendix met schema
  141. \section{Input server}
  142. % TODO
  143. % vertaling driver naar point down, move, up
  144. % TUIO in reference implementation
  145. \section{Gesture server}
  146. \subsection{Windows}
  147. % TODO
  148. % toewijzen even aan deel v/h scherm:
  149. % TUIO coördinaten zijn over het hele scherm en van 0.0 tot 1.0, dus moeten
  150. % worden vertaald naar pixelcoördinaten binnen een ``window''
  151. \subsection{Trackers}
  152. % TODO
  153. % event binding/triggering
  154. % extendability
  155. \chapter{Reference implementation}
  156. % TODO
  157. % draw.py
  158. % VTK interactor
  159. \chapter{Conclusions}
  160. % TODO
  161. % Windows zijn een manier om globale events toe te wijzen aan vensters
  162. % Trackers zijn een effectieve manier om gebaren te detecteren
  163. % Trackers zijn uitbreidbaar door object-orientatie
  164. \chapter{Suggestions for future work}
  165. % TODO: Network protocol (ZeroMQ)
  166. \bibliography{report}{}
  167. \bibliographystyle{plain}
  168. \appendix
  169. \chapter{Schema of mechanism structure}
  170. \label{app:schema}
  171. \begin{figure}[H]
  172. \hspace{-14em}
  173. \includegraphics{data/server_scheme.pdf}
  174. \caption{}
  175. %TODO: caption
  176. \end{figure}
  177. \chapter{Supported events in reference implementation}
  178. \label{app:supported-events}
  179. % TODO
  180. \end{document}