In the last years, touchscreens became the most common input device for a wide range of computers. While touchscreens are truly pervasive, commercial devices reduce the richness of touch input to two-dimensional positions on the screen. Recent work proposed interaction techniques to extend the richness of the input vocabulary using the finger orientation. Approaches for determining a finger’s orientation using offthe-shelf capacitive touchscreens proposed in previous work already enable compelling use cases. However, the low estimation accuracy limits the usability and restricts the usage of finger orientation to non-precise input. With this paper, we provide a ground truth data set for capacitive touch screens recorded with a high-precision motion capture system. Using this data set, we show that a Convolutional Neural Network can outperform approaches proposed in previous work. Instead of relying on hand-crafted features, we trained the model based on the raw capacitive images. Thereby we reduce the pitch error by 9.8% and the yaw error by 45.7%.
Below is the BibTex entry for citing this work
@inproceedings{Mayer:2017:Orientation, title = {Estimating the Finger Orientation on Capacitive Touchscreens Using Convolutional Neural Networks}, author = {Sven Mayer and Huy Viet Le and Niels Henze}, year = {2017}, date = {2017-10-17}, booktitle = {Proceedings of the 2017 International Conference on Interactive Surfaces and Spaces}, volume = {17}, publisher = {ACM}, address = {New York, NY, USA}, series = {ISS'17}, keywords = {Finger orientation; touchscreen; mobile device; capacitive sensing}, doi = {10.1145/3132272.3134130}, url = {https://doi.org/10.1145/3132272.3134130} }