系统仿真学报 ›› 2018, Vol. 30 ›› Issue (6): 2027-2035.doi: 10.16182/j.issn1004731x.joss.201806004
赵新灿, 潘世豪, 王雅萍, 帖云
收稿日期:
2016-04-27
修回日期:
2016-08-03
出版日期:
2018-06-08
发布日期:
2018-06-14
作者简介:
赵新灿(1972-),男,山东曹县,博士,副教授,研究方向为增强现实、人机交互;潘世豪(1990-),男,河南平顶山,硕士生,研究方向为虚拟现实、人机交互。
基金资助:
Zhao Xincan, Pan Shihao, Wang Yaping, Tie Yun
Received:
2016-04-27
Revised:
2016-08-03
Online:
2018-06-08
Published:
2018-06-14
摘要: 针对大型沉浸式虚拟环境中人机交互完全依赖肢体动作且效率低等问题,提出利用三维视线追踪技术得到用户注视点,以实现交互操作,为沉浸式环境提供一种自然、双向的交互手段。创新性地将Leap Motion用于瞳孔位置跟踪,通过被动式光学追踪设备获取使用者的头部运动状态,依据初始标定得到的映射方程来估计使用者大空间范围内自由运动状态下的三维注视点。实验表明,使用者在3.0 m×3.2 m×2.0 m的空间内自由运动时,集成系统对三维注视点的估计频率可达60 Hz,估计误差小于45 mm,为视线追踪在沉浸式虚拟环境中的广泛应用奠定了基础。
中图分类号:
赵新灿, 潘世豪, 王雅萍, 帖云. 沉浸式三维视线追踪算法研究[J]. 系统仿真学报, 2018, 30(6): 2027-2035.
Zhao Xincan, Pan Shihao, Wang Yaping, Tie Yun. Eye Gaze Tracking in 3D Immersive Environments[J]. Journal of System Simulation, 2018, 30(6): 2027-2035.
[1] Hansen D W, Ji Q.In the eye of the beholder: A survey of models for eyes and gaze[J]. IEEE transactions on pattern analysis and machine intelligence (S0162-8828), 2010, 32(3): 478-500. [2] Yoo D H, Chung M J.A novel non-intrusive eye gaze estimation using cross-ratio under large head motion[J]. Computer Vision and Image Understanding (S1077-3142), 2005, 98(1): 25-51. [3] Zhang Z, Cai Q.Improving cross-ratio-based eye tracking techniques by leveraging the binocular fixation constraint[C]// Proceedings of the Symposium on Eye Tracking Research and Applications. ACM, 2014: 267-270. [4] Shin Y G, Choi K A, Kim S T, et al.A novel 2-D mapping-based remote eye gaze tracking method using two IR light sources[C]// Consumer Electronics (ICCE), 2015 IEEE International Conference on. IEEE, 2015: 190-191. [5] Zhu Z, Ji Q, Bennett K P.Nonlinear eye gaze mapping function estimation via support vector regression[C]// Pattern Recognition, 2006. ICPR 2006. 18th International Conference on. IEEE, 2006: 1132-1135. [6] Gneo M, Schmid M, Conforto S, et al.A free geometry model-independent neural eye-gaze tracking system[J]. Journal of neuroengineering and rehabilitation (S1743-0003), 2012, 9(1): 82. [7] Zhu Z, Ji Q.Novel eye gaze tracking techniques under natural head movement[J]. IEEE Transactions on biomedical engineering (S1558-2531), 2007, 54(12): 2246-2260. [8] Chen J, Tong Y, Gray W, et al.A robust 3D eye gaze tracking system using noise reduction[C]// Proceedings of the 2008 symposium on Eye tracking research & applications. ACM, 2008: 189-196. [9] Sun L, Liu Z, Sun M T.Real time gaze estimation with a consumer depth camera[J]. Information Sciences (S0020-0255), 2015, 320: 346-360. [10] Xiong X, Cai Q, Liu Z, et al.Eye gaze tracking using an RGBD camera: a comparison with a RGB solution[C]// Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication. ACM, 2014: 1113-1121. [11] Hou W, Ding Y, Qin X, et al.Evaluation of Cognitive Awareness Based on Dual Task Situation[C]// International Conference on Human Centered Computing. Springer, Cham, 2016: 160-171. [12] McMurrough C, Conly C, Athitsos V, et al. A mobile, low-cost headset for 3D point of gaze estimation[J]. International Journal of Advanced Computer Science (S1819-9224), 2014, 1(1): 2-5. [13] Marin G, Dominio F, Zanuttigh P.Hand gesture recognition with leap motion and kinect devices[C]// Image Processing (ICIP), 2014 IEEE International Conference on. IEEE, 2014: 1565-1569. [14] Weichert F, Bachmann D, Rudak B, et al.Analysis of the accuracy and robustness of the leap motion controller[J]. Sensors (S1424-8220), 2013, 13(5): 6380-6393. [15] Guna J, Jakus G, Pogačnik M, et al.An analysis of the precision and reliability of the leap motion sensor and its suitability for static and dynamic tracking[J]. Sensors (S1424-8220), 2014, 14(2): 3702-3720. [16] Shao G, Che M, Zhang B, et al.A novel simple 2D model of eye gaze estimation[C]// Intelligent Human-Machine Systems and Cybernetics (IHMSC), 2010 2nd International Conference on. IEEE, 2010, 1: 300-304. [17] Li D, Winfield D, Parkhurst D J.Starburst: A hybrid algorithm for video-based eye tracking combining feature-based and model-based approaches[C]// Computer Vision and Pattern Recognition-Workshops, 2005. CVPR Workshops. IEEE Computer Society Conference on. IEEE, 2005: 79-79. [18] Zhu Z, Ji Q.Eye gaze tracking under natural head movements[C]// Computer Vision and Pattern Recognition, 2005. CVPR 2005. IEEE Computer Society Conference on. IEEE, 2005: 918-923. [19] Chambers T L, Aglawe A, Reiners D, et al.Real-time simulation for a virtual reality-based MIG welding training system[J]. Virtual Reality (S1359-4338), 2012, 16(1): 45-55. |
[1] | 李智杰, 石昊琦, 李昌华, 张颉. 基于改进遗传算法的影像中心布局优化方法[J]. 系统仿真学报, 2022, 34(6): 1173-1184. |
[2] | 陈斌, 刘悦, 杨亚磊. 基于STN的机场航班过站保障时间协同规划建模[J]. 系统仿真学报, 2022, 34(6): 1196-1207. |
[3] | 杨凯, 陈纯毅, 胡小娟, 于海洋. 蒙卡渲染画面多特征非局部均值滤波降噪算法[J]. 系统仿真学报, 2022, 34(6): 1259-1266. |
[4] | 陈麒, 崔昊杨. 基于改进鸽群层级的无人机集群视觉巡检模型[J]. 系统仿真学报, 2022, 34(6): 1275-1285. |
[5] | 王沐晴, 张磊, 范秀敏, 骆晓萌, 朱文敏. VR外设驱动的虚拟人姿态优化仿真方法[J]. 系统仿真学报, 2022, 34(6): 1296-1303. |
[6] | 陆承, 靳学胜. 基于Steam VR的交互仿真水枪灭火训练系统设计[J]. 系统仿真学报, 2022, 34(6): 1312-1319. |
[7] | 高宏鼐, 付丽疆, 夏倩, 郭亚. 可观测度在光合作用模型性能评估中的应用[J]. 系统仿真学报, 2022, 34(6): 1330-1342. |
[8] | 倪凌佳, 黄晓霞, 李红旮, 张子博. 基于协作式深度强化学习的火灾应急疏散仿真研究[J]. 系统仿真学报, 2022, 34(6): 1353-1366. |
[9] | 蒙盾, 胡卓, 张华军. 基于改进A*算法的多层邮轮疏散系统仿真[J]. 系统仿真学报, 2022, 34(6): 1375-1382. |
[10] | 郭宇飞, 赵康, 海永清. 面向有限元分析的三角网格布尔运算方法[J]. 系统仿真学报, 2022, 34(5): 1003-1014. |
[11] | 吴桐, 王清辉, 徐志佳. 三周期极小曲面多孔材料渗透率尺度特性研究[J]. 系统仿真学报, 2022, 34(5): 1015-1024. |
[12] | 蒋阳升, 王思琛, 高宽, 刘梦, 姚志洪. 混入智能网联车队的混合交通流元胞自动机模型[J]. 系统仿真学报, 2022, 34(5): 1025-1032. |
[13] | 梁江涛, 王慧琴. 基于改进蚁群算法的建筑火灾疏散路径规划研究[J]. 系统仿真学报, 2022, 34(5): 1044-1053. |
[14] | 张其文, 张斌. 基于教学优化算法求解置换流水车间调度问题[J]. 系统仿真学报, 2022, 34(5): 1054-1063. |
[15] | 邢根上, 鲁芳, 李书山, 罗定提. 基于产品体验性的供应链交货模型与仿真研究[J]. 系统仿真学报, 2022, 34(5): 1064-1075. |
阅读次数 | ||||||
全文 |
|
|||||
摘要 |
|
|||||