Journal of System Simulation ›› 2016, Vol. 28 ›› Issue (10): 2378-2385.

Previous Articles     Next Articles

360-Degree Virtual Fitting Based on Kinect

Zhang Xiaoli, Yao Junfeng, Huang Ping   

  1. Center for Digital Media Computing, Software School, Xiamen University, Xiamen 361005, China
  • Received:2016-05-31 Revised:2016-07-11 Online:2016-10-08 Published:2020-08-13

Abstract: A lot of virtual fitting systems only study human-computer interaction and clothing simulation, but they can’t make clothes model to rotate 360 degree along with human. To solve this problem, an improved method of virtual fitting based on Kinect using motion prediction was proposed. With the help of Kinect, the skeleton feature points of user for real-time tracking were obtained. According to the obtained information of the joint point of the head and color image, face was detected and judging the front or back of the user. The motion trajectory of the left and right shoulders’ joint points based on gray model was predicted. When the depth coordinates of joint point varied sharply, the data was corrected that Kinect obtained. The proposed method has the following advantages: Sense of reality, the system realizes the real-time 360 degree virtual fitting; Real-time, the gray forecast can predict the result quickly, which achieves real-time rotation of clothing model with the human body. Experimental results show that the 3D virtual fitting system can achieve a better fitting effect result.

Key words: Kinect, skeletal characteristic points, face detection, gray prediction, 360 degree rotation

CLC Number: