We proposed an IR-based facial expression tracking sensor for Head-Mounted Displays(HMD). The proposed sensor uses lateral propagation characteristics of IR light on human skin to capture the degree of compressed or stretched deformations of facial skin. We derived a semi-empirical equation modeling the lateral propagation characteristics of vertically incident IR light into human skin, with parameters which fit to the measured data. All components of IR emitters and receivers were integrated into the form interface of a commercial VR headset. We verified the functionality of tracking performance from 4-kind of basic facial expressions in the experiment using the implemented prototype headset. As a further work, we will upgrade the performance of proposed sensor to recognize emotional expressions of HMD user by applying machine learning technique. We are aiming to enable immersive human to human(or avatar) communications in the cyberspace.