The Kinect sensor has gained popularity in a large number of applications beyond its intended original design of being a 3D human interface device, including indoor navigation of pushcart and backpack sensor platforms. In this study, a performance analysis is provided based on a series of indoor tests, where sufficient control was available. This investigation aims at estimating and evaluating the total error budget of the trajectory recovery process that is based on using both optical (2D) and depth (3D) imagery. The sensor error budget defines a lower error bound for the trajectory estimation errors, i.e. what can be achieved under ideal conditions. The total error budget includes the object space dependency, the error introduced by the scene content in terms of geometry and texture that can be exploited to identify matching features in the image sequence. While it is difficult to encapsulate the impact of the object space in a rigorous sense, tendencies can be identified based on statistical evaluation of data acquired under typical object space scenarios. The test data used in this study was acquired at the Department of Civil, Environmental and Geodetic Engineering of the Ohio State University, using the updated prototype of the personal navigator developed earlier at the OSU Satellite Positioning and Inertial Navigation Laboratory (SPIN) Laboratory.