We introduce a hybrid geodesically convex optimization framework, which is motivated by the problems of computing the log-optimal portfolio and computing the Augustin information. Our framework considers objective functions that are geodesically convex with respect to one Riemannian metric and have geodesically Lipschitz gradients with respect to another Riemannian metric. Under this hybrid condition, we provide a non-asymptotic convergence guarantee for Riemannian gradient descent, proving that it converges at a rate of O(1/T). We apply this framework to the original motivating problems.