The conjugate gradient method is welcome method for solving optimization problems due to its simplicity and low storage. In this paper, we propose a kind of conjugate gradient method. The presented method possesses the sufficient descent property under the strong Wolfe line search. Under mild conditions, we prove that the method with strong Wolfe line search is globally convergent even if the objective function is nonconvex. At the end of this paper, we also present numerical experiment to show the efficiency of the proposed method.