We study the solution of minimax problems min xmax yG(x) + K(x), y - F*(y) in finite-dimensional Hilbert spaces. The functionals G and F* we assume to be convex, but the operator K we allow to be nonlinear. We formulate a natural extension of the modified primal-dual hybrid gradient method, originally for linear K, due to Chambolle and Pock. We prove the local convergence of the method, provided various technical conditions are satisfied. These include in particular the Aubin property of the inverse of a monotone operator at the solution. Of particular interest to us is the case arising from Tikhonov type regularization of inverse problems with nonlinear forward operators. Mainly we are interested in total variation and second-order total generalized variation priors. For such problems, we show that our general local convergence result holds when the noise level of the data f is low, and the regularization parameter α is correspondingly small. We verify the numerical performance of the method by applying it to problems from magnetic resonance imaging (MRI) in chemical engineering and medicine. The specific applications are in diffusion tensor imaging and MR velocity imaging. These numerical studies show very promising performance. © 2014 IOP Publishing Ltd.