We show that a model reaction-diffusion system with two species in a monostable regime and over a large region of parameter space produces Turing patterns coexisting with a limit cycle which cannot be discerned from the linear analysis. As a consequence, the patterns oscillate in time. When varying a single parameter, a series of bifurcations leads to period doubling, quasiperiodic, and chaotic oscillations without modifying the underlying Turing pattern. A Ruelle-Takens-Newhouse route to chaos is identified. We also examine the Turing conditions for obtaining a diffusion-driven instability and show that the patterns obtained are not necessarily stationary for certain values of the diffusion coefficients. These results demonstrate the limitations of the linear analysis for reaction-diffusion systems. © 2012 American Physical Society.