In this paper we develop a family of randomized projection methods (RPM) for solving the convex feasibility problem. Our approach is based on several new ideas and tools, including stochastic approximation of convex sets, stochastic reformulations, and conditioning of the convex feasibility problem. In particular, we propose four equivalent stochastic reformulations: stochastic smooth and nonsmooth optimization problems, the stochastic fixed point problem, and the stochastic feasibility problem. The last reformulation asks for a point which belongs to a certain random set with probability one. In this case, RPM can be interpreted as follows: we sample a batch of random sets in an independently and identically distributed fashion, perform projections of the current iterate on all sampled sets, and then combine these projections by averaging with a carefully designed extrapolation step. We prove that under stochastic linear regularity, RPM converges linearly, with a rate that has a natural interpretation as a condition number of the stochastic optimization reformulation and that depends explicitly on the number of sets sampled. In doing so, we extend the concept of condition number to general convex feasibility problems. This condition number depends on the linear regularity constant and an additional key constant which can be interpreted as a Lipschitz constant of the gradient of the stochastic optimization reformulation. Besides providing a general framework for the design and analysis of randomized projection schemes, our results resolve an open problem in the literature related to the theoretical understanding of observed practical efficiency of extrapolated parallel projection methods. In addition, we prove that our method converges sublinearly in the case when the stochastic linear regularity condition is not satisfied. Preliminary numerical results also show a better performance of our extrapolated step-size scheme over its constant step-size counterpart.