Motivated by applications to adaptive filtering that involves joint parameter and state estimation in hidden Markov models, we describe a new approach to MCMC, which uses sequential state substitutions for its Metropolis-Hastings-type transitions. The basic idea is to approximate the target distribution by the empirical distribution of N representative atoms, chosen sequentially by an MCMC scheme so that the empirical distribution converges weakly to the target distribution as the number K of iterations approaches infinity. Making use of coupling arguments and bounds on a weighted total variation norm of the signed measure defined by the difference between the target distribution and the empirical measure induced by the sample paths of the MCMC scheme, we develop its asymptotic theory. In particular, we prove the asymptotic normality (as both K and N become infinite) of the estimates of functionals of the target distribution using the new MCMC method, provide consistent estimates of their standard errors, and derive oracle properties that prove their asymptotic optimality. Implementation details and applications, particularly to image analysis with uncertainty quantification, and to Bayesian structural equation modeling and generalized diagnostic classification for adaptive testing in psychometrics, are also given.