We investigate the effect of symbolic encoding applied to times series consisting of some deterministic signal and additive noise, as well as time series given by a deterministic signal with randomly distributed initial conditions as a model of event-related brain potentials. We introduce an estimator of the signal-to-noise ratio (SNR) of the system by means of time averages of running complexity measures such as Shannon and Rényi entropies, and prove its asymptotical equivalence with the linear SNR in the case of Shannon entropies of symbol distributions. A SNR improvement factor is defined, exhibiting a maximum for intermediate values of noise amplitude in analogy to stochastic resonance phenomena. We demonstrate that the maximum of the SNR improvement factor can be shifted toward smaller noise amplitudes by using higher order Rényi entropies instead of the Shannon entropy. For a further improvement of the SNR, a half wave encoding of noisy time series is introduced. Finally, we discuss the effect of noisy phases on the linear SNR as well as on the SNR defined by symbolic dynamics. It is shown that longer symbol sequences yield an improvement of the latter.