Distributed vector representations are a key bridging point between connectionist and symbolic representations of cognition. It is unclear how uncertainty should be modelled in systems using such representations. One may place vector-valued distributions over vector representations, although that may assign non-zero probabilities to vector symbols that cannot occur. In this paper we discuss how bundles of symbols in Vector Symbolic Architectures (VSAs) can be understood as defining an object that has a relationship to a probability distribution, and how statements in VSAs can be understood as being analogous to probabilistic statements. We sketch novel designs for networks that compute entropy and mutual information. In this paper we restrict ourselves to operators proposed for Holographic Reduced Representations, and representing real-valued data. However, we suggest that the methods presented in this paper should translate to any VSA where the dot product between fractionally bound symbols induces a valid kernel.