The free energy principle (FEP) constitutes the basis for one of the prominent theories for perception and learning in biological agents. Based on a generative model (GM) and beliefs over hidden states, the free energy principle enables an agent to sense and act by minimizing a free energy bound on Bayesian surprise. Inclusion of prior beliefs in the GM about desired states leads to active inference (ActInf). In this work, we apply ActInf to general state estimation and control. Suitable codings of prior beliefs recover solutions for standard cost-based optimization frameworks. In contrast to standard cost and constraint-based solutions, ActInf gives rise to a minimization problem that includes both an information-theoretic surprise term and a model-predictive control type of cost term. ActInf thus subsumes classical stochastic control as a special case. We illustrate the performance of ActInf under varying system parameters and compare to classical solutions for estimation and control.