Bayesian inference in nonconjugate models such as Bayesian Poisson regression often relies on computationally expensive Monte Carlo methods. This paper introduces Q-conjugacy, a generalization of classical conjugacy that enables efficient closed-form variational inference in certain nonconjugate models. Q-conjugacy is a condition in which a closed-form update scheme expresses the solution minimizing the Kullback-Leibler divergence between a variational distribution and the product of two potentially unnormalized distributions. Leveraging Q-conjugacy within a local message passing framework allows deriving analytic inference update equations for nonconjugate models. The effectiveness of this approach is demonstrated on Bayesian Poisson regression and a model involving a hidden gamma-distributed latent variable with Gaussian-corrupted logarithmic observations. Results show that Q-conjugate triplets, such as (Gamma, LogNormal, Gamma), provide better speed-accuracy trade-offs than Markov Chain Monte Carlo.