We introduce low complexity bounds on mutual informationfor efficient privacy-preserving feature selection with secure multi-partycomputation (MPC). Considering a discrete feature with N possible values and a discrete label with M possible values, our approach requiresO(N) multiplications as opposed to O(NM) in a direct MPC implementation of mutual information. Our experimental results show thatfor regression tasks, we achieve a computation speed up of over 1,000×compared to a straightforward MPC implementation of mutual information, while achieving similar accuracy for the downstream machinelearning model.