Decentralized machine learning tackles the problemof learning useful models when data is distributed amongseveral clients. The most prevalent decentralized setting todayis federated learning (FL), where a central server orchestratesthe learning among clients. In this work, we contribute to therelatively understudied sub-field of generative modelling in theFL framework.We study the task of how to train generative adversarial net-works (GANs) when training data is heterogeneously distributed(non-iid) over clients and cannot be shared. Our objective isto train a generator that is able to sample from the collectivedata distribution centrally, while the client data never leaves theclients and user privacy is respected. We show using standardbenchmark image datasets that existing approaches fail in thissetting, experiencing so-called client drift when the local numberof epochs becomes to large and local parameters drift too faraway in parameter space. To tackle this challenge, we proposea novel approach namedEFFGAN: Ensembles of fine-tunedfederated GANs. Being an ensemble of local expert generators, EFFGAN is able to learn the data distribution over all clientsand mitigate client drift. It is able to train with a large numberof local epochs, making it more communication efficient thanprevious works