The mainstream approach in NLP is to train systems on large amounts of data. Such passive learning contrasts with the waylanguage is learnt by humans. Human language is acquired within communities, it is culturally transmitted and changesdynamically. These evolutionary mechanisms have been extensively studied in the field of Language Evolution. Despitelimited prior interaction between fields, such mechanisms are now increasingly incorporated into NLP systems. Suchmodels have the potential to both study the evolution of language in multi-agent simulations with state-of-the-art (deep)learning systems in more naturalistic settings and improve NLP systems by having language emerge organically. Weexamine how findings from a model by Havrylov & Titov (2017) compare to those from traditional Language Evolutionmodels and quantify the emerging compositionality using an existing Language Evolution method (Tamariz, 2011). Thisapproach reveals novel insights into the generated data, the applied methodology and the nature of compositionality.