Purpose : Life cycle assessment (LCA) is a data-intensive methodology; therefore, experts usually focus collection efforts on a few activities, while generic data on remaining activities are taken from databases. Even though increased availability of databases has facilitated LCA takeoff, assuring data quality is fundamental to ensure meaningful results and reliable interpretation. Methods : Ecoinvent has become a global reference for inventory data. Its current version released three impact partition modeling options—the recycled content, “allocation at the point of substitution” (APOS), and consequential models—whose adequate choice is crucial for yielding meaningful assessments. Tutorials and manuals describe the distribution algorithm that backs each system model, to ground decision-making regarding the best fit to a study’s goals. We performed a systematic literature review to investigate—within the papers published on the International Journal of LCA (IJLCA)—how transparently authors addressed the system model choices. Results and discussion : About 70% of LCA practitioners continued to use earlier versions of ecoinvent after version 3 was launched in 2013. The number of papers using versions 3.x only showed an increased growth trend 2 years later. Eighty-three papers actually adopted the newest version of the database. From those, only 29 papers clearly mentioned the adopted system model. Our SLR also suggests a trend regarding authorship profile of LCA-related studies: the number of studies conducted by practitioners aware of the intricacies of sound modeling of background and foreground data might have been surpassed by those conducted by non-LCA specialists who use LCA as a supporting tool for investigations in applied fields, and merely scratch the surface. Conclusions : Our results point to a need for a caveat: ecoinvent users must take time to understand the general concept behind each system model and practice one of the most important actions when performing an LCA—state methodological choices clearly.