Activity recognition is gaining a lot of interest given its direct use in applications like ambient assisted living and has been empowered by the increasing ubiquity of sensors (e.g., clothes, smartphones, watches). The machine learning approach to activity recognition consists on finding the signatures characterizing the activities to be recognized, with the hope of identifying them (pattern matching) within the stream of sensor data. The finding of those signatures can be very complex, thus many approaches deal with the streams of sensor data by segmenting them into sections or “time-windows”, before processing them by a feature extraction procedure. The problem then concerns the association of features to class labels. In this paper, we propose the use of the Gamma Growing Neural Gas algorithm to unsupervisely discover templates in a recording containing gestures performed by a person in a home environment. The system is able to do vector quantization from the time-series of data coming from one accelerometer, and finds salient patterns (e.g., templates) in the signal. These templates integrate information not only from single time-windows but do consider the recent history of the incoming signal (e.g., multiple time-windows). Those templates are then associated to activity classes by supervised learning. Our experiments show that the resulting performance is better than previous benchmarks of the same database.