Automatic recognition of user context is essential for a variety of emerging applications, such as context-dependent content delivery, telemonitoring of medical patients, or quantified life-logging. Although not explicitly observable as, e.g., activities, an important aspect towards understanding user context lies in the affective state of mood.While significant work has been done to assess mood, most approaches require the use of customized sensors and controlled laboratory settings.In this work, we engineer a recognition pipeline to recognize daily activities from commercially popularized wearable electronics. In turn, we use predicted activities to learn a regression model capable of assessing user mood.Using only commercially popularized wearable devices, we enable the potential for seamless deployment to the general public.Conducting a real-world study with a prototype system, we collect and evaluate data from 18 users, who provide over 93 user-days of labelled activity data.Regressing for mood based on predicted daily activities, we are able to infer mood angles with a mean absolute error of 0:24p radians on the Circumplex Model of Affect.Comparing with benchmark approaches, our approach outperforms with statistical significance and is validated for robustness against noise from activity misclassification.