We present texture operators encoding class-specific local organizations of image directions (LOIDs) in a rotation-invariant fashion. The LOIDs are key for visual understanding, and are at the origin of the success of the popular approaches, such as local binary patterns (LBPs) and the scale-invariant feature transform (SIFT). Whereas, LBPs and SIFT yield hand-crafted image representations, we propose to learn data-specific representations of the LOIDs in a rotation-invariant fashion. The image operators are based on steerable circular harmonic wavelets (CHWs), offering a rich and yet compact initial representation for characterizing natural textures. The joint location and orientation required to encode the LOIDs is preserved by using moving frames (MFs) texture representations built from locally-steered image gradients that are invariant to rigid motions. In a second step, we use support vector machines to learn a multi-class shaping matrix for the initial CHW representation, yielding data-driven MFs called steerable wavelet machines (SWMs). The SWM forward function is composed of linear operations (i.e., convolution and weighted combinations) interleaved with non-linear steermax operations. We experimentally demonstrate the effectiveness of the proposed operators for classifying natural textures. Our scheme outperforms recent approaches on several test suites of the Outex and the CUReT databases.