Image retrieval approaches can assist radiologists by finding similar images in databases as a means to providing decision support. In general, images are indexed using low-level imaging features, and a distance function is used to find the best matches in the feature space. However, using low-level features to capture the appearance of diseases in images is challenging and the semantic gap between these features and the high-level visual concepts in radiology may impair the system performance. We present a semantic framework that enables retrieving similar images based on high-level semantic image annotations. This framework relies on (1) an automatic approach to predict the annotations as semantic terms from Riesz texture image features and (2) a distance function to compare images considering both texture-based and radiodensity-based similarities among image annotations. Experiments performed on CT images emphasize the relevance of this framework.