Is it helpful for a medical physical health chatbot to show empathy? How can a chatbot show empathy only based on short-term text conversations? We have investigated these questions by building two different medical assistant chatbots with the goal of providing a diagnosis for physical health problem to the user based on a short conversation. One chatbot was advice-only and asked only the necessary questions for the diagnosis without responding to the user's emotions. Another chatbot, capable of showing empathy, responded in a more supportive manner by analyzing the user's emotions and generating appropriate responses with a high empathic accuracy. Using the RoPE scale questionnaire for empathy perception in a human-robot interaction, our empathic chatbot was rated significantly better in showing empathy and was preferred by a majority of the preliminary study participants (N=12).