I just can't accept dentists any longer. I just feel they lie.
In 2016 I had it with the dentist I was going to. After getting a filling he told me I need another on a front tooth. I didn't believe him. So I didn't go back. Things went longer than expected and today I went for cleaning and exam to a new dentist.
This dentist didn't seem to think ANYTHING was wrong with this front tooth.
But she suggested "I should" have a root canal on my back tooth. I questioned what that meant but she just left me more confused. I believe she sees some decay and there isn't a lot of tooth left so she thinks I should do what I can now. But it isn't "necessary". This is the second time I had this happen and last time the root canal specialist said a RC wasn't necessary.
She also did some vague allegations about decay on teeth but it wasn't until I got to the front where I found out I was getting cavities filled.
I mostly don't feel like I can trust them. Everyone wants to do work on my teeth that don't ever hurt. How can I justify that?