How did that happen? The short answer is: Doctors don’t work for you anymore. They work for big business. Over the last several decades, health care has undergone a radical transformation.
Thinking about getting a doctor's note may make you feel like you are in grade school again. They can give you an uneasy feeling like you are a child being monitored as opposed to an adult.