We can put bandaids over our symptoms but they will only return with vengeance. It's the equivalent of scooping water out of the boat that is sinking instead of just logically patching the hole. 🤔
Western medicine is not interested in the why or the root.
They're trying to manage a disease instead of produce health.
Depressed? Oh, you have a SSRI deficiency. Here's a drug. High cholesterol? Statin deficiency. Here's a drug. Recurring UTIs? Here's more rounds of antibiotics. And so on and so on. LET'S JUST DRUG IT! Sure, there is a place for medication. We are fortunate to live in a time where lifesaving drugs are available when needed. But these things are not without consequence. The skill of critical thinking has been completely lost in western medicine. Just sit through a 30 minute TV show in America and count how many pharmaceutical commercials you see. It's sickening. You know that isn't the norm in most other countries, right?
The pharmaceutical industry PROFITS off of the SICK. They make no money when you change your diet and heal holistically.
I'm sick of hearing "well I'll trust my doctor because they went to medical school" or "your opinion is not valid because you are self taught." I'm sorry, but what year is it?! We live in the age of technology. Information is literally at our fingertips. So what makes my research null and void compared to a doctors training? Just because you have MD (or RN🙋🏻♀️) behind your name does not give you the right to dismiss someone's concerns or label them as crazy for questioning the status quo.
Start thinking for yourself instead of just believe something because someone told you it was true.
News flash: science is never settled. ✌🏻