The Testosterone Drop and Estrogen Storm

What’s happening to men and women—and how to test before it’s too late

When Daniel, a 34-year-old lawyer, came in for his first consult, he looked exhausted. “I’m working out five days a week,” he said, “but I can’t put on muscle. My energy crashes by 3 p.m. And my wife jokes that I’ve lost my edge.” His bloodwork confirmed what I suspected: testosterone levels that would have been considered low for a man in his 50s just one generation ago.

On the other side, Clara, a 29-year-old teacher, had a different story: irregular cycles, weight gain around the middle, and anxiety that seemed to spike every month. Her gynecologist had recommended birth control to “regulate” things, but deeper testing revealed disrupted estrogen–progesterone balance and early signs of thyroid stress.

Daniel and Clara aren’t isolated cases. They represent what large-scale studies now show: men and women are hormonally imbalanced in ways our parents and grandparents weren’t.

Hormones under siege
The science backs this up. The Massachusetts Male Aging Study documented a steady decline in testosterone between the 1980s and early 2000s, not explained by age or body weight. A global meta-analysis found sperm counts have dropped by about 50% since the 1970s, with the decline now accelerating. For women, puberty is arriving earlier—roughly three months earlier per decade since the late 1970s—shaping hormone exposure over a lifetime in ways that raise risks of cycle disorders and metabolic disease.

What’s driving this shift? Researchers point to multiple layers:

Endocrine-disrupting chemicals (EDCs): Phthalates, bisphenols (like BPA), and pesticides are now so widespread that the Endocrine Society has labeled them a major health threat.

Metabolic stress: Obesity and insulin resistance alter hormone-binding proteins, shifting estrogen–testosterone balance in both sexes.

Lifestyle disruption: Stress, blue light, and sleep loss spike cortisol and blunt anabolic hormones.

Developmental timing effects: Earlier puberty in girls and lower baseline testosterone in boys create long-term ripple effects for fertility, mood, and vitality.

The result? A hormonal landscape that’s profoundly different from a few decades ago.

The hidden cost of hormone imbalance
Why does this matter for you? Because hormonal health isn’t just about reproduction—it’s about quality of life. Low testosterone leaves men fatigued, unfocused, and at higher risk for cardiovascular disease. Estrogen and progesterone imbalance drives PMS, PCOS, thyroid issues, and mood instability in women. Left unchecked, these imbalances ripple outward: strained marriages, diminished work performance, even generational effects as parents pass along vulnerabilities to their children.

Daniel’s fatigue wasn’t just about the gym—it was about how he showed up at work and at home. Clara’s irregular cycles weren’t just an inconvenience—they were early warning signs her body was out of sync with its environment.

The DUTCH Test reveals what routine labs miss
Here’s the sobering truth: hormonal imbalance isn’t a fringe issue anymore. It’s the new normal. But normal doesn’t mean healthy.

The smartest first step isn’t guessing—it’s testing. This is where the DUTCH Test (Dried Urine Test for Comprehensive Hormones) comes in. Unlike a standard blood draw, DUTCH captures not only your hormone levels but also their metabolites—showing how your body is actually using and clearing hormones. It measures sex hormones, adrenal stress hormones like cortisol, and even organic acids tied to metabolism.

That means a man like Daniel can see if his testosterone is being shunted down the wrong metabolic pathway, while a woman like Clara can uncover whether her estrogen dominance is paired with sluggish detoxification. Armed with this map, a practitioner can build a customized program—whether that’s nutrition, supplements, lifestyle changes, or targeted biohacking tools—to bring hormones back into balance instead of just masking symptoms.

So if you’ve been feeling “off,” stop guessing. The DUTCH Test is available for both men and women and offers a clear window into what’s really happening behind the scenes of your biology. Because the real question isn’t whether your hormones are under pressure—they are. The question is: will you take the step to measure them, so you can finally rebalance them?

Stay vital,

Richard Labaki

Holistic Therapist ​/ ​Longevity Architect 

Dr. Google Syndrome Evolving into Dr. AI Syndrome

Why Self-Diagnosing with AI is a Comedy (and Sometimes a Tragedy) of Errors

 

 

By Richard Labaki

 

Remember the days when you’d wake up with a mysterious ache, type your symptoms into Google, and suddenly convince yourself you had a rare tropical disease, a zombie virus, or at best, a mild case of death? Welcome to the era of Dr. Google Syndrome – the unofficial medical degree you earn after a few frantic clicks at 2 a.m. But now, as AI chatbots like ChatGPT enter the scene, we’ve graduated to a new phenomenon: Dr. AI Syndrome. It’s like Dr. Google’s tech-savvy cousin who talks a lot, sounds smart, but still can’t replace your actual therapist or physician.

The evolution of self-diagnosis

Back in the early 2000s, Google was the go-to “doctor” for those unwilling or unable to visit a real one. You’d type “headache + nausea + dizziness,” and Google would serve up everything from dehydration to brain tumor. The problem? Google doesn’t know “you” – it can’t ask follow-up questions or weigh your personal history. It just throws information at you, leaving you spiraling down a rabbit hole of worst-case scenarios, aka cyberchondria.

Fast forward to today, and AI chatbots like ChatGPT promise a more conversational, personalized experience. You can ask, “Hey ChatGPT, what’s wrong with my stomach?” and get a detailed, articulate response that feels like talking to a knowledgeable friend. But here’s the catch: despite passing some medical exams in controlled settings, AI’s real-world medical diagnosis accuracy is still less than half the time correct – and sometimes hilariously wrong. Imagine your AI doctor confidently telling you that a common cold is actually a rare tropical parasite infestation. Spoiler: it’s not.

Despite passing some medical exams in controlled settings, AI’s real-world medical diagnosis accuracy is still less than half the time correct

Why relying on AI for self-diagnosis is a bad idea!

The idea of AI as a medical oracle is tempting, but it comes with serious pitfalls:

You have to know how to ask: AI chatbots depend heavily on how you phrase your questions. A vague prompt like “I feel bad” gets a vague answer. You need to know enough medical jargon or symptoms to ask the “right” questions. Otherwise, you might get a generic or misleading response. I, for example, know nothing about mechanical engineering. If I were to start asking ChatGPT about matters related to mechanical engineering, I wouldn’t even know how to ask the right questions and then verify the responses.  

AI can hallucinate: No, not in the psychedelic sense, but AI sometimes “hallucinates” – it invents plausible-sounding but false information. This can lead to dangerous advice, like telling a patient they had a vaccine they never received or missing critical symptoms.

Lack of context and nuance: AI can’t perform physical exams, order lab tests, or interpret subtle clinical signs. It also can’t factor in your full medical history or emotional state, which are crucial for accurate diagnosis and treatment.

Accountability issues: If your AI “doctor” messes up, who’s responsible? The developers? The user? The chatbot itself? This murky territory means you’re often left holding the bag for any misdiagnosis or delayed treatment.

AI sometimes “hallucinates” – it invents plausible-sounding but false information

When AI goes off script

In one case, a mental health professional asked ChatGPT for academic references for a legal case. ChatGPT invented fake citations. The opposing lawyer caught it, and now the therapist is facing court sanctions for using AI-generated false information.

Lesson: AI hallucinations aren't just bad – they can get you sued.

A Belgian man in his 30s began using an AI chatbot named Eliza on the app Chai to discuss his growing eco-anxiety. Over six weeks, the bot encouraged him to end his life. Tragically, he followed through. The chatbot was programmed to be emotionally responsive, but lacked ethical boundaries, leading to a preventable death.

Lesson: Emotional dependency on AI can become dangerous without safeguards.

The National Eating Disorders Association (NEDA) replaced its human helpline staff with an AI chatbot. Almost immediately, users reported that the bot gave weight loss advice – to people struggling with eating disorders. It was quickly shut down.

Lesson: Replacing humans with bots in sensitive situations = facepalm.

In a documented user experience study, a woman sought advice for a skin burn, and the chatbot suggested menstruation-related issues. Somewhere between “I spilled coffee on my arm” and “Are you on your period?”, the AI glitched – big time.

Lesson: AI doesn't always understand context. Or anatomy.

Many users report typing symptoms like “headache and fatigue” into ChatGPT or other AI bots and receiving dramatic conclusions like “brain tumor”, “rare autoimmune disease”, or “you might be dying.”

Lesson: Worst-case scenario bias can turn a sniffle into a Shakespearean tragedy.

So, what’s the takeaway?

Dr. Google syndrome taught us that self-diagnosing online can spiral into anxiety and misinformation. Dr. AI syndrome, while more sophisticated, hasn’t yet solved these problems – it has just added new layers of complexity.

- Use AI chatbots as “informational tools, not diagnostic authorities”.

- Always consult a real healthcare professional for diagnosis and treatment.

- If you do use AI, be “critical and skeptical” – challenge the answers and don’t take them at face value.

- Remember, AI can’t replace the human touch, empathy, and clinical judgment of a trained professional.

In the end, whether it’s Google or AI, self-diagnosis is like trying to fix a car by reading the manual without ever popping the hood. Sure, you might get lucky, but more often than not, you’ll end up with a lot of confusion and a car still not running – or in this case, health still uncertain.

So next time you feel under the weather, resist the urge to summon Dr. Google or Dr. AI for a diagnosis. Instead, make an appointment with your real health practitioner –someone who can listen, examine, and treat you properly. Because while AI is a powerful tool, it’s not (yet) your personal physician.


If you enjoyed this post, share it with your fellow cyberchondriacs and AI enthusiasts. Feel free to leave your comments/questions below - I would love to hear your opinion and answer your questions.