Monthly Archives: January 2016

DHEA and the Menopause.

DHEA is often the forgotten hormone. I regularly test for this important hormone, which is essential for good health as we age. It is also known as “the mother of all hormones”
Climacteric. 2013 Apr;16(2):205-13. doi: 10.3109/13697137.2012.733983. Epub 2012 Nov 5.

DHEA and intracrinology at menopause, a positive choice for evolution of the human species.

Abstract

Menopause has been chosen by evolution as the convergence of three factors, namely cessation of ovarian function (reproduction and estrogen secretion), high circulating dehydroepiandrosterone (DHEA), and intracrine enzymes able to convert DHEA into active sex steroids in peripheral tissues. The arrest of estrogen secretion by the ovaries at menopause causes a decrease of circulating estradiol below the threshold of biological activity, thus eliminating stimulation of the endometrium and risk of endometrial cancer. As much as the arrest of secretion of estradiol by the ovaries is essential to protect the uterus, it is of major importance that sex steroids continue to be made available in most other tissues which need estrogens and/or androgens for their normal functioning. Evolution, through 500 million years, has progressively provided the peripheral tissues with the enzymes able to make androgens and estrogens while high levels of DHEA, the precursor of all sex steroids, have appeared much later with the primates approximately 20 million years ago. All elements were thus in place for the functioning of intracrinology or the cell-specific formation of estrogens and androgens in peripheral tissues from the inactive precursor DHEA, with no significant release of active sex steroids in the circulation, thus eliminating the risks of adverse effects in the other tissues, especially the uterus. The presence of subthreshold levels of circulating estradiol combined with the formation of sex steroids from DHEA in specific peripheral tissues (intracrinology) makes menopause a positive characteristic supporting many years of good-quality postmenopausal life, useful for taking care of children and grandchildren. DHEA, however, decreases with age and is present at very different concentrations between different women, with the consequence that approximately 75% of postmenopausal women have too low circulating DHEA levels and suffer from symptoms/signs of hormone deficiency.

Sometimes less is better – so why don’t doctors ‘deintensify’ medical treatment?

Sometimes less is better – so why don’t doctors ‘deintensify’ medical treatment?

October 27, 2015 3.38am AEDT

Do you still need to take that? Lucas Jackson/Reuters

Doctors know a lot about when to start medications to treat disease. But sometimes our focus on starting medicines means we can confuse providing more care with providing better care. And better care sometimes means fewer medicines, not more.

For instance, patients with high blood pressure who have lost weight or are exercising more may find that they may no longer need blood pressure pills. Patients with heartburn who take proton-pump inhibitors (such as Nexium) may do just as well with a lower dose or occasional therapy. Patients who take medications for osteoporosis may be candidates for “drug holidays.”

And as we age, our bodies process medications differently and we become susceptible to different side effects. What may have been the right treatment for a patient when she was 50 can turn out to be dangerous at 80.

That may mean many patients can have their treatment deintensified – changing or stopping medicines when they are no longer needed. But it turns out, doctors often don’t do this, even though it means patients risk fewer side effects and can avoid extra health costs. So why, and when, should a person’s drugs be deintensified?

Who benefits from having treatment deintensified?

Diabetes makes a great case study for deintensification, because patients often need treatment over the course of a lifetime. For decades, doctors have focused on treating diabetes intensively to lower their patients’ risks of developing kidney disease and other complications. But we now know that intensive treatment for diabetes, like nearly all medical treatments, can have also cause serious harm, such as low blood sugar levels, which can lead to falls and memory problems, and even death.

Many patients with diabetes may benefit from deintensification. Older patients, in particular, are more likely to experience drug side effects, and patients taking more than one medicine run a risk of harmful drug interactions. Older patients also have less to gain from intensive treatment of their diabetes because they have fewer years to develop the long-term effects of diabetes on their bodies. And as a person’s health status changes, they may need fewer – not more – medicines to manage their diabetes.

That doesn’t mean intensive treatment is bad – it just means that not every patient needs it, and some patients may need it for only a certain amount of time. For example, intensive treatment to lower blood sugar in younger people lowers their risk of developing kidney and eye disease, and other harmful long-term effects of diabetes.

So drug choices need to be individualized based on what a person stands to gain from intensive treatment, balanced against their risk of treatment side effects. Deintensifying treatment means finding the sweet spot between too much and too little medicine.

Even though many clinical practice guidelines already recognize that goals for diabetes control and other chronic conditions should be based on a patient’s individual risk and benefits of treatment, this message hasn’t gotten through to all doctors and patients. And none of these guidelines specify who should have treatment deintensification and when that should happen.

Lucas Jackson/Reuters

Older diabetes patients are often overtreated

Several studies have found that older patients with diabetes are often overtreated – meaning that they are taking more medications, or medications at too high doses, than they need to achieve a safe level of sugar control.

Recently, we reported that doctors deintensified medications for only a quarter of nearly 25,000 older patients with diabetes who were treated to potentially dangerously low levels of sugar control. Deintensification rates barely budged even if the patient has had low blood sugar multiple times or had severely limited life expectancy.

In patients with low blood sugar who did not have their treatment deintensified, 40% did not even have their diabetes control values rechecked within six months. This means that the majority of overtreated patients continued to take medications that they did not need or at doses that were too high.

Why don’t clinicians deintensify treatment?

Doctors usually focus on intensifying therapy to control blood sugar, which means that deintensifying treatment can take a completely new mindset.

In another study, we asked primary care providers what they thought would be appropriate treatment for a hypothetical patient in his late 70’s who has had diabetes for 20 years and also has kidney disease. The patient takes two pills every day to manage his diabetes, but could be fine just taking one of them. We found that 39% of almost 600 respondents felt that this patient would continue to benefit from stringent diabetes control – despite current expert recommendations to the contrary.

When we looked at reasons why, 42% of providers worried that not treating him intensively could harm the scores on their clinical report cards, which track the quality of care the doctors’ provide to their patients. Nearly one-quarter worried about legal liability resulting from decreasing medications.

Just as troubling, 30% wouldn’t deintensify the diabetes medications because they worried they wouldn’t have enough time to discuss these changes with the patient.

Finding the treatment sweet spot

This isn’t just an issue for people with diabetes – it’s an issue for anyone living with a chronic condition.

So how do we encourage appropriate deintensification in order to get to the sweet spot for treatment? There are many changes that could help.

First, health care systems should institute programs that systematically engage providers and patients to consider stopping medications that are no longer necessary.

For example, the VA has instituted a national “Hypoglycemia Safety Initiative“ to encourage appropriate deintensification of diabetes medications in order to decrease the harm of intensive treatment among those at risk for hypoglycemia (low blood sugar).

Second, patients should ask their providers if their medications are still necessary. Some could possibly be stopped or the dose decreased. Providers should regularly reexamine their patients’ medication lists and discuss the options.

As a person’s health changes, so should the medicine they take. Prescription pad via www.shutterstock.com.

Third, while existing clinical practice guidelines already say that treatment should be based on risk and benefits for an individual patient, these guidelines should go a step further and include explicit recommendations for deintensification to help providers and patients decide when stopping a medication might be wise.

Fourth, the way we assess whether doctors are providing high-quality care should look not just at whether high-intensity treatment is provided for a patient, but also if doctors are deintensifying treatment when possible and beneficial.

Finally, we must get out the message that more is not always better. Campaigns such as Choosing Wisely®, in conjunction with Consumer Reports, educate the public about care that might not be needed, but only 21% of US doctors surveyed were aware of the campaign.

Changing the “more is better” mindset among both patients and providers will not be easy, but it will be essential if we want to ensure that patients get the treatments they need but not those that are unnecessary and potentially harmful

Estrogen Protective Against Flu Virus in Women But Not Men, Study Suggests

I have often spoken about the benefits of Oestrogen to women, and here is another major benefit which I did not know.

January 19, 2016

Estrogen Protective Against Flu Virus in Women But Not Men, Study Suggests

Finding could offer additional benefits to hormone replacement therapy or infertility treatment

Estrogen dramatically reduced the amount of flu virus that replicated in infected cells from women but not from men, a new study by researchers at the Johns Hopkins Bloomberg School of Public Health shows.

The findings, reported online last week in the American Journal of Physiology – Lung Cellular and Molecular Physiology, suggest a protective advantage to the quintessential female hormone that naturally circulates in women’s bodies, as well as artificial forms given for hormone replacement therapy and estrogen-like chemicals found in the environment.

Recent studies have shown that estrogen can hamper replication of viruses including HIV, Ebola and hepatitis, which can lessen an infection’s severity and make an infection less likely to spread to other people. But the new study’s leader, Sabra L. Klein, Ph.D., an associate professor in the Departments of Molecular Microbiology and Immunology and Biochemistry and Molecular Biology at the Bloomberg School, says it was unknown whether estrogen might have the same effect on the flu virus.

To investigate, she and her colleagues collected cells from the nasal passage — typically the first cells in the body to get infected with the flu — from female and male volunteers. The researchers exposed batches of these cells to different types of estrogens, including normal levels of naturally occurring estrogen, different types of selective estrogen receptor modulators (SERMs, synthetic estrogen-like chemicals used for hormone replacement therapy and infertility treatment, among other uses) or bisphenol A, an estrogen-like chemical found in many plastics. They then exposed cells to the influenza A virus, a variant of the flu virus that circulates each year during the flu season.

Tests showed that female cells that received estrogens, including some types of SERMs and bisphenol A, had marked reductions of viral replications — nearly 1,000-fold less compared to those that hadn’t been exposed to these hormones. Further investigations showed that the hormones that caused this striking effect act on estrogen receptor beta, one of two types of receptors for estrogen inside cells.

Klein explains that even though men produce estrogen, their cells have far fewer receptors for the hormone. That might be why estrogen didn’t have the same protective effects against flu virus replication in cells from men.

When Klein and her colleagues looked for a mechanism behind estrogen’s protective effect, they found that binding to estrogen receptor beta decreased the activity of more than 30 genes involved in cell metabolism, slowing the metabolic activity of these cells and potentially preventing them from manufacturing viral particles.

Klein notes that because hormone levels cycle in pre-menopausal women, it’s unlikely that there’s a population-wide effect in protecting this group against the flu. But, she says, the new findings suggest that hormones that women already take for contraception, hormone replacement therapy, infertility treatments or other medical uses could play a powerful role in reducing infection. She doesn’t suggest starting hormone therapies for this reason, however, because estrogens can have other biological effects ranging from strengthening bones to increasing risks for some types of cancer.

“If women are taking estrogen-like hormones for other reasons, an added benefit might be less susceptibility to influenza during the flu season,” Klein says. The findings could be particularly important for elderly women, she adds, since this population is most susceptible to severe influenza.

“Being on hormone replacement therapy could be one way to mitigate the severity of this disease, which is exciting, simple and cheap,” she says. “While the decision to take hormone therapy should always depend on a patient’s history and include discussion with their care providers, our study shows another potential benefit to this hormone.”

Estrogenic compounds reduce influenza A virus replication in primary human nasal epithelial cells derived from female, but not male, donors” was written by Jackye Peretz, Andrew Pekosz, Andrew P. Lane and Sabra L. Klein.

This research was supported by grants from the Center for Alternatives to Animal Testing at the Johns Hopkins Bloomberg School of Public Health, the National Institutes of Health’s National Institute on Aging (R01AI097417, R01AI72502 and T32 AI007417) and the U.S. Department of Health and Human Services (HHSN272201400007C).

why do my muscles ache the day after exercise?

Health Check: why do my muscles ache the day after exercise?

September 14, 2015 2.06pm AEST

Think of it as a useful signal from your body. lzf/Shutterstock

It’s normal to experience muscle pain after exercising if it’s been a while since you were active or performed a certain movement. This type of pain – called delayed onset muscle soreness or DOMS – generally develops several hours later and exacerbates over the next few days.

The exercise that induces DOMS consists of eccentric (lengthening) muscle contractions in which contracting muscles are lengthened. Walking down a set of stairs or slope, where front thigh muscles are lengthened when supporting the body weight, is one example of eccentric exercise.

Another is using weights, such as a dumbbells. When lowering a heavy object slowly from an elbow flexed to an extended position, the muscles to flex the elbow joint perform eccentric exercise, since the external load (dumbbell) is greater than the force generated by the muscle.

Exercise consisting of mainly concentric (shortening) contractions, where muscles contract and are shortened, such as walking up stairs and lifting a dumbbell, does not induce DOMS at all.

DOMS is technically considered an indicator of “muscle damage”, as muscle function decreases and, in some cases, muscle-specific proteins increase in the blood, indicating plasma membrane damage. But it appears that very few muscle fibres are actually injured or destroyed (less than 1% of total muscle fibres).

Interestingly, other structures such as fascia (the sheath of tissue surrounding the muscle) and connective tissue within the muscle appear to be more affected by eccentric contractions.

Structure of skeletal muscle. Designua/Shutterstock

A study my colleagues and I recently published tested the hypothesis that fascia would become more sensitive than muscle when DOMS is induced. We probed the muscles of volunteer eccentric exercisers with an acupuncture needle designed to introduce a steadily increasing electrical current from its tip, until they reported muscle pain.

The results showed that DOMS was associated with the increased sensitivity of muscle fascia to the stimulus, suggesting the source of pain is fascia (connective tissue) rather than the muscle fibres themselves.

We still don’t know how eccentric contractions affect connective tissue surrounding muscle fibres. It’s possible they have different levels of elasticity. So, as the contracting muscle is stretched, a shear force may develop between muscle fibres and their surrounding connective tissue. This may damage the structure and cause inflammation.

It’s still a mystery why there’s a delay between the exercise and muscle soreness. Researchers speculate that it’s due to the time it takes for inflammation to develop after the micro-injury.

It doesn’t appear that DOMS is a warning sign not to move the affected muscles, since moving the muscles ameliorates the pain and does not hamper the recovery. It may be that DOMS is a simple message from the body that the muscle lacked a good stimulus for a while, which it received.

But is it necessary for developing bigger and stronger muscles?

There’s no scientific evidence to support the theory of “no pain no gain”. Research shows eccentric exercise training produces greater increases in muscle strength and size when compared with concentric exercise training, but this is not necessarily associated with “muscle damage.”

Don’t be afraid of DOMS, although it could bother you for several days after exercise. DOMS reduces when the same eccentric exercise is repeated. If the intensity and volume of eccentric exercise are gradually increased, you can minimise DOMS.

In the meantime, think of DOMS as a useful signal from your body.

Menopause and exercise

Surprise, surprise- exercise is actually good for you. Make your new years resolution to be more active – in any way you can.
Menopause:
Clinical Corner: Invited Review

Menopause and exercise

Grindler, Natalia M. MD; Santoro, Nanette F. MD

Abstract

Objective: Accumulating data suggest that regular physical exercise reduces mortality and extends the functional life span of men and women. This review seeks to describe the current state of the medical literature on this topic.

Methods: A narrative review of the current medical literature including randomized clinical trials and clinical guidelines that address the benefits of physical fitness and regular exercise on the health of midlife and postmenopausal women.

Results: Reduction and avoidance of obesity and its related comorbidities (hypertension, glucose intolerance, dyslipidemia, and heart disease) are one major benefit of exercise. However, long-term physical exercise is also associated with reduced rates of cancer, dementia and cognitive decline, adverse mood and anxiety symptoms, and reduction of osteoporosis, osteopenia, falls, and fractures. Beneficial physical activity includes exercise that will promote cardiovascular fitness (aerobic), muscle strength (resistance), flexibility (stretching), and balance (many of the preceding, and additional activities such as yoga).

Conclusions: Given that it is unambiguously beneficial, inexpensive, and minimal risk, maintaining a healthy exercise regimen should be a goal for every participant to enhance lifelong wellness. Clinicians should use a number of behavioral strategies to support the physical fitness goals of their participants.

Estrogen and Memory

This study repeats something I have been on about over the last few years – the benefits of oestrogen to women’s mental state in menopause – specifically memory and avoiding dementia and Alzheimer’s. However, it is important to know that the benefits only accrue if the oestrogen is started soon after the onset of menopause (“window of opportunity”)
Endocrinology. 2013 Aug; 154(8): 2570–2572.
PMCID: PMC3713207

Is Timing Everything? New Insights into Why the Effect of Estrogen Therapy on Memory Might be Age Dependent

Despite controversy concerning relative risks and benefits, hormone therapy (HT) remains the most effective treatment for vasomotor symptoms associated with menopause (1). Understanding the factors that determine the health outcomes associated with HT is therefore a key priority. One critical factor is the age at which treatment is initiated. Perhaps the most compelling evidence in support of that view comes from the very study that raised concerns about the risks of HT, the Women’s Health Initiative (WHI). After an average of 7.1 years of follow-up, the WHI Estrogen-Alone Trial was stopped early because of an increased risk of stroke that was equivalent to an absolute excess risk of 12 additional strokes per 10 000 person years (2). After a mean of 10.7 years of follow-up in that trial, the impact of conjugated equine estrogens (CEE) on health outcomes, including coronary heart disease, myocardial infarction, colorectal cancer, total mortality, and a global index of chronic disease, was more favorable for women aged 50–59 at treatment initiation compared with women who initiated treatment at older ages (3)

Within the 50- to 59-year-old group, the risk of coronary heart disease and total myocardial infarction was significantly reduced with CEE (3). (The risk of stroke was no longer elevated during the postintervention follow-up period). Thus early age at initiation was a critical determinant of the long-term impact of estrogen treatment. Such data support the hypothesis that estrogen treatment may confer benefits when initiated early, a hypothesis that is referred to in the literature as the “timing hypothesis,” the “window of opportunity hypothesis,” or the “critical window hypothesis.”

The timing hypothesis has been proposed to explain the impact of HT on memory and risk for Alzheimer’s disease (4). Meta-analyses of observational studies revealed a significant 29% decreased risk of Alzheimer’s disease in women who used HT (5). In contrast, the Women’s Health Initiative Memory Study (WHIMS) revealed a doubling of the risk of all-cause dementia in older women (mean age 72 years) randomized to receive CEE plus medroxyprogesterone acetate (CEE/MPA) and no increased or decreased risk of all-cause dementia in women randomized to receive CEE alone (6, 7). The discrepancy between the observational studies and WHIMS has been attributed to differences in the timing of treatment initiation, with younger ages in women in the observational studies compared with the older women in WHIMS.

To date, 3 of 3 observational cohort studies that specifically examined the timing of HT initiation on the risk for Alzheimer’s disease have provided support for the timing hypothesis (810). An ancillary study, the WHIMS of Younger Women, found no impact of either CEE or CEE/MPA on a measure of global cognitive function in women who were randomized to HT between the ages of 50 and 55 and who completed the assessment of global cognitive function on average 7.2 years after the trials ended, suggesting that treatments may be safer for younger women (11). It is unknown what percentage of women in the CEE-alone arm of WHIMS of Younger Women might have initiated CEE treatment during the critical window, because the average number of years since final menstrual period in the CEE group at the time of randomization was 9 years, and about 40% of women in that study arm had bilateral oopherectomy (12). Neuroimaging studies in humans indicate that verbal memory and hippocampal function are enhanced with early initiation of HT (13). Such human data are consistent with evidence from animal studies that the impact of 17β-estradiol (E2) on hippocampal function varies with age and is enhanced with early initiation (1416).

In this issue of Endocrinology, Rao and colleagues (17) provide new insights into mechanisms that might explain why the impact of E2 on cognition and brain function might depend on age at E2 treatment. They focused on microRNAs (miRNAs), which are small noncoding RNAs that regulate gene expression by binding to complementary sequences in the 3′-untranslated region of target mRNAs, usually resulting in their silencing. A subset of miRNAs are known to mediate estrogen-regulated activities, and/or are regulated by estradiol (18). The hypothesis was that E2 might differentially regulate miRNA expression in the ventral hippocampus of female rats, leading to altered gene expression important for neuronal function. The ventral hippocampus was chosen as the primary brain region of interest because of its critical role in memory function. To test this hypothesis, they treated ovariectomized 3-month-old (young adult) and 18-month-old (late middle age) female rats with sc E2 (2.5 μg) or vehicle control for 3 days. The E2 dose resulted in physiological levels of E2 levels that are comparable to levels observed during late diestrus/early proestrus, a phase of the rat estrous cycle shortly prior to the preovulatory increase in estradiol. The main question was how E2 treatment, age, and the interaction of E2 and age influenced miRNA expression.

In the first step of identifying candidate miRNAs in the left ventral hippocampus, the investigators used a microarray platform to probe 723 miRNAs and identify those with moderate-to-very high expression levels and high signal intensity. Of the 723 miRNAs, 34 were significantly altered by E2 treatment regardless of age, 21 were significantly altered by age regardless of treatment, and 9 were altered by a combination of the 2 factors. In a second validation step, real-time quantitative PCR of RNA taken from the right ventral hippocampus confirmed that 7 of those 9 miRNAs were altered. Specifically, E2 treatment significantly impacted 6 of the 7 miRNAs tested, age affected 2 of the 7, and the 2 factors combined affected 2 of the 7. Target prediction programs were then used to understand the physiologic impact of the E2-regulated miRNAs. Based on those predictions, levels of sirtuin 1 (SIRT1), glucocorticoid receptor, γ-aminobutyric acid receptor A1, and brain-derived neurotrophic factor (BDNF) mRNA expression were measured and compared across treatment groups. Because degradation of mRNA transcripts is one mechanism whereby miRNAs decrease the protein expression of their target genes, it was expected that miRNA expression would relate inversely to target mRNA expression. Indeed, that was the case for SIRT1, which was down-regulated in young, E2-treated animals. SIRT1 expression has been linked to memory and neuronal plasticity (19), as well as anxiety through effects on monamine oxidase (20). There was also a strong trend for an E2-mediated increase in BDNF in the young animals. Notably, E2 enhances hippocampal expression of BDNF, a growth factor that increases dendritic spines and enhances memory function (21, 22). Overall, these results indicate that E2 is a critical regulator of miRNA expression in the ventral hippocampus generally, affecting gene targets that influence hippocampal function and memory.

Lastly, to determine whether these effects were specific to the ventral hippocampus, the authors examined miRNA expression levels in 3 other brain regions that are functionally connected to the ventral hippocampus: dorsal hippocampus, central amygdala, and paraventricular nucleus. Results demonstrated that magnitude of E2 action on miRNA expression differed by brain region, with most brain regions having unique miRNAs that were significantly affected by both age and treatment. The exception was that E2 altered expression of miR-7a, which targets SIRT1, in multiple brain regions. Notably, treatment with SIRT1-activating compounds has been shown to delay neurodegeneration in a transgenic mouse model and is thought to be a key mechanism involved in the beneficial effects of caloric restriction on memory and neuronal function (23) as well as the neuroprotective effects of resveratrol (24).

The demonstration that E2 alters miRNA expression in an age-dependent manner opens up several avenues for future studies. First, as noted by the authors, a 3-day treatment exposure was sufficient to demonstrate proof of concept that E2 can impact miRNA expression, but longer-term treatment is needed to better model how E2 is used in clinical practice. Second, whether early and sustained E2 treatment influences miRNA expression long after treatment is discontinued is an important area of future inquiry. Women who have an oophorectomy before the normal age at menopause show an increased risk for cognitive impairment or dementia later in life unless they are treated with estrogen until the normal age at menopause (25). Third, the study provides justification for investigating SIRT 1 as a potential mediator of the impact of E2 on memory and hippocampal function. SIRT1 has been implicated in the disruption of mitochondrial bioenergenetics in Alzheimer’s disease and mild cognitive impairment (26). Disruption of mitochondrial bioenergetics, in turn, has been implicated as a potential mechanism explaining the negative impact of estrogen on cognition when initiated in older women (27). Fourth, the increase in dementia observed with CEE/MPA rather than CEE alone suggests potential deleterious effects of MPA on brain function in older women (6, 7). Understanding how MPA and other progestins might alter the impact of E2 on miRNA levels can help to guide future clinical trials to identify estrogen/progestin combinations that might confer cognitive benefits. Lastly, in light of evidence that estrogen effects on cardiovascular disease and mortality are age dependent (3), it will be important to expand the evaluation of miRNAs in tissues other than brain.

In conclusion, there is compelling evidence that the impact of estrogen treatment on health outcomes varies with age and that health benefits may be most evident with early treatment. The precise mechanisms underlying such effects remain poorly understood. However, findings on miRNA expression presented by Rao and colleagues (17) provide new evidence that age-dependent effects on memory may be due to E2modulation of neuronal target genes that alter gene expression and influence hippocampal function.

So you think you have IBS, coeliac disease or Crohn’s? Here’s what it might mean for you

So you think you have IBS, coeliac disease or Crohn’s? Here’s what it might mean for you

August 20, 2015 2.48pm AEST

See your doctor if you suffer from gastrointestinal symptoms, particularly if you’ve had them for weeks or months. Holly Lay/Flickr, CC BY

Disclosure statement

Dr Rebecca Reynolds is a registered nutritionist and the owner of The Real Bok Choy www.therealbokchoy (formerly Wholesomely), a nutrition and lifestyle company that provides evidence-based information and advice

Partners

UNSW Australia provide funding as a member of The Conversation AU.

Conditions affecting the gastrointestinal tract are common in modern humans and many are on the rise. The gastrointestinal tract extends from the mouth to the anus, via the stomach and the bowels, which include the small intestine and the large intestine (colon).

Around one in five Australians suffers symptoms of irritable bowel syndrome (IBS) at some point in their life. Around one in 70 have coeliac disease (though many don’t know they have it). Inflammatory bowel disease (IBD), which usually manifests as Crohn’s disease or ulcerative colitis, is less common, affecting three in 10,000 Australians.

The gastrointestinal tract. Blamb/Shutterstock

Irritable bowel syndrome is also called irritable colon. People with IBS have sensitive large intestines that are easily aggravated.

Coeliac disease is an autoimmune condition in which the body reacts abnormally to gluten, which is found in wheat, oats, rye and barley. (An easy way to remember this is the acronym WORB.) This abnormal reaction to gluten causes damage and inflammation to the small intestine.

Coeliac disease is not a food allergy or intolerance. Some people can be sensitive to gluten, but not have coeliac disease. This is called non-coeliac gluten sensitivity.

In inflammatory bowel disease, the gastrointestinal tract becomes swollen and red from inflammation. Abscesses and cracks can develop in any part of the tract in Crohn’s, while open sores called ulcers usually affect the large intestine in ulcerative colitis.

The causes of these gastrointestinal conditions are not well-understood, but may include a combination of genetic and environmental factors, such as infection, psychological stress and diet. Researchers have reported associations, for instance, between higher intakes of total fat and meat and an increased risk of developing Crohn’s disease and ulcerative colitis.

Symptoms

Symptoms of these gastrointestinal tract conditions include bloating, cramps, abdominal pain, excessive wind, diarrhoea, constipation, nausea, fatigue, mucus or blood in stools, body aches, weight loss and nutrient deficiencies.

Suffering from a gastrointestinal condition can be very stressful. Imagine being in constant abdominal pain and your toilet habits alternating between diarrhoea and constipation. Or your gut becoming so inflamed you have to go to hospital.

See your doctor if you suffer from these symptoms, particularly if you’ve had them for weeks or months; don’t wait years. Due to similarities in the symptoms of these gastrointestinal conditions, diagnosis often takes some time. It may also be necessary to investigate bowel cancer as a possibility.

It’s best not to self-diagnose or self-treat. If you remove gluten from your diet and feel better, for instance, that doesn’t automatically mean that you have coeliac disease and need to carefully avoid gluten for life.

Treatment

Many gastrointestinal afflictions cannot be cured, but can be managed with combinations of medication, diet and psychological treatment.

Thankfully, irritable bowel syndrome can sometimes resolve over time and leave no long-term damage in the gastrointestinal tract. A 2015 review found that for the management of IBS symptoms, an individual’s diet, lifestyle and medical and behavioural factors must be taken into account. Because of a link between IBS and stress, psychological therapy has also been shown to reduce symptom severity and improve quality of life.

FODMAP stands for fermentable oligosaccharides, disaccharides, monosaccharides and polyols. These nutrients are poorly digested and absorbed in the small intestine, and therefore reach the large intestine, where they are fermented by bacteria. A low FODMAP diet and certain probiotics may also help ease IBS symptoms, although the long-term benefits of a low FODMAP diet are unclear.

In comparison, coeliac disease cannot be cured and must be managed with a strict, lifelong gluten-free diet to prevent small intestinal damage. And I mean strict. Even the gluten in a wheat bread crumb can cause bowel injury.

Conversely, symptoms associated with non-coeliac gluten sensitivity may indeed be due to gluten, or may be associated with other dietary components. Recent research has implicated FODMAP in non-coeliac gluten sensitivity. Like with the dietary management of IBS, a diet low or free from gluten and/or FODMAP may improve non-coeliac gluten sensitivity.

Inflammatory bowel disease cannot be cured and is often managed with medications such as steroids and immunomodulators that control the high levels of gut inflammation. There is currently insufficient evidence to suggest dietary changes can treat inflammatory bowel disease, but future directions may involve manipulation of gut bacteria using combinations of antibiotics, prebiotics, probiotics and diet.

So, what can I eat?

The first step is to find out if you actually have a clinical gastrointestinal problem, which you can only do by consulting with a medical professional and having appropriate tests. Any dietary advice will depend on this diagnosis, as well as your individual situation.

Avoiding FODMAP-containing foods if you have non-coeliac gluten sensitivity or IBS may help ease symptoms and improve quality of life. This means cutting out otherwise healthy, fibre- and nutrient-rich foods, such as apples, onions and lentils.

The total removal of gluten from a coeliac’s diet means a rigid avoidance of not just bread and pasta but also many processed foods, including sauces, stocks, processed meats, ice cream, mayonnaise, vinegar and other products.

It also means shunning foods that are supposedly “gluten-free” but may have been contaminated with gluten by the use of shared apparatus, such as tongs to serve both gluten-free and other cookies in a cafe.

Avoiding gluten if you have non-coeliac gluten sensitivity may not need to be so strict.

For inflammatory bowel disease, the evidence may not yet clear enough to prescribe nutrition therapy, but eating a healthy and balanced diet can’t hurt.

People with gastrointestinal problems may benefit from personalised dietary advice from a health professional, such as a dietitian. There are also several national organisations that provide essential advice and support, such as Coeliac Australia.

Your Brain, Your Disease, Your Self

Photo

CreditGérard DuBois

Neurodegenerative diseases that target the motor system, like amyotrophic lateral sclerosis, can lead to equally devastating consequences: difficulty moving, walking, speaking and eventually, swallowing and breathing. Yet they do not seem to threaten the fabric of selfhood in quite the same way.

Memory, it seems, is central to identity. And indeed, many philosophers and psychologists have supposed as much. This idea is intuitive enough, for what captures our personal trajectory through life better than the vault of our recollections?

The challenge in trying in determine what parts of the mind contribute to personal identity is that each neurodegenerative disease can affect many cognitive systems, with the exact constellation of symptoms manifesting differently from one patient to the next. For instance, some Alzheimer’s patients experience only memory loss, whereas others also experience personality change or impaired visual recognition.

The only way to tease apart which changes render someone unrecognizable is to compare all such symptoms, across multiple diseases. And that’s just what we did, in a study published this month in Psychological Science.

What we found runs counter to what many people might expect, and certainly what most psychologists would have guessed: The single most powerful predictor of identity change was not disruption to memory — but rather disruption to the moral faculty.

We surveyed 248 family members of people who had one of three types of neurodegenerative disease: Alzheimer’s, A.L.S. or frontotemporal dementia.

Frontotemporal dementia is the second most common form of dementia after Alzheimer’s. It obliterates executive function in the brain, impairing self-control and scrambling the moral compass. People with the disease are prone to antisocial outbursts, apathy, pathological lying, stealing and sexual infidelity.

In one part of the survey, we asked the family members questions designed to evaluate identity persistence. For instance, did they feel like they still knew who the patient was? Did the patient ever seem like a stranger?

We found that people with frontotemporal dementia exhibited the highest degree of identity change, and that people with A.L.S. exhibited the least. People with Alzheimer’s were somewhere between these two extremes.

While this result was suggestive, it still didn’t tell us which specific symptoms were causing the patients to no longer seem like themselves. For this, we would need to collect a detailed history of the scope and extent of the symptoms that each patient had experienced.

So in another part of the survey, we asked about basic cognitive faculties, like executing voluntary movements and object recognition; about the patient’s memory for words and facts and autobiographical details; about emotional changes like agitation and depression; about nonmoral personality change, like extroversion, sense of humor, creativity and intelligence; and about moral character and moral behavior changes, such as empathy, honesty and compassion.

We found that disruptions to the moral faculty created a powerful sense that the patient’s identity had been compromised. Virtually no other mental impairment led people to stop seeming like themselves. This included amnesia, personality change, loss of intelligence, emotional disturbances and the ability to perform basic daily tasks.

For those with Alzheimer’s, neither degree nor type of memory impairment impacted perceived identity. All that mattered was whether their moral capacities remained intact.

As monstrous as neurodegenerative disease is, its powers of identity theft have been greatly exaggerated. Remarkably, a person can undergo significant cognitive change and still come across as fundamentally the same person.

What makes us recognizable to others resides almost entirely within a relatively narrow band of cognitive functioning. It is only when our grip on the moral universe loosens that our identity slips away with it.

Testing Testosterone: Trial Finds No Link to Hardening of the Arteries

This study is reassuring to men that testosterone does not increase the risk of heart disease, as many believe. Another finding is that testosterone is not the panacea for older men’s energy and sex problems. It may only be a part of the treatment, including the complete lifestyle package – weight loss, diet, exercise, judicious use of supplements, lower stress and other aspects of life that I cover from time to time.

Aug 11, 2015

Testing Testosterone: Trial Finds No Link to Hardening of the Arteries
Testosterone sales have grown rapidly over the last decade, but few studies have examined the long-term effects of taking testosterone on cardiovascular health and other important outcomes. This week, investigators from Brigham and Women’s Hospital (BWH) report the results of the Testosterone’s Effects on Atherosclerosis Progression in Aging Men (TEAAM) trial in The Journal of the American Medical Association (JAMA). The three-year study finds that testosterone administration had no effect on the progression of hardening of the arteries in older men with low to low normal testosterone levels and did not significantly improve sexual function or health-related quality of life.

“The results of this trial suggest that testosterone should not be used indiscriminately by men,” said corresponding author Shalender Bhasin, MD, director of BWH’s Research Program in Men’s Health: Aging and Metabolism and director of the Boston Claude D. Pepper Older Americans Independence Center at BWH. “We find that men with low and low normal testosterone are unlikely to derive benefits in terms of sexual function or quality of life, two reasons why men may seek testosterone therapy. And although we find that testosterone did not affect the rate of hardening of the arteries, we need long-term data from large trials to determine testosterone’s effects on other major cardiovascular events.”

Atherosclerosis
The TEAAM trial measured two indicators of atherosclerosis,
or the build up of plaque in the arteries, in more than 300 men.
 

Testosterone, a hormone primarily secreted by the testicles, plays a key role not only in male reproductive tissues but also in muscle growth, bone mass and body hair. As men get older, their testosterone levels naturally decline – on average by 1 percent a year after age 40. Previous studies that have aimed to examine rates of cardiovascular events in men taking testosterone have reported conflicting results but have raised concerns that testosterone therapy might increase a person’s risk of a heart attack or stroke. Atherosclerosis, or the buildup of plaque in the arteries, is a critical risk factor for such cardiovascular events.

In the three-year, double-blind TEAAM trial, the research team enrolled more than 300 men over the age of 60 with total testosterone levels between 100-400 ng/dL (low to low normal range) and measured two indicators of atherosclerosis: calcium deposits in the arteries of the heart (coronary artery calcification) and the thickness of inner lining of the carotid arteries that supply blood to the brain (common carotid artery intima-media thickness). To measure secondary outcomes of sexual function and health-related quality of life, the research team had participants also completed a 15-item questionnaire. Participants applied a testosterone or placebo gel daily for three years.

“Our study has important implications for clinical practice, and for older men who are seeking testosterone therapy,” said Bhasin. “Many men, as they get older, experience a decline in testosterone and in sexual function and vitality. But our study finds that taking testosterone, when levels are in the low to low normal range, may not improve sexual function or quality of life.”

The TEAAM trial was designed to examine atherosclerosis progression and not cardiovascular events — further studies will be needed to determine the cardiovascular safety of testosterone use in older men. The research team also notes that comparing patients using statins to those who are not could be another important direction for future studies.

This work was supported by Solvay Pharmaceuticals, Inc. Abbvie Pharmaceuticals, Inc., the Aurora Foundation, the Boston Claude D. Pepper Older Americans Independence Center grant 5P30AG031679, and the Boston University’s Clinical and Translational Science Institute grant 1UL1RR025771. Testosterone and placebo gel for the study were provided by Solvay Pharmaceuticals, Inc., and later by Abbvie Pharmaceuticals.

Why some people get grumpy when they’re hungry

Health Check: the science of ‘hangry’, or why some people get grumpy when they’re hungry

July 20, 2015 1.14pm AEST

There are many reasons why some people get very grumpy when they haven’t eaten for a while. Katie Inglis/Flickr, CC BY-NC-ND

Have you ever snapped angrily at someone when you were hungry? Or has someone snapped angrily at you when they were hungry? If so, you’ve experienced “hangry” (an amalgam of hungry and angry) – the phenomenon whereby some people get grumpy and short-tempered when they’re overdue for a feed.

But where does hanger come from? And why is it that only some people seem to get hangry? The answer lies in some of the processes that happen inside your body when it needs food.

The physiology of hanger

The carbohydrates, proteins and fats in everything you eat are digested into simple sugars (such as glucose), amino acids and free fatty acids. These nutrients pass into your bloodstream from where they are distributed to your organs and tissues and used for energy.

As time passes after your last meal, the amount of these nutrients circulating in your bloodstream starts to drop. If your blood-glucose levels fall far enough, your brain will perceive it as a life-threatening situation. You see, unlike most other organs and tissues in your body which can use a variety of nutrients to keep functioning, your brain is critically dependent on glucose to do its job.

You’ve probably already noticed this dependence your brain has on glucose; simple things can become difficult when you’re hungry and your blood glucose levels drop. You may find it hard to concentrate, for instance, or you may make silly mistakes. Or you might have noticed that your words become muddled or slurred.

Another thing that can become more difficult when you’re hungry is behaving within socially acceptable norms, such as not snapping at people. So while you may be able to conjure up enough brain power to avoid being grumpy with important colleagues, you may let your guard down and inadvertently snap at the people you are most relaxed with or care most about, such as partners and friends. Sound familiar?

Another bodily response

Besides a drop in blood-glucose concentrations, another reason people can become hangry is the glucose counter-regulatory response. Let me explain.

When blood-glucose levels drop to a certain threshold, your brain sends instructions to several organs in your body to synthesise and release hormones that increase the amount of glucose in your bloodstream.

The four main glucose counter-regulatory hormones are: growth hormone from the pituitary gland situated deep in the brain; glucagon from the pancreas; and adrenaline, which is sometimes called epinephrine, and cortisol, which are both from the adrenal glands. These latter two glucose counter-regulatory hormones are stress hormones that are released into your bloodstream in all sorts of stressful situations, not just when you experience the physical stress of low blood-glucose levels.

In fact, adrenaline is one of the major hormones released into your bloodstream with the “fight or flight” response to a sudden scare, such as when you see, hear or even think something that threatens your safety. Just as you might easily shout out in anger at someone during the “fight or flight” response, the flood of adrenaline you get during the glucose counter-regulatory response can promote a similar response.

Nature and nurture

Another reason hunger is linked to anger is that both are controlled by common genes. The product of one such gene is neuropeptide Y, a natural brain chemical released into the brain when you are hungry. It stimulates voracious feeding behaviours by acting on a variety of receptors in the brain, including one called the Y1 receptor.

Besides acting in the brain to control hunger, neuropeptide Y and the Y1 receptor also regulate anger or aggression. In keeping with this, people with high levels of neuropeptide Y in their cerebrospinal fluid also tend to show high levels of impulse aggression.

As you can see, there are several pathways that can make you prone to anger when you’re hungry. Hanger is undoubtedly a survival mechanism that has served humans and other animals well. Think about it like this: if hungry organisms stood back and graciously let others eat before them, their species could die out.

While many physical factors contribute to hanger, psychosocial factors also have a role. Culture influences whether you express verbal aggression directly or indirectly, for instance.

And as we are all different across all of these factors, it’s little wonder there are differences in how angry people seem to get when they’re hungry.

Dealing with hanger

The easiest way to handle hanger is to eat something before you get too hungry. While you may hanker for quick-fix foods, such as chocolate and potato chips, when you’re in the throes of hanger, junk foods generally induce large rises in blood-glucose levels that come crashing down fast.

Ultimately, they may leave you feeling hangrier. So think nutrient-rich, natural foods that help satisfy hunger for as long as possible, without excess kilojoules.

Eating as soon as you are hungry may not always be possible. This may be the case during long shifts at work, for instance, or through religious fasts such as Ramadan, or during weight-loss diets that involve severe energy restriction (such as intermittent fasting diets). All of these should only be done if your doctor has given you the all-clear.

In these cases, it can help to remember that, with time, your glucose counter-regulatory response will kick in and your blood-glucose levels will stabilise. Also, when you go without food, your body starts breaking down its own fat stores for energy, some of which are converted by your body into ketones, a product of fat metabolism. Ketones are thought to help keep your hunger under control because your brain can use ketones in place of glucose for fuel.

A final – and very civilised – way of handling hanger is to suggest that difficult situations be dealt with after food, not before!