Monthly Archives: April 2015

Cooking oils to eat and avoid

13 October 2014, 2.59pm AEDT

Health Check: cooking oils to eat and avoid

Health conscious consumers are increasingly ditching old favourites vegetable and canola oil for trendy alternatives like coconut and peanut oil. But are they any healthier? And how do they compare with…

Try to avoid too many saturated fatty acids by choosing oils that are liquid at room temperature. Marjan Lazarevski/Flickr, CC BY-ND

Health conscious consumers are increasingly ditching old favourites vegetable and canola oil for trendy alternatives like coconut and peanut oil. But are they any healthier? And how do they compare with other options such as heated olive oil and butter?

The short answer is, it depends. The long version requires a quick lesson in food science.

Fats and oils = lipids

Edible fats and oils are all part of the lipid family. Fats, such as butter, are solid at room temperature and oils are liquid; their solidity depends on their chemical makeup.

Both are comprised of triglycerides, which have a glycerol backbone and three fatty acids attached. The fatty acids are the important part of the molecule and can be of different lengths and have different numbers of double bonds. They are organised into three groups:

  • Saturated fats are mainly derived from animal products such as meat and dairy. They have no double bonds.
  • Monounsaturated fats are found in olive oil, avocados and macadamia nuts.They have one double bond.
  • Polyunsaturated fats come from corn, seed and nut oils (omega 6) and seafood (omega 3). They have two or more double bonds.

Fats and oils always include a range of saturated and unsaturated fatty acids. The greater the proportion of saturated fatty acids, the more solid it will be at room temperature.

Why we need fats and oils

Once digested, we use the fatty acids to maintain the function of our body’s cells and cell membranes, in hormones and in neurotransmitters. They’re also important for our skin, hair and nails, and they keep us warm and cushion our internal organs.

Fats and oils are used for energy production and provide the most energy per gram of all the macronutrients (fat 37kJ/g; carbohydrate 16kJ/g; and protein 17kJ/g).

Use sparingly: butter is a saturated fat. Taryn/Flickr, CC BY-SA

But it’s important to balance the amount of fat we eat and the energy we expend in order to prevent obesity.

Type of fat eaten is also important:

Therefore, the greater the proportion of mono and polyunsaturated fatty acids contained in a particular fat or oil, the healthier it is.

Cooking with fats and oils

The longer the fatty acid chain and the fewer the double bonds, the hotter the fat can be heated without it starting to breakdown – or in scientific terms, be oxidised. The temperature at which this oxidation occurs is called the smoke point and it is associated with unpleasant odours and flavours.

To fry food you need a high temperature and therefore a high smoke point, so saturated fats with long chain lengths work best.

Tallow (rendered beef fat) or ghee (rendered butter) are often used for frying, but these are saturated fats, so they’re among the least healthy.

Butter is also a saturated fat, but has other components as well. The dairy proteins, relatively high water content and short chain fatty acids mean butter is great for browning food, but not for frying as it starts to splatter when heated to a high temperature.

Vegetable oil is a term used for any (non animal-based) oil from vegetable or seed origin. Or it can be a blend of these oils. It is mainly polyunsaturated fats of different chain lengths, so it’s one of the healthier options.

Canola oil, which was developed from rapeseed, was specifically developed for frying as it contains predominantly longer chain monounsaturated fatty acids and has a relatively high smoke point.

Canola oil is harvested from rapeseed. rytc/Flickr, CC BY-NC-ND

Peanut oil (from peanuts) is mainly long chain omega 6 (polyunsaturated) fatty acids. It has a high smoke point and is also good for frying.

Olive oil is predominantly monounsaturated fatty acids, but has the added benefit of polyphenolic compounds which act as antioxidants and contribute to the health qualities of this oil. Olive oil does not have a high smoke point, so should only be used for low-temperature cooking.

Cold pressed olive oil is the best choice, as it is not heated or processed by chemicals in the extraction of the oil from the olive. Olive oil is easily oxidised so should be stored in a dark place in a coloured bottle.

Fats and oils are also used in baking, adding taste and texture to cakes, biscuits and pastry. The best fat for this role is saturated fat.

Currently there is no other mono or polyunsaturated fats that can completely replace the role of saturated fats in baking, and this is one of the reasons we suggest eating these foods only occasionally.

New kids on the block

Coconut oil has 85-90% saturated fatty acids. It has traditionally been used in curries, but its recent popularity with health conscious consumers has seen it added to all kinds of foods from muesli to smoothies.

The predominant fatty acid in coconut oil is Lauric acid. Its shorter chain length is thought to be why coconut oil may not have the same effects on LDL (bad) cholesterol as other saturated fatty acids.

But there is little evidence for any health benefits of Lauric acid at this point. To be prudent, it is best to use coconut oil occasionally as part of a healthy eating plan.

Adding coconut oil to everything won’t make it healthier. Meal Makeover Moms/Flickr, CC BY-ND

Flaxseed oil (also known as linseed oil) is claimed to have similar health benefits to marine omega 3 fatty acids. Flaxseed oil contains a high proportion of α-Linolenic acid (ALA), which in theory can be converted to omega-3 fatty acids by the body.

But we are not efficient at doing this and there is little evidence that flaxseed oil has the same protective effects as the omega 3 fatty acids from fish oil.

Flaxseed oil degrades rapidly when heated or exposed to sunlight and should be stored in a dark place in coloured glass. It is mostly used on salads and as an addition to cold dishes and drinks, but should not replace the two fish meals a week suggested by the National Heart Foundation.

Fats and oils are important in our diet, but should be used carefully and as part of a healthy eating plan. Try to avoid too many saturated fatty acids by choosing oils that are liquid at room temperature.

Is It Really Dementia?

Is It Really Dementia?

Maybe it’s something else.

That’s what you tell yourself, isn’t it, when an older person begins to lose her memory, repeat herself, see things that aren’t there, lose her way on streets she’s traveled for decades? Maybe it’s not dementia.

And sometimes, thankfully, it is indeed some other problem, something that mimics the cognitive destruction of Alzheimer’s disease or another dementia — but, unlike them, is fixable.

“It probably happens more often than people realize,” said Dr. P. Murali Doraiswamy, a neuroscientist at Duke University Medical Center. But, he added, it doesn’t happen nearly as often as family members hope.

Several confounding cases have appeared at Duke: A woman who appeared to have Alzheimer’s actually was suffering the effects of alcoholism. Another patient’s symptoms resulted not from dementia but from chronic depression.

Dr. Doraiswamy estimates that when doctors suspect Alzheimer’s, they’re right 50 to 60 percent of the time. (The accuracy of Alzheimer’s diagnoses, even in specialized medical centers, is more haphazard than you would hope.)

Perhaps another 25 percent of patients actually have other types of dementia, like Lewy body or frontotemporal — scarcely happy news, but because these diseases have different trajectories and can be exacerbated by the wrong drugs, the distinction matters.

The remaining 15 to 25 percent “usually have conditions that can be reversed or at least improved,” Dr. Doraiswamy said.

In trying to tell the difference — not a job for amateurs — one key consideration is age, said Dr. Ronald C. Petersen, director of the Mayo Clinic’s Alzheimer’s center.

Dementia is highly age-related, he pointed out. In a 50-year-old, memory problems might very well have some other cause. But “the likelihood that a 75-year-old’s becoming forgetful over six to 12 to 18 months is due to something treatable and fixable is low,” Dr. Petersen said. “But not zero.”

Which points to another key question: speed of onset. Dementia tends to develop slowly; family members often realize, in retrospect, that an older person has shown subtle cognitive decline for years.

When a person’s mental state changes suddenly over a few days or weeks, however, “that’s not the usual picture of a degenerative disease,” Dr. Petersen said. “That means looking for something else.”

The list of other causes for dementia-like symptoms runs surprisingly long. Among the leading culprits, Dr. Doraiswamy said, are depression and anxiety. Like dementia, they can interfere with the ability to concentrate and remember.

He looks next for thyroid deficiency. “Thyroid problems are very prevalent, and thyroid has a huge effect on the brain at every age,” he said. Usually, “this can be relatively easily tested for and relatively easily fixed” with daily medication.

Vitamin deficiencies probably qualify as the most hoped-for scenario. Cognitive problems caused by lack of vitamin B1 (thiamine) or B12 are reversible with pills or injections.

Heavy drinking also causes memory loss. “If you stop drinking, if it’s not too late, the brain can repair itself,” Dr. Doraiswamy said. After years of alcoholism, “you may not be able to repair the damage, but you can keep it from getting worse.”

Sleep disorders, and in particular sleep apnea, can take a cognitive toll on older people. “Their cognitive function may become slower, with poor attention and concentration,” Dr. Petersen said. When patients with apnea use a C.P.A.P. machine, “they come back the next year markedly improved.”

Sleeping pills — you knew this was coming — and a variety of other drugs, especially in combination, frequently cause dementia-like symptoms, too.

“There’s a long list, several hundred drugs, both prescription and over the counter, that can impair memory,” Dr. Doraiswamy said. He rattled off a bunch: medications for nausea and urinary incontinence, older antihistamines like Benadryl, cardiac drugs, painkillers, certain antidepressants and anti-anxiety medications — yes, including benzodiazepines. Selectively deprescribing may help clear a patient’s head.

There’s more. Head injuries that lead to the blot clots called subdural hematomas. High blood pressure. Diabetes. Infections. A condition called normal pressure hydrocephalus. Delirium that develops during hospitalization.

Plus, older people can have any of these problems along with actual dementia. Treating the other causes may at least slow, though not stop, cognitive decline.

So it makes sense, Dr. Petersen said, to tell patients and families — many already terrified of dementia — that other causes exist. “We shouldn’t just dismiss them,” he said. “We scan the brain, do blood tests. We look for these other conditions. That’s common and not inappropriate.”

On the other hand, “I want to be realistic,” he said. “I do it softly at first, but I introduce the notion that we might not find something else.”

Because even though the list of other possibilities is long, so are the odds against restoring a patient to normal functioning. When it looks like dementia, sadly, most of the time it is.

Another nod to the hygiene hypothesis

It is becoming increasingly obvious that being “too clean” is a probable cause of the alarming amount of allergies seen in the western world at present, including the big increase in autoimmune diseases. Exposure to dirt and germs early in life seems to reduce our risk of these diseases.

Effects of early-life exposure to allergens and bacteria on recurrent wheeze and atopy in urban children

Susan V. Lynch, PhD∗et al.

These authors contributed equally to this work.

 

Background

Wheezing illnesses cause major morbidity in infants and are frequent precursors to asthma.

Objective

We sought to examine environmental factors associated with recurrent wheezing in inner-city environments.

Methods

The Urban Environment and Childhood Asthma study examined a birth cohort at high risk for asthma (n = 560) in Baltimore, Boston, New York, and St Louis. Environmental assessments included allergen exposure and, in a nested case-control study of 104 children, the bacterial content of house dust collected in the first year of life. Associations were determined among environmental factors, aeroallergen sensitization, and recurrent wheezing at age 3 years.

Results

Cumulative allergen exposure over the first 3 years was associated with allergic sensitization, and sensitization at age 3 years was related to recurrent wheeze. In contrast, first-year exposure to cockroach, mouse, and cat allergens was negatively associated with recurrent wheeze (odds ratio, 0.60, 0.65, and 0.75, respectively; P ≤ .01). Differences in house dust bacterial content in the first year, especially reduced exposure to specific Firmicutes and Bacteriodetes, was associated with atopy and atopic wheeze. Exposure to high levels of both allergens and this subset of bacteria in the first year of life was most common among children without atopy or wheeze.

Conclusions

In inner-city environments children with the highest exposure to specific allergens and bacteria during their first year were least likely to have recurrent wheeze and allergic sensitization. These findings suggest that concomitant exposure to high levels of certain allergens and bacteria in early life might be beneficial and suggest new preventive strategies for wheezing and allergic diseases.

Why do we have blood types?

Blood illustration by Elena Boils

© Elena Boils

Why do we have blood types?

More than a century after their discovery, we still don’t really know what blood types are for. Do they really matter? Carl Zimmer investigates.

15 July 2014

When my parents informed me that my blood type was A+, I felt a strange sense of pride. If A+ was the top grade in school, then surely A+ was also the most excellent of blood types – a biological mark of distinction.

It didn’t take long for me to recognise just how silly that feeling was and tamp it down. But I didn’t learn much more about what it really meant to have type A+ blood. By the time I was an adult, all I really knew was that if I should end up in a hospital in need of blood, the doctors there would need to make sure they transfused me with a suitable type.

And yet there remained some nagging questions. Why do 40 per cent of Caucasians have type A blood, while only 27 per cent of Asians do? Where do different blood types come from, and what do they do? To get some answers, I went to the experts – to haematologists, geneticists, evolutionary biologists, virologists and nutrition scientists.

In 1900 the Austrian physician Karl Landsteiner first discovered blood types, winning the Nobel Prize in Physiology or Medicine for his research in 1930. Since then scientists have developed ever more powerful tools for probing the biology of blood types. They’ve found some intriguing clues about them – tracing their deep ancestry, for example, and detecting influences of blood types on our health. And yet I found that in many ways blood types remain strangely mysterious. Scientists have yet to come up with a good explanation for their very existence.

“Isn’t it amazing?” says Ajit Varki, a biologist at the University of California, San Diego. “Almost a hundred years after the Nobel Prize was awarded for this discovery, we still don’t know exactly what they’re for.”

§

My knowledge that I’m type A comes to me thanks to one of the greatest discoveries in the history of medicine. Because doctors are aware of blood types, they can save lives by transfusing blood into patients. But for most of history, the notion of putting blood from one person into another was a feverish dream.

Renaissance doctors mused about what would happen if they put blood into the veins of their patients. Some thought that it could be a treatment for all manner of ailments, even insanity. Finally, in the 1600s, a few doctors tested out the idea, with disastrous results. A French doctor injected calf’s blood into a madman, who promptly started to sweat and vomit and produce urine the colour of chimney soot. After another transfusion the man died.

Such calamities gave transfusions a bad reputation for 150 years. Even in the 19th century only a few doctors dared try out the procedure. One of them was a British physician named James Blundell. Like other physicians of his day, he watched many of his female patients die from bleeding during childbirth. After the death of one patient in 1817, he found he couldn’t resign himself to the way things were.

“I could not forbear considering, that the patient might very probably have been saved by transfusion,” he later wrote.

Blundell became convinced that the earlier disasters with blood transfusions had come about thanks to one fundamental error: transfusing “the blood of the brute”, as he put it. Doctors shouldn’t transfer blood between species, he concluded, because “the different kinds of blood differ very importantly from each other”.

Human patients should only get human blood, Blundell decided. But no one had ever tried to perform such a transfusion. Blundell set about doing so by designing a system of funnels and syringes and tubes that could channel blood from a donor to an ailing patient. After testing the apparatus out on dogs, Blundell was summoned to the bed of a man who was bleeding to death. “Transfusion alone could give him a chance of life,” he wrote.

Several donors provided Blundell with 14 ounces of blood, which he injected into the man’s arm. After the procedure the patient told Blundell that he felt better – “less fainty” – but two days later he died.

Still, the experience convinced Blundell that blood transfusion would be a huge benefit to mankind, and he continued to pour blood into desperate patients in the following years. All told, he performed ten blood transfusions. Only four patients survived.

While some other doctors experimented with blood transfusion as well, their success rates were also dismal. Various approaches were tried, including attempts in the 1870s to use milk in transfusions (which were, unsurprisingly, fruitless and dangerous).

§

Blundell was correct in believing that humans should only get human blood. But he didn’t know another crucial fact about blood: that humans should only get blood from certain other humans. It’s likely that Blundell’s ignorance of this simple fact led to the death of some of his patients. What makes those deaths all the more tragic is that the discovery of blood types, a few decades later, was the result of a fairly simple procedure.

The first clues as to why the transfusions of the early 19th century had failed were clumps of blood. When scientists in the late 1800s mixed blood from different people in test tubes, they noticed that sometimes the red blood cells stuck together. But because the blood generally came from sick patients, scientists dismissed the clumping as some sort of pathology not worth investigating. Nobody bothered to see if the blood of healthy people clumped, until Karl Landsteiner wondered what would happen. Immediately, he could see that mixtures of healthy blood sometimes clumped too.

Landsteiner set out to map the clumping pattern, collecting blood from members of his lab, including himself. He separated each sample into red blood cells and plasma, and then he combined plasma from one person with cells from another.

Landsteiner found that the clumping occurred only if he mixed certain people’s blood together. By working through all the combinations, he sorted his subjects into three groups. He gave them the entirely arbitrary names of A, B and C. (Later on C was renamed O, and a few years later other researchers discovered the AB group. By the middle of the 20th century the American researcher Philip Levine had discovered another way to categorise blood, based on whether it had the Rh blood factor. A plus or minus sign at the end of Landsteiner’s letters indicates whether a person has the factor or not.)

When Landsteiner mixed the blood from different people together, he discovered it followed certain rules. If he mixed the plasma from group A with red blood cells from someone else in group A, the plasma and cells remained a liquid. The same rule applied to the plasma and red blood cells from group B. But if Landsteiner mixed plasma from group A with red blood cells from B, the cells clumped (and vice versa).

The blood from people in group O was different. When Landsteiner mixed either A or B red blood cells with O plasma, the cells clumped. But he could add A or B plasma to O red blood cells without any clumping.

It’s this clumping that makes blood transfusions so potentially dangerous. If a doctor accidentally injected type B blood into my arm, my body would become loaded with tiny clots. They would disrupt my circulation and cause me to start bleeding massively, struggle for breath and potentially die. But if I received either type A or type O blood, I would be fine.

Landsteiner didn’t know what precisely distinguished one blood type from another. Later generations of scientists discovered that the red blood cells in each type are decorated with different molecules on their surface. In my type A blood, for example, the cells build these molecules in two stages, like two floors of a house. The first floor is called an H antigen. On top of the first floor the cells build a second, called the A antigen.

People with type B blood, on the other hand, build the second floor of the house in a different shape. And people with type O build a single-storey ranch house: they only build the H antigen and go no further.

Each person’s immune system becomes familiar with his or her own blood type. If people receive a transfusion of the wrong type of blood, however, their immune system responds with a furious attack, as if the blood were an invader. The exception to this rule is type O blood. It only has H antigens, which are present in the other blood types too. To a person with type A or type B, it seems familiar. That familiarity makes people with type O blood universal donors, and their blood especially valuable to blood centres.

Landsteiner reported his experiment in a short, terse paper in 1900. “It might be mentioned that the reported observations may assist in the explanation of various consequences of therapeutic blood transfusions,” he concluded with exquisite understatement. Landsteiner’s discovery opened the way to safe, large-scale blood transfusions, and even today blood banks use his basic method of clumping blood cells as a quick, reliable test for blood types.

But as Landsteiner answered an old question, he raised new ones. What, if anything, were blood types for? Why should red blood cells bother with building their molecular houses? And why do people have different houses?

Solid scientific answers to these questions have been hard to come by. And in the meantime, some unscientific explanations have gained huge popularity. “It’s just been ridiculous,” sighs Connie Westhoff, the Director of Immunohematology, Genomics, and Rare Blood at the New York Blood Center.

§

In 1996 a naturopath named Peter D’Adamo published a book called Eat Right 4 Your Type. D’Adamo argued that we must eat according to our blood type, in order to harmonise with our evolutionary heritage.

Blood types, he claimed, “appear to have arrived at critical junctures of human development.” According to D’Adamo, type O blood arose in our hunter-gatherer ancestors in Africa, type A at the dawn of agriculture, and type B developed between 10,000 and 15,000 years ago in the Himalayan highlands. Type AB, he argued, is a modern blending of A and B.

From these suppositions D’Adamo then claimed that our blood type determines what food we should eat. With my agriculture-based type A blood, for example, I should be a vegetarian. People with the ancient hunter type O should have a meat-rich diet and avoid grains and dairy. According to the book, foods that aren’t suited to our blood type contain antigens that can cause all sorts of illness. D’Adamo recommended his diet as a way to reduce infections, lose weight, fight cancer and diabetes, and slow the ageing process.

D’Adamo’s book has sold 7 million copies and has been translated into 60 languages. It’s been followed by a string of other blood type diet books; D’Adamo also sells a line of blood-type-tailored diet supplements on his website. As a result, doctors often get asked by their patients if blood type diets actually work.

The best way to answer that question is to run an experiment. In Eat Right 4 Your Type D’Adamo wrote that he was in the eighth year of a decade-long trial of blood type diets on women with cancer. Eighteen years later, however, the data from this trial have not yet been published.

Recently, researchers at the Red Cross in Belgium decided to see if there was any other evidence in the diet’s favour. They hunted through the scientific literature for experiments that measured the benefits of diets based on blood types. Although they examined over 1,000 studies, their efforts were futile. “There is no direct evidence supporting the health effects of the ABO blood type diet,” says Emmy De Buck of the Belgian Red Cross-Flanders.

After De Buck and her colleagues published their review in the American Journal of Clinical Nutrition, D’Adamo responded on his blog. In spite of the lack of published evidence supporting his Blood Type Diet, he claimed that the science behind it is right. “There is good science behind the blood type diets, just like there was good science behind Einstein’s mathmatical [sic] calculations that led to the Theory of Relativity,” he wrote.

Comparisons to Einstein notwithstanding, the scientists who actually do research on blood types categorically reject such a claim. “The promotion of these diets is wrong,” a group of researchers flatly declared in Transfusion Medicine Reviews.

Nevertheless, some people who follow the Blood Type Diet see positive results. According to Ahmed El-Sohemy, a nutritional scientist at the University of Toronto, that’s no reason to think that blood types have anything to do with the diet’s success.

El-Sohemy is an expert in the emerging field of nutrigenomics. He and his colleagues have brought together 1,500 volunteers to study, tracking the foods they eat and their health. They are analysing the DNA of their subjects to see how their genes may influence how food affects them. Two people may respond very differently to the same diet based on their genes.

“Almost every time I give talks about this, someone at the end asks me, ‘Oh, is this like the Blood Type Diet?’” says El-Sohemy. As a scientist, he found Eat Right 4 Your Type lacking. “None of the stuff in the book is backed by science,” he says. But El-Sohemy realised that since he knew the blood types of his 1,500 volunteers, he could see if the Blood Type Diet actually did people any good.

El-Sohemy and his colleagues divided up their subjects by their diets. Some ate the meat-based diets D’Adamo recommended for type O, some ate a mostly vegetarian diet as recommended for type A, and so on. The scientists gave each person in the study a score for how well they adhered to each blood type diet.

The researchers did find, in fact, that some of the diets could do people some good. People who stuck to the type A diet, for example, had lower body mass index scores, smaller waists and lower blood pressure. People on the type O diet had lower triglycerides. The type B diet – rich in dairy products – provided no benefits.

“The catch,” says El-Sohemy, “is that it has nothing to do with people’s blood type.” In other words, if you have type O blood, you can still benefit from a so-called type A diet just as much as someone with type A blood – probably because the benefits of a mostly vegetarian diet can be enjoyed by anyone. Anyone on a type O diet cuts out lots of carbohydrates, with the attending benefits of this being available to virtually everyone. Likewise, a diet rich in dairy products isn’t healthy for anyone – no matter their blood type.

§

One of the appeals of the Blood Type Diet is its story of the origins of how we got our different blood types. But that story bears little resemblance to the evidence that scientists have gathered about their evolution.

After Landsteiner’s discovery of human blood types in 1900, other scientists wondered if the blood of other animals came in different types too. It turned out that some primate species had blood that mixed nicely with certain human blood types. But for a long time it was hard to know what to make of the findings. The fact that a monkey’s blood doesn’t clump with my type A blood doesn’t necessarily mean that the monkey inherited the same type A gene that I carry from a common ancestor we share. Type A blood might have evolved more than once.

The uncertainty slowly began to dissolve, starting in the 1990s with scientists deciphering the molecular biology of blood types. They found that a single gene, called ABO, is responsible for building the second floor of the blood type house. The A version of the gene differs by a few key mutations from B. People with type O blood have mutations in the ABO gene that prevent them from making the enzyme that builds either the A or B antigen.

Scientists could then begin comparing the ABO gene from humans to other species. Laure Ségurel and her colleagues at the National Center for Scientific Research in Paris have led the most ambitious survey of ABO genes in primates to date. And they’ve found that our blood types are profoundly old. Gibbons and humans both have variants for both A and B blood types, and those variants come from a common ancestor that lived 20 million years ago.

Our blood types might be even older, but it’s hard to know how old. Scientists have yet to analyse the genes of all primates, so they can’t see how widespread our own versions are among other species. But the evidence that scientists have gathered so far already reveals a turbulent history to blood types. In some lineages mutations have shut down one blood type or another. Chimpanzees, our closest living relatives, have only type A and type O blood. Gorillas, on the other hand, have only B. In some cases mutations have altered the ABO gene, turning type A blood into type B. And even in humans, scientists are finding, mutations have repeatedly arisen that prevent the ABO protein from building a second storey on the blood type house. These mutations have turned blood types from A or B to O. “There are hundreds of ways of being type O,” says Westhoff.

§

Being type A is not a legacy of my proto-farmer ancestors, in other words. It’s a legacy of my monkey-like ancestors. Surely, if my blood type has endured for millions of years, it must be providing me with some obvious biological benefit. Otherwise, why do my blood cells bother building such complicated molecular structures?

Yet scientists have struggled to identify what benefit the ABO gene provides. “There is no good and definite explanation for ABO,” says Antoine Blancher of the University of Toulouse, “although many answers have been given.”

The most striking demonstration of our ignorance about the benefit of blood types came to light in Bombay in 1952. Doctors discovered that a handful of patients had no ABO blood type at all – not A, not B, not AB, not O. If A and B are two-storey buildings, and O is a one-storey ranch house, then these Bombay patients had only an empty lot.

Since its discovery this condition – called the Bombay phenotype – has turned up in other people, although it remains exceedingly rare. And as far as scientists can tell, there’s no harm that comes from it. The only known medical risk it presents comes when it’s time for a blood transfusion. Those with the Bombay phenotype can only accept blood from other people with the same condition. Even blood type O, supposedly the universal blood type, can kill them.

The Bombay phenotype proves that there’s no immediate life-or-death advantage to having ABO blood types. Some scientists think that the explanation for blood types may lie in their variation. That’s because different blood types may protect us from different diseases.

Doctors first began to notice a link between blood types and different diseases in the middle of the 20th century, and the list has continued to grow. “There are still many associations being found between blood groups and infections, cancers and a range of diseases,” Pamela Greenwell of the University of Westminster tells me.

From Greenwell I learn to my displeasure that blood type A puts me at a higher risk of several types of cancer, such as some forms of pancreatic cancer and leukaemia. I’m also more prone to smallpox infections, heart disease and severe malaria. On the other hand, people with other blood types have to face increased risks of other disorders. People with type O, for example, are more likely to get ulcers and ruptured Achilles tendons.

These links between blood types and diseases have a mysterious arbitrariness about them, and scientists have only begun to work out the reasons behind some of them. For example, Kevin Kain of the University of Toronto and his colleagues have been investigating why people with type O are better protected against severe malaria than people with other blood types. His studies indicate that immune cells have an easier job of recognising infected blood cells if they’re type O rather than other blood types.

More puzzling are the links between blood types and diseases that have nothing to do with the blood. Take norovirus. This nasty pathogen is the bane of cruise ships, as it can rage through hundreds of passengers, causing violent vomiting and diarrhoea. It does so by invading cells lining the intestines, leaving blood cells untouched. Nevertheless, people’s blood type influences the risk that they will be infected by a particular strain of norovirus.

The solution to this particular mystery can be found in the fact that blood cells are not the only cells to produce blood type antigens. They are also produced by cells in blood vessel walls, the airway, skin and hair. Many people even secrete blood type antigens in their saliva. Noroviruses make us sick by grabbing onto the blood type antigens produced by cells in the gut.

Yet a norovirus can only grab firmly onto a cell if its proteins fit snugly onto the cell’s blood type antigen. So it’s possible that each strain of norovirus has proteins that are adapted to attach tightly to certain blood type antigens, but not others. That would explain why our blood type can influence which norovirus strains can make us sick.

It may also be a clue as to why a variety of blood types have endured for millions of years. Our primate ancestors were locked in a never-ending cage match with countless pathogens, including viruses, bacteria and other enemies. Some of those pathogens may have adapted to exploit different kinds of blood type antigens. The pathogens that were best suited to the most common blood type would have fared best, because they had the most hosts to infect. But, gradually, they may have destroyed that advantage by killing off their hosts. Meanwhile, primates with rarer blood types would have thrived, thanks to their protection against some of their enemies.

As I contemplate this possibility, my type A blood remains as puzzling to me as when I was a boy. But it’s a deeper state of puzzlement that brings me some pleasure. I realise that the reason for my blood type may, ultimately, have nothing to do with blood at all.

Five foods to always avoid at the supermarket

8 September 2014, 11.06am AEST

Health Check: five foods to always avoid at the supermarket

Want to stack the nutrition odds in your favour? The key is good food so here are five things to never let into your shopping trolley: lollies, biscuits, sugar-sweetened drinks, potato crisps and processed…

Ideally lollies, biscuits, sugar-sweetened drinks, potato crisps and processed meats will never appear in your shopping trolley. Matt/Flickr, CC BY-NC-SA

Want to stack the nutrition odds in your favour? The key is good food so here are five things to never let into your shopping trolley: lollies, biscuits, sugar-sweetened drinks, potato crisps and processed meats.

Known as discretionary foods, all five are high in either added sugars, saturated fat or salt. Discretionary foods provide kilojoules but not many nutrients.

Consuming a lot of discretionary foods and drinks increases your risks of weight gain, obesity, heart disease, type 2 diabetes and certain cancers. Unless you’re extremely active, it is unlikely that you can eat a lot of these foods and also be a healthy weight.

Lollies

Dental caries or cavities (holes in your teeth) are the most common and expensive preventable diet-related problem. It’s bad enough that one in five adults rate their oral health as fair or poor, the prevalence of dental caries in children is also increasing. If you or your kids are lolly addicts, the best way to avoid dental disease is to give up grazing on confectionery.

Dump the lolly bag and swap to sugar-free chewing gum to save the kilojoules and your teeth. Susan/Flickr, CC BY

Sugar and other fermentable carbohydrates from highly processed foods are major risk factors for both the start and progress of dental disease. The more lollies you eat, and the more often you eat them, the bigger the risk.

What’s more, they’ll make you fat. Just 100 grams of jelly babies has over 1,400 kilojoules and over 50 grams of sugar, which is about ten teaspoons. Dump the lolly bag and swap to sugar-free chewing gum to save the kilojoules and your teeth.

Sugar-sweetened drinks

Sugar-sweetened beverages include sweetened soft drinks, sports drinks, energy drinks, fruit juice drinks and cordial.

Drink soda water if you want the fizz of soft drinks without the kilojoules. arbyreed/Flickr, CC BY-NC-ND

In a trial of over 15,000 adults who were followed up for 15 years, researchers found drinking one or more cups of soft drink a day increased the risk of developing type 2 diabetes by 29%, compared to drinking less than one glass a month. And a US study estimated drinking one can of soft drink a day could contribute to over six kilograms of weight gain in just a year, if the kilojoules were not offset by increasing physical activity or by cutting back on food intake.

Since we know these extra kilojoules are usually not offset, you can see why drinking sweet drinks regularly increases the risk of weight gain.

Swap to diet drinks, or soda water if you want the fizz without the kilojoules. Better still, drink water. Unless you’re an elite athlete, plain water is all you need during sport.

Crisps of all kinds

Crisps, including potato chips, Burger Rings, Twisties and corn chips, are some of the most popular snack foods in the developed world. And the bigger the bag, the more we eat.

Crisps are one of the most popular snack foods in the developed world. Alan/Flickr, CC BY-NC-SA

A healthy low-kilojoule alternative is air-popped popcorn. So put the multi-bag of crisps back on the shelf, grab a bag of popcorn kernels, and pop them yourself at home.

Popcorn is wholegrain, more satisfying and cheap. One cup of air-popped popcorn has 150 kilojoules, compared to 550 kilojoules in a 25-gram individual packet of potato crisps. This approximately 400-kilojoule saving is the equivalent of the energy burned up in about a 25-minute walk.

Biscuits

Most biscuits are consumed with a cup of tea or coffee. But the problem is that biscuits provide more than crunchiness. They contain large amounts of kilojoules, unhealthy fats and highly processed carbohydrates. What’s more, they’re mostly low in fibre and whole grains.

Biscuits are mostly low in fibre and whole grains. Rob Faulkner/Flickr, CC BY

Look at it this way: two cream-filled biscuits contain around 860 kilojoules. You’d need to push your shopping trolley for about an hour to burn that up.

Instead load up on fruit and save on kilojoules. One cup of strawberries has 150 kilojoules, a small bunch grapes 350 kilojoules and a medium banana 365 kilojoules.

Processed meat

Processed meat includes meat products preserved by smoking, curing, salting or the addition of preservatives including nitrite, nitrates, phosphate, glutamate or ascorbic acid. They include bacon, ham, pastrami, salami, corned beef, chorizo, devon, fritz, luncheon meats, some sausages, hot dogs, cabanossi and kabana.

There’s no completely safe level of intake for these foods. The more processed meats you eat, the greater your risk of developing bowel cancer over a lifetime.

Processed meat are preserved by smoking, curing, salting or the addition of preservatives. Alpha/Flickr, CC BY-NC-SA

Keep processed meat for when there are no other choices available. Whenever possible, load up with tomatoes and mushrooms, and swap the breakfast bacon for an egg with baked beans and a mixed vegetable grill.

Grab a pack of fresh chicken breast and cook it for use on sandwiches, or buy reduced-fat cheese, canned tuna and salmon, or small cans of four-bean mix.

If you have a recipe that calls for chopped bacon, replace it with diced browned onion and garlic, mixed with a couple of tablespoons of sunflower seeds, pumpkin seeds or pine nuts.

Avoiding the five foods discussed here and replacing them with the suggestions I have outlined will put you on track to a long, healthy life. Ideally these foods will never appear in your shopping trolley, unlike the five foods that should be there every time you’re filling up the shopping trolley.

Take a stand on sitting.

This is an important message for most of us. I have already started implementing the suggestions in this post, and I suggest you do too.
18 August 2014, 1.51pm AEST

Health Check: sitting versus standing

It seems the world is finally coming to terms with the fact that humans evolved to stand, not to sit – well, health researchers, savvy office workers and many commuters, at least. The evidence is mounting…

Alternating between sitting and standing is best. ramsey beyer/Flickr, CC BY

It seems the world is finally coming to terms with the fact that humans evolved to stand, not to sit – well, health researchers, savvy office workers and many commuters, at least.

The evidence is mounting to show that spending too much sitting at work, during your commute and for leisure increases your risk of diabetes, certain cancers, heart disease and early death.

This isn’t a new revelation. Bernardino Ramazzini first described the ill effects of too much sitting at work in the 1700s and advised people to break up sitting and stimulate blood flow.

But technological advances and ergonomic experts have made sitting more comfortable and more enticing. Australian adults now sit for an average of nearly nine hours a day. This is longer than the time that most people spend sleeping.

So, is it time to buy a standing desk? Let’s examine the evidence.

Many people know when they’ve been sitting too long because their back or neck gets sore. These are effects many can relate to because we can actually feel them.

But it’s what you can’t feel or see that you may need to be concerned about. Canadian researcher Dr Peter Katzmarzyk, for instance, found that those who sat almost all of the time had nearly a one-third higher risk of early death than those who stood almost all of the time.

University College London researcher Dr Emmanuel Stamatakis found similar results among women in the United Kingdom: those whose work involved mostly standing/walking about had a 32% lower risk of early death than those who worked in sitting jobs.

For the average adult, standing burns more calories and involves more muscular contraction than sitting. One study reported 2.5 times higher average muscular activity of the thigh when standing compared to sitting. This is important for improving blood sugar profiles and vascular health, reducing the risk of early death.

But it’s important to note prolonged standing can also have adverse health effects. Compared to sitting, when we stand, our hearts and circulatory systems work harder to maintain blood flow to the brain, because they are countering the effects of gravity. Standing still for long periods of time can lead to swelling, heaviness or cramping of the legs.

Enforced standing has actually been used as an interrogation technique (though former US secretary of defense Donald Rumsfeld couldn’t understand why it was only for four hours – he stood for eight to ten hours a day).

If standing still for too long is potentially risky, what should you do?

To obtain the health benefits of standing and reduce the potential adverse effects, the best option is to alternate between sitting and standing. Our message is to stand up, sit less and move more.

Alternating between sitting and standing will increase muscular contractions, stimulating blood flow and resulting in more calories burnt and healthier blood sugar levels. Recent findings from our lab show that alternating between 30 minutes of sitting and standing can improve blood sugar levels after a meal.

Now, if you’re leaning towards getting a standing desk but are concerned about your concentration and productivity, there’s some good news. Research shows task performance such as typing, reading and performing cognitive tests is largely unaffected by standing desks.

Thomas Jefferson, Winston Churchill, Virginia Woolf, and Ernest Hemingway fought off the urge to sit with the aid of standing desks. It might be time for you to do the same, and alternate between sitting and standing.

If you’re still not ready for a stand-up desk, these tips might help get you moving:

  • take regular breaks during long drives in the car
  • stand up on public transport
  • choose more active ways to hang out with friends (swap the cafe for a walk)
  • stand at the bar instead of sitting on the comfy couches
  • have standing meetings (they usually end faster)
  • stand up while on the phone.

The Wrong Approach to Breast Cancer

Photo

CreditJenny Wildfang

This story is included with an NYT Opinion subscription.
LBERKELEY, Calif. — ONE of the nastier aspects of breast cancer is that it doesn’t have the five-year sell-by date of some other malignancies: you’re not considered “cured” until you die of something else. Although it becomes less likely, the disease can come back eight, 10, even 20 years after treatment. I fell on the wrong side of those odds.

I had a tiny, low-grade tumor in 1997; 15 years later, in the summer of 2012, while I was simultaneously watching “Breaking Bad,” chatting with my husband and changing into my pajamas, my finger grazed a hard knot beneath my lumpectomy scar. Just as before, time seemed to stop.

The recurrence appears to have been confined to my breast and was, like the original tumor, a slow-moving form of the disease. Since the lumpectomy and radiation I had in 1997 failed, however, this time the whole breast had to go. My first question to my oncologist (after “Am I going to die?” Answer: yes, someday, but probably not of this) was whether I should have the other breast removed, just to be safe.

It turns out, I’m not alone in that concern. After a decades-long trend toward less invasive surgery, patients’ interest in removing the unaffected breast through a procedure called contralateral prophylactic mastectomy (orC.P.M., as it’s known in the trade) is skyrocketing, and not just among women like me who have been through treatment before.

According to a study published in the Journal of Clinical Oncology in 2009, among those with ductal carcinoma in situ — a non-life-threatening, “stage 0” cancer — the rates of mastectomy with C.P.M. jumped 188 percent between 1998 and 2005. Among those with early-stage invasive disease, the rates went up 150 percent between 1998 and 2003. Most of these women did not carry a genetic mutation, like the actress Angelina Jolie, that predisposes them to the disease.

Researchers I’ve spoken with have called the spike an “epidemic” and “alarming,” driven by patients’ overestimation of their actual chances of contracting a second cancer. In a 2013 study conducted by the Dana-Farber Cancer Institute in Boston, for instance, women under 40 with no increased genetic risk and disease in one breast believed that within five years, 10 out of 100 of them would develop it in the other; the actual risk is about 2 to 4 percent.

Many of those same young women underestimated the potential complications and side effects of C.P.M. Breasts don’t just screw off, like jar lids: Infections can occur, implants can break through the skin or rupture, tissue relocated from elsewhere in the body can fail. Even if all goes well, a reconstructed breast has little sensation. Mine looks swell, and is a remarkably close match to its natural counterpart, but from the inside it feels pretty much like a glued-on tennis ball.

Of course, as any cancer patient will tell you, our fear is not simply of getting cancer, it’s of dying from it. What’s a mere mammary gland when, as Amy Robach, a journalist at ABC News, told People magazine last year after her own C.P.M., “I want to be at my daughters’ graduations. I want to be at their weddings. I want to hold my grandchildren.”

Unfortunately, for most women, C.P.M. is irrelevant to making those milestones. The most comprehensive study yet, published earlier this month in the Journal of the National Cancer Institute, showed virtually no survival benefit from the procedure — less than 1 percent over 20 years.

Researchers used the Surveillance, Epidemiology, and End Results registry and other databases to model survival chances for women who opted for C.P.M. and those who did not. They took into account a woman’s age at diagnosis, the stage and biology of her original tumor, the likelihood of dying from that cancer, the risk of developing cancer in the healthy breast, and the potential of dying from a new cancer. They even tweaked the numbers, nearly doubling the risk of contracting a second cancer and exaggerating the aggressiveness of a new tumor and the effectiveness of C.P.M.

“The story didn’t change,” Todd M. Tuttle, chief of surgical oncology at the University of Minnesota and the study’s senior author, told me. “Even if we used unrealistic figures, the conclusions were still the same. There was no group with a survival benefit of even 1 percent.”

How can that be? Well, first of all, it is extremely rare for a tumor on one side to spread to the other. Cancer doesn’t just leap from breast to breast. In any case, cancer confined to the breast is not deadly. The disease becomes lethal only if it metastasizes, spreading to the bones or other organs. Cutting off the healthy breast won’t prevent the original tumor from doing that. As for developing another cancer, Dr. Tuttle said, when that does happen (and remember, it’s far less common than patients believe), 91 percent will be early-stage lesions, so more readily treatable.

There’s some indication that patients understand that, yet choose C.P.M. anyway. The majority of the young women in the Dana-Farber survey knew the procedure wouldn’t prolong life; even so, they cited enhanced survival as the reason they had undergone it.

Such contradictions aren’t unusual, according to Steven J. Katz, a professor of medicine and health management and policy at the University of Michigan, who studies medical decision making. In exam rooms, all of us — men, women, cardiovascular patients, diabetics, cancer patients — tend to react from the gut rather than the head. “The general response to any diagnosis is, we want to flee it,” Dr. Katz explained. “It’s the kind of fast-flow decision making that we’re wired to perform. And it’s very difficult at that point to put data before a patient.”

I get that. When my cancer was first diagnosed, I felt as if a humongous cockroach had been dropped onto my chest. I could barely contain the urge to bat frantically at my breast screaming, “Get it off! Get it off!” Physicians, according to Dr. Katz, need to better understand how that visceral reaction affects treatment choices. They also need to recognize the power of “anticipated regret”: how people imagine they’d feel if their illness returned and they had not done “everything” to fight it when they’d had the chance. Patients will go to extremes to restore peace of mind, even undergoing surgery that, paradoxically, won’t change the medical basis for their fear.

Mothers inevitably cite their children as motivation for radical treatment; self-sacrifice has, after all, long defined good motherhood. It seems almost primal to offer up a healthy breast — with its connotations of maternal nurturance — to fate, as a symbol of our willingness to give all we have to and for our families. It’s hard to imagine, by contrast, that someone with abasal cell carcinoma on one ear would needlessly remove the other one “just in case” or for the sake of “symmetry.”

Treatment decisions are ultimately up to the individual. But physicians can frame options and educate patients in a way that incorporates psychology as well as statistics. Beyond that, doctors are not obliged to provide treatment that is not truly necessary.

The good news is that treatment to reduce the risk of metastasis has improved over the years. Not enough, but significantly. So those of us who dream of dancing at our children’s weddings? We may yet get there. But if — when — we do, it won’t be because of C.P.M.

When Gluten Sensitivity Isn’t Celiac Disease

Fodmaps is becoming an important diagnosis now, which diagnosis is still missed by many doctors. If you have IBS, aask your doctor to test you for fodmaps. Read below.
Well - Tara Parker-Pope on Health

When Gluten Sensitivity Isn’t Celiac Disease

Photo

Credit Margaret Riegel
Personal Health
Personal Health

Jane Brody on health and aging.

My nephew, sister-in-law and several others I know are on gluten-free diets, helping to support a market for these foods that is expected to reach $15 billion in annual sales by 2016.

Supermarket shelves are now packed with foods labeled gluten-free (including some, like peanut and almond butter, that naturally lack gluten). Chefs, too, have joined the cause: Many high-end restaurants and even pizza parlors now offer gluten-free dishes.

Those who say they react to gluten, a protein in wheat and other grains, report symptoms like abdominal pain; bloating; gas; diarrhea; headache; fatigue; joint pain; foggy mind; numbness in the legs, arms or fingers; and balance problems after eating a gluten-rich food.

I suspected at first that the gluten-free craze was an attempt by some to find a physical explanation for emotional problems, similar to the “epidemic” of hypoglycemia in decades past. But a growing body of research indicates that many may be suffering a real condition called non-celiac gluten sensitivity, or NCGS.

It is not celiac disease, a far less common autoimmune condition that can destroy the small intestine. Indeed, no one has conclusively identified a physical explanation for gluten sensitivity and its array of symptoms.

Recent studies have strongly suggested that many, and possibly most, people who react badly to gluten may have a more challenging problem: sensitivity to a long list of foods containing certain carbohydrates.

In 2011, Dr. Peter Gibson, a gastroenterologist at Monash University in Victoria, Australia, and his colleagues studied 34 people with irritable bowel syndrome who did not have celiac disease but reacted badly to wheat, a gluten-rich grain. The researchers concluded that non-celiac gluten sensitivity “may exist.”

Many of their subjects still had symptoms on a gluten-free diet, however, which prompted a second study of 37 patients with irritable bowel syndrome and non-celiac gluten sensitivity who were randomly assigned to a two-week diet low in certain carbohydrates, collectively called Fodmaps.

All patients on the special diet improved, but got significantly worse when fed gluten or whey protein. Only 8 percent of the participants reacted specifically to gluten, prompting the researchers to conclude that Fodmaps, not gluten, accounted for most of the distress.

Fodmaps is an acronym for fermentable oligosaccharides, disaccharides, monosaccharides and polyols, sugars that draw water into the intestinal tract. They may be poorly digested or absorbed, and become fodder for colonic bacteria that produce gas and can cause abdominal distress. They are:

■ Fructose: A sugar prominent in apples, pears, watermelon, mangoes, grapes, blueberries, tomatoes and tomato concentrate, and all dried fruits; vegetables like sugar-snap peas, sweet peppers and pickles; honey; agave; and jams, dressings and drinks made with high-fructose corn syrup.

■ Lactose: The sugar in milk from cows, goats and sheep, present in ice cream, soft cheeses, sour cream and custard.

■ Fructans: Soluble fiber found in bananas, garlic, onions, leeks, artichokes, asparagus, beets, wheat and rye.

■ Galactans: Complex sugars prominent in dried peas and beans, soybeans, soy milk, broccoli, cabbage and brussels sprouts.

■ Polyols: The sugar alcohols (sweeteners) isomalt, mannitol, sorbitol and xylitol, present in stone fruits like avocado, cherries, peaches, plums and apricots.

People with irritable bowel syndrome often find that their symptoms lessen or disappear when avoiding foods rich in Fodmaps; however, it can take six to eight weeks on a low-Fodmap diet to see a significant improvement.

Experts advise those patients to eliminate all foods rich in Fodmaps at the start. (You can find a list of foods low in these carbohydrates at stanfordhealthcare.org.) Once symptoms resolve, individual foods are returned to the diet one by one to identify those to which patients react.

So what about patients who think they are sensitive only to gluten?

Dr. Joseph A. Murray, gastroenterologist at the Mayo Clinic and an expert on celiac disease, urges that they first be tested for celiac disease, a condition that has become dramatically more prevalent in recent decades. The signs of gluten sensitivity often mimic those of celiac disease, as well as irritable bowel syndrome.

Tests for celiac disease are less accurate if the diet does not currently include gluten. “Test first, test right,” Dr. Murray said in an interview. “We’re seeing people with symptoms who go on a gluten-free diet, and then we can’t make a correct diagnosis.”

With non-celiac gluten sensitivity, there is no damage to the small intestine, meaning many people may consume small amounts of gluten without incident. A forthcoming book edited by Dr. Murray, “Mayo Clinic Going Gluten Free,” lists the essential requirements for diagnosis of non-celiac gluten sensitivity:

■ Negative blood tests for celiac disease and no sign of damage on an intestinal biopsy.

■ Symptom improvement when gluten is removed from the diet.

■ Recurrence of symptoms when gluten is reintroduced.

■ No other explanation for the symptoms.

It is not yet known if the condition results from an immunological reaction similar to that seen in celiac disease, or whether gluten exerts a chemical or other negative effect on digestion.

Gluten sensitivity is not the same as a wheat allergy, a far less common problem with symptoms like swelling, itching, skin rash, tingling or burning of the mouth, and nasal congestion.

The best way to test for non-celiac gluten sensitivity (after ruling out celiac disease) is to remove all sources of gluten from one’s diet for several weeks. If the symptoms disappear, reintroduce gluten to see if they recur. Another option is to keep a food diary for a few weeks, recording everything you eat and drink and any symptoms that follow.

In addition to the inconvenience and added expense, a gluten-free diet can result in a poor intake of fiber and certain essential nutrients. It may be wise to consult a registered dietitian if you plan to go gluten-free.

The good and bad of Easter eggs, chocolate and hot cross buns

30 March 2015, 3.17pm AEDT

Health Check: the good and bad of Easter eggs, chocolate and hot cross buns

Australians love Easter but it seems we love Easter eggs more, spending more than A$185 million on chocolate over the holiday break.

The average 80-gram hot cross bun contains 1,070 kilojoules. Emilie Hardman/Flickr, CC BY-NC-SA

Australians love Easter but it seems we love Easter eggs more, spending more than A$185 million on chocolate over the holiday break.

Painted or dyed eggs were given traditionally at Easter to symbolise new life. Chocolate Easter eggs first appeared early in the 19th century, followed by hollow Easter eggs in 1875, when manufacturing advances allowed chocolate to flow into moulds.

These days we don’t have much restraint when it comes to eggs made out of chocolate, but how many regular hen eggs are okay to eat? And what about the other Easter favourite: the hot cross buns?

Hen eggs

Two recently published reviews examined research on the relationship between egg consumption and risk of heart disease and diabetes. They found that people who consumed the most eggs (six or more per week) had a greater risk of developing type 2 diabetes than people who consumed the least (one egg or less per week).

While the reviews disagree on whether egg consumption increases the risk of heart disease in the general population, they both found that people with diabetes who consumed a lot of eggs had a greater risk of developing heart disease.

Ditch the bacon and go for poached, not fried. Alpha/Flickr, CC BY-NC-SA

However, recent research shows that it is what you eat with your eggs that matters most. In a study of 19,000 adults in the United States, eating eggs was associated with eating more fast foods (think egg and bacon breakfast muffins) and having a bigger waist circumference.

A recent systematic review of breakfast patterns found people who consumed bacon and eggs for breakfast had higher total daily energy intakes, while in another study those who ate poached eggs for breakfast had lower total daily energy intakes. So how you cook your eggs matters too.

Chocolate

Although the jury is out on whether chocolate causes acne until better studies are done, it appears there are some health benefits from eating chocolate.

One team of Australian researchers looked at whether it would be cost effective for those at risk of metabolic syndrome to eat dark chocolate in order to combat heart disease. Wishful thinking, you might say.

They found some benefit, but GPs shouldn’t start writing scripts for chocolate just yet. They estimated that you would have to have 10,000 people eat 100 grams of dark chocolate a day (which is a lot) for over ten years in order to lower blood pressure and blood cholesterol enough to prevent 85 heart attacks and strokes. And that’s a best-case scenario.

In a large review of 42 studies that lasted up to four months, researchers found some health benefits, such as reduced blood pressure, in studies where people drank cocoa drinks (21 trials), ate dark or milk chocolate (15 trials) or had other cocoa products.

Australians spend more than A$185 million on chocolate over the holiday break. Adam Wyles/Flickr, CC BY-ND

When trials were added together in a meta-analysis there were significant but small improvements in blood flow (measured by flow-mediated dilatation, which indicates how flexible your blood vessels are), both two hours after consuming chocolate or cocoa and in the studies that lasted up to four months.

A Cochrane review also found that consuming chocolate and cocoa reduced blood pressure by two to three millimetres of mercury (2-3 Hg) in short-term studies over only two weeks.

But they concluded that longer-term evaluations were needed to assess the impact of chocolate on outcomes such as the incidence of heart attacks and stroke.

Hot cross buns

Hot cross buns used to be baked only on Good Friday, as a symbol of good luck or to ward off evil. These days they begin to appear in supermarkets before you have packed away the Christmas tree.

Hot cross buns are made from refined white flour, so there is no good news there. The protective qualities of grains in terms of reducing your risk of type 2 diabetes, heart disease and colon cancer have only been found for the regular consumption of whole grains.

If you love hot cross buns, you could justify eating a few extra ones by citing the anti-inflammatory, antimicrobial, antioxidant, anti-tumour and anti-diabetes properties of spices such as cinnamon.

But consider this: the average 80-gram hot cross bun contains 1,070 kilojoules. When you add one teaspoon of margarine (135kJ) and two teaspoons of jam (160 kilojoules), this takes it up to about 1,365 kilojoules.

To walk off the kilojoules in that tasty bun you will need to take about 8,200 steps. That will give you plenty of time to get out in the fresh air and enjoy your time off over Easter