No, eating chocolate won’t cure depression



If you’re depressed, the headlines might tempt you to reach out for a chocolate bar. But don’t believe the hype.
from www.shutterstock.com

Ben Desbrow, Griffith University

A recent study published in the journal Depression and Anxiety has attracted widespread media attention. Media reports said eating chocolate, in particular, dark chocolate, was linked to reduced symptoms of depression.

Unfortunately, we cannot use this type of evidence to promote eating chocolate as a safeguard against depression, a serious, common and sometimes debilitating mental health condition.

This is because this study looked at an association between diet and depression in the general population. It did not gauge causation. In other words, it was not designed to say whether eating dark chocolate caused a reduction in depressive symptoms.




Read more:
What causes depression? What we know, don’t know and suspect


What did the researchers do?

The authors explored data from the United States National Health and Nutrition Examination Survey. This shows how common health, nutrition and other factors are among a representative sample of the population.

People in the study reported what they had eaten in the previous 24 hours in two ways. First, they recalled in person, to a trained dietary interviewer using a standard questionnaire. The second time they recalled what they had eaten over the phone, several days after the first recall.

The researchers then calculated how much chocolate participants had eaten using the average of these two recalls.

Dark chocolate needed to contain at least 45% cocoa solids for it to count as “dark”.




Read more:
Explainer: what is memory?


The researchers excluded from their analysis people who ate an implausibly large amount of chocolate, people who were underweight and/or had diabetes.

The remaining data (from 13,626 people) was then divided in two ways. One was by categories of chocolate consumption (no chocolate, chocolate but no dark chocolate, and any dark chocolate). The other way was by the amount of chocolate (no chocolate, and then in groups, from the lowest to highest chocolate consumption).




Read more:
Monday’s medical myth: chocolate is an aphrodisiac


The researchers assessed people’s depressive symptoms by having participants complete a short questionnaire asking about the frequency of these symptoms over the past two weeks.

The researchers controlled for other factors that might influence any relationship between chocolate and depression, such as weight, gender, socioeconomic factors, smoking, sugar intake and exercise.

What did the researchers find?

Of the entire sample, 1,332 (11%) of people said they had eaten chocolate in their two 24 hour dietary recalls, with only 148 (1.1%) reporting eating dark chocolate.

A total of 1,009 (7.4%) people reported depressive symptoms. But after adjusting for other factors, the researchers found no association between any chocolate consumption and depressive symptoms.

Few people said they’d eaten any chocolate in the past 24 hours. Were they telling the truth?
from www.shutterstock.com

However, people who ate dark chocolate had a 70% lower chance of reporting clinically relevant depressive symptoms than those who did not report eating chocolate.

When investigating the amount of chocolate consumed, people who ate the most chocolate were more likely to have fewer depressive symptoms.

What are the study’s limitations?

While the size of the dataset is impressive, there are major limitations to the investigation and its conclusions.

First, assessing chocolate intake is challenging. People may eat different amounts (and types) depending on the day. And asking what people ate over the past 24 hours (twice) is not the most accurate way of telling what people usually eat.

Then there’s whether people report what they actually eat. For instance, if you ate a whole block of chocolate yesterday, would you tell an interviewer? What about if you were also depressed?

This could be why so few people reported eating chocolate in this study, compared with what retail figures tell us people eat.




Read more:
These 5 foods are claimed to improve our health. But the amount we’d need to consume to benefit is… a lot


Finally, the authors’ results are mathematically accurate, but misleading.

Only 1.1% of people in the analysis ate dark chocolate. And when they did, the amount was very small (about 12g a day). And only two people reported clinical symptoms of depression and ate any dark chocolate.

The authors conclude the small numbers and low consumption “attests to the strength of this finding”. I would suggest the opposite.

Finally, people who ate the most chocolate (104-454g a day) had an almost 60% lower chance of having depressive symptoms. But those who ate 100g a day had about a 30% chance. Who’d have thought four or so more grams of chocolate could be so important?

This study and the media coverage that followed are perfect examples of the pitfalls of translating population-based nutrition research to public recommendations for health.

My general advice is, if you enjoy chocolate, go for darker varieties, with fruit or nuts added, and eat it mindfully. — Ben Desbrow


Blind peer review

Chocolate manufacturers have been a good source of funding for much of the research into chocolate products.

While the authors of this new study declare no conflict of interest, any whisper of good news about chocolate attracts publicity. I agree with the author’s scepticism of the study.

Just 1.1% of people in the study ate dark chocolate (at least 45% cocoa solids) at an average 11.7g a day. There was a wide variation in reported clinically relevant depressive symptoms in this group. So, it is not valid to draw any real conclusion from the data collected.

For total chocolate consumption, the authors accurately report no statistically significant association with clinically relevant depressive symptoms.

However, they then claim eating more chocolate is of benefit, based on fewer symptoms among those who ate the most.

In fact, depressive symptoms were most common in the third-highest quartile (who ate 100g chocolate a day), followed by the first (4-35g a day), then the second (37-95g a day) and finally the lowest level (104-454g a day). Risks in sub-sets of data such as quartiles are only valid if they lie on the same slope.

The basic problems come from measurements and the many confounding factors. This study can’t validly be used to justify eating more chocolate of any kind. — Rosemary Stanton


Research Checks interrogate newly published studies and how they’re reported in the media. The analysis is undertaken by one or more academics not involved with the study, and reviewed by another, to make sure it’s accurate.The Conversation

Ben Desbrow, Associate Professor, Nutrition and Dietetics, Griffith University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Advertisements

Are there certain foods you can eat to reduce your risk of Alzheimer’s disease?



Eating healthy foods doesn’t just improve our physical health. It can benefit our mental health, too.
From shutterstock.com

Ralph Martins, Macquarie University

With the rise of fad diets, “superfoods”, and a growing range of dietary supplement choices, it’s sometimes hard to know what to eat.

This can be particularly relevant as we grow older, and are trying to make the best choices to minimise the risk of health problems such as high blood pressure, obesity, type 2 diabetes, and heart (cardiovascular) problems.

We now have evidence these health problems also all affect brain function: they increase nerve degeneration in the brain, leading to a higher risk of Alzheimer’s disease and other brain conditions including vascular dementia and Parkinson’s disease.

We know a healthy diet can protect against conditions like type 2 diabetes, obesity and heart disease. Fortunately, evidence shows that what’s good for the body is generally also good for the brain.




Read more:
People living in rural areas may be at lower risk of Alzheimer’s disease


Oxidative stress

As we age, our metabolism becomes less efficient, and is less able to get rid of compounds generated from what’s called “oxidative stress”.

The body’s normal chemical reactions can sometimes cause chemical damage, or generate side-products known as free radicals – which in turn cause damage to other chemicals in the body.

To neutralise these free radicals, our bodies draw on protective mechanisms, in the form of antioxidants or specific proteins. But as we get older, these systems become less efficient. When your body can no longer neutralise the free radical damage, it’s under oxidative stress.

The toxic compounds generated by oxidative stress steadily build up, slowly damaging the brain and eventually leading to symptoms of Alzheimer’s disease.




Read more:
What causes Alzheimer’s disease? What we know, don’t know and suspect


To reduce your risk, you need to reduce oxidative stress and the long-term inflammation it can cause.

Increasing physical activity is important. But here we are focusing on diet, which is our major source of ANTIoxidants.

Foods to add

There are plenty of foods you can include in your diet that will positively influence brain health. These include fresh fruits, seafood, green leafy vegetables, pulses (including beans, lentils and peas), as well as nuts and healthy oils.

Fish

Fish is a good source of complete protein. Importantly, oily fish in particular is rich in omega-3 fatty acids.

Laboratory studies have shown omega-3 fatty acids protect against oxidative stress, and they’ve been found to be lacking in the brains of people with Alzheimer’s disease.

They are essential for memory, learning and cognitive processes, and improve the gut microbiota and function.

Oily fish, like salmon, is high in omega-3 fatty acids, which research shows can benefit our brain health.
From shutterstock.com

Low dietary intake of omega-3 fatty acids, meanwhile, is linked to faster cognitive decline, and the development of preclinical Alzheimer’s disease (changes in the brain that can be seen several years before for onset of symptoms such as memory loss).

Omega-3 fatty acids are generally lacking in western diets, and this has been linked to reduced brain cell health and function.

Fish also provides vitamin D. This is important because a lack of vitamin D has been linked to Alzheimer’s disease, Parkinson’s disease, and vascular dementia (a common form of dementia caused by reduced blood supply to the brain as a result of a series of small strokes).

Berries

Berries are especially high in the antioxidants vitamin C (strawberries), anthocyanins (blueberries, raspberries and blackberries) and resveratrol (blueberries).

In research conducted on mouse brain cells, anthocyanins have been associated with lower toxic Alzheimer’s disease-related protein changes, and reduced signs of oxidative stress and inflammation specifically related to brain cell (neuron) damage. Human studies have shown improvements in brain function and blood flow, and signs of reduced brain inflammation.




Read more:
Six things you can do to reduce your risk of dementia


Red and purple sweet potato

Longevity has been associated with a small number of traditional diets, and one of these is the diet of the Okinawan people of Japan. The starchy staple of their diet is the purple sweet potato – rich in anthocyanin antioxidants.

Studies in mice have shown this potato’s anthocyanins protect against the effects of obesity on blood sugar regulation and cognitive function, and can reduce obesity-induced brain inflammation.

Green vegetables and herbs

The traditional Mediterranean diet has also been studied for its links to longevity and lower risk of Alzheimer’s disease.

Green vegetables and herbs feature prominently in this diet. They are rich sources of antioxidants including vitamins A and C, folate, polyphenols such as apigenin, and the carotenoid xanthophylls (especially if raw). A carotenoid is an orange or red pigment commonly found in carrots.

Green vegetables and herbs provide us with several types of antioxidants.
From shutterstock.com

The antioxidants and anti-inflammatory chemicals in the vegetables are believed to be responsible for slowing Alzheimer’s pathology development, the build up of specific proteins which are toxic to brain cells.

Parsley is rich in apigenin, a powerful antioxidant. It readily crosses the barrier between the blood and the brain (unlike many drugs), where it reduces inflammation and oxidative stress, and helps brain tissue recovery after injury.




Read more:
What is the Mediterranean diet and why is it good for you?


Beetroot

Beetroot is a rich source of folate and polyphenol antioxidants, as well as copper and manganese. In particular, beetroot is rich in betalain pigments, which reduce oxidative stress and have anti-inflammatory properties.

Due to its nitrate content, beetroot can also boost the body’s nitric oxide levels. Nitric oxide relaxes blood vessels resulting in lowered blood pressure, a benefit which has been associated with drinking beetroot juice.

A recent review of clinical studies in older adults also indicated clear benefits of nitrate-rich beetroot juice on the health of our hearts and blood vessels.

Foods to reduce

Equally as important as adding good sources of antioxidants to your diet is minimising foods that are unhealthy: some foods contain damaged fats and proteins, which are major sources of oxidative stress and inflammation.

A high intake of “junk foods” including sweets, soft drinks, refined carbohydrates, processed meats and deep fried foods has been linked to obesity, type 2 diabetes and cardiovascular disease.

Where these conditions are are all risk factors for cognitive decline and Alzheimer’s disease, they should be kept to a minimum to reduce health risks and improve longevity.




Read more:
Health check: can eating certain foods make you smarter?


The Conversation


Ralph Martins, Professor, Department of Biomedical Sciences, Macquarie University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Why is nursing home food so bad? Some spend just $6.08 per person a day – that’s lower than prison



If residents are given poor quality foods that don’t meet their needs or preferences, they’re less likely to eat it.
Shutterstock

Cherie Hugo, Bond University

The Royal Commission into Aged Care Quality and Safety this week turned its attention to food and nutrition. The testimony of maggots in bins and rotting food in refrigerators was horrific.

When so much of a resident’s waking hours is spent either at a meal, or thinking of a meal, the meal can either make or break an elderly person’s day.

So why are some aged care providers still offering residents meals they can’t stomach?

It comes down to three key factors: cost-cutting, aged care funding structures that don’t reward good food and mealtime experiences, and residents not being given a voice. And it has a devastating impact on nutrition.




Read more:
Nearly 2 out of 3 nursing homes are understaffed. These 10 charts explain why aged care is in crisis


How much are we spending on residents’ food?

Our research from 2017 found the average food spend in Australian aged care homes was A$6.08 per resident per day. This is the raw food cost for meals and drinks over breakfast, morning tea, lunch, afternoon tea, dinner and supper.

This A$6.08 is almost one-third of the average for older coupled adults living in the community (A$17.25), and less than the average in Australian prisons (A$8.25 per prisoner per day).

Over the time of the study, food spend reduced by A$0.31 per resident per day.

Meanwhile the expenditure on commercial nutrition supplements increased by A$0.50 per resident per day.

Commercial nutrition supplements may be in the form of a powder or liquid to offer additional nutrients. But they can never replace the value of a good meal and mealtime experience.




Read more:
What is ‘quality’ in aged care? Here’s what studies (and our readers) say


Cutting food budgets, poor staff training and insufficient staff time preparing food on-site inevitably impacts the quality of food provided.

At the royal commission, chefs spoke about using more frozen and processed meals, choosing poorer quality of meats and serving leftover meals in response to budget cuts.

Malnutrition is common, but we can address it

One in two aged care residents are malnourished and this figure has remained largely the same for the last 20 years.

Malnutrition has many causes – many of which are preventable or can be ameliorated. These include:

  • dental issues or ill-fitting dentures
  • dementia (because of difficulty swallowing and sensory sensitivities)
  • a poorly designed dining environment (such as poor acoustics, uncomfortable furniture, inappropriate crockery and table settings)
  • having too few staff members to help residents eat and drink and/or poor staff training
  • not supplying modified cutlery and crockery for those who need extra help
  • not offering residents food they want to eat or offering inadequate food choices.
Residents often need help at mealtimes.
Futurewalk/Shutterstock

My soon-to-be-published research shows disatisfaction with the food service significantly influences how much and what residents eat, and therefore contributes to the risk of malnutrition.

Malnutrition impacts all aspects of care and quality of life. It directly contributes to muscle wasting, reduced strength, heart and lung problems, pressure ulcers, delayed wound healing, increased falls risk and poor response to medications, to name a few.

Food supplements, funding and quality control

Reduced food budgets increase the risk of malnutrition but it’s not the only aged care funding issue related to mealtimes.

Aged care providers are increasingly giving oral nutrition supplements to residents with unplanned weight loss. This is a substandard solution that neglects fundamental aspects of malnutrition and quality of life. For instance, if a resident has lost weight as a result of ill-fitting dentures, offering a supplement will not identify and address the initial cause. And it ends up costing more than improving the quality of food and the residents’ mealtime experience.

Our other soon-to-be-published research shows the benefits of replacing supplements with staff training and offering high-quality food in the right mealtime environment. This approach significantly reduced malnutrition (44% over three months), saved money and improved the overall quality of life of residents.




Read more:
So you’re thinking of going into a nursing home? Here’s what you’ll have to pay for


However, aged care funding does not reward quality in food, nutrition and mealtime experience. If a provider does well in these areas, they don’t attract more government funding.

It’s not surprising that organisations under financial pressure naturally focus on aspects that attract funding and often in turn, reduce investment in food.

A research team commissioned by the health department has been investigating how best to change aged care funding. So hopefully we’ll see changes in the future.

It’s not just about the food. Residents’ mealtime experiences affect their quality of life.
Ranta Images/Shutterstock

Aged care residents are unlikely to voice their opinions – they either won’t or can’t speak out. Unhappy residents often fear retribution about complaining – often choosing to accept current care despite feeling unhappy with it.




Read more:
How our residential aged-care system doesn’t care about older people’s emotional needs


We lived in an aged care home. This is what we learned

New Aged Care Quality Standards came into effect on July 1 (I was involved in developing the guidelines to help aged care providers meet these standards).

However, they provide limited guidance for organisations to interpret and make meaningful change when it comes to food, nutrition and mealtime experience. Aged care providers will need extra support to make this happen.

We’ve developed an evidence-based solution, designed with the aged care industry, to address key areas currently holding aged care back. The solution offers tools and identified key areas essential for a happier and more nourishing mealtime.

At the end of 2018, our team lived as residents in an aged care home on and off for three months. As a result of this, and earlier work, we developed three key solutions as part of the Lantern Project:

  • a food, nutrition and mealtime experience guide for industry with a feedback mechanism for facilities to improve their performance

  • free monthly meetings for aged care providers and staff to discuss areas affecting food provision

  • an app that gives staff, residents and providers the chance to share their food experiences. This can be everything from residents rating a meal to staff talking about the dining room or menu. For residents, in particular, this allows them to freely share their experience.

We have built, refined and researched these aspects over the past seven years and are ready to roll them out nationally to help all homes improve aged care food, nutrition and mealtime experience.The Conversation

Cherie Hugo, Teaching Fellow, Nutrition & Dietetics, Bond University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Research Check: can drinking coffee help you lose weight?



As keen as we may be to hear about any health benefits of drinking coffee, the headlines aren’t always what they seem.
Janko Ferlic/Unsplash

Andrew Carey, Baker Heart and Diabetes Institute

Researchers from the University of Nottingham in the UK recently published a study in the journal Scientific Reports suggesting caffeine increases brown fat.

This caught people’s attention because brown fat activity burns energy, which may help with weight loss. Headlines claimed drinking coffee can help you lose weight, and that coffee is possibly even the “secret to fighting obesity”.

Unfortunately, it’s a little more complicated than that. The researchers did find caffeine stimulated brown fat, but this was mainly in cells in a lab.

For a human to reap the benefits seen in the cells, we estimate they’d need to drink at least 100 cups of coffee.

Although part of this research did look at people, the methods used don’t support coffee or caffeine as weight-loss options.




Read more:
These 5 foods are claimed to improve our health. But the amount we’d need to consume to benefit is… a lot


What is brown fat?

Brown adipose (fat) tissue is found deep within the torso and neck. It contains fat cell types which differ from the “white” fat we find around our waistlines.

Brown fat cells adapt to our environment by increasing or decreasing the amount of energy they can burn when “activated”, to produce heat to warm us up.

When people are cold for days or weeks, their brown fat gets better at burning energy.

We understand caffeine may be able to indirectly accentuate and prolong some of these processes, mimicking the effects of cold exposure in stimulating brown fat.

Brown fat – and anything thought to increase its activity – has generated significant research interest, in the hope it might assist in the treatment of obesity.

What did the researchers do in this latest study?

The research team first conducted experiments where cells taken from mice were grown into fat cells in petri dishes. They added caffeine to some samples, but not others, to see whether the caffeinated cells acquired more brown fat attributes (we call this “browning”).

The dose of caffeine (one millimolar) was determined based on what would be the highest concentration that browned the cells but didn’t kill them.

The fat cell culture experiment showed adding caffeine did “brown” the cells.




Read more:
Can ‘brown fat’ really help with weight loss?


The researchers then recruited a group of nine people who drank a cup of instant coffee, or water as a control.

Before and after the participants drank coffee, the researchers measured their brown fat activity by assessing the temperature of the skin near the neck, under which a major region of brown fat is known to lie.

Skin temperature increased over the shoulder area after drinking coffee, whereas it didn’t after drinking only water.

How should we interpret the results?

Some people will criticise the low number of human participants (nine). We shouldn’t make broad recommendations on human behaviour or medicine based on small studies like this, but we can use them to identify new and interesting aspects of how our bodies work – and that’s what these researchers sought to do.

But whether the increased skin temperature after drinking coffee is significant cannot be determined for a few important reasons.

Firstly, although the study showed an increase in skin temperature after drinking coffee, the statistical analysis for the human experiment doesn’t include enough data to accurately compare the coffee and water groups, which prevents meaningful conclusions. That is, it doesn’t use appropriate methods we apply in science to decide if something really changed or only happened by chance.

Enjoy coffee for the taste, or the buzz. But don’t expect it to affect your waistline.
From shutterstock.com

Second, measuring skin temperature is not necessarily the most accurate indicator for brown fat in this context. Skin temperature has been validated as a way to measure brown fat after cold exposure, but not after taking drugs which mimic the effects of cold exposure – which caffeine is in the context of this study.

Myself and other researchers have shown the effects of these “mimic” drugs result in diverse effects including increased blood flow to the skin. Where we don’t know if changes in the skin temperature are due to brown fat or unrelated factors, relying on this measure may be problematic.

Although also suffering its own limitations, PET (poistron emission tomography) imaging is currently our best option for directly measuring active brown fat.

It’s the dose that matters most

The instant coffee used in the study contained 65mg of caffeine, which is standard for a regular cup of instant coffee. Brewed coffees vary and might be double this.

Regardless, it’s difficult to imagine this dose could increase brown fat energy burning when studies using large doses of more potent “cold-mimicking” drugs (such as ephedrine) cause no, or at best modest, increases in brown fat activity.




Read more:
Health check: can caffeine improve your exercise performance?


But let’s look at the caffeine dose used in the cell experiments. The one millimolar concentration of caffeine is a 20-fold larger dose than 300-600mg of caffeine dose used by elite athletes as a performance-boosting strategy. And this dose is five to ten times higher than the amount of caffeine you’d get from drinking an instant coffee.

Rough calculations therefore suggest we’d need to drink 100 or 200 cups of coffee to engage the “browning” effects of caffeine.

So people should continue to drink and enjoy their coffee. But current evidence suggests we shouldn’t start thinking about it as a weight loss tool, nor that it has anything meaningful to do with brown fat in humans. – Andrew Carey


Blind peer review

This Research Check is a fair and balanced discussion of the study. The limitations identified by this Research Check apply equally to diabetes, which the study encompassed, but didn’t get picked up as much in the headlines.

Coffee contains more than caffeine, and while there is some evidence that modest coffee consumption may reduce diabetes risk, decaffeinated coffee seems to be as effective as caffeinated coffee. This is consistent with the point made by the Research Check that you would need to drink an implausible number of cups of coffee to produce the effect seen with caffeine in the cultured fat cells. – Ian Musgrave


Research Checks interrogate newly published studies and how they’re reported in the media. The analysis is undertaken by one or more academics not involved with the study, and reviewed by another, to make sure it’s accurate.The Conversation

Andrew Carey, Group Leader: Metabolic and Vascular Physiology, Baker Heart and Diabetes Institute

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Health Check: why do we crave comfort food in winter?



You’re not imagining it. Our bodies really do crave macaroni cheese and other comfort foods as the temperature drops. Here’s why.
from www.shutterstock.com

Megan Lee, Southern Cross University and Jacqui Yoxall, Southern Cross University

It’s winter and many of us find ourselves drawn to bowls of cheesy pasta, oozing puddings, warming soups, and hot chocolate with marshmallows.

These and other comfort foods can make us feel good. But why? And why do we crave them in winter and not in summer?

Research tells us there are three good reasons.




Read more:
Health Check: how food affects mood and mood affects food


1. The gut ‘speaks’ to the brain

We know from the relatively new field of nutritional psychiatry that our stomachs produce the “happiness chemicals” dopamine and serotonin. When we eat, a complex process involving the brain means these neurochemicals trigger feelings of happiness and well-being.

These happiness chemicals are also produced when we exercise and are exposed to sunlight, which decline in winter.

This results in a change in the fine balance between the good and bad bacteria that live in our stomachs, and consequently, the relationship between the gut and the brain.

So, in winter when we eat our favourite comfort foods, we get a rush of happiness chemicals sent from the gut to our brain and this make us feel happy and content.




Read more:
Essays on health: microbes aren’t the enemy, they’re a big part of who we are


2. Evolution may have a hand

The second reason we crave more comfort foods during the winter months could be evolutionary. Before we enjoyed technological advances such as housing, heating, supermarkets and clothing, humans who increased their body weight during winter to keep warm were more likely to survive their environmental conditions. Craving carbohydrate and sugar rich foods was therefore a protective mechanism.

Although we are not still living in shelters or foraging for food today, food cravings in winter may still be programmed into our biology.




Read more:
Caveman cravings? Rating the paleo diet


3. Psychology, craving and mood

Social learning theory says people learn from each other through observing, imitating and modelling. In the context of food cravings this suggests that what our caregivers gave to us in winter as children has a striking impact on what we choose to eat in winter as adults.

A review of studies on the psychological reasons behind eating comfort food says this food may play a role in alleviating loneliness and boosting positive thoughts of childhood social interaction.

We may also naturally experience lower mood in winter and low mood has been linked to emotional eating.

In winter due to it being darker and colder, we tend to stay indoors longer and self-medicate with foods that are carbohydrate and sugar rich. These types of foods release glucose straight to our brain which gives us an instant feeling of happiness when we are feeling cold, sad, tired or bored.




Read more:
Here comes the sun: how the weather affects our mood


Comfort food can be healthy

For all the comfort they provide, comfort foods generally receive a bad rap because they are usually energy and calorie dense; they can be high in sugar, fat and refined carbohydrates.

These types of foods are usually linked to weight gain in winter and if you eat too much over the longer term, can increase the risk of heart disease and diabetes.

However, not all comfort foods are created equally, nor are they all bad for our health.

You still get a comforting feeling with a hearty bowl of soup, but without the extra calories.
from www.shutterstock.com

You can get the same comforting feelings from winter foods containing ingredients that are good for you. For example, a hearty bowl of soup with a slice of wholegrain bread can give you all the components you need for optimal physical and psychological health. Steaming bowls of chilli and curries can provide immunity boosting properties with the use of their warming spices. So too are all the wonderful citrus fruits that become available in the winter.

If you are craving something that is carbohydrate rich, try swapping white varieties for wholegrain versions that will dampen carbohydrate cravings. If you crave a hot chocolate try swapping the cocoa powder for cacao which has a higher concentration of vitamins and minerals.

More good news

The good news for all of us who crave comfort foods in winter is studies that assess intuitive eating — eating when you are hungry, stopping when you are full and listening to what your body is telling you to eat — suggest people who eat this way are happier with their body image, feel better psychologically and are less likely to have disordered eating.

So, embrace this wonderful chilly weather. Rug up in your favourite woolly jumper, sit by the fire, cuddle up with a loved one, make some healthier swaps to your classic comfort foods, remove the food guilt and listen to what your body is telling you it needs during these cold winter months.The Conversation

Megan Lee, Academic Tutor and PhD Candidate, Southern Cross University and Jacqui Yoxall, Senior Lecturer in Allied Health, Southern Cross University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Why the Australasian Health Star Rating needs major changes to make it work



Most consumers are unaware that the Health Star Rating system is compensatory, and that one negative nutritional attribute, such as high sugar, can be cancelled out by a positive attribute like fibre.
from http://www.shutterstock.com, CC BY-ND

Jessica C Lai, Victoria University of Wellington; Alana Harrison, Victoria University of Wellington; Hongzhi Gao, Victoria University of Wellington, and Samuel Becher, Victoria University of Wellington

Unhealthy diets cause multiple physical and mental health problems. To help consumers make healthier choices, Australia and New Zealand introduced the voluntary Health Star Rating (HSR) system in 2014.

The system is supposedly designed to provide consumers with an overall signal about a food’s healthiness. Presumably, this should nudge consumers to make more informed and healthier decisions.

Five years on, the Australian and New Zealand governments are conducting a system review. Our research shows that, while the initiative is noble, the devil is in the details. There is a need, and hopefully an opportunity, to improve the system and reconsider some of its key aspects.




Read more:
Labor’s election pledge to improve Australian diets is a first – now we need action, not just ‘consideration’


Loopholes and consumer misconception

Under the HSR system, products are labelled from 0.5 stars (the least healthy score) to 5 stars (the healthiest products). The rating is determined by evaluating the overall nutritional value of the product. It compares the content of “good” ingredients (i.e. fibre, protein, fruit, vegetables, nuts and legumes) with the “bad” ones (i.e. saturated fat, energy, total sugar and sodium).

But we believe most consumers are unaware that the HSR system is compensatory. This means one negative nutritional attribute can be cancelled out, or balanced, by a positive attribute. A manufacturer can receive a high HSR score for a product rich in sugar by adding a healthy ingredient such as fibre.


CC BY-ND

It is also likely that most consumers are unaware that the HSR rating is calculated on an “as prepared” basis. This means a product can enjoy a high rating based on the nutritional value of preparatory ingredients.

Milo found itself embroiled in controversy for displaying 4.5 stars on its chocolate powder, though the powder itself clearly does not merit this rating. The 4.5-star rating was based on consuming merely three teaspoons of powder combined with skim milk. But who actually consumes Milo this way?

Furthermore, HSR scores are intended to allow comparison only among similar products. A four-star rating for a cereal cannot be compared to a four-star rating given to milk. While the two products display the same number of stars, their healthiness may differ significantly.




Read more:
Have you gone vegan? Keep an eye on these 4 nutrients


What holds the system back

There is scepticism about the HSR’s authenticity, reliability and effectiveness. This stems in part from the system being self-regulated.

In addition, the system is non-mandatory, leaving manufacturers free to decide when and how to use it. For instance, only around 20% of packaged goods available in New Zealand and Australian supermarkets have an HSR. To add to the distortion, a disproportionate number of these show high ratings. This indicates that manufacturers only use the HSR for their healthier products.

A voluntary system does little to counter the inbuilt incentive that manufacturers have to use unhealthy components such as sugar, salt and saturated fats. These produce pleasure and create “craveable” foods and food addiction. Manufacturers likely do not use a HSR for these products. However, consumers do not interpret missing information as “the worst-case scenario”, but assume average quality.

Finally, the system does not effectively assist the vulnerable consumers who need it the most. While HSR does help some middle- to high-income consumers, it does a poor job with respect to consumers of low socio-economic status. This suggests that the label requires consumers to be educated about its meaning.

Time to move forward

Some improvements could carry the HSR forward a great distance.

If the system were made mandatory, it would likely raise consumers’ awareness. There should also be more education initiatives about the HSR. This, in turn, would incentivise manufacturers to produce healthier foods and beverages.

At the same time, we should strive to minimise the costs involved and consider backing the system with government funding. This would allow all businesses to participate in the program, including less profitable or smaller businesses. It would also prevent costs from being passed onto consumers.

As a minimum, if the system is not made mandatory, a general “non-participation” label should be introduced. If a producer opts not to label its product, it should be required to use a conspicuous cautionary statement. Such a statement should declare, for instance, that “the manufacturer has chosen not to verify the health rating of this product” or “the healthiness of this product cannot be verified”.

Studies show the HSR rating would have a bigger impact if placed in the upper left corner of the packaging and used colours. It could use a traffic light system, with 0.5-2.5 stars on a red background, 3 to 4 stars on amber and 4.5-5 star products on green. The colour-coded system has proved to be more effective with marginalised groups of consumers.

All easier said than done.

Healthy diets are important for physical and psychological well-being and for strengthening our communities and economies. However, any regulation of the food industry is likely to be resisted by its strong and well-organised lobbying power. To fight this battle, the consumers’ voice is crucial to ensure we can all make good and healthy foods choices.The Conversation

Jessica C Lai, Senior Lecturer in Commercial Law, Victoria University of Wellington; Alana Harrison, LLB(Hons) & BCOM Undergraduate Student, Victoria University of Wellington; Hongzhi Gao, Associate professor, Victoria University of Wellington, and Samuel Becher, Associate Professor of Business Law, Victoria University of Wellington

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Research Check: is white meat as bad for your cholesterol levels as red meat?



Whether you’re eating red meat or white meat, a lean cut is the healthier way to go.
From shutterstock.com

Clare Collins, University of Newcastle

You’ve probably heard eating too much fatty red meat is bad for your health, while lean meat and chicken are better choices. So, recent headlines claiming white meat is just as bad for your cholesterol levels as red meat might have surprised you.

The reports were triggered by a paper published in the The American Journal of Clinical Nutrition earlier this month.

The study did find lean white meat had the same effect on cholesterol levels as lean red meat. While this might be construed as good news by lovers of red meat, more research on this topic is needed for a clearer picture.

How was this study conducted?

The researchers set out to compare three diets: one where the main dietary source of protein came from eating red meat (beef and pork), another where it came from poultry (chicken and turkey), and a third where it came from plant foods (legumes, nuts, grains and soy products).

They wanted to measure the impact of these diets on specific categories of blood fats, as markers of heart disease risk. They tested blood fat markers including low density lipoprotein cholesterol (or LDL, commonly known as “bad cholesterol”), apolipoprotein B (apoB), and the ratio of total cholesterol to high density lipoprotein cholesterol (or HDL, commonly known as “good cholesterol”).




Read more:
How to get the nutrients you need without eating as much red meat


The researchers also wanted to know whether blood fat levels changed more when the background dietary patterns were high in saturated fat, derived mostly from full-fat dairy products and butter, or when they were low in saturated fat.

To achieve this, 177 adults with blood cholesterol levels in the normal range were randomised to follow either a high-saturated fat diet (14% of total energy intake) or a low-saturated fat diet (7% of total energy intake).

Within these two groups they were further randomly assigned to follow three separate diets for four weeks each: red meat, white meat, and plant protein sources. The main protein sources in the meat groups came from lean cuts of red and white meat. In the plant diet, protein came from legumes, nuts, grains and soy products.

Participants met research staff weekly to collect their food products and received counselling on following their specified diet. Participants were asked to maintain their physical activity level and keep their weight as stable as possible so these factors did not bias the results.

To eliminate any carry-over effects from eating one type of protein to the next, participants were given between two and seven weeks break in between each diet and told to return to their usual eating patterns.




Read more:
Organic, grass fed and hormone-free: does this make red meat any healthier?


What did the study find?

Some participants dropped out along the way, so in the end researchers had results from 113 participants.

Blood concentrations of LDL cholesterol and apoB were lower following the plant protein diet period, compared to both the red and white meat periods. This was independent of whether participants were on a background diet of high- or low-saturated fat.

There was no statistically significant difference in the blood fat levels of those eating red meat compared to those eating white meat.

We’re often told to limit our consumption of red meat.
From shutterstock.com

Eating a diet high in saturated fat led to significant increases in blood levels of LDL cholesterol, apoB, and large LDL particles compared with a background diet low in saturated fat.

So, all the dietary protein sources as well as the level of saturated fat intake had significant effects on total cholesterol, LDL cholesterol, non-HDL cholesterol, and apoB levels.

How should we interpret the results?

Although the test diets only lasted four weeks each, this study is important. It’s rare to see intervention studies that directly compare eating different types of meat and sources of protein and the impact on heart-disease risk factors. This is partly due to the challenge and expense of providing the food and getting people to follow specific diets.

Most studies to date have been cohort studies where people are categorised based on what they eat, then followed up for many years to see what happens to their health.

One review of cohort studies found no greater risk of stroke in those who eat more poultry compared to less poultry, while another showed a higher risk of stroke among those eating more red and processed meat relative to poultry intake.




Read more:
Should we eat red meat? The nutrition and the ethics


There are a few things to keep in mind with this study. First, the researchers used the leanest cuts of both red and white meats, and removed all visible fat and skin. If participants were eating fatty meat, we may have seen different results.

The significant variation in breaks between different diets (ranging from two to seven weeks) may have also affected the results. Participants with a longer break would have had more time for their blood cholesterol levels to change, compared to those with shorter breaks.

Finally, in reporting their results, it would have been better to include all 177 participants who began the study. People who drop out often have different health characteristics and leaving them out may have biased results.

This short-term study does not provide evidence that choosing lean white meat over red meat is either better or worse for your health.

But the findings are consistent with recommendations from the Heart Foundation to include a variety of plant-based foods in our diets, foods containing healthy types of fat and lower amounts of saturated fat, and in particular, to choose lean red meat and poultry. – Clare Collins


Blind peer review

The article presents a fair, balanced and accurate assessment of the study. In this study, they showed lean red meat and lean white meat (with all visible fat and skin removed) had the same effect on blood fat levels.

Importantly, plant protein sources (such as legumes, nuts, grains and soy products) lowered blood fat levels compared to the red and white meats, and this was independent of whether the participants had been placed on a background diet low or high in saturated fats. This study did not look at the impact of a fish-based diet on blood fats. – Evangeline Mantzioris




Read more:
Three charts on: Australia’s declining taste for beef and growing appetite for chicken


Research Checks interrogate newly published studies and how they’re reported in the media. The analysis is undertaken by one or more academics not involved with the study, and reviewed by another, to make sure it’s accurate.The Conversation

Clare Collins, Professor in Nutrition and Dietetics, University of Newcastle

This article is republished from The Conversation under a Creative Commons license. Read the original article.

How to get the nutrients you need without eating as much red meat



The average Australian eats 81 grams of red meat a day, while the planetary diet recommends just 14g.
Napocska/Shutterstock

Evangeline Mantzioris, University of South Australia

If you’re a red meat-eater, there’s a good chance you’re eating more of it than you should. At last count, Australians ate an average of 81 grams of red meat per day.

The planetary health diet was developed by researchers to meet the nutritional needs of people around the world, while reducing food production’s environmental impact. It recommends reducing our red meat intake to around 14g a day. That’s around 100g of red meat a week.




Read more:
How to feed a growing population healthy food without ruining the planet


Australia’s dietary guidelines are more conservative and recommend limiting red meat intake to a maximum of 455g a week, or 65g a day, to reduce the additional cancer risk that comes from eating large quantities of red meat.

So, what should you eat instead? And how can you ensure you’re getting enough protein, iron, zinc and vitamin B12?

Protein

Animal sources of protein provide essential amino acids, which the body uses to make muscle, tissue, hormones, neurotransmitters and the different cells and antibodies in our immune system.

The planetary health diet offers a good blueprint for gaining enough protein from a variety of other animal sources. It recommends eating, on average:

  • 25g of chicken per day
  • 28g of fish per day day
  • 1.5 eggs per week
  • 200g of milk per day day
  • 50g of cheese per day.

In addition to the 14g of red meat in the planetary health diet, these foods would provide a total of 45g of protein per day, which is around 80% of our daily protein needs from animal sources.

The remaining protein required (11g) is easily met with plant foods, including nuts, legumes, beans and wholegrains.

Nuts are a good alternative to meat.
Eakrat/Shutterstock

Iron

Iron is essential for many of the body’s functions, including transporting oxygen to the blood.

Iron deficiency can lead to anaemia, a condition in which you feel tired and lethargic.




Read more:
Why iron is such an important part of your diet


Pre-menopausal women need around 18 milligrams a day, while men only need 8mg. Pre-menopausal women need more iron because of the blood they lose during menstruation.

So, how can you get enough iron?

Beef, of course, is a rich source of iron, containing 3.3mg for every 100g.

The same amount of chicken breast contains 0.4mg, while the chicken thigh (the darker meat) contains slightly higher levels, at 0.9mg.

Pork is similarly low in iron at 0.7mg.

But kangaroo will provide you with 4.1mg of iron for every 100g. Yes, kangaroo is a red meat but it produces lower methane emissions and has one-third the levels of saturated fat than beef, making it a healthier and more environmentally friendly alternative.

Plant protein sources are also high in iron: cooked kidney beans have 1.7mg and brown lentils have 2.37mg per 100g.

Kidney beans and lentils are good sources of iron.
Hermes Rivera

If you wanted to cut your red meat intake from the 81g average to the recommended 14g per day while still getting the same amount of iron, you would need to consume the equivalent of either 50g of kangaroo, 100g of brown lentils or 150g of red kidney beans per day.

Zinc

Zinc is an essential mineral that helps the body function optimally. It affects everything from our ability to fight bugs, to our sense of smell and taste.

Zinc requirements are higher for men (14mg a day) than women (8mg a day) due to zinc’s role in the production and development of sperm.

Of all meat sources, beef provides the most zinc, at 8.2mg per 100g.

Chicken breast provides just 0.68mg, while the chicken thigh has 2mg.

In kangaroo meat, the levels of zinc are lower than beef, at 3.05mg.

The richest source of zinc is oysters (48.3mg).

Beans such lentils, red kidney beans and chickpeas all provide about 1.0mg per 100g.

To meet the shortfall of zinc from reducing your red meat intake, you could eat 12 oysters a day, which is unlikely. Or you could eat a combination of foods such as 150g of red kidney beans, one serve (30g) of zinc-supplemented cereals like Weet-bix, three slices of wholegrain bread, and a handful of mixed nuts (30g).

Vitamin B12

Vitamin B12 is important for healthy blood and nerve function. It’s the nutrient of most concern for people cutting out meat products as it’s only found in animal sources.

Requirements of vitamin B12 are the same for both women and men at 2.4 micrograms (mcg) a day.

Beef and kangaroo provide 2.5mcg per 100g serve, while chicken and turkey provide about 0.6mcg.

Dairy products also contain vitamin B12. One glass of milk would give you half your daily requirement requirement (1.24mcg) and one slice of cheese (20g) would provide one-fifth (0.4mcg).

A glass of milk would provide half the vitamin B12 you need in a day.
AntGor/Shutterstock

Vitamin B12 can be found in trace amounts in spinach and fermented foods, but these levels aren’t high enough to meet your nutritional needs. Mushrooms, however, have consistently higher levels, with shiitake mushrooms containing 5mcg per 100g.

To meet the shortfall of vitamin B12 from reducing red meat intake, you would need to eat 75g kangaroo per day or have a glass of milk (200ml) plus a slice of cheese (20g). Alternatively, a handful of dried shiitake mushrooms in your salad or stir-fry would fulfil your requirements.

Don’t forget about fibre

A recent study found fibre intakes of around 25 to 29g a day were linked to lower rates of many chronic diseases such as coronary heart disease, type 2 diabetes, stroke and bowel cancer.

Yet most Australian adults currently have low dietary fibre levels of around 20g a day.

By making some of the changes above and increasing your intake of meat alternatives such as legumes, you’ll also be boosting your levels of dietary fibre. Substituting 100g of lentils will give you an extra 5g of fibre per day.

With some forward planning, it’s easy to swap red meat for other animal products and non-meat alternatives that are healthier and more environmentally sustainable.The Conversation

Evangeline Mantzioris, Program Director of Nutrition and Food Sciences, University of South Australia

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Health check: can eating certain foods make you smarter?



File 20190325 36273 au864m.jpg?ixlib=rb 1.1
Green vegetables, nuts and berries are among the foods that could improve our brain function.
From shutterstock.com

Margaret Morris, UNSW and Michael Kendig, UNSW

Trying to keep up with what constitutes a “healthy” diet can be exhausting. With unending options at the supermarket, and diet advice coming from all directions, filling your shopping trolley with the right things can seem an overwhelming task.

For a long time we’ve known diet is key to maintaining physical health.

But emerging evidence indicates diet quality also plays a critical role in our cognitive function.

We’re learning some of the best things to eat in this regard include vegetables, nuts and berries, foods containing “good fats” and, possibly, fermented foods.

As well as potentially improving our brain function, eating these sorts of foods could improve our mental well-being – and could even help the planet, too.




Read more:
Research Check: does eating chocolate improve your brain function?


Diet and brain function

In the face of rising obesity rates, over the past couple of decades, researchers have questioned whether increased weight, or poor diet, could influence cognition. They have since looked at what sorts of diets might impair or improve the function of our brains.

Long term follow-up studies show obesity is associated with mild impairments in several domains of cognitive function, including short-term memory, attention and decision-making.

Research has also shown short-term memory is poorer in people who report eating more saturated fat and sugar.

Conversely, the Mediterranean diet has been associated with better brain health and maintenance of cognitive abilities into older age. A Mediterranean diet is based on vegetables, whole grains, legumes and nuts, with healthy fats such as olive oil. Intake of red meat, saturated fats and sugar is limited.

A healthy diet has many elements, so let’s look at what particular foods might explain these benefits.

Vegetables, nuts and berries

Evidence indicates eating more vegetables slows the gradual decline in cognitive abilities that occurs naturally as we age.

While all veggies are likely to contribute, those in the cruciferous (Brassicaceae) family may confer particular benefits through their high fibre, folate, potassium and vitamin content. Vegetables in this family include broccoli, cauliflower, brussels sprouts, and fad favourites kale and rocket.

Interestingly, while there’s good evidence for the protective role of vegetables, there’s less evidence when it comes to fruit.

Research has shown a healthy diet can improve cognitive functions such as learning and memory.
From shutterstock.com

Berries, though, contain high levels of antioxidants. These compounds protect the body by scavenging harmful free radicals and reducing inflammation. Together these functions are likely to protect our cognitive ability.

Studies in rats, and in older people with mild cognitive impairment, indicate supplementing diets with berries improves performance in various memory tasks.

Nuts, meanwhile, are excellent sources of monounsaturated and polyunsaturated fats, minerals and vitamins. Studies in animals have shown the addition of nuts improves learning and memory. Emerging evidence in humans suggests consuming nuts within a Mediterranean-style diet improves measures of cognition, such as the capacity for verbal reasoning.

Healthy fats

Healthy diets such as the Mediterranean diet are also characterised by foods such as oily fish, avocados, olive oil and small amounts of animal-derived fats (such as from red meat).

One of our experiments in rats showed diets high in saturated fat from lard or high in sugar led to memory impairments, whereas an oil-based diet high in polyunsaturated fats didn’t.




Read more:
Food as medicine: your brain really does want you to eat more veggies


Importantly, rats fed these different diets did not differ in their total energy intake – only the type of fat and sugar varied.

While we can’t comment directly on the effects in humans, these findings suggest eating excess sugar, or animal-based fats, may negatively impact cognition.

Fermented foods

For thousands of years humans have prolonged the life of foods through fermentation, which increases the proportion of Lactobacillus and other healthy gut bacteria.

Kombucha and kefir are trendy right now, but other popular fermented foods include kimchi, miso, yoghurt and sauerkraut. Intake of these foods is thought to maintain the diversity of the gut microbiome.




Read more:
Health check: will eating nuts make you gain weight?


Interest in the potential cognitive effects of fermented foods stems from emerging evidence for the importance of the gut microbiota in cognition and health.

It’s well known that a poor diet can reduce the diversity of the gut microbiome. Our work in rats has shown the cognitive impairments produced by exposure to an unhealthy “cafeteria” diet – a Western-style diet high in saturated fat and sugar – are linked to changes in the gut microbiome.

Beyond cognition

It’s not possible to attribute “miracle” properties to one food group alone. We suggest a balanced, varied diet is the best approach to sustain not only brain health, but heart health too.

And there may be other reasons to seek out these foods. A newly published study showed eating fruit and vegetables improved mental well-being. Subjects tended to feel happier, less worried, and reported higher levels of overall life satisfaction.

The link between diet quality and better mental health is now well-established.

The recently published EAT-Lancet report adds a further compelling reason to eat healthily: the environment. This commission argued for a “planetary health” diet – akin to the Mediterranean diet – consisting of whole grains, vegetables, fruits, nuts and dairy, healthy fats, with low animal protein and few processed foods.

It is thought that shifting to such a diet, together with reducing food waste and adopting more sustainable food production systems, will minimise environmental damage and safeguard individual health.

The central message is the health of individuals and of the planet are inextricably linked, and this requires a rethink of global food systems.




Read more:
Want to improve your mood? It’s time to ditch the junk food


Overhauling food systems – and individual food habits – will not be simple while foods high in fat and sugar are so readily available and relatively cheap.

Nonetheless, recognising that eating well might benefit the planet, as well as the body and brain, might motivate people to change their dietary habits.The Conversation

Margaret Morris, Professor of Pharmacology, Head of Pharmacology, UNSW and Michael Kendig, Postdoctoral Research Fellow, UNSW

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Health check: is moderate drinking good for me?



File 20190307 82669 qqvs4x.jpg?ixlib=rb 1.1
We previously thought moderate drinking could be good for our health. There’s now evidence that says the opposite.
From shutterstock.com

Hassan Vally, La Trobe University

For the past three decades or so, the conventional wisdom has been that drinking alcohol at moderate levels is good for us.

The evidence for this has come from many studies that have suggested the death rate for moderate drinkers is lower than that for non-drinkers. In other words, we thought moderate drinkers lived longer than those who didn’t drink at all.

This phenomenon has been communicated with great impact by the J-shaped curve that shows death rates fall as you move from non-drinking to moderate drinking, before rising again as drinking levels increase.




Read more:
Did you look forward to last night’s bottle of wine a bit too much? Ladies, you’re not alone


Most of us embraced these studies with enthusiasm. But the findings were probably too good to be true. The problem has always been the potential mixing of many other variables – called confounding factors – with drinking.

The concern was that non-drinkers as a group in many of these previous studies were different to moderate drinkers in many ways in addition to their drinking. Non-drinkers may have been unhealthier to begin with (hence not taking up drinking in the first place) or they may have included recovering alcoholics with poor health.

These confounding factors may have made moderate drinkers look healthier than they actually were (relative to non-drinkers) and thus have led us to associate moderate drinking with better health.




Read more:
Ten reasons some of us should cut back on alcohol


More recent studies have been able to address this challenge of separating out the effect of drinking on health, independent of other confounding factors. And these newer studies tell us moderate drinking is probably not good for us at all.

Instead of the J-shaped curve described previously, the most recent evidence is showing a curve that continues on an upward trajectory.

As you increase your level of drinking beyond not drinking at all, for all levels of drinking, your health outcomes worsen. The curve starts off relatively flat, before rising dramatically, indicating much higher rates of early death as drinking levels increase.

So what is the health cost of moderate drinking?

If we look a recent Lancet study that addressed this issue, we can start to make sense of this cost. This suggests that if you drink one alcoholic drink per day you have a 0.5% higher risk of developing one of 23 alcohol-related health conditions.

But risk expressed in this way is difficult to interpret. It’s only when we convert this to an absolute risk that we can begin to understand the actual magnitude of this risk to our health. It translates to four more illnesses* per 100,000 people due to alcohol, which is actually a pretty small risk (but an increased risk nonetheless).

While the health implications of moderate drinking have been a point of contention, it’s clear drinking excessively isn’t good.
From shutterstock.com

This risk estimation assumes several things, including that you drink alcohol every single day, so you would expect the risk to be smaller for those who drink every other day or only occasionally.

The latest evidence suggests the health cost of light to moderate drinking, if there is one, is quite small. What was previously thought to be a marginal benefit of moderate alcohol drinking is now considered a marginal cost to health.




Read more:
Think before you drink: alcohol’s calories end up on your waistline


So for you as an individual, what does this new evidence mean?

Maybe it means having to lose the contentedness you have felt as you drink your evening glass of wine, believing it was also improving your health.

Or maybe this new evidence will give you the motivation to reduce your drinking, even if you are only a moderate drinker.

Of course, if you get pleasure from drinking responsibly, and you have no intention of changing your drinking habits, then you will have to consider and accept this potential cost to your health.

But remember, the evidence is still incontrovertible that drinking high levels
of alcohol is very bad for you. It will shorten the length of your life and affect the quality of your life and those around you.

Correction: this article originally said one alcoholic drink per day equated to four more deaths – rather than illnesses – per 100,000 people due to alcohol.The Conversation

Hassan Vally, Senior Lecturer in Epidemiology, La Trobe University

This article is republished from The Conversation under a Creative Commons license. Read the original article.