How Much Smartphone Use Is Too Much?

iStock
iStock

Since the iPhone debuted in 2007, ushering in the age of the phone-as-computer, smartphone use has exploded worldwide, with an estimated 2.3 billion users last year. According to a 2016 Pew Research survey, 77 percent of Americans own a smartphone, and other recent stats have found that users are on their phones an average of more than five hours per day—almost double the rate in 2013. More people now use a mobile device to get online than they do a computer. This is especially true in regions where people may not be able to afford a personal computer but can buy a smartphone.

We love our smartphones perhaps a little too much, and the desire to unplug is growing among people who see 24/7 connectedness as damaging to their mental health. This week, Apple announced new iPhone features meant to curb our dependence on our devices, including a weekly "Report" app that shows your phone and app usage, as well as how many times you physically pick up your phone. (One small study by the consumer research firm Dscout found that we touch our phones more than 2600 times a day.) You can also set customized limits for overall phone usage with the "Screen Time" app.

Many of us feel anxiety at the very thought of being without their phone and the access it offers to the internet. Researchers have a term for it: nomophobia ("no mobile phone phobia"). So how much smartphone use is too much?

That turns out to be a surprisingly difficult question to answer. "Smartphone addiction" isn't an official medical diagnosis. Even the experts haven't decided how much is too much—or even whether smartphone addiction is real.

DEFINING ADDICTION

To understand what's going on, we have to first step back and define what addiction is. It's different from habits, which are subconsciously performed routines, and dependence, when repeated use of something causes withdrawals when you stop. You can be dependent on something without it ruining your life. Addiction is a mental disorder characterized by compulsive consumption despite serious adverse consequences.

Yet, our understanding of behavioral addictions—especially ones that don't involve ingesting mind-altering chemicals—is still evolving. Actions that result in psychological rewards, such as a crushing a castle in Clash Royale or getting a new ping from Instagram, can turn compulsive as our brains rewire to seek that payoff (just like our smartphones, our brains use electricity to operate, and circuits of neurons can restructure to skew toward rewards). For a minority of people, it seems those compulsions can turn to addictions.

Psychologists have been treating internet addiction for almost as long as the internet has been around: Kimberly Young, a clinical psychologist and program director at St. Bonaventure University, founded the Center for Internet Addiction back in 1995. By 2013, addictive behavior connected to personal technology was common enough that in the fifth edition of the Diagnostic and Statistical Manual of Mental Disorders (DSM-V), the bible for mental disorder diagnoses, the American Psychiatric Association included "internet gaming disorder" as a condition "warranting further study." These days, thanks to an abundance of horror stories involving people who were glued to the internet until they died—and living gamers who are so engrossed in their games that they ignore paramedics removing dead gamers—internet rehabs are popping up all over the world.

But in virtually all of the medical literature published so far about internet addiction—including the WHO's forthcoming 11th edition of International Classification of Diseases (ICD-11), whose "excessive use of the internet" is built around how much gaming interferes with daily life—there's no mention of smartphones.

According to Marc Potenza, a professor of psychiatry and neuroscience at the Yale School of Medicine, there's a reason for these omissions: Despite the official definitions included in the DSM-V and ICD-11, "there's debate regarding the use of those terms [internet addiction]. Both the ICD-11 group and the DSM-V group chose to focus on the behavior rather than the delivery device."

So while you may feel nomophobia when you can't find your internet "delivery device," the global psychiatric community thinks it's the internet itself that's the problem—not the phone in your hand.

THE REWARDS THAT COME FROM OUR PHONES

We are getting something from our phones, though, and it's not just access to the internet. Receiving a notification gives us a small dopamine burst, and we learn to associate that dose of pleasure with the smartphone. You may pull your phone from your pocket a dozen times an hour to check for notifications—even if you know they're not there because your phone would have, well, notified you.

It's not unusual for people to become attached to an action (checking the phone) rather than its reward (getting a notification). Sometimes smokers trying to quit feel the urge to chew or bite and need to replace cigarettes with gum or sunflower seeds. According to Stephanie Borgland, a neuroscientist and associate professor at University of Calgary, this is called a Pavlovian-instrumental transfer—a reference to Ivan Pavlov's experiments, in which he reinforced behavior in dogs through signals and rewards. Borgland tells Mental Floss that we can become compulsively attached to the cues of phone use. We cling to the physical stimuli our brains have linked to the reward.

There may be an evolutionary basis to this behavior. Like other primates, humans are social mammals, but we have dramatically higher levels of dopamine than our cousins. This neurotransmitter is associated with reward-motivated behavior. So when we get a notification on an app that tells us someone has engaged us in social interaction—which we naturally crave—it triggers our natural inclinations.

HOW TO CURB YOUR ENTHUSIASM (FOR YOUR PHONE)

The global psychiatric community may not be convinced our smartphones are a problem, and no one has died from checking Snapchat too often—or at least it hasn't been reported. But most of us would say that spending five hours a day on our smartphones is too much. So are there any guidelines?

At this stage of research into smartphone use, there are no specific time-limit recommendations, though some researchers are working on a smartphone addiction scale; one was proposed in a 2013 study in the journal PLOS One. Based on what's said to be coming out in the ICD-11, here's one simple guideline: Problematic smartphone use negatively interferes with your life. Some research suggests Facebook, Instagram, and even online gaming make us feel more isolated and less connected. The more we try to fill that hole by tapping away at our phones, the more we crave social interaction. "There are a number of factors that have been associated with these behaviors or conditions," says Potenza, who is developing tools to screen for and assess problematic internet use and has consulted with the WHO on these issues. "And arguably one of the most consistent ones is depression."

One way to assess whether your smartphone is a problem is noting how you react when you're cut off from it, according to the PLOS One study. The study proposed a "smartphone addiction scale" based on negative responses to being without a smartphone, among other criteria. What happens on a day when you accidentally leave it at home? Are you irritable or anxious? Do you feel isolated from friends or unsafe? Do you have trouble concentrating on work, school, or other important responsibilities, whether or not you have your phone?

While smartphones may not be truly addictive in a medical sense, learning how to use them in a more mindful, healthy manner couldn't hurt. Test yourself for nomophobia [PDF]—knowing how much time you spend online is the first step to identifying how that can be problematic. Block distracting sites or track usage via a timer or an app (beware third-party apps' privacy settings, however). Delete the apps that keep the phone in your hand even when you're not online, like games. If you're still struggling, you could ditch smartphones altogether and downgrade to a "dumb" phone or get a Light Phone, a cellular device "designed to be used as little as possible."

A recent WIRED feature argued that using the internet five hours per day isn't a personal failing so much as a reflection of the way many apps are purposely designed to keep you salivating for more. So perhaps the best measure is to leave your phone behind once in a while. Schedule a screen-free Sunday. Go for a walk in the woods. Meditate. Socialize instead of binging The Office again. Don’t worry—you’ll be fine.

That Sugar Rush Is All In Your Head

iStock.com/egal
iStock.com/egal

We've all heard of the "sugar rush." It's a vision that prompts parents and even teachers to snatch candy away from kids, fearing they'll soon be bouncing off the walls, wired and hyperactive. It’s a myth American culture has clung to for decades—and these days, it’s not just a kid thing. Adults are wary of sugar, too. Some of this fear is warranted—diabetes, the obesity epidemic—but the truth is, sugar doesn't cause hyperactivity. Its impact on the body isn’t an up-and-down thing. The science is clear: There is no "sugar rush.”

To find out how and why the myth started, we need to go back to well before the first World War—then pay a visit to the 1970s.

Our Complicated Relationship With Sugar

According to cultural historian Samira Kawash, America has had a long, complex, love-hate relationship with sugar. In Candy: A Century of Panic and Pleasure, Kawash traces the turn from candy-as-treat to candy-as-food in the early 20th century. At that time, the dietary recommendations from scientists included a mix of carbohydrates, proteins, and fats, with sugar as essential for energy.

Not everyone was on board: The temperance movement, for example, pushed the idea that sugar caused an intoxication similar to alcohol, making candy-eaters sluggish, loopy, and overstimulated. In 1907, the chief of the Philadelphia Bureau of Health estimated that the "appetite" for candy and alcohol were "one and the same," Kawash writes. On the flip side, other scientists suggested that sugar from candy could stave off cravings for alcohol—a suggestion that candymakers then used in their advertisements.

While the debate about sugar as an energy source raged in America, militaries around the world were also exploring sugar as energy for soldiers. In 1898, the Prussian war office became the first to commission a study on the sweet stuff—with promising results: "Sugar in small doses is well-adapted to help men to perform extraordinary muscular labor," early researchers wrote. German military experiments introduced candy and chocolate cakes as fortification for the troops, and the U.S. military added sugary foods to soldiers' diets soon after. When American soldiers returned from World War I, they craved sweets, which "propelled an enormous boom" of candy sales that has lasted to this day, Kawash wrote on her blog, The Candy Professor. American advertisers framed candy as a quick, easy source of energy for busy adults during their workday.

As artificial sweeteners moved into kitchens in the 1950s, candymakers struggled to make their products appeal to women who were watching their waistlines. One industry group, Sugar Information Inc., produced a tiny "Memo to Dieters" pamphlet in 1954 designed to fit inside chocolate boxes. "Sugar before meals raises your blood sugar level and reduces your appetite," it claimed. But by the 1970s, the sugar-positivity heyday had started to wane.

The Origins of the Sugar Rush Myth

The idea that sugar causes hyperactivity gained traction in the early 1970s, when more attention was being paid to how diet might affect behavior. One of the major figures studying the possible connection between diet and behavior was an allergist named Benjamin Feingold, who hypothesized that certain food additives, including dyes and artificial flavorings, might lead to hyperactivity. He formalized this into a popular—yet controversial—elimination diet program. Though certain sugary foods were banned from the program for containing dyes and flavorings, sugar itself was never formally prohibited. Still, thanks in part of the Feingold diet, sugar started to become the poster child for diet and hyperactivity.

It wasn't until the late 1980s that serious doubts about sugar's connection to hyperactivity began to be raised by scientists. As FDA historian Suzanne White Junod wrote in 2003 [PDF], the 1988 Surgeon General's Report on Nutrition and Health concluded that "alleged links between sugar consumption and hyperactivity/attention deficit disorders in children had not been scientifically supported." Despite "mothers' mantra of no sweets before dinner," she noted, "more serious allegations of adverse pediatric consequences … have not withstood scientific scrutiny."

A 1994 paper found that aspartame—an artificial sweetener that had also been accused of inducing hyperactivity in children—had no effect on 15 children with ADHD, even though they had consumed 10 times more than the typical amount.

A year later, the Journal of the American Medical Association published a meta-analysis of the effect of sugar on children's behavior and cognition. It examined data from 23 studies that were conducted under controlled conditions: In every study, some children were given sugar, and others were given an artificial sweetener placebo like aspartame. Neither researchers nor children knew who received the real thing. The studies recruited neurotypical children, kids with ADHD, and a group who were "sensitive" to sugar, according to their parents.

The analysis found that "sugar does not affect the behavior or cognitive performance of children." (The authors did note that “a small effect of sugar or effects on subsets of children cannot be ruled out.”)

"So far, all the well-controlled scientific studies examining the relationship between sugar and behavior in children have not been able to demonstrate it," Mark Wolraich, an emeritus professor of pediatrics at the University of Oklahoma Health Sciences Center who has worked with children with ADHD for more than 30 years and the co-author of that 1995 paper, tells Mental Floss.

Yet the myth that consuming sugar causes hyperactivity hasn’t really gone away. One major reason is the placebo effect, which can have powerful results. The idea that you or your children might feel a "sugar rush" from too much candy isn't unlike the boost you hope to feel from an energy drink or a meal replacement shake or bar (which can contain several teaspoons of sugar). The same is true for parents who claim that their kids seem hyperactive at a party. Peer pressure and excitement seem to be to blame—not sugar.

"The strong belief of parents [in sugar's effects on children's behavior] may be due to expectancy and common association," Wolraich wrote in the JAMA paper.

It works the other way, too: Some parents say they've noticed a difference in their kids' behavior once they take out most sugars from their diets. This strategy, like the Feingold diet, continues to attract interest and followers because believing it works has an impact on whether it actually works or not.

Correlation, Causation, and Caffeine

Which isn't to say there are absolutely no links between sugar consumption and poor health outcomes. A 2006 paper found that drinking a lot of sugary soft drinks was associated with mental health issues, including hyperactivity, but the study's design relied on self-reported questionnaires that were filled out by more than 5000 10th-graders in Oslo, Norway. The authors also noted that caffeine is common in colas, which might have a confounding effect.

In another study, conducted by University of Vermont professor of economics Sara Solnick and Harvard health policy professor David Hemenway, the researchers investigated the so-called "Twinkie defense," in which sugar is said to contribute to an "altered state of mind." (The phrase Twinkie defense comes from the 1979 trial of Dan White for killing San Francisco city district supervisor Harvey Milk and Mayor George Moscone. His lawyers argued that White had "diminished capacity and was unable to premeditate his crime," as evidenced in part by his sudden adoption of a junk-food diet in the months before the murders. White was convicted of voluntary manslaughter.)

In their survey of nearly 1900 Boston public high schoolers, Solnick and Hemenway found "a significant and strong association between soft drinks and violence." Adolescents who drank more than five cans of soft drinks per week—nearly 30 percent of the group—were significantly more likely to have carried a weapon.

But Solnick tells Mental Floss the study isn't evidence of a "sugar rush."

"Even if sugar did cause aggression—which we did not prove—we have no way of knowing whether the effect is immediate (and perhaps short-lived) as the phrase 'sugar rush' implies, or whether it’s a longer-term process," she says. Sugar could, for example, increase irritability, which might sometimes flare up into aggression—but not as an immediate reaction to consuming sugar.

Harvard researchers are looking into the long-term effects of sugar using data from Project Viva, a large observational study of pregnant women, mothers, and their children. A 2018 paper in the American Journal of Preventive Medicine studied more than 1200 mother-child pairs from Project Viva, assessing mothers' self-reported diets during pregnancy as well as their children's health during early childhood.

"Sugar consumption, especially from [sugar-sweetened beverages], during pregnancy and childhood, and maternal diet soda consumption may adversely impact child cognition,” the authors concluded, though they noted that other factors could explain the association.

“This study design can look at relationships, but it cannot determine cause and effect,” says Wolraich, who was not involved in the study. "It is equally possible that parents of children with lower cognition are likely to cause a greater consumption of sugar or diet drinks, or that there is a third factor that influences cognition and consumption.”

The Science of the Sugar Crash

Though the evidence against the sugar rush is strong, a "sugar crash" is real—but typically it only affects people with diabetes.

According to the National Institute of Diabetes and Digestive and Kidney Diseases, low blood sugar—or hypoglycemia—is a serious medical condition. When a lot of sugar enters the bloodstream, it can spike the blood sugar level, causing fluctuation, instability, and eventually a crash—which is called reactive hypoglycemia. If a diabetic's blood sugar levels are too low, a number of symptoms—including shakiness, fatigue, weakness, and more—can follow. Severe hypoglycemia can lead to seizures and even coma.

For most of us, though, it's rare. Endocrinologist Dr. Natasa Janicic-Kahric told The Washington Post that "about 5 percent of Americans experience sugar crash."

You're more likely to experience it if you do a tough workout on an empty stomach. "If one exercises vigorously and doesn't have sufficient intake to supplement their use of calories, they can get lightheaded," Wolraich says. "But in most cases, the body is good at regulating a person's needs."

So what you're attributing to sugar—the highs and the lows—is probably all in your head.

Yes, There Is Such a Thing as Getting Too Much Sleep

iStock.com/byakkaya
iStock.com/byakkaya

Regularly getting a good night's rest is incredibly important. While you’re sleeping, your body is sorting memories, cleaning out your brain, boosting your immune system, and otherwise recovering from the day. But there is such a thing as too much of a good thing: According to Popular Science, it's possible to sleep too much.

It's hard to say exactly how much sleep you should be getting each night, but a new observational study of more than 116,000 people across 21 countries finds that sleeping nine or more hours a night is correlated with a higher mortality risk. The sweet spot for healthy sleep habits, according to this data, seems to be six to eight hours each night. (Even if part of that time comes from daytime naps.)

The new paper published in the European Heart Journal examined data from the Prospective Urban Rural Epidemiology study, followed individuals between the ages of 35 and 70 across the world, some of whom lived in high-income countries like Canada and Sweden; others of whom lived in countries considered middle-income, like Argentina and Turkey; and others who lived in countries considered to be low-income like Bangladesh and Pakistan.

Over the course of an average 7.8 years, study participants answered follow-up questions about what time they went to bed and got up, and whether they napped and for how long. They also answered general health questions about things like exercise rates, dietary patterns, and weight. The researchers then collected medical records and death certificates to track whether the subjects had major cardiac events (like heart attacks) or died during the study period.

The researchers found both sleeping too much and sleeping too little to be associated with a higher likelihood of dying before the study was through. Across the world, participants who got less than six hours a day or more than eight hours a day were more likely to experience major cardiac events than participants who slept between six and eight hours a night. When the researchers adjusted the results for age and sex, they still found sleep duration to be a significant predictor of heart issues and all-cause mortality.

While adjusting for factors like physical activity, BMI, and diet did change the results a bit, the basic pattern—a J-shaped curve showing higher risk for short sleepers, low risk for moderate sleepers, and even higher risk for very long sleepers—was the same. While previous research has suggested that naps can be good for your health, this study found that napping was associated with worse outcomes if it put someone over the eight-hours-of-sleep mark in that 24-hour period.

The results may feel like vindication to people who feel terrible whenever they stay in bed too long, but there are some caveats. Sleeping nine hours a day might be a sign that someone has an underlying health condition that in itself poses a higher mortality risk, rather than the cause of the higher mortality risk in itself. The researchers tried to account for this by analyzing the data only for people who were known to have no prevalent diseases and who weren't at risk for conditions like sleep apnea and insomnia, and later by excluding people who had a cardiac event or died during the first two years of the study.

"This suggests that sleep duration per se may be associated with increased risks," they write (emphasis in the original), "but causality cannot be definitively proven from this or other observational studies (and randomized studies of different sleep durations may be difficult to conduct)." So we may never know for sure just how much risk we take upon ourselves when we settle in for a long nap.

Considering that plenty of other research suggests that around seven hours of sleep total is an ideal target, you should probably aim for that number while setting your alarm. And if getting too much shut-eye isn't your problem, check out our tips for getting back to sleep after you've woken up in the middle of the night.

[h/t Popular Science]

SECTIONS

arrow
LIVE SMARTER