In a recent New York Times article, Malia Wollan asked us to consider how the disappearance of the classic lunch break in American work culture might be a sign of much bigger issues in our labor force. She cited the studies of an ethnographer, June Jo Lee, who has spent more than a decade traveling the U.S., and interviewing Americans about how they eat. This project has taken her into hundred and hundred of offices of white-collar workers. In one interview, an administrator in a Seattle architecture firm admitted, “I don’t think I ate at a table at all this week if you don’t include my desk at work.” In Chicago, Lee spoke with a technology specialist who only ate lunch at his computer and religiously avoided the break room; anyone who ate in there was a little “weird.” Another lady said that at the beginning of each week she’s bring in a crudité platter from Costco and graze from it whenever she got hungry.
Trained in anthropology, Lee now works for a consulting firm that helps clients like Kraft Foods, PepsiCo, Nestlé, and Whole Foods get a sense of how people consume food so they can repackage products and design new ones. After all her interviews, observations and analysis, Lee summarizes her findings as follows: “The way people eat at work is pretty sad.”
In the iconic 1987 movie “Wall Street,” investment mogul Gordon Gekko declares that “Lunch is for wimps.” That has proved to be a fateful summation for the modern American workplace, where taking time off for lunch has increasingly become a sign of idleness. Heading out for a midday meal may have made more sense when laborers toiled with their bodies on tasks — building, planting, harvesting, manufacturing — and needed rest and refueling. But in an economy where the standard task is sitting in front of a computer, lunch is far more optional.
In fact, eating lunch at the desk is now the norm, with about 62% of professionals routinely doing so. Social scientists call this phenomenon “desktop dining.” Dining has been eclipsed by meetings and catching up on email. And when American adults do eat, roughly half do so alone. According to studies by the Hartman Group, many said they actually preferred it that way (this is especially true among Millennials, who have their electronic devices as ever-present company).
There is one silver lining to this trend, however. Some research has indicated a possible health benefit, since “solo-lunches” are believed to be smaller. In fact, research on animals (pigs, rats, dogs, chickens, and more) going back to the 1930s indicate a phenomenon researchers call ‘‘social facilitation,’’ where the mere company of others causes an individual to consumer more. Scientists used to think that humans were different, but more data now shows that this is not the case. Simply eating with one other person increases the average amount ingested by a whopping 44%. In fact, the more people are present, the more we tend to eat. One study showed that with seven or more, subjects ate 96% more calories than they would have alone.
The remainder of “Failure to Lunch: The Lamentable Rise of Desktop Dining” is printed in full below. Or, check out the original story in the New York Times Magazine.
“…But with the clearly delineated lunch on the decline, workers end up snacking. In a study of 122 employees, people on average cached 476 calories’ worth of food in their desks. One person squirreled away 3,000 calories, including Cheetos, candy bars and five cans of pop-top tuna fish. In addition to the personal food stashes, there are those areas in an office where food accumulates like driftwood — the leftover sandwiches from a catered lunch; the remains of a birthday cake; banana bread someone baked at home; the bottomless candy dish. When researchers interviewed administrative staff members at the University of California, Davis, one respondent called these common stockpiles ‘‘food altars.’’
Sometimes these collective food repositories become fraught, and in the case of shared workplace refrigerators, even hazardous. In a survey of more than 2,100 full-time professionals, nearly all had access to refrigerators. When asked about cleanliness, a full 40 percent were unaware of fridge cleaning or knew it to be rare or nonexistent. Navigating around a colleague’s forgotten bag of slimy baby carrots might be gross, but the bigger danger in a fridge is the bacterium Listeria monocytogenes. Unlike other pathogens like E. coli, listeria can thrive at 40 degrees Fahrenheit, the recommended temperature for refrigerators. Even in the Seattle law office of Bill Marler, the most prominent food-safety lawyer in the country, the fridge was, until recently, a mess of expired food, rotting salad and long-abandoned deli meat. ‘‘It’s embarrassing,’’ he told me. ‘‘Like an insurance salesman not having insuranc
Beyond any health risks, the desk lunch detracts from our sense of the office as a collaborative, innovative, sociable space. It is hard to foster that feeling when workers eat single-serving yogurt alone, faces lit in the monochrome blue of their computer screens. Brian Wansink, a professor and the director of Cornell University’s Food and Brand Lab, points out that desktop dining isn’t even a sign of industriousness anymore; these days, a desk luncher is as likely as not to be scrolling through Facebook. Wansink and other researchers did a survey of fire-department captains and lieutenants in a major American city. They found significant positive correlations between work performance and eating and cooking as a team. Firehouses where firefighters ate together reported more cooperative behavior; they were better at their jobs.
‘‘Workplace satisfaction is so much higher if you eat with your colleagues,’’ Wansink told me. ‘‘You like your job more — and you like your colleagues better.’’ MALIA WOLLAN