The Food Safety Modernization Act, enacted in 2011 but just now being implemented, would seem at first glance to be the type of federal action that many say hurts the economy and stifles economic growth.
We’ve all heard the complaint (it’s a staple of the Presidential campaign): there’s too much regulation, and business must be freed from oppressive federal rules in order to drive economic growth.
Food safety, however, might be the exception. Food is the world’s most important commodity, and although safety standards and procedures have dramatically improved from the days of Ida Tarbell and Upton Sinclair, the very size of the industry from field to table makes a 100% safe food supply an aspirational goal, but far from a reality.
The new food safety act (known by its acronym, FSMA) codifies into federal law sweeping new regulations across the multi-billion dollar food channel. Perhaps most important, FSMA unveils a new philosophy of regulation. Instead of reacting to food contamination outbreaks after-the-fact, which has been standard operating procedure for decades, FSMA seeks to prevent outbreaks by placing more responsibilities on food companies to manage their businesses in accordance with the new regulations.
Here’s a specific example. One of the new regulatory initiatives under FSMA requires food manufacturers and processors to undertake a thorough and detailed hazard analysis of any and all possible weak points in their production system, from machinery lubricants to temperature controls to safe and sanitary packaging protocols. Once these analyses are completed, they must be maintained on premises, regularly updated and documented.
Failure to implement these steps (or to ignore them) could result in fines and — something new — criminal liabilities for company CEOs and other officers.
Needless to say, FSMA represents sweeping change, and change almost always brings on anxiety and stress. Before FSMA was enacted four years ago, it was heavily lobbied by various food industry trade groups and firms, and the pushback continued as implementation began. Yet it is the law, and as a food consumer, we should consider FSMA as a good incentive for the industry to make food production and distribution as safe as possible.
Of course, there are skeptics, and the Food and Drug Administration, which oversees FSMA, is nothing if not deliberate (i.e. glacial) in moving forward with implementation. Yet to those critics who denounce regulation, two recent food-borne illness incidents sadly demonstrate the need for close supervision of food production. In one case, a Georgia peanut butter company knowingly distributed product contaminated with salmonella, resulting in numerous cases of illness and nine deaths. (Two of the company’s top officials were prosecuted and sentened to 20+ years in prison). In another, an ice cream producer shipped thousands of gallons of ice cream with minute amounts of the dangerous Listeria microbe. Hundreds of consumers fell ill; three died.
America’s food supply is overwhelmingly safe. But it can always be safer. With FSMA, the food industry is under the spotlight to make food safety a top management priority, or face some rather unpalatable legal and brand consequences.
For many years beginning in 1982, I worked at The Kroger Co. as corporate director of public relations. One of my top priorities was handling crisis communications during food recalls — those involving Kroger’s own manufactured items or national brand products sold in Kroger food stores. Each incident was vastly different, yet there were enough similarities that when I left Kroger to start my own business, in 1998, I took with me a treasure trove of experience in the always topical subject of food safety.
My expertise lay fallow for some time as I worked in other areas. Earlier this year, I was contacted by an old Kroger colleague with whom I worked closely on many of those recalls. Gale Prince is widely known in the food industry and government circles as a leading expert on the causes of food contamination. He’s been honored many times over the years for his dedication to promoting safe food handling techniques, food safety training, and manufacturing standards.
What he didn’t have time for was letting the world know about his work. That’s where I come in. Gale and I have created a joint venture to work together on food safety issues. His company, SAGE Food Safety Consultants, and my agency, Bernish Communications, LLC, now offer clients our combined talents in the food safety arena.
The timing of our collaboration couldn’t be better. The Food and Drug Administration (FDA) is implementing new regulations under the Food Safety Modernization Act of 2011 (better known as FSMA, pronounced “fizz-ma”). The legislation, once fully in place, has the capability of transforming food safety by placing more responsibility on food manufacturers and processors to prevent recalls from happening in the first place. The FDA, along with the Justice Department, is holding the industry’s feet to the fire; recently (as you may have read) the government meted out 29 and 20 year prison sentences to two brothers, Stewart and Michael Parnell, who knowingly shipped peanut butter contaminated with salmonella bacteria. Nine people died and dozens more were made very ill from eating the peanut butter. In years past, the government was more likely to issue fines for such gross violations of the law. This time, it appears the FDA and DOJ are not turning a bureaucratic blind eye to blatant misdeeds.
As I said, food safety is a perennial issue — it never really goes away. Our nation’s food supply is very safe, but it could be safer. Ditto for the rest of the world, whose products are increasingly available on U.S. supermarket shelves. FSMA, hopefully, will enhance food safety even more, but accidents and willful lawbreaking are inevitable in such a vast and varied industry.
When that happens, chances are Gale and I will be on the scene.
The untimely death of New York Times reporter & columnist David Carr felt like a death’s knell for the bygone days of tough-minded journalism.
But his passing by no means represents the end of en era.
Carr seemed to fit the stereotype of a Damon Runyan-like character who smoke, drank and typed his stories while snarling at copy boys in crowded, frantic newsrooms.
He was something like that, at least early in his career. Yet he was in every sense a modern, contemporary and relevant observer. Carr wrote often and perceptively about sweeping changes in the media, from corporate ownership of media properties to the impact of social media. Yet he also used social media to leverage his audience; he didn’t reject competition for audience, but welcomed it.
And about corporate media properties ownership, Carr took on the issue like a dog with a steak. He didn’t let his highly visible position at the Times in any way dilute his willingness to report — with frankness and acerbic wit — how such a trend threatened independent, professional journalism.
He was by all acounts a singular personality. Beset with drug and alcohol problems in his early years, Carr struggled to find his way in life, and newspaper work (and his family) became his salvation. His early struggles helped shape an attitude that didn’t suffer fools, abhorred “truthiness” and political bloviation, and relished precise language and the power of the declarative sentence. At Boston University, Carr was an inspiring lecturer. He taught aspiring journalists about how to write and how to approach officialdom — whether business, political or governmental — by being tough-minded, assertive and skeptical.
His death at 58 shocked and saddened his many friends and colleagues at the Times and elsewhere. He would surely take some satisfaction in knowing that young people who read his obituary might be inspired to emulate his professional life. That’s something of a priority, if journalism is to survive.
“I love the current future of journalism we are living through and care desperately about getting my students ready to prosper in this new place,” Carr once said. It’s a fitting epitaph.
Uber, the the car-on-demand transportation service, is attempting nothing less than to take over the world’s urban taxicab business. Not surprisingly, this has certain people, including cabbies, pretty upset, and the service is now embroiled in controversy.
Uber’s introduction underscores the dramatic transition underway in journalism today. Not only are there many new reporters appearing on the scene who haven’t necessarily been trained in the field. It’s also increasingly obvious that content — what is being covered — is also changing, as the mainstream media and new media all pursue the new buzzword of reporting: relevance.
This trend comes into clearer focus in news coverage of technology, especially in consumer-facing applications that provide new ways of doing old things, from using a slide rule to hailing a cab. Most new apps initially receive glowing, enthusiastic reviews by the business and trade press, reflecting a mindset that tends to accept that whatever is new or different is superior to the old. It’s only when consumers and affected businesses and traditional service providers begin pushing back that the media starts raising questions about the efficacy of the new product or service.
Some background: Uber was an idea hatched by two startup entrepreneurs who thought that waiting for a cab was a ridiculous waste of time. They concluded that with not much investment beyond a new app, they could disrupt the traditional taxicab business by crowd sourcing rides in cities and towns via a mobile app which could rustle up a car and driver almost immediately, and for a much lower fare. Voila! A new business was born.
As is typically the case with all clever new apps, early adopters jumped on Uber, which started operations on the fly, making up rules and procedures as it grew and expanded. Its stated aim was to supplant traditional taxi services, which were regulated by city governments that required cabbies to be licensed and bonded. Uber’s founders, in effect, blew off those regulations. The firm became emboldened as more customers started using Uber (and other new arrivals on the scene), even as cab companies howled and city governments became wary.
Media coverage of Uber, at least initially, was noticeable by the obvious enthusiasm of articles and broadcasts, notably in the business media. “Hey, here’s a neat new way to get around town” was pretty much the standard tone of coverage. When problems began popping up, such as concerns about who, exactly were the drivers offering their cars as taxis, or how consumers could be protected against being ripped off, the enthusiasm began to wane — to a degree. As problems mounted, Uber was described as going through growing pains, and that things would be sorted out eventually, ushering in a transformation of a basic mode of transport for getting from point A to B.
Uber has continued to expand, but news coverage is rapidly becoming less forgiving. With criticial scrutiny rising, one Uber executive, in a fit of pique, responded with kindergarten anger. The exec speculated publicly that Uber critics in the media should be trashed via orchestrated character assassination. Needless to say, the business press hasn’t taken kindly to this idea, which — in fairness — the executive has recanted. As it happens, the outburst came in the midst of a Uber PR charm offensive campaign with the media to improve its image.
There things stand for the moment. Uber’s fate remains unsettled. Local governments have banned the service outright in some cities, and restricted it in others. Taxi businesses have lobbied strenuously to protect their franchises and, in many places, they have managed to stem Uber’s aggressive expansion plans. Interestingly, however, Uber users are rallying around the service, arguing that negative articles are the result of taxicab business propaganda and, therefore, without merit.
To me, the more fundamental question is whether the new corps of reporters and editors, already caught up in the fervor of application-guided lifestyel, blinded by the sizzle of new ideas, resulting in articles that are largely adulatory and enthusiastic rather than objective and balanced? In the case of Uber, the mainstream as well as tech press largely gave the concept a free pass until issues arose by disgruntled users and suddenly displaced interests.
That leads to perhaps an even deeper question, linked to the transformative changes in the current media landscape. What, exactly, is news in the current environment?
Across the mainstream media, publishers and broadcasters are aggressively choosing subjects to cover based on page views, that is, by relevance and topicality, rather than what is important and substantial (although the two are not mutually exclusive). Additionally, journalism today has locked into a specific target audience, the 25 to 45 age demogrphic, precisely those most likely to be on top of, and amenable to, the newest fad or trend. In other words, today’s journalism industry seems predisposed to cover what interests their target demographic, rather than the larger and more amorphous general public.
Here, in a newsroom memo from Detroit Free Press Editor and Publisher Paul Anger, is as good a summary of one publishing firm’s (Gannett’s) emerging editorial strategy:
First, we’ll be doing more staff training on metrics — details to come soon — and how to plan content for different platforms and audiences. How to analyze traffic, maximize it, learn from what performs well and what doesn’t. We need to emphasize, more and more, reaching readers in the 25-45 age demographic.”
If this is the new direction of news coverage, stories about companies like Uber are going to appear more often, because they are more relevant to the targeted demographic. This out-with-the-old, in-with-the-new philosophy may indeed generate more page views, Facebook likes and Twitter messages. The danger is that it will turn off older news consumers who may feel increasingly marginalized. Older readers and viewers, it should be noted, access news media far more than younger generations. Is the news media, desparate in to remain relevant to younger consumers (and advertisers) turning its back on their core customers? In the case of Uber, much of the media’s criticism may, in fact, not be reaching those most likely to use the app.
Who, then, will provide the necessary background and objectivity about all the new stuff coming along, and whether the promise of new, user friendly technology is in fact better than whatever it is replacing?
The answer, right now, is as uncertain as Uber’s unsettled future.
The specter of contracting Ebola is a dystopian nightmare. Whether fear of the virus justified the media’s panic-fueling coverage is another matter.
Here are some incontrovertible facts: Ebola is a frightening, often deadly scourge. It is highly contagious, but only when the infected person is showing unmistakable signs of the virus: fever, vomiting, diarrhea, and droplets of body secretions enter another person through eyes, ears,mouth of cuts. Ebola is killed with hospital-grade disinfectants (such as household bleach), according to the Centers for Disease Control (CDC). Ebola on dry surfaces, such as doorknobs and countertops, can survive for several hours; however, virus in body fluids (such as blood) can survive up to several days at room temperature.
Other facts: the United States was woefully unprepared for dealing with Ebola, despite the fact that the virus was burgeoning out of control in west Africa for months. Because it was there, not here, no one paid much attention to Ebola. But when an infected Liberian landed in Dallas, sick with Ebola, the hospital didn’t diagnose the illness despite knowing that the man had come to the U.S. from a nation battling the virus. When the man died on Oct. 8, Ebola was suddenly on everyone’s radar screen. while public confidence in the Centers for Disease Control (CDC) and state and local public health agencies was shaken to the core. The mainstream and social media industry went into Defcon 5 overdrive, provoking the public’s growing fears about a decimating pandemic here.
But today, less than three weeks after the death of Thomas Duncan died of Ebola in Dallas, there has been no pandemic, there has been no widespread infection, and the number of virus-infected Americans stands precisely at three. Two of them, nurses who cared for Duncan, are recovering; one of them was released from hospital and is seen hugging President Obama in a carefully orchestrated Oval Office photo op (a journalistic term for a staged event) intended to demonstrate that the nurse is just fine, thank you.
Meanwhile, in news that’s increasingly buried “below the fold” (a newspaper saying denoting stories of lesser public interest), the NBC cameraman who caught the virus on location in west Africa has recovered, as well as the Spanish nurse whose pet dog was euthanized for fear that it could spread the virus. The latest to be diagnosed with the virus, a New York City physician who served on the front lines battling Ebola in Africa, is now under quarantine and helping direct his own recovery. Again, all this in less than three weeks time.
To be perfectly clear, Ebola still poses a major world health threat as it spreads among people in Liberia, Guiena and Sierra Leone, and possibly elsewhere in the region. Health experts predict it will get worse before subsiding.
Yet in the U.S., Ebola has taken, if not a death toll, then a psychic toll, leaving a shaken public already nervous about ISIS, the sluggish economy, increasing college debt and a host of other worries. How did this happen in a nation with, arguably, the world’s most advanced health care system?
The answer, in my mind, is threefold. First, Americans are serial isolationists. We pay little attention to world developments unless or until it is perceived that we are somehow at risk. Ebola has been around since 1967, devastaing swaths of Africans, yet it might as well have been on Mars for all the concern we showed for the virus.
Second, media sensationism drove the Ebola scare, with social media playing an ever larger, and not altogether helpful, role in the narrative. This is the direct result of the new nature of 24/7 news coverage, with intense pressure on journalists and bloggers (to say nothing of social media tweeters and Facebook posters) to rush to print (another journalism saying) with no perspective, little background, and lots of opinion, rumor and mis-information baked in.
CNN, to mention one egregious example, covered the virus with a not-so-faint whiff of hysteria as anchors and reporters kept the story alive with preposterous tease-in questions like “will people in the U.S. die from Ebola,” and “are we prepared for an Ebola pandemic?” Not to be outdone, Fox News established an impossible benchmark — absolute protection against Ebola for all Americans — and then used that measure to pillory the CDC. Catch this grilling of CDC Director Tom Frieden by Fox News:
Certainly, the agency’s initial response came across as confused and bumbling, and criticism was justified. But no other player in the drama was held to the media’s standard of 100% perfection; certainly not the Dallas hospital where Duncan died (after being sent home with a 103-degree fever)! The media and social media instead pointed their angry fingers at the CDC and its mild-mannered, low key director, a career public health physician with impeccable qualifications and direct, first hand knowledgeof Ebola from trips to west Africa, but with a low key TV presence only his mom could like.
The third dimension of the crisis was political. The approaching mid-term elections all but guaranteed that the political “optics” of the virus would be inserted into the coverage. Commentators and bloggers kept suggesting that the nation’s guardian at the gate — the CDC — had let the crisis fester while it dithered. This satisifed a partisan political narrative that seemingly accompanies almost every event, and led to mounting criticism of White House inaction, partly justified as the President hesitated to take any firm action. The more criticism was voiced in the media, the more people concluded that they were imminent danger, drowning out those who cautioned calm. Within days, if you were absorbing the evening news or following on Twitter, the end was near, facts to the contrary aside.
As this is written, the arc of the Ebola crisis appears to have begun its downward slope. After stumbling out of the gate following Duncan’s death, the CDC has gained control not just of the response to the virus, but the public’s perception that things are now being managed. Coverage and commentary of the New York physician diagnosed with the virus is largely devoid of hysteria and low-key, prompting even the Wall Street Journal to offer praise to the CDC and public health care officials. There is also, if you care to look for it, exceptional coverage of the Ebola outbreak.
Perhaps this would be a good moment for some serious introspection. Did panic-stricken tweets and scary Facebook posts excerbate the situation? Does the MSM not have some responsibility to refrain from “ripping and reading” (another old journalism term) without also providing perspective and informed expertise? Like so many recent media-driven events, the Ebola scare has been plagued from the start by unsubstantiated information and rumor-mongering in social media channels. Such paranoia might have limited impact save for the fact that traditional news coverage is increasingly dictated by what’s playing in social media. Editors measure coverage by page views and hashtag “trending, rather than old-fashioned fact gathering, and this symbiotic relationship merely serves to throw fuel on the fire, without illuminating anything.
The Ebola crisis — in Africa — is nowhere near resolution. The virus’ relentless march of devasatation continues, even as nations from all over the globe are sending help and medicines to the region. Research for an effective vaccine is underway in labs around the world, but no one thinks an antidote will be ready for months. With deaths mounting “over there,” that is where the Ebola story really is.