Wednesday, October 21, 2015

Truth be Told

I feel sorry for those who never got a chance to see "Truth or Consequences."

I don't mean the town in New Mexico, either.

I'm thinking of "T or C" this morning amid the news that host Bob Barker is in the hospital after a fall near his Southern California home.

"Truth" didn't give Barker, 91, his start in broadcasting, but it put him on television for the first time. And there Bob stayed for some 51 years.

It was game show---and reality TV, if you want to know the truth---pioneer Ralph Edwards who passed the torch of "Truth" to Barker, in 1956.

Edwards created "Truth" on the radio in 1940. The premise was wacky yet simple.

The show was among the first "audience participation" offerings of the day.

Regular folks would have to answer an obscure trivia question---always designed for the contestant to fail---and when the answer was wrong, there would be consequences. These usually came in the form of wild stunts that were often embarrassing.

But the people ate it up and to be a "victim" on the show became desirable.

As Edwards said, "Most of the American people are pretty darned good sports."

The mad success of "Truth" in the non-visual medium of radio is a testament to Edwards' ability to use sound effects, audience microphones and his own vivid descriptions to give the listener a ringside seat to the raucous action.

Ralph Edwards didn't paint pictures with his radio show, he made mental movies---as any good radio program did in the medium's heyday.

Edwards moved "Truth" to TV in 1950, once he saw the potential of television and how it fit his stunt show like a glove.

Edwards stepped off camera in 1954, devoting his time to running his production company, which produced "Truth."

After a couple of years with new host Jack Bailey, Edwards turned "Truth" over to Barker, who Edwards had heard doing an audience participation show on Los Angeles radio.

That was in 1956, and Barker continued hosting "Truth" until 1974.

I started watching "Truth" in the late-1960s and now that I think about it, the show is at the tip top of today's family tree when it comes to wackiness on television. Pretty much every show you see on television today that involves crazy physical tasks by its contestants can have its roots traced to "Truth."

"Truth" also spawned similar shows in the days of early TV such as "Beat the Clock."

Before "Truth," nothing on television really came close to capturing the notion of asking regular people to do things that they would never consider doing---even with a few drinks in them.

"Candid Camera" had its niche, but that show preyed on the unsuspecting. "Truth" made no bones about it with its participants: you're going to do something weird and embarrassing. And you're going to do it willingly, and it will be seen by millions of people across the country. Period.

And people fell all over themselves---sometimes literally---to be on "Truth." Everyone wanted Bob Barker to embarrass them on national TV.

Ralph Edwards was right---most of the American people were, indeed, pretty darned good sports.

I was drawn to "Truth" as a young boy because each episode was different. The stunts were creative and slapstick and frankly, it wasn't boring.



Then there was "Barker's Box."

Maybe this is what I liked about "Truth" the most.

At the end of every show, a box was brought down to the studio audience. It had four drawers---three had money in them and the fourth was empty, or had a booby prize in it, such as a phony snake that would pop out. If the selected audience member chose the three money drawers before choosing the empty one, he/she would win the money. That's it. Simple but fun.

"Truth" signed off for the last time in 1974. Barker didn't go hungry. He went on to host something called "The Price is Right."

An effort to revive "Truth" occurred in 1977 but it died a quick death with host Bob Hilton.

I had great fun watching "Truth" as a young lad. Liked it a helluva lot more than "The Price is Right," that's for damn sure.

Bob Barker made a living on radio and TV for over six decades by engaging with audiences. For 18 years on "Truth," those audiences would do pretty much anything Bob asked them to do.

THAT'S some power.

Get well soon, Bob. As you said after every show in your "Truth" days, "Hoping all your consequences are happy ones."

Friday, October 9, 2015

The Great Pumpkin

I do believe that this country has gone out of its gourd with pumpkin.

It's the biggest food takeover in America since the Italians introduced pizza to an unsuspecting public in the late-19th century.

Pumpkin spiced coffee. Pumpkin scented candles. Pumpkin cookies, pumpkin cakes, pumpkin pies.

OK, that last one doesn't count.

Somewhere, in some board room in corporate America, it was determined that pumpkin spice should be sprinkled, mixed, folded, encased and saturated into every possible food stuff we consume.

The ironic thing is that pumpkin, by itself, certainly must taste pretty nasty. It's only edible because of what is added to it.

If you plan on buying a pumpkin for Halloween with the intent of carving it, scrape out a portion and eat it, raw with no helpers.

I dare ya.




Pumpkin isn't invading our food supply, it's the spices added to it that are working their way into our digestive tracts with virulent speed.

Starbucks, for example, only started putting real pumpkin in its pumpkin spiced drinks in 2015---and those drinks debuted in 2003.

Pumpkin is literally the flavor of the day.

But again, the irony is that we're not hooked on pumpkin, per se; we're loving the allure of allspice, cloves, nutmeg, cinnamon and Lord knows what else is being added to pumpkin to make it palatable.

Still, it's all being served up using the p-word.

Pumpkin (spice) is in our beer. It's in our tea, in our coffee. I haven't looked, but I'm sure there's a pumpkin spiced chewing gum, too.

So how did the pumpkin craze start, anyway?

Well, it didn't start with a spike in pumpkin sales.

Every year since 2010, we've been buying fewer and fewer pumpkins---the actual fruit/gourd.

Yet we're inundated with pumpkin this, pumpkin that.

According to the market research company The NPD Group, sales of pumpkin-flavored items continue to soar, rising 11.6 percent to $361 million for the year ended July 25.

No hard data is available on how much of those items' content actually contains real pumpkin versus some witches' brew of spices and flavorings---natural or artificial.

Here's a non-surprising fact, thanks to Neilsen.

"While 50 percent of U.S. consumers are actively trying to lose weight, they're overlooking fresh pumpkin to satisfy their craving, instead opting for indulgent treats like baked goods, dips and sweets, where sales have steadily increased," the company said in a statement.

The key word, of course, is "fresh."


Wednesday, September 30, 2015

Statue of Limitations

In another time, in another era, against another backdrop, a statue of Orville Hubbard outside of City Hall was a monument about which the good people of Dearborn didn't bat an eye.

And not just Dearbornites.

It wasn't just the people who lived in that city that knew what Hubbard, Dearborn's mayor from 1942-78, stood for.

It was an ironic monument, really, because the statue of Hubbard, in an almost welcoming repose, belied the exclusiveness that pocked his reign over the city.

Hubbard was an unapologetic segregationist. That's not opinion.

But those ways were widely accepted by his citizenry, particularly in the first 25 years of his being mayor.

To the people of Dearborn, Orville Hubbard represented the sheriff that kept their streets safe and the town prosperous, despite sharing multiple borders with the city of Detroit.

Everyone knew what safe and prosperous was code for in Dearborn under Orville Hubbard.

No blacks allowed.

Hubbard made no bones about it. African-Americans simply weren't allowed to take up residence in Dearborn.

The statue of Hubbard, sculpted in 1989, came down from outside the old city hall on Tuesday. Dearborn opened its new city hall in 2013, but the Hubbard statue didn't make the trip with municipal employees. 

But now that the statue's original site has been sold to a private entity, it had to be moved.

The only question was, where?

It will be moved to outside the Dearborn Historical Museum, a move that amounts to a compromise.

The statue certainly wasn't going to be relocated outside the new and current city hall. That was made clear by current mayor Jack O'Reilly, whose father John was a Hubbard protege and who succeeded Hubbard as mayor.

“It was never intended the statue would come to the new (City Hall),” O’Reilly said. 



The compromise in moving the statue to the Historical Museum's grounds is that the nod to Hubbard's place in Dearborn history will remain, but by not being near the current city hall, there won't even be a hint that the Dearborn of today is represented by what Orville Hubbard's segregationist views elicit.

Hubbard's extremism when it came to refusing blacks to move into the city unfortunately overshadows the good that he did for Dearborn, and there really was a lot of that.

Under Hubbard, there was a strong parks and recreation system in the city, a Florida-based senior care facility and Camp Dearborn, to name just a few positives.

A museum is exactly where the Hubbard statue belongs.

Museums aren't always warm and fuzzy. Often they're the opposite; the curators fill them with reminders of a past that is disturbing.

The Rosa Parks bus at the Henry Ford isn't there to put smiles on people's faces.

As for the Hubbard statue, even one of his daughters, Nancy, who served for years on the Dearborn City Council, said "I think it's alright" that it will be moved, though her preference would be to "leave it where it is."

Dawud Walid, executive director of the Council for American-Islamic Relations' Michigan chapter, summed things up pretty well, capturing why the statue both needed to come down and be relocated to a museum.

"The vision that Orville Hubbard had," Walid said,  "thankfully, is not the Dearborn of today."

But it can't be swept under the rug, either. 

Friday, August 28, 2015

The Inconvenience of News

"No news is good news."

I always wondered about this oft-used phrase.

Is it saying that there is no such thing as good news, or that when you find yourself without any news at all, that's a good thing?

However you choose to decipher "No news is good news," I have one for you that is without ambiguity.

"The news isn't convenient."

There shouldn't be any confusion over that, but yet there is.

In the whirlwind of social media sharing and updates in the wake of the horrific murders of two young television journalists---one a reporter, the other a photographer---in Roanoke, VA on Wednesday during a live interview, we had ourselves a genuine "made for TV" violent crime, and there was much pontificating about what to do with it.

The alleged shooter of reporter Alison Parker and photographer Adam Ward, Vester Flanagan, aka Bryce Williams (on-air name), a reportedly disgruntled and frustrated TV reporter himself, crafted a highly premeditated act that was designed to be as sensational as possible.

Flanagan brought a camera with him (likely his cell phone) and carefully framed the image so that viewers would be able to see Parker being pierced with bullets. Flanagan then faxed a 23-page manifesto (a word that only seems to be associated with atrocities committed by a single person) to ABC News.

But Flanagan was far from done.

The video of the shooting was then uploaded to his Twitter and Facebook accounts, for as many people to see before the accounts were suspended. Flanagan also tweeted some snippets that gave some insight as to his motives.

Flanagan referenced alleged racial discrimination and blasted Ward for going to human resources against him. Flanagan also lamented the hiring of Parker, who he said had made some "racial comments."

But what got the pontificating going were the videos of the crime---both the version shot by Ward during the live interview and the killer's version.

Yes, this was some chilling stuff. Extraordinarily so.

This wasn't Lee Harvey Oswald being gunned down by Jack Ruby, which also happened on live TV, on November 24, 1963.

Parker and Ward, 24 years old and 27, respectively, were just a couple of kids doing their jobs, at 6:45 in the morning, doing a fluff piece about tourism.

All horrific stuff, for sure.

But as stated above, news isn't convenient. It's not pretty and it doesn't always exist to make us feel good. Often times, it makes us feel very bad.

So while those who strongly suggested that the videos of Parker and Ward's murders not be viewed or shared meant well, this puts news gathering down a slippery slope.

Of course, Flanagan's staging and posting and faxing were all designed so that he could "go out with a bang," so to speak. He used the very same media that was once his livelihood as a means to make sure that a record of his victims' last moments would forever exist, somewhere.

Because that's what Flanagan wanted, so many Americans wanted to try to deny him that. It was the least that could be done, they figured.

The pleas to not share or view the videos were made in the name of respect for Parker and Ward.

That's very honorable and well-meaning.

It's also dangerous.


Alison Parker and Adam Ward

We ought not cherry pick which news and which videos we pump and promote, and which that we chastise their viewing and sharing.

Like it or not, what Flanagan did was news. Horrific, disgusting, grotesque news, but news nonetheless.

The notion that those who chose to view or share the videos are somehow less feeling or less human, is misplaced.

To view or share isn't tantamount to approval, nor is it tantamount to gratuitous sensationalism.

Two young TV people were killed in cold blood, live on camera.

That's news.

I also don't know what is hoped to be gained by the discouraging of viewing or sharing the videos.

Facebook and Twitter moved swiftly, yanking down Flanagan's accounts without hesitation. They did their jobs.

But barn door being closed, meet the horses that got out.

Two questions, similar but different.

How does not viewing or sharing the videos of Parker and Ward's killings help things?

And, how does viewing and sharing the videos hurt things?

Frankly, these videos could have been worse. Much, much worse.

We could have seen a head shot. Or blood and gore. We saw neither.

We did see, in Flanagan's video, the muzzle of a gun and we heard the gunshots. But we never saw bullets striking Parker---at least not obviously.

It was kind of like the shower scene in "Psycho." We think we see the knife strike Janet Leigh in the shower, but thanks to clever editing, we don't.

We think we see Parker being riddled with bullets, but we really don't.

Obviously, Flanagan's video doesn't need blood and gore to be shocking. But oh, how much worse it could have been.

News is news. A lot of times, it just plain sucks.

The better question is, what is news?

Invasion of privacy and other matters that masquerade as news are the real bane.

The definition of what is news seems to be broadening as technology keeps advancing.

But there's no question that in the Parker and Ward killings, this was news.

As much as you'd like for it not to be.

We can't decide what others should view or share, in the matter of genuine news.

That's a path we truly should not want to be sent down.

Saturday, August 1, 2015

Roses Have Thorns

My memories of Lynn Anderson are rather sardonic, but that's not her fault, necessarily.

Singer Anderson, 67, passed away the other day of a heart attack in a Nashville hospital while being treated for pneumonia.

She was best known for her song, "Rose Garden," which peaked at no. 1 on the country charts and no. 3 on the Billboard charts in early-1971.

But around the campus of Eastern Michigan University in the 1980s, Lynn Anderson became a notorious figure, forever linked to the school's outrageous efforts to keep its football program in the Mid-American Conference (MAC).

Let me explain.

By 1983, MAC officials were considering kicking EMU's football program out of the conference, because of poor performance on the field and more importantly, poor performance at the turnstiles. The latter was a direct effect of the former's cause.

The conference pretty much gave the university an ultimatum: lift attendance to a minimum threshold (I can't recall what that threshold was, but I think it was in the 10-15,000 per game neighborhood), or risk being booted.

Being asked to leave a Division-I conference would have cost EMU lots and lots of money in revenue, so the push was on to increase attendance, real quick.

Shuttle buses were sent to dorms to pick students up and drive them to Rynearson Stadium. Ticket prices were slashed, because the ultimatum wasn't based on revenue sales---it was based on the number of fannies in the seats. EMU didn't care what price folks paid to get in, or whether they paid at all. They just needed warm bodies in the stands.

But it was going to take more than the above to get students to take three hours out of their Saturday to watch a football team that was mostly miserable.

So EMU brought in halftime performers.

They brought in stand-up comics (I remember the legendary Skip Stephenson showing up one night). They brought in the Dallas Cowboys Cheerleaders, who were booed because they didn't wear their iconic halter tops and go-go boots because the night air was too damn chilly. The girls ran onto the field wearing blue Lycra bodysuits, and that didn't go over too well with the male fans.

And the university also brought in Lynn Anderson.



Anderson was well into her 30s and her career had taken a downturn by the time EMU signed her up for a halftime performance. This was circa 1984.

Things gut ugly when Anderson was found to be obviously lip-synching, which by itself isn't a crime, but it's one of those things that, if it's blatant, can turn an audience against the performer.

The jig was up when the recording had technical difficulties. You can imagine the effects of that.

Anderson was booed off the stage and in the next edition of the school newspaper, The Eastern Echo, a graphic ran in the editorial section that depicted a photo of Anderson being flushed down a toilet.

Now, whether Anderson insisted on the lip-synching, or if the school decided it would be best due to the logistics of performing outdoors, is anyone's guess. Regardless, Lynn Anderson took the hit and she was mocked, panned and derided.

All told, Anderson had 18 country Top 10 hits, including five No. 1 songs. Among her other hits: "Rocky Top," the Felice and Boudleaux Bryant tune that's one of Tennessee's state songs. Anderson's version hit No. 17 on the country charts in 1970.

"I am a huge fan of Lynn's. She was always so nice to me. She did so much for the females in country music," country star Reba McEntire said in a statement.

I'm sure all of that is true. But on a chilly Saturday night on the football field at EMU in 1984, Lynn Anderson became a twisted footnote in the history of Eastern.

EMU made its attendance commitment, by the way, and stayed in the MAC.

We wore "I survived the Big MAC Attack" t-shirts on campus, a play on a McDonald's ad campaign of the time.

Fun times.

Wednesday, July 15, 2015

Christmas (weather) in July

I know this: our hot pepper plants aren't enjoying the cool summer we're having in Metro Detroit.

But fie on them.

The mercury hasn't scraped much past the mid-80s so far, and we're in mid-July.

I couldn't be happier.

I don't do well with the heat. The pepper plants do, however, and ours have been struggling to bear fruit, but like I said, fie on them. I can buy hot peppers at the market, although there is a charm to growing your own.

But if that's the trade off---store-bought hot peppers in exchange for summer days in which I can breathe without an oxygen mask, then I'll take it and run.

Normally by now, we would have suffered through oppressive heat, with temps in the high-80s and low-90s, with enough humidity to curl you from hair to toe.

But this year?

So far, so good.

Cool evenings, enabling you to sleep with the windows open, and is there anything better than breathing in fresh night air as you slumber?

Pleasant daytime temps, which don't mandate the use of air conditioning 24 hours a day. I love A/C---I think it was a great invention. But being in it too much makes me feel like I'm living in a plastic bubble and the world outside is so close yet so far.

Now, I do feel for the swimming pool owners out there.

We don't own a pool anymore, but the year we bought ours, in 1998, we were swimming in it (comfortably) in mid-May, shortly after it was installed.

My, has the climate changed.

Despite the aforementioned oppressive heat, those days haven't really started until well into June in recent years, so the pool owners' swimming season has been shrinking steadily.

I saw some pools with their winter covers still on, as recently as two weeks ago!

So you own a pool nowadays and you're spending God knows how much money on electricity for the pump and chemicals for the water, and you can't even dip your toes in the stinking thing---until Independence Day.

I also haven't heard the ice cream truck very much this summer.

But that's still collateral damage in my book.

I'm enjoying the heck out of daytime temps in the mid-to-high 70s and evening lows in the upper-50s.

I'm basking in the low humidity and the ability to take in a deep breath of air without nearly passing out.

I have no idea how much longer this can last. I keep bracing myself for a heat wave.

And despite the lack of heat thus far, I'm sure I'll still grumble and bitch the first day the thermometer scrapes 90 degrees.

But until then, I'm enjoying Christmas in July.

How about you?

Wednesday, June 24, 2015

The Many Degrees of DVP

Which Dick Van Patten would you like to remember and mourn today?

Is it the actor Van Patten, who most famously seeped into our consciousness as Tom Bradford, the patriarch of the TV family on ABC's "Eight is Enough" from 1977-81?

Is it the tennis player Van Patten, whose sons got some of the old man's genes and did pretty good on the court as well?

Is it the animal activist Van Patten, who worked tirelessly for our furried and feathered friends, including founding National Guide Dog Month in 2008?

Is it the entrepreneur Van Patten, who co-founded Natural Balance Pet Foods in 1989?

Take your pick---or take them all, if you'd like.

Van Patten passed away on Tuesday at age 86. Some reports blame the cause of death on complications related to diabetes.

There was some juice to the Van Patten name in the entertainment industry. There was Dick, of course, and there was his younger sister Joyce, a fellow actor. There were the Van Patten boys---Vincent, Nels and Jimmy---who were all actors.

It's so fitting that Dick Van Patten made his most pop culture hay as family man Tom Bradford on "Eight is Enough" because his own family tree is pretty interesting and runs like an artery through show business.

In addition to the aforementioned, check this out.

Van Patten's sister Joyce married actor Martin Balsam, and the couple had a child---actress Talia Balsam.

Talia Balsam's first husband was George Clooney. You may have heard of him.

Talia Balsam is now married to "Mad Men" actor John Slattery.

Van Patten's son Vince is married to soap star and current reality TV personality Eileen Davidson.

Dick's other son Nels is married to former "Baywatch" regular Nancy Valen.

For some, it may seem like "Eight is Enough" lasted longer than just four seasons, but that's a testament to the show's impact. It hit the small screen four years after "The Brady Bunch" filmed its last episode, and American TV viewers were ready for a family show featuring a large brood that was a little more grown up.

With "EiE," entire episodes weren't spent on trying to find the family dog or teaching kids lessons about humility. The show was about (mostly) grown-up kids who had more convoluted issues.

Of course, by the end of the hour, all the loose ends were tied up, but not before some laughter, some crying and some reflection.

Real-life tragedy was dealt with, as well.

Actress Diana Hyland was originally cast as Tom Bradford's wife but she succumbed to cancer four episodes into season two. Her untimely death wasn't ignored, like the shows from the 1950s and 1960s would have done---replacing the passed away actor with someone else playing the same character.

Instead, the producers of "EiE" dealt with Hyland's death head on, writing it into the show, and the cast's mourning on the screen was real.

Betty Buckley was brought in to play Tom's new love interest (and eventual second wife), Abby, for seasons two through four.

Leading it all was Dick Van Patten, whose character was based on real-life newspaper columnist Tom Braden, who chronicled his large family with an autobiographical book also titled Eight is Enough---a reference to Braden's (and Bradford's) eight children.



Dick Van Patten was hardly the leading man type---thin-haired, slightly paunchy and with a round face. He looked more like your neighbor---which was likely why Tom Bradford resonated on the screen. Van Patten looked like a guy who had eight kids and who worked for a newspaper.

Van Patten's Tom Bradford was also unlike other TV dads in the sense that he wasn't written as a buffoon who somehow got a pretty, smart girl to marry him. The kids didn't zing witty one-liners at dad's expense; rather, Tom Bradford was a true patriarch who had his kids' respect.

Van Patten was acting on stage and screen for some 28 years before he got the "EiE" gig, but he was treated by many viewers as a virtual unknown until 1977. Such is the power of being a lead actor on a successful TV show.

Van Patten was also a favorite of comedian/director Mel Brooks, who cast Dick in a number of films.

Such was Dick Van Patten's varied interests that he even served as a TV commentator for the World Series of Poker from 1993-95.

Trivia: Van Patten named his son Nels after the character that Dick played in his first TV job, a series called "Mama" (1949-57).

Dick Van Patten didn't light up the screen. He wasn't that type of actor. But you were always aware of his presence.

Unlike some of his brethren who felt typecast and button-holed by roles they played on television, Dick Van Patten embraced Tom Bradford.

"I appreciate 'Eight is Enough'," he once said. "It made me recognizable."

But he was influential in so many other ways, and for that so many are grateful.


Friday, June 19, 2015

Spock Would Be Proud

In the interest of full disclosure, I'm 51 years old.

I only tell you this because, when she was my age, Jeralean Talley was living in the year 1950.

And she continued to live, some 65 more years, until passing peacefully the other day in her home in Inkster.

Jeralean was 116 years, 25 days old when she slipped away, ending her two-month reign as the world's oldest living person.

I wonder what it would have been like to be my age now, in 1950.

Harry S. Truman was president. Television was still a relatively new thing and lots of folks didn't even own one. And if they did. it broadcast everything in beautiful, gorgeous, vivid...black and white.

The NHL had six teams. Major League Baseball had all of 16. The NFL was still finding its audience as teams were experimenting with something called the forward pass. The NBA was four years old.

The only phones we had were mounted on our kitchen walls. You had to actually read the hands of a clock or wristwatch to tell time. Shoes had laces, not Velcro.

If you wanted to know what was going on, you bought a newspaper. If you needed more, you bought a Late edition on the street.

Cars were as big as tanks and the only things that weren't metal were the seats and the dashboard.

If you wanted to know how to get where you were going, you bought a map.

You didn't send e-mails, you wrote letters. If you wanted to pay a bill, you licked a stamp.

We were just five years removed from the second World War and on our way into another conflict in Korea.

That's just when Jeralean Talley was 51.

She graduated from high school during World War I. When she was old enough to vote, she couldn't.

She saw the invention of the telephone, the airplane, radio, air conditioning, modern refrigeration and instant coffee.

For starters.


Jeralean Talley (1899-2015)


But Jeralean is gone now, and according to daughter Thelma Holloway, who's a youngster at age 77, her mother "was ready to go home and rest."

"She asked the Lord to take her peacefully, and he did," Holloway told the Detroit News.


According to the News story, the California-based Gerontology Research Group, which keeps track of the world’s oldest people, declared Talley in early April to be the oldest human on the planet.

The previous record-holder, Arkansas resident Gertrude Weaver, died April 6 at 116 years old, according to the group.

Mrs. Talley is succeeded as the world’s oldest person by New Yorker Susannah Mushatt Jones, who turns 116 on July 6.

Jeralean Talley moved to Detroit from Georgia in 1935, right smack in the middle of the Great Depression. Her husband, Alfred, has been gone since 1988 after 52 years of marriage to Jeralean.

Jeralean was an avid bowler, continuing to roll games until she was 104. Her last game rolled produced an astounding score of 200.

Despite the number of people around the world who have lived well past their 100th birthday, there continues to not be any succinct reason why they were able to eclipse normal life expectancy by such a wide margin.

They all had their "secrets" to longevity, and some of those secrets wouldn't necessarily lead you to believe that they would have anything to do with living past 50, let alone 100.

So maybe it's just a crapshoot.

Regardless, it won't be long before these centenarians no longer have 19th century dates on their birth certificates. To be born in 1899 and still be alive today is a marvel.

Jeralean Talley's longtime friend and fellow churchgoer, Christonna Campbell, spoke for so many of those who knew Mrs. Talley.

"We just thought she was going to live forever," Campbell said.

But didn't she, in a way?

Tuesday, May 26, 2015

Meara, Meara

Comedians/actors Jerry Stiller and Anne Meara were married for 61 years, but had they not heeded warning signs, the marriage might have ended some 44 years ago.

The comedy team of Stiller & Meara was seemingly cruising along in 1970, having just enjoyed a nice run of 36 appearances on "The Ed Sullivan Show" in the 1960s, when both members of the team/marriage sensed that something was amiss.

With an act based largely on their real-life domestic trials and tribulations, Stiller and Meara found that despite their success---or maybe because of it---the line between life at home and life on stage was getting further blurred as the years went on.

"I didn't know where the act ended and our marriage began," Meara told People magazine in 1977.

"We were like two guys," Stiller said in the same article.

With Meara questioning things and Stiller worried that he might lose his wife, the act was disbanded in 1970.

But they never stopped working together for very long at any given time; they just didn't do so as the stage act Stiller & Meara.

The couple had been teaming up on a web series in recent years before Anne Meara passed away over the weekend. She was 85.

On television, Stiller and Meara were most recently seen sharing some scenes together on "The King of Queens," with Stiller playing Carrie Heffernan's widowed father Arthur Spooner and Meara playing the part of Veronica Olchin, the widowed mother of Doug Heffernan's friend Spence Olchin.

Ironically, that series ended with Stiller and Meara's characters getting married.

Stiller and Meara's actor/producer/director son, Ben Stiller, produced the web series for Red Hour Digital, which Ben owns.



Anne Meara met Jerry Stiller in New York after a failed audition in 1953, and the couple was married a year later. But it took much prodding and several years of convincing before Meara agreed to join her husband on stage as a comedy team, whose only rival at the time in the male/female duo category was the team of Elaine May and Mike Nichols, who weren't married.

Thus, Stiller & Meara would eventually become the entertainment industry's longest-running, most successful husband and wife comedy duo, surpassing that of George Burns and Gracie Allen.

After the stage "breakup" in 1970, Stiller and Meara hardly disappeared from view or from listeners' ears.

They did radio ads for Blue Nun wine, and appeared in television commercials together. They also teamed up in 1977-78 for "Take Five with Stiller & Meara," which was a series consisting of humorous blackouts about everyday life.

Meara was no Gracie Allen, and that's hardly a knock. Where Allen was George Burns' ditzy foil, Anne Meara was Jerry Stiller's equal, and then some---both physically and in terms of material. She was a tall, Irish, Brooklyn redhead whose height caused her to loom large on stage next to her husband, literally and figuratively.

Meara was a four-time Emmy Award nominee and she was nominated for a Tony Award once.

There was so much more to Anne Meara than being Jerry Stiller's comedy partner---and Ben Stiller's mother. There was the acting and the writing and the teaching and the trailblazing aspect to her career for other female comics.

Not bad for a woman whose own mother committed suicide when she was 11 years old.

Meara once gave a glimpse into what the secret was to staying married to a co-worker for over six decades, practically unheard of in show business.

"Was it love at first sight? It wasn't then---but it sure is now."

Friday, May 8, 2015

Who Among Us?

The only thing that is certain in the road rage trial of Martin Zale is that it was tragic.

A wife widowed. Children growing up father-less.

After that, it gets tricky.

Zale is the motorist who is accused of murder in the fatal shooting of Derek Flemming last September 2 in Genoa Township, at Grand River Avenue and Chilson Road.

Zale was allegedly driving recklessly and Flemming, on a beautiful afternoon after having lunch with his wife, didn't appreciate it.

The vehicles stopped at a red light---Zale's in front of Flemming's---and Flemming got out of his vehicle to confront Zale. Witnesses say that Flemming looked very angry and had both fists clenched as he approached Zale's truck.

Moments later, Flemming was dead---shot once in the face. He died instantly.

Zale didn't flee; rather, he pulled off to the side of the road and called his lawyer.

Those are the basic facts. Zale's trial is happening now, and I think it's going to be fascinating to follow.

Of course, there's a lot more to it than what I have chronicled. But that's what makes it so fascinating.

Who among us has never been enraged by another motorist?

635666718283062463-Martin-Zale
Martin Zale at his trial


That's what enthralls me about the Zale trial. So many criminal trials are difficult to relate to, because they involve actions or circumstances in which a vast majority of us would never find ourselves.

But Martin Zale and Derek Flemming? We've all been the latter and some of us, whether we choose to admit it or not, have been the former.

It's just that in this case, Flemming took that extra step that many of us have fantasized about but have still managed to avoid actually doing---probably because of the fear of the fate that befell Flemming.

It's a trial that so many of can relate to. And I believe that its verdict could have a ripple effect in several ways.

It's also a trial where there will be no shortage of opinion or water cooler talk at the office.

As I said, the only non-debatable aspect here is that what happened was a tragedy. It always is, when something bad happens that was avoidable.

But there's that word: avoidable.

It's a sort of chicken and egg thing going on here.

You can say that Flemming initiated, in essence, his own death by climbing out of his vehicle to confront Zale.

You can also say that Zale initiated everything because of his allegedly reckless driving to begin with.

Then there are the backgrounds of the two men.

Zale, according to co-workers at least, was notorious for crazy driving. He also has another documented road rage confrontation from his past in which police were called.

Flemming, for his part, also--according to those who knew him---had exhibited behavior in the past that aligns with possible anger issues.

So there we have it---two known hotheads coming together to form a perfect storm of rage and reaction.

The easy thing to do---and I am among those who have done it---is to wag a finger and hold up Flemming as the poster boy for why you should never confront, and why you should call 911 instead.

But that doesn't let Zale off the hook, of course. Flemming's actions may have been ill-advised, but did they deserve the death penalty?

Maybe something like this was bound to happen, involving Martin Zale.

Perhaps the same could be said of Derek Flemming.

Still, tragic.

They'll be talking about this one for years.

Friday, April 24, 2015

Another Untimely, Tragic Wrap

As if suicide isn't rotten enough, it invariably raises more questions than it answers. That's because suicide often doesn't answer any questions at all.

Even a note left behind won't necessarily satisfy all the curiosity. In fact, suicide notes are likely to create more questions than they answer, as well.

A suicide note is like a press conference where a statement is issued and the issuer scrambles away, without taking any queries.

Sawyer Sweeten is dead. Apparently it's suicide.

Sawyer, on the verge of turning 20, was one-half of the identical twin actors who played Ray and Debra Barone's twin boys on "Everybody Loves Raymond" (1996-2005). Sawyer played Geoffrey and Sullivan Sweeten played Michael. The twins' older sister Madylin played older sister Ally on the TV show.

According to reports, Sawyer was visiting family in Texas when he apparently shot himself on the front porch of the house where he was staying.

In the early years of "Raymond," star Ray Romano would say in the open that the show "is not really about the kids," and he was right. The Barone children were often not seen at all in episodes. Not making kids foils or smart alecks was one of many ways in which "Raymond" was refreshing.

The Sweeten kids weren't fed rapid fire one-liners by the writers. Their characters rarely acted out, and only on occasion was a "Raymond" storyline built around the children.

But today, it IS about the kids. One, in particular.

No word yet if Sawyer left a note. Not that it helps if he did.

Throughout entertainment history, the travails of the child actor after he/she is no longer an adolescent have been widely documented. I don't know if studies have been made, so it's anyone's guess as to whether former child stars are, statistically, prone to big people-type problems more than "normal" kids. But certainly their issues are higher in profile.

I would imagine that some of the emotional/psychological problems that child actors face start with a question that we have all asked about said stars, either to ourselves or of others.

"Whatever happened to...?"

That may be the crux of a lot of this stuff.

Whatever happened to the kid actors after they grew up and their shows ended up in syndication?

But maybe the kid actors are asking themselves, "What do I do now, now that the spotlights have been turned off and the acting jobs have dried up?"


Sawyer and Madylin Sweeten


Some of the kid stars turned to drugs. Some turned to alcohol. Some turned to both. Others followed their lives on set with a life of crime, almost immediately.

With or without a suicide note, the questions surrounding Sawyer Sweeten's apparent suicide will never truly be answered, because the only person who possesses the answers and who can expound is gone.

And it might be that Sawyer's demise had absolutely nothing to do with his having been a child actor.

Romano, who reminded us back in the day that his show wasn't about the kids, reversed that course upon learning of Sawyer's tragic death.

"I'm shocked, and terribly saddened, by the news about Sawyer," Romano said in a statement.
"(Sawyer) was a wonderful and sweet kid to be around. Just a great energy whenever he was there. My heart breaks for him, his family, and his friends during this very difficult time."

Big sister Madylin Sweeten told us to do something that shouldn't take an untimely death to get us to do.

"At this time I would like to encourage everyone to reach out to the ones you love," she wrote on her Facebook page. "Let them have no doubt of what they mean to you."


Wednesday, April 15, 2015

Ebb and Flo

They were television advertising icons who resided on the banks of our cultural consciousness.

Mr. Whipple (Charmin bathroom tissue). Madge the manicurist (Palmolive dish detergent). The Maytag Repair Man. Even the Qantas koala bear.

Those were just a few commercial characters who invaded our living rooms in the 1970s and '80s. Their ads---usually 60 seconds in length or even longer---were rarely the same. The format might have been nearly identical, and of course the tag lines were ("DON'T squeeze the Charmin!"), but each appearance by Mr. Whipple or Madge usually had them interacting with different customers.

The actors behind the characters were often nameless, as it should have been, but I'm sure their paychecks weren't nameless---or paltry.

The pitchman on TV these days is usually a local litigator or a voice-over hawking prescription meds.

There isn't really any character that is iconic---no one who, when they appear on the screen, instantly lets us know what product is being advertised.

Except for Flo, the Progressive Insurance Girl.

Played by Stephanie Courtney (we only know that because this is the Internet age), Flo first started appearing on TV in the late-2000s. Her cheery attitude, dark hair, blood red lipstick and ridiculously long eyelashes, all packaged in an all-white uniform, screams insurance at the moment of seeing her.

To Progressive's credit, the Flo ads are kept fresher than most other TV spots, which can gag you with their repetitiveness and lack of variety (i.e. those same three Liberty Mutual Insurance ads that are rotated).

Progressive has put Flo in all sorts of situations, from riding motorcycles to consoling a man in a locker room to being tied to a stake (in an ad that puts Flo in different eras in world history).

But unlike the advertising characters from days gone by, who were mostly universally liked (or, at the very least, tolerated rather easily), Flo, for whatever reason, is a polarizing sort.

My mother, for example, can't stand Flo. I, on the other hand, find Flo attractive in an odd way.

Trolling the Internet, this polarization is acute.

There are Flo-hating websites and forums, as well as those that are visited by men who make no bones that they would like to do some things (sexually) to Flo that are unfit to print here. Other comments on Facebook et al have been from females who like Flo just because they think she's likable.

Courtney, for her part, has never understood the allure of Flo, sexually.

"The GEICO gecko puts out more sexual vibes than Flo does," Courtney has been quoted as saying.



Regardless of where you stand on the Flo issue, one thing can't be disputed: She's a throwback to a time when TV advertising was flush with identifiable characters and mascots. Back when TV hawked more than just insurance, beer, cars and drugs.

Flo's Facebook page has nearly 5 million likes, though that number has been dipping in recent years from its peak of 5.4 million.

Like them or not, the Flo spots at least are freshened up rather frequently. Her character, these days, is seen less in that all-white, fantasy Progressive Insurance "store" and more in various situations and venues.

And, no doubt, Flo has made Stephanie Courtney's wallet fatter than it likely would have been had she been forced to stick to more traditional bit parts on TV and in the movies, as she was doing prior to Flo.

You pretty much love Flo or you hate her; it's hard to be on the fence with her. She's the Howard Cosell of modern television that way.

The GEICO gecko, by the way, should get props for its popularity and freshness of new spots.

Who would have thought that the world of insurance would take over TV advertising?

Friday, April 3, 2015

Still Rockin', Still Rollin'?

The Rolling Stones are coming! The Rolling Stones are coming!

How much rolling they do nowadays, it's anyone's guess. They're all in their 70s now.

The iconic rock group is touring this summer, and Detroit is on the travelogue, with the Stones playing Comerica Park on July 8.

This isn't ageism, but one can only wonder how strong the voices are, how powerful the guitar riffs are and how much energy is in the tank for the Mick Jagger-led group, who can all order off the seniors menu at every restaurant in the country.

I've been listening to a lot of 1960s-era rock lately, thanks to a nifty little mobile app called Milk Music. The tunes (sans commercials) come in handy while walking the pooch.

The Rolling Stones are part of that, of course, but sprinkled in with the bands I am listening to are performers like Jim Morrison (The Doors), Jim Croce, Jimi Hendrix, Janis Joplin, Mama Cass Elliot (the Mamas and the Papas) and others who died before their time.

So the question begs: what would have become of those artists had they lived as long as Jagger, Richards, Wyman, Watt et al?

The argument could be made that each of the aforementioned music artists, who all died in their 20s (except Elliot, who was 32 when she passed), were trailblazers for acts who came behind them.

But would their acts have stood the test of time?

We'll never know, of course, but it's still fun to imagine what kind of music The Doors would be pumping out in 2015, or if Croce's ballads would have evolved over time or if Hendrix would still be wailing on the electric guitar some 45 years after he died.

Then again, there are many bands and individual artists from the British Invasion years that have pretty much vanished from the public eye---all while remaining alive and kicking.

The Rolling Stones are still a draw because they, like The Who, Paul McCartney and others who've been at this rock-and-roll thing for 50-plus years, pumped out so many hits in their prime that it never gets old for their fan base---many of whom are also in their senior years---to hear those hits performed live, no matter the age of the performers.



The bodies of work of Morrison, Croce, Hendrix, Joplin and Elliot, combined, averaged about four years at their peak. If it seems like it was longer, then that's both a testament to their music's influence and to the fact that they died young. James Dean only made four movies, believe it or not. Yet a prevailing belief is that Dean's filmography is more voluminous than that.

Elvis Presley would have turned 80 in January. But forget The King's music; how would those hips have held up?

Wednesday, February 25, 2015

Heat Index

My first experience with spicy food came when I was a youngster.

I was a latch key kid, and that included lunch. My grade school was literally across the street from the house, more or less. So I would let myself in and prepare my own lunch, as early as age 11.

This was circa 1974-75.

Nobody reported my mother to Child Protective Services. I managed to not burn the house down. I'd fix my lunch, eat it, and be back in class on time.

Somehow along the way I have lost that efficiency in my life, but that's another blog post entirely.

The point being, my first encounter with spicy foods came in the form of those Vlasic hot pepper rings in a jar. Again, I was 11 and I started nibbling on those tangy, vinegar-encased yellow rings, usually combining them with a sandwich of some sort.

That was some 40 years ago, and it was way before I discovered Szechuan Chinese food, Indian cuisine and Thai delights.

It was also way before fast food joints and snack manufacturers discovered anything remotely on the warm side, spicy food-wise.

Today everyone is pushing spicy food.

Jalapenos are all the rage now.

Everyone from Frito Lay to Applebee's to Burger King are putting jalapenos in their offerings.

Spicy food is everywhere. Buffalo style (fill in the blank); "bold" menu items; Cajun everything; Thai this and Thai that.

Not that I'm complaining.

My yen for bold, spicy and tangy foods clearly started with those latch key lunches in the mid-1970s. Vlasic hot pepper rings was my first experience. I remember it like a woman remembers her first kiss.

But I eventually had to eat something other than hot pepper rings to satisfy my growing craving.

My mom and I used to eat Chinese food a lot but it wasn't until I went off to college and started working in Ann Arbor that I realized not all Chinese cuisine was of the Cantonese variety.

Spicy Chinese food? Really?

Some co-workers were getting take-out at a Chinese place down the street and it served something called Szechuan, they said. Never heard of it, I replied.

Oh, it's good, they said. Very spicy and hot.

I probably cocked my head, like a bemused dog does.

But I for sure said that I was in on that!


Part of nature's nectar


The food arrived and I'm surprised my taste buds didn't all drop dead of a heart attack.

Never before had they seen anything like Szechuan Chinese food come down my gullet.

What a taste sensation!

So that's when I got hooked on spicy Chinese food (circa 1982). That would change from Chinese to Asian when I discovered Thai cuisine, some five years later.

If I thought Szechuan (and Mandarin) was hot, I had no idea when it came to Thai food.

Thai food was invented for people like me. Intense heat, but still adjustable for individual taste.

Siam Spicy, on Woodward in Royal Oak, gave me my indoctrination to Thai food. I foolishly ordered it "extra hot" on my first visit. I dismissed the sweet waitress's warning.

I should have listened to her.

But that painful (literally) experience didn't dissuade me. I had discovered a treasure trove.

In the early-1990s I found out about Indian food. More delightful salivating ensued.

So here we are today, 40 years after I lost my spicy food virginity, and only now is the food industry catching up.

It's a generational thing, I'm sure.

I was born in 1963. Today's target demographic was born some 20 years after that, and they, as a whole, are more in tune with hot and spicy food.

They are less afraid and more adventurous eaters than the generation preceding them.

The products and menu items today reflect that shift in taste bud stamina. Although when the so-called spicy offerings first started to appear, they weren't nearly hot enough for my liking. Now the heat level is increasing as the demographic is getting younger.

The easiest bet I ever won came some 30 years ago, when a friend wagered that I couldn't eat an entire bag of extra hot potato chips without drinking anything.

I won a case of Molson Brador beer. Like taking candy from a baby.

I still eat hot pepper rings, by the way. Today I call it comfort food.









Friday, February 13, 2015

The Justified Bully

In the 1980s, HBO presented a comedy series called "Not Necessarily the News." In it, pretend anchors used real news clips but altered them for laughs.

Cleverly inserted shots that the HBO show produced, interspersed with the actual clips, would be used for gags.

Of course, the notion of fake news on TV was hardly new at that time. "Saturday Night Live" began the trend in earnest with its signature Weekend Update segment not long after "SNL" debuted in 1975.

While "NNTN" was playful and Weekend Update was very sarcastic, always delivered with a wink and a smirk, there was still further to go in the fake news genre.

Enter Comedy Central's "The Daily Show."

Where "NNTN" was produced sporadically and Weekend Update was weekly (during the "SNL" season), "The Daily Show" was exactly that---daily.

But that's hardly where the delineation ended.

"TDS"'s Jon Stewart was not part of a host rotation, like Weekend Update's, which helped make stars out of everyone from Bill Murray to Dennis Miller to Seth Myers.

Weekend Update has always been presented in a breezy five minutes or so, while "TDS" has always been 30 minutes in length.

Stewart is one of two hosts that "TDS" has ever known (Craig Kilborn began when the show began in 1996 and Stewart took over by 1999), and he stunned his audience with the announcement this week that this will be the year that he steps down.

Kudos should continue to go to Kilborn, the ESPN grad whose smarmy delivery would forever brand "TDS," but it was Stewart's intellectually sharp, biting humor and longevity that cemented "TDS"'s perpetual place in television comedy history.

"TDS" has been guested by a gaggle of political figures and other celebrities over the years, many of whom have been eager to share the stage with Stewart and engage in the ensuing repartee.

Such was the popularity of Stewart's show that it spawned spin-offs, like Stephen Colbert's "The Colbert Report" and "The Nightly Show with Larry Wilmore."

Stewart never hesitated to point out the absurdity and hypocrisy of politics, social issues and celebrity. He used his host's chair as a bully pulpit, but it always seemed that those he bullied deserved it. Stewart possessed the incredibly difficult knack of being biting but not mean-spirited. He never tweaked anyone just for cheap laughs.



I believe that the ability to jab someone in a pointed way but sans brutality added to the humor of "TDS." Stewart was no insult comic---he wasn't Don Rickles sitting behind a desk.

Stewart was so entrenched as "TDS" host that it was easy to forget that he wasn't one of the mainstream news anchors, but instead a gifted comedian and an actor/director whose career on the big screen is nothing to sneeze at either.

Comedians will tell you that the beauty of their craft turns up when their material practically writes itself.

Stewart didn't have to try very hard to pull laughs from the daily headlines; so much of what goes on is good fodder. But that doesn't minimize his contribution to television comedy.

Jon Stewart's "TDS" not only poked fun at the news and newsmakers, it illuminated the injustices, ridiculousness and shamelessness bubbling just below the surface of them both.

Stewart pulled no punches, but at least those he tattooed had it coming.

Friday, January 30, 2015

Death in the Slow Lane

Traditions are terrific things. Whether they run in families, bring together communities or even entire nations, there is no mistaking the notion that honoring tradition is a noble and cozy thing to do, when not misguided.

But let's do away with the funeral procession, shall we?

In simpler, less crowded, less rude times, the funeral procession, particularly when done using the horse and carriage, was a fine way of respecting the newly-deceased.

Today, it's more along the lines of a nuisance and, frankly, it can be dangerous.

The journey from church (or other nonsecular place) to the cemetery or mausoleum can certainly be a somber one. There isn't a limousine leading the way with cans and string attached, with a hand-painted sign that says "Just Died."

So I get it that commuting during an occasion of burial isn't the most pleasant thing in the world. And I have nothing against respecting and honoring the dead.

But the funeral procession has worn out its welcome.

Today, with roads packed more than ever with vehicles, the idea of stringing together dozens of motorists and allowing them to pass through intersections and running red lights with impunity, simply isn't very bright.

It's nothing against the processioners, per se, although there does always seem to be one car that lags behind the rest, creating a potentially dangerous gap. It's more about the rude, disrespectful motorists who aren't part of the procession.

I just don't think we need to drive en masse to a burial.

I think you can give folks the target address and driving instructions and say "We'll see you there."

An exception would be for something more stately, such as the funeral of a police officer or political figure.

If one of the purposes of a funeral procession is to show, in a very visual way, how beloved someone was, I am reminded of some sage words uttered by a wise person.

"The only thing that is going to determine how many people show up to your funeral is the weather."

My inspiration here isn't because I was recently inconvenienced by a funeral procession, though Lord knows that I have been. Nor is it because I have encountered strange and exasperating moments whilst driving in a funeral procession, though I once drove the entire way behind a car with no functioning brake lights (that was fun).

In fact, this really has nothing to do with inconvenience. It has everything to do with practicality and safety.

I don't have the numbers, and maybe they don't bear me out anyway, but I still think that you increase the chances of an accident anytime a funeral procession rolls on by.

Besides, they're depressing.


Enough.


What's a more in-your-face reminder of mortality than watching 30 cars drive slowly by, following a hearse?

I see enough images of death and destruction on TV and the Internet to last me a lifetime, thank you very much.

Would death be any less significant and the occasion of a funeral be any less morose or somber if we stopped traveling to burials in herds?

I recall a stand-up comedian once remarking that as a show of life's cruel irony, the only time you get to drive through red lights and stop signs is when you're dead and can't enjoy the gratification.

Besides, in my non-funeral procession fantasy world, if I really want to drive miles and miles in a tight-knit pack while pumping my brakes, I have that opportunity, twice a day: my commute to and from work.

Tuesday, January 13, 2015

Kept in the Dark

I think one of the most depressing parts of winter is that we spend it cloaked in darkness.

It's dark when you wake up to get ready for work. The afternoons are often overcast and everyone has to drive with their headlights on. It's dark when you drive home from work. You can go days without seeing any serious sunlight.

In Michigan, you can pretty much put your sunglasses in the drawer in October and not pull them out again until April---if you're lucky.

It's like in wintertime, we've all forgotten to pay the light bill.

That's why, when you get a day of sunshine in the winter, your eyes hurt. You spend the day squinting. Everyone looks like Robert De Niro in every movie in which he's ever appeared.

But there's something called the Winter Solstice, and we actually passed it a few weeks ago---December 21 to be exact. And when you pass the solstice, you're in for longer days, slowly but surely.

When I was a kid, I remember folks talking about December 21 as being "the longest night of the year."

Kids, as we know, tend to take phrases literally. I was no exception. One year, I heard all the blather about December 21's "longest night" and when that night actually came, I thought it would be dark for the whole next day.

The "longest night" aspect, of course, is an astronomical phenomenon rooted in minutes, not hours.

But that's not what kids hear.

So here we are, 23 days past the Winter Solstice and while it's still mostly dark out, the commute home from the office isn't quite as depressing anymore. I take heart in the fact that from this point forward, nightfall stays away a tad longer, day by day.

But it's still dark a lot.


This photo was likely taken at 1:00 in the afternoon during a Michigan winter


I like December 21 in the same vein that I dread June 21, the Summer Solstice.

Because after June 21, the days start to get shorter.

I love it that in the summer, the clock will read 9:25 p.m. and you could still mow the lawn if  you want. There's that much sunlight still available.

But after June 21, sunset creeps closer and closer. It's like a slow water torture.

By August, 8:00 becomes the point where you need flashlights outside. A couple months later, with the leaves on the ground and with more chill in the air, sunlight becomes a precious commodity.

Then we start the whole depressing winter thing all over again.

This blog post may seem like an exercise in futility, because no amount of complaining in the world is going to change the Earth's axis. We can't rally and join hands to make our winter days filled with more sunshine.

But I write this because today it hit me---I made it home after work with a sliver of sunshine left in the sky. It was gone a few minutes later, but this is improvement.

Plus, in Michigan, the longer the days get in the winter, the more we get to see all the snow that needs to be shoveled.

Give and take, you see.