Categories
Arts & Letters History

The Fascinating History Behind Hunting Decoys, An American Folk Art Form

Decoys originate from man’s efforts to lure waterfowl. Whether hunting with nets, traps, or firearms, hunters came to value decoys as highly as boats, blinds, and shotguns. As weaponry improved and populations increased in the latter part of the 19th century, more and more people hunted waterfowl for food and sport, and the demand for decoys grew. The art of the decoy entailed that the fabrications should appear lifelike from afar—the more realistic the decoys, the more successful the hunt.

The waterfowl decoy is now a treasured form of folk art. Often highly sought after as collectibles, many are quite valuable. From old working decoys to the modernized, stylized, and finely carved, they reflect the impact of technology, environment, society, and economy on an American way of life.

(Courtesy of John V. Quarstein)

The Magic of Migration

When the crisp winds of fall break across the Chesapeake, we hear again the glorious music of the migrant Canada goose drifting through the air. Look up into the sky or out above the cut cornfields, and you can see their wavering lines passing into the distance. One wonders what compels these birds to travel thousands of miles each year from their northern breeding grounds to winter destinations along the Atlantic Coast, and back again. How do they find their way? How do they know when to go and when to return? The answers to these questions lie in the mystery of migration.

The movement north and south of migratory waterfowl is probably triggered by meteorological conditions, including temperature and barometric pressure. The birds travel certain routes to particular places based on food and water sources, and waterfowl flocks return each year to the same wintering areas because of imprinting.

The Atlantic Flyway welcomes birds from the eastern Arctic, the coast of Greenland, Labrador, Newfoundland, Hudson Bay, the Yukon, and the prairies of Canada and the United States. Millions of ducks, swans, and geese move along the coast and overwinter on the Chesapeake and North Carolina sounds.

The Chesapeake Bay region is a great magnet for migrating waterfowl. These protected waters provide food and a safe haven. Aquatic grasses fill the waterways, and the harvested fields are sprinkled with corn. The tremendous number of birds flocking to the region has driven decoy demand for centuries.

Humans have been hunting waterfowl for food and sport for thousands of years. (RubberBall Productions/Brand X Pictures/Getty Images)

Waterfowl in America

The word decoy is derived from the Dutch words for the (de), cage (kooi), and duck or fowl (eend). The Dutch brought to New Amsterdam—today’s New York—an ancient method of using cages and tame ducks to lure and trap wild fowl. The tame birds were called the cage ducks, or “de kooi eend.” By the mid-19th century, the word decoy became commonplace in America as “an image of a bird used to entice within gunshot.”

While the earliest known decoys were used by pre-Columbian North Americans, a combination of factors expanded the demand for waterfowl during the post-Civil War era in America. Migratory birds, including canvasback ducks and whistling swans, were abundant, but access to and distribution of this seemingly endless food source was problematic. Rapid population growth motivated Americans to find ways to harvest the crop, and the expansion of railroads provided routes for refrigerated cars to transport the delicious waterfowl meat to eager markets in major cities otherwise disconnected from rivers, bays, and marshes.

At the same time, firearm improvements brought increased efficiency for hunters. From the paper shotgun shell to lever-action, pump-action, and eventually automatic shotguns, firing speed rose and weapons became so effective that waterfowl were quickly endangered, forcing the passage of the Migratory Bird Treaty Act of 1918.

Prior to that, natural abundance coupled with technological innovation enabled the harvesting of thousands of ducks each year. On the flat tidelands of the Susquehanna River, sinkbox blinds were favored—typically by market hunters—and required 300 to 700 decoys per layout. An estimated 75 sinkboxes were in use during the late 19th and early 20th centuries. With 50 to 100 decoys per sneakbox boat, and countless shore-blind rigs, approximately 20,000 decoys or more were needed every year to support hunting activities.

A shorebird decoy stands on driftwood. Made around 1900, it has tacks for eyes and a nail for its bill. It is part of the author’s personal collection. (Courtesy of John V. Quarstein)

The rapid expansion of market and sport hunting after the Civil War Era prompted many guides to begin making decoys, and the craft became an important trade. Influenced by regional differences in water, weather, paint, and stylistic traditions, decoy designs were handed down through generations. Every maker had an opinion about how various waterfowls should look.

Decoy-making practices were well established by the mid-19th century. Decoys were hand-chopped using simple woodworking tools such as axes, chopping blocks, spokeshaves, and various kinds of knives. With the rise in popularity of waterfowling in the 20th century, decoy needs increased, and the influence of industrialization set in just as the market expanded beyond the capacity of traditional makers.

Enterprising businessmen, hunters, and woodworkers endeavored to promote and mass-produce decoys. Traditional carvers turned to power tools to increase output. While some operations employed only a few workers and continued to use traditional carving methods, other manufacturers used assembly line processes. The common ground for these early producers was in advertising their products throughout the nation and shipping good-looking, high-quality decoys.

A Legacy Carved in Wood

Makers had often endeavored to craft decent decoys from materials other than wood. The post-World War I era witnessed the first shift from wooden working decoys to decoys mass-produced from other materials. This transition changed the decoy industry. Following World War II, decoys made from cork, canvas, papier-mâché, and plastic appeared. Many of these new styles were patented, and each promised to bring in the most ducks. As the cost of wooden birds increased, other types of decoys became more popular and dominated gunning rigs. Wood-carvers could not compete economically against plastic birds, and their work changed from crafting hunting tools to creating artworks.

People had already recognized the folk art qualities of decoys. Traditional makers strove for realism, carving decoys with raised wings or turned heads, for example. Others crafted miniatures as samples of their work. These “fancy ducks,” as Lem Ward called the early decoratives, began selling at premium prices. In time, carvers expanded their techniques by using wood burning tools to detail feathers, or branching out into new technologies like dental tools to make decoys so lifelike that any duck would be surprised to find otherwise.

(Courtesy of John V. Quarstein)

The art of the decoy is ever-changing. Today’s decoys are mixtures of traditional working decoys and reflections of minute detail. Many decoys aren’t intended as hunting tools, yet plenty still are. The craft continues as a connection between man and nature, form and function.

Waterfowl decoys existed for thousands of years before collectors came to appreciate the decoy as a historic art form—one of the oldest forms of American folk art—with a potential for aesthetic value exceeding its functional worth. While many decoys served as simple tools of the bayman’s trade, others became expressions of the birds themselves. In the end, material and style aren’t as important as process and overall effect. When a decoy truly captures a bird in body and spirit, then we call it art.
The Drifter I’m just an old has-been decoy No ribbons I have won. My sides and head are full of shot From many a blazing gun. My home has been by the river, Just drifting with the tide. No roof have I had for shelter, No one place where I could abide. I’ve rocked to winter’s wild fury, I’ve scorched in the heat of the sun, I’ve drifted and drifted and drifted, For tides never cease to run. I was picked up by some fool collector Who put me up here on a shelf. But my place is out on the river, Where I can drift all by myself. I want to go back to the shoreline Where flying clouds hang thick and low, And get the touch of the rain drops And the velvety soft touch of the snow. —Lem Ward, Chrisfield, Maryland
From May Issue, Volume II
Categories
History

The Heroic Legacy of Women Who Heeded the Call for Nurses During World War II

Mention “Rosie the Riveter” to anyone who’s familiar with America’s entry into World War II and you’ll likely get a smile.

Bandana-wearing “Rosie” was the star of a ubiquitous poster in a national campaign aimed at recruiting women to fill jobs in America’s industrial plants after male laborers enlisted in the military. The campaign paid off, with hundreds of thousands of women going to work in America’s factories.

But while “Rosie” remains a fictional character that gained widespread publicity, there’s another group of real-life heroines who supported the war effort and who even today are much less known.

Before the United States entered World War II, the United States Public Health Service had forecast a shortage of nurses stateside, raising important questions as war loomed: Who would fill the void left by civilian nurses who had joined the military? Who would care for the injured soldiers returning home from battle? And who would tend to sick civilians hospitalized across the country?

The answer was the more than 120,000 women who served in the U.S. Cadet Nurse Corps.

Nurses to the Rescue

The U.S. Cadet Nurse Corps program was signed into law by Congress in 1943 with two goals: improving the quality of training at existing nursing schools, but also attracting women from 17 to 35 years old who would receive tuition scholarships in exchange for completing 30 months of training and agreeing to work as a nurse, for as long as the war lasted.

From 1943 to 1948, when the program concluded, nursing schools were transformed as federal funds shaped more modern facilities and paid for newer laboratory equipment. At the same time, program graduates gained valuable knowledge that would ease the deficit of nurses and which could be used in a postwar setting, widening a pool of dedicated nurses for years to come.

Armed with newfound classroom training and given opportunities to learn the ropes at hospitals affiliated with the corps, cadets served in military hospitals, VA (Veterans Administration) hospitals, private hospitals, public hospitals and public health agencies.

In an interview with American Essence, Alexandra Lord, chair of the Division of Medicine and Science at the Smithsonian National Museum of American History, called the program “brilliant in its approach.”

“Nursing schools were not consistent in how they were teaching nurses and providing an education,” Ms. Lord said. By transforming and strengthening existing nursing schools while providing women with free tuition tied to a pledge of future service, a steady source of cadet corps members would be available to work in hospitals across the country.

Lucile Petry Leone, the founding director of the U.S. Cadet Nurse Corps. (Public Domain)

“The U.S. Cadet Nurse Corps has been highly successful,” then-Surgeon General Thomas Parran Jr. testified before the House Committee on Military Affairs back in 1945. “Our best estimates are that students are giving 80 percent of the nursing in their associated hospitals. … The U.S. Cadet Nurse Corps has prevented a collapse of the nursing care in hospitals.”

Many of the cadets hailed from preexisting nursing programs that served Navajo, black, and mixed student bodies, in keeping with the Bolton Act’s mandate that there be no racial or religious discrimination in the program. The edict was a milestone at a time when even the military had not fully embraced integration. Yet discrimination did rear its head, as when hundreds of black nurse corps members were assigned to work in less desirable conditions, at stateside camps holding German prisoners of war.

Tough Conditions

Shirley Wilson, now 99 years old and living in Connecticut, applied to join the Cadet Nurse Corps in 1944. Shortly after her interview, her mother became critically ill with blood clots. A brother had also sustained third-degree burns after falling into a fire. While those incidents delayed her start date with the corps, both occurrences deepened her sensitivity to the need for quality medical care. But long hours and pressing demands took their toll. While delivering a glass of water to a hospitalized patient, “he looked at me and he died,” Ms. Wilson told American Essence. Performing hospital duties on the same day as cadet corps classes, Ms. Wilson asked herself, “Am I in the right place?” After completing her mission as a cadet, Ms. Wilson joined the U.S. Air Force as a uniformed military nurse who served stateside during the Korean War and went on to teach cardiac nursing at rural hospitals.

Nurses around the country faced different challenges beyond the long hours (sometimes having 12-hour workdays), such as scarce supplies, and in the case of cadets at the University of Washington School of Nursing, a polio epidemic that hit Seattle in the 1950s.

Industrial production in support of the war effort also contributed to the workload that weighed down nurses; with more factory workers getting injured, the workload of nurse cadets increased. Andrew Kersten, Dean of the College of Arts and Sciences at Cleveland State University, cites the Bureau of Labor Statistics data detailing 2 million disabling industrial injuries annually from 1942 to 1945.

Postwar Service

Beatrice Strauss joined the Cadet Nurse Corps in 1947 and worked at the Jewish Hospital of Brooklyn in New York. Like Shirley Wilson, she wasted little time in deciding what to do after the program’s demise. She attended graduate school, earned a master’s in rehabilitation nursing, and later served as a nursing supervisor for a foster care program with more than 1,200 children in need of help.

She served during the peak of the HIV/AIDS epidemic. “We would get infants into care who had been born to drug-infected women who had ‘some kind of infection,’ and after about three months in care, many of our babies would become seriously ill and die.” Eventually, experimental medications were developed. “It was so gratifying to see that after a while some of our little babies were growing up to be toddlers!” Ms. Strauss wrote in one of many personal profiles posted at www.uscadetnurse.org.

Ms. Strauss’s move into pediatric infections was one of the many benefits of having nurse cadets become exposed to some specialty programs as part of their rotations at affiliated hospitals.

“Even in training, we spent three months on every [type of] service in the hospital,” said another corps graduate, Emily Schacht, age 97, who lives with her daughter in Connecticut.

Emily Schacht, who served as a cadet nurse in the Connecticut area. (Courtesy of Eileen DeGaetano)

In recognition of the contributions that nurse cadets made to the war effort, a bipartisan bill honoring women who served in the corps was introduced in Congress in May 2023. The U.S. Cadet Nurse Corps Service Recognition Act, if enacted, would recognize former cadets with “honorary veterans” status, a service medal, burial plaque, and other privileges. It would not provide still-living nurses with pensions, healthcare benefits, or burial at Arlington National Cemetery.

“There was no official discharge. The war ended and they [cadets] just went on. … They are the only uniformed members of the war effort that has yet to be recognized,” said Eileen DeGaetano, Emily Schacht’s daughter and herself a retired nurse.

Rep. Mike Lawler (R-New York), one of the eight original sponsors of the bill that is working its way through both chambers of Congress, stated that the bill would “honor the vital work of cadet nurses during World War II, provide them the honors they are due, and forever enshrine their legacy in the collective memory of our nation.” The bill has been referred to the House and Senate committees on veterans’ affairs.

A Mother–Daughter Bond

Ms. DeGaetano said the influence of her mother’s time in the Cadet Nurse Corps is “woven throughout the fabric” of her own nursing career.

“I learned to work hard and never to compromise the care I was providing by cutting corners. I learned to solve problems and create solutions without optimal conditions or resources. … I found the courage to stand up to the status quo and advocate for my patients,” she said.

“And perhaps, most importantly, I learned to measure the success of my career through the knowledge that my efforts made a difference in someone else’s life.”

From March Issue, Volume IV

Categories
Arts & Letters History House of Beauty

Mark Twain’s Dream Abode

The Mark Twain House & Museum resides befittingly in Hartford, Connecticut, in the charming, historic neighborhood of Nook Farm, once a thriving artistic community. The lovingly preserved home of America’s humorist was built in an American Gothic Revival style in 1874 and was lived in by Twain and his family until 1891. The mansion was intended to make a statement about its owner and his burgeoning literary career. Its whimsy, elegance, and extravagance—from exterior painted bricks, exuberant gables, and tiled roof, to the elaborate interior decorations—made a definitive statement in the 19th century. Indeed, the Gilded Age look of Twain’s home, with its layered, maximalist design of furnishings, textiles, and patterns, is once again in vogue.

Mark Twain, born Samuel Clemens (1835–1910), was a man of many talents and many jobs. In his life, he worked as a riverboat pilot, silver prospector, newspaper reporter, adventurer, satirist, lecturer, and author of iconic American books. His years spent in this Hartford home were the happiest and most productive of his life, and he called it “the loveliest home that ever was.” While living here, he wrote his classic novels “The Adventures of Huckleberry Finn,” “The Adventures of Tom Sawyer,” “The Prince and the Pauper,” and “A Connecticut in Yankee in King Arthur’s Court.”

Twain by the riverside, photographed by Benjamin J. Falk, circa late 19th century. (Public domain)

Twain and his wife Olivia commissioned the New York architect Edward Tuckerman Potter, a specialist in ecclesiastical design and High Victorian Gothic style, to build their dream home. Twain was then primarily known for his travel writings and a novel that lampooned high society, yet the 25-room house announced intentionally his entrée into that very society. Measuring 11,500 square feet distributed over three floors, it epitomized a modern home of the time, with central heating, gas lighting fixtures, and hot and cold running water. As the building costs were substantial, with the couple spending between $40,000 and $45,000 on the construction, the interiors were initially kept simple.

The house was used for delightful dinner parties, billiard games, charity events, and the raising of three daughters. In 1881, Twain’s growing international fame and success enabled the couple to renovate the home’s interiors in a grand and artistic manner. They engaged the fashionable design firm Louis C. Tiffany & Co.‚ Associated Artists, known for its globally inspired interiors. Like Twain, Louis C. Tiffany was a creative genius and extensive world traveler, and he explored nearly every artistic and decorative medium. He was highly skilled in designing and overseeing his studios’ production of leaded-glass windows and lighting fixtures—for which he is best known today—as well as mosaics, pottery, enamels, jewelry, metalwork, painting, drawing, and interiors. In the same year that the firm embarked on the Twain house project, it was also hired to redecorate the state rooms of the White House. Interestingly, today, it is the Mark Twain House that is considered the most important existing and publicly accessible example of the design firm’s work.

The billiard room served as Twain’s office and study, where he wrote some of his most famous works, including “The Adventures of Tom Sawyer” and “Adventures of Huckleberry Finn.” (Mark Twain House)

The couple signed a $5,000 agreement giving Tiffany and his associate designers carte blanche in implementing a decorating scheme. The design work included the walls and ceilings for the newly expanded front hall, the library, the dining room, the drawing room, and the first-floor guest room, along with the second- and third-floor walls and ceilings visible from the front hall.

Louis C. Tiffany & Co.‚ Associated Artists’ cohesive decoration of the first-floor rooms is inspired by evocative motifs from Morocco‚ India‚ Japan‚ China, and Turkey. The entrance hall, carved with ornamental detail when the house was first built, had its wainscoting stenciled in silver with a starburst pattern and its walls and ceiling painted red with black and silver patterns. In the house’s gaslight, the silver paint would have flickered and given the exotic illusion of mother-of-pearl. The drawing room was given a base color of salmon pink, and Indian-inspired bells and paisley swirls were stenciled in silver. Today, one can still view a large pier glass mirror, a wedding gift to the Twains, as well as the family’s tufted furniture.

Decorated by Louis C. Tiffany & Co., Associated Artists, the house has an entrance hall that is kept dim to mimic gas lighting. The iridescent stenciling, accented by wooden moldings, gives the room the feeling of a Persian palace. (Mark Twain House)

The dining room, used by the family for almost all of its meals whether informal or formal, was covered in a deep burgundy and gold-colored wallpaper in a pattern of Japanese style flowers. The paper’s pattern was embossed to give the impression of luxurious tooled leather. Its subject is typical of the work of Candace Wheeler, a partner in Associated Artists renowned for her textile and interior designs. Her honeycomb wallpaper enlivens the home’s best guest suite, known as the Mahogany Room.

The dining room wallpaper features heavily embossed paper, simulating the texture and color of tooled leather. (Mark Twain House)

Green and blue were frequent colors used in libraries at the time, and this house’s library is in a peacock blue. Its mantel, a large oak piece purchased from Scotland’s Ayton Castle, is the focal point of the room. Twain used the space to orate excerpts from his latest works, recite poetry, and tell stories to friends and family. In addition, Twain would entertain his daughters with fanciful tales using the decorative items on the mantelpiece as inspiration.

The family’s private rooms were beautifully decorated, too, and have been restored to their former glory by the museum. The nursery has delightful Walter Crane wallpaper that recounts the nursery rhyme “Ye Frog He Would A-Wooing Go” in words and pictures. Crane was an English artist considered to be one of the most influential children’s book illustrators of his generation; he created decorative arts in his distinctive detailed and colorful style. The bedroom of Twain and his wife was dominated by a large bed with elaborately carved angels that they had purchased in Venice. Twain’s only surviving daughter donated the piece to the museum, where it continues to be displayed. The third-floor billiard room is perhaps the most meaningful to fans of Twain’s writings, for it served as his writing office and study. When editing, he would lay out the pages of his manuscript on the billiard table.

The library, which opens up to a conservatory, was the main attraction for visitors. The statue of Eve was sculpted by Karl Gerhardt, a family friend and protégé of Twain. (Mark Twain House)
This Fischer upright piano was given to Mark Twain’s daughters as a Christmas gift in 1880. Known as the school room, this room was the primary space where the three girls were homeschooled from 1880 to 1891. (Mark Twain House)

Financial difficulties resulted in Twain and his family moving to Europe in 1891, and they never again lived in the home or even Hartford. They sold the property in 1903. Tiffany stained glass windows made for the home were sold separately, and their current whereabouts are unknown. The house went through different ownership and was, for a time, a school for boys before being sold to a developer who planned to demolish the house and turn it into an apartment building. A campaign was mounted to save the home, and, eventually, it was purchased by a group devoted to preserving its legacy.

The Mark Twain House & Museum, a National Historic Landmark, is considered one of the best historic house museums. Twain wrote in a letter that “our house was not unsentient matter—it had a heart, and a soul, and eyes to see us with. … We were in its confidence, and lived in its grace and in the peace of its benediction.” Fortunately, the home’s meticulous restoration and vibrant educational programming provides a window into its unique history and an opportunity to admire its timeless beauty.

The drawing room was the place for formal hospitality, where Twain’s daughters performed for guests. (Mark Twain House)
Although the Steinway piano and the “Holy Family” painting by Andrea del Sarto were not original to the house, the Clemenses were known to have purchased objects like these on their European trips. (Mark Twain House)

Fun Facts

Mark Twain incorporated autobiographical events in his novels “The Adventures of Tom Sawyer” and “Adventures of Huckleberry Finn.” The character of Finn, the beloved vagrant sidekick to Sawyer, was modeled off a boy he knew from childhood.

In his autobiography, Twain wrote, “In Huckleberry Finn I have drawn Tom Blankenship exactly as he was. He was ignorant, unwashed, insufficiently fed; but he had as good a heart as ever any boy had. His liberties were totally unrestricted. He was the only really independent person—boy or man—in the community, and by consequence he was tranquilly and continuously happy and envied by the rest of us.”

From March Issue, Volume IV

Categories
History

Thomas Jefferson’s Passion for Music Began a Civic Tradition Still Celebrated Today

On March 4, 1801, the 32-member U.S. Marine Band gathered at the Capitol in Washington, D.C., for Thomas Jefferson’s inauguration. Consisting of drummers and fifers, the fledgling organization had been established only three years prior in 1798 to serve as entertainment for governmental functions.

President John Adams was the first to invite the band to the White House a couple of months before Jefferson’s inauguration. It made its debut on January 1, 1801, at the unfinished Executive Mansion, hosted by Adams. The federal government’s move from Philadelphia to Washington, D.C., was so fresh that the finishing touches were still being completed at the President’s House, what was then called the Executive Mansion. The name “White House” was coined in 1811, three years before its reconstruction after the British burned the former building down.

While President Adams was in office during the U.S. Marine Band’s inception, its biggest cheerleader would come later when Jefferson stepped in as Commander in Chief.

Jefferson’s Favorite Passion

A window silhouette of U.S. President Thomas Jefferson playing the violin for his family in 1805. (Chip Somodevilla/Getty Images)

Jefferson remains one of America’s most successful and historic diplomats. While he had an undying love for his country, he stated that music was the “favorite passion of my soul” in a letter to the Italian economist and natural historian Giovanni Fabbroni.

Jefferson was an accomplished violinist and dedicated much of his childhood to the study of music. In his 20s, his courtship with his future wife Martha Wayles Skelton was spent bonding over music. According to historians at Monticello (Jefferson’s private residence), their shared love of music translated to their affections vividly. The two could often be seen playing music together. As they sang to each other, Martha played her harpsichord and Thomas played his violin. They continued to foster their shared love of music throughout their marriage, which helped keep it strong—even through Jefferson’s most trying times.

Jefferson’s musical tastes were broad. Mozart and Haydn were considered to be two of his favorite composers. But he also had a penchant for Scotch songs, which focused on life spent in the rural expanses of Scotland with an old-time tune style. He also learned as many Italian works as he could.

‘Godfather’ of the Marine Band

Uniforms for the President’s Own U.S. Marine Band from “U.S. Marine Corps Uniforms 1983,” by Donna J. Neary. (Public Domain)

His love of music influenced the cultural landscape of early 1800s America. Italy and France were considered to be places of musical renaissance, and he wanted to create that type of musical flourishing in the United States. He vowed to bring a renewed sense of life to Washington, D.C., through the expansion of music’s role—particularly the role of classical music and traditional works—for the district’s official events.

Months after the Marine Band’s first performance for President Jefferson at his inauguration, he invited the group to perform at the White House’s first official Independence Day celebration. Set up in a room near the party’s festivities in early July, the band played a variety of classical music and entertaining pieces. The party’s attendance soon grew to 100, and the attendees danced and marveled at the band’s prowess. One guest, Samuel Harrison Smith, later wrote to his sister saying the music the band played would have inspired her “patriotic heart” with “delight.”

The United States Marine Band soon became a regular at Jefferson’s events. He was so attached to the group of talented young musicians that he nicknamed them “The President’s Own,” after a tradition in England that often ascribed artists working in an official capacity to be nicknamed the King or Queen’s “Own.” The sign of respect was returned when Jefferson was later nicknamed the “godfather” of the United States Marine Band due to his eager support of the unique military endeavor.

A Fighting Spirit

President Lincoln was especially fond of the Marine Band performances in the White House and weekly concerts on the grounds. An illustrated plate of the U.S. Marine Band playing on July 4 in the 1861 Harpers Weekly. Internet Archive. (Public Domain)

Since its performance at President Jefferson’s inauguration, the U.S. Marine Band went on to perform at every president’s inaugural ceremony, marking one of America’s longest-standing civic traditions.

“The President’s Own” began with 32 drummers and fifers. But with Jefferson’s support, the band grew along with its role as official entertainer of the White House. Today, the band boasts over 160 members. And it performs for around 700 events annually.

With one of the most scrupulous audition processes and a legacy built around the “fighting spirit” of the earliest Marine Band members, the premier group is made up of the nation’s most talented and respected musicians. The long-standing organization’s dedication lies in upholding the country’s founding ideals and principles through the performance of traditional works. The United States Marine Band remains the oldest professional music organization in America.

John Williams conducting “The President’s Own” U.S. Marine Band for its 225th anniversary concert at The Kennedy Center for the Performing Arts on July 16, 2023. (U.S. Marine Band)

From March Issue, Volume IV

Categories
History

The Suprising Origins of the American Christmas Tree

It took some doing, but the Christmas holiday finally became an American tradition. Long before the 13 Colonies and the War for Independence, our forefathers brought forth upon this continent a rather strict view of the celebration. It was not to be celebrated. Despite Christmas being adopted by Christianity 1,200 years before the Separatists landed at Plymouth Rock, the sect believed the tradition was too intertwined with pagan rituals. After three decades in the New World, the court of the Massachusetts Bay Colony declared that “whosoever shall be found observing any such day as Christmas or the like, either by forbearing of labor, feasting, or any other way, upon such accounts as aforesaid, every such person so offending shall pay for every such offense five shillings, as a fine to the country.”

The law was instituted in 1659, and though the law eventually went by the wayside, even after the War for Independence ended in 1783 Americans remained rather dismissive of the holiday and its traditions for religious reasons. The traditions were typically traced back to ancient Rome’s Saturnalia festival, which marked the end of the planting season and the approach of the winter solstice on December 25 (the darkest evening of the year on the Julian calendar). The festival, originally created for one day, lasted a week (December 17–23). Gift-giving, feasting, and general merry-making were part and parcel of the holiday. Another staple was the use of evergreen plants, which included wreaths and trees. This plant was seen as a symbol that the sun, and therefore spring, would return.

William Bradford, second governor of Plymouth Plantation, considered it all “pagan mockery,” but the early settlers’ puritanical influence slowly began to dissipate as America moved into the 18th and 19th centuries.

A colored lithograph titled “Under the Christmas Tree,” by Max Seeger after the watercolor painting by R. Beyschlag, circa 1892. (Grafissimo/DigitalVision Vectors/Getty Images)

German Influence

Just as the Separatists and the Puritans are known for coming to the New World to escape religious persecution and find religious freedom, the Germans did as well. Before the Peace of Westphalia was signed in 1648, which ended the religious conflict between Catholics and Protestants called the Thirty Years’ War, many Germans had already fled across the Atlantic Ocean. Early into the 18th century, thousands more Germans came to America, where they settled in what is now Albany, New York. When tens of thousands of Germans flooded onto America’s eastern shores during what became known as the Rhine Exodus of 1816–17, they brought with them their German traditions, one of which was the Christmas tree.

Several thousand Germans immigrated to America’s eastern shore during the Rhine Exodus of 1816– 1817, many of whom settled in Albany, N.Y. “Albany, New York” by Pavel Petrovich Svinin, 1811–circa 1813. Watercolor on off-white wove paper. Rogers Fund, 1942; The Metropolitan Museum of Art, New York. (Public Domain)

The display of Christmas trees in Germany stretches back to the eighth century—and arguably the nation’s most famous citizen, Martin Luther, was the first to place lights (candles) in his tree during the sacred holiday. His inspiration came from standing in the thick German forest and peering into the twinkling night sky.

In America, Charles Follen, a German exile who became a Harvard professor and then a minister, introduced the Christmas tree to his New England peers in 1832, a moment recounted by the prominent British author Harriet Martineau. Four years later, Hermann Bokum, a German immigrant who would become an author and a chaplain in the Union Army, wrote “The Stranger’s Gift: A Christmas and New Year’s Present,” which provided an illustration in its opening pages of a Christmas tree. This tradition of cutting down, housing, and decorating evergreen trees was continued by German Americans, but it had hardly caught on among the rest of the populace.

Martin Luther, German priest and theologian, was the first to place candles in his tree during Christmas. A portrait of Martin Luther from the workshop of Lucas Cranach the Elder, circa 1532. Oil on wood. Gift of Robert Lehman, 1955; The Metropolitan Museum of Art, New York. (Public Domain)

Christmas Illustrations

It was not until December 1848, and from a rather unlikely source, that the Christmas tree began its meteoric rise to becoming an American tradition. It happened when the editor of the influential magazine Godey’s Lady’s Book, Sarah Josepha Hale, came across an illustration in the Christmas edition of the Illustrated London News. The illustration was of the Royal Family―Queen Victoria, German-born Prince Albert, the royal children, and their grandmother―standing around a decorated Christmas tree. It was entitled “Christmas Tree at Windsor Castle.”

Hale decided to have the illustration recreated for Godey’s Lady’s Book, a women’s magazine, in 1850―but with a few visual edits, such as the removal of Victoria’s tiara and Albert’s sash and mustache, to make it appear more American. This editorial decision caused the Christmas tree to go mainstream.

The country’s first National Christmas Tree was erected in 1923 near the White House. Photograph titled “Community Christmas Tree” on Dec. 24, 1923. Library of Congress. (Public Domain)

In the following holiday seasons, other prominent magazines began illustrating this new American tradition. On December 25, 1858, Harper’s Weekly published a Christmas-themed piece that included illustrations by Winslow Homer entitled “Christmas―Gathering Evergreens” and “The Christmas Tree.” The article recalled, “Time was when it was unlawful to keep Christmas in New England. A penal enactment, we are told, actually forbade the pilgrims and their children from keeping Christmas.” The article trumpeted, “Nowhere, perhaps, in the world is Christmas so heartily enjoyed as in New York.” (Harper’s Weekly was a New York-based publication.)

In 1923, exactly a century ago, the country’s first National Christmas Tree was erected on the Ellipse, a park near the White House. The lighting ceremony, led by President Calvin Coolidge, has been conducted every year since. Tree lighting ceremonies are hardly confined to the nation’s capital. Every year, large cities and small towns, along with approximately 100 million households, conduct what has now become an American holiday tradition.

From Dec. Issue, Volume 3

Categories
Arts & Letters American Artists Features History

Gary Cooper’s Daughter Shares Uplifting Lessons From Her Dad

Gary Cooper is synonymous with the Golden Age of Hollywood. He was one of its most successful box office draws. He was nominated five times for the Best Actor Oscar and won twice for “Sergeant York” and “High Noon.” Handsome, strong, and with an honest stare, Cooper became the country’s model of masculinity, integrity, and courage.

His roles were varied. They ranged from military heroes, like Alvin York, the most decorated U.S. soldier in World War I, and Billy Mitchell, considered the Father of the U.S. Air Force; to a Quaker father in “Friendly Persuasion”; the tragic baseball player Lou Gehrig in “The Pride of the Yankees”; and a tamer of the Old West, none better known than the fictional Marshal Will Kane in “High Noon.”

Maria Cooper Janis, the daughter and only child of Cooper and Veronica Balfe, recalled her father saying that he wanted to try to portray the best an American man could be. These dignified and masculine roles surely captured the ideal, but they also captured something else. Janis said the man that millions of moviegoers saw, and still see today, was, in so many ways, playing himself.

Gary Cooper waits on set. (Courtesy of Maria Cooper Janis)
(L to R) Actors Clark Gable, Van Heflin, Gary Cooper, and James Stewart enjoy a laugh during a New Year’s party held at Romanoff’s restaurant in Beverly Hills, Calif. (SSLIM AARONS ESTATE/Getty Images)

Rugged and Sophisticated

From the rough-and-tumble Western stereotypes to the sophisticated man-about-town, he was “as comfortable in blue jeans as he was in white ties and tails,” she said.

There is a famous photo called “The Kings of Hollywood” of Cooper standing alongside Jimmy Stewart, Clark Gable, and Van Heflin in their white ties and tails, cocktails in hand, having a laugh. It is the elegant and sophisticated version of Cooper—the quintessential image of Hollywood’s leading man. Indeed, Cooper was one of the kings for several decades.

But he was also an everyman. Cooper grew up in early 1900s Montana. He was born in Helena just a few years after it was named the state’s capital. It was a rich town despite being part of the recently settled West. It was an environment―both rugged and luxurious―that Cooper would go on to personify.

The Cooper family enjoying a romp in the snow. (Courtesy of Maria Cooper Janis)

Janis said her father’s first friends were the local Native Americans. They taught him how to stalk and hunt animals and perform his own taxidermy. His friendships helped him understand the plight of the Indians. His father, Charles Cooper, a justice on the Montana Supreme Court, had long been concerned about the Native Americans.

“My grandfather was always working for the underdog,” she said. “My father must have heard a lot of those stories. [My father] always felt he should defend those who needed defending, especially those who didn’t have the clout or standing to win.”

Cooper and the cast on the set of “High Noon.” (Courtesy of Maria Cooper Janis)

The Defender

Cooper found himself defending others on film and in real life, and sometimes those two mixed. Although he stated before Congress that he was “not very sympathetic to communism,” he was sympathetic to those in Hollywood―actors, writers, and directors―who were targeted by the Hollywood blacklist movement. One of those with whom he was sympathetic was Carl Foreman, who had written the script for “High Noon” and had refused to cooperate with the House Un-American Activities Committee. After “High Noon,” Foreman left for England, where he would write “The Bridge on the River Kwai.”

“My father was actually very close to Carl Foreman,” Janis said. “My father told Stanley Kramer [the producer], ‘If Foreman’s off the picture, then Cooper is off the picture.’” Foreman remained, and Cooper performed one of his most definitive roles as a marshal who stands against a criminal gang in a town where everyone is too afraid to help. “High Noon” is believed to be a representation of the Hollywood blacklist era―a belief that Janis holds as well.

“My father passionately believed you were free to believe what you wanted to believe,” she said. “He was threatened that he would never work in Hollywood again. But he knew what he believed and he lived his life.”

Cooper in the ring with a bull in Pamplona, Spain. (Courtesy of Maria Cooper Janis)

Lessons From Cooper

Cooper kept working in Hollywood for nearly a decade more until his tragic death from cancer. But Janis wants people to know that there was so much more to her father than his time on the big screen. It is one of the reasons she wrote her book “Gary Cooper Off Camera: A Daughter Remembers,” which focuses on his family life.

“We had a very close family bond,” she said. “If you have loving parents who show you discipline, that’s a leg up in life. I think the importance of a loving, strong father figure for a girl is excruciatingly important.”

Her mother and father were both a source of encouragement. Despite growing up the daughter of Gary Cooper, she never felt pressured to go into acting.

“He basically left it up to me. He and my mother were very realistic. I came to my own conclusions about what I wanted in my life,” she said.

She studied art at the prestigious Chouinard Art Institute in Los Angeles and began a successful career as a painter. She said that being an artist was apparently in her DNA, as her father, her grandmother Veronica Gibbons, and her great-uncle Cedric Gibbons, who designed the Oscar statuette, were gifted artists.

Family time at Cooper’s Brentwood, Calif., residence. (Courtesy of Maria Cooper Janis)

Cooper―at home and on-screen―had given his daughter the proper perspective of what she should look for in a husband. He had thoroughly educated her on the fact that there were some men who “don’t act very gentlemanly.” So he taught her boxing and self-defense.

“He told me, ‘Don’t let any man intimidate you. You are going to be a beautiful woman. Stand up for yourself,’” she recalled. “It was enough to give me a sense of confidence.”

When her father died in 1961, she continued her career in art and retained that confidence. In 1966, she married another artist, Byron Janis, one of the world’s greatest classical pianists. She said marrying Janis was “the greatest fortune that could have ever happened to me.” The two celebrated their 57th wedding anniversary this April.

Cooper and little Maria at the Grand Canyon. (Courtesy of Maria Cooper Janis)

In Cooper’s Memory

Although Cooper has been dead for more than 60 years, his legacy remains. That legacy has been entrusted to his daughter’s care. She has worked to champion her father’s causes as well as his name.

Janis established a scholarship at the University of Southern California in Cooper’s name for Native American students who wish to pursue an education in film and television. She also advocates for continuing research into the terminal illness of amyotrophic lateral sclerosis (ALS), famously known as Lou Gehrig’s disease.

Along with her book, she collaborated with Bruce Boyer on his book “Gary Cooper: Enduring Style” and contributed to the documentary “The True Gen,” about Cooper’s friendship with Ernest Hemingway. She also established the official Gary Cooper website dedicated to his memory.

Janis said she has understood her past and that of her father’s better over the years, quoting the Danish theologian Soren Kierkegaard: “Life can only be understood backwards; but it must be lived forwards.” In a broader sense, her efforts are to ensure Gary Cooper will be better understood by all as the years go by.

The family loved making music together. (Courtesy of Maria Cooper Janis)

From Aug. Issue, Volume 3

Categories
Arts & Letters American Artists Features History

How John Wayne Became the Face of America—On-Screen and Off

Rarely is a man remembered for who he was when he was so overshadowed by what he did. In the case of John Wayne, however, who he was and what he did were one and the same.

John Wayne, born Marion Robert Morrison on May 26, 1907, in the very small Iowa town of Winterset, became one of the, if not the, most iconic actors of the 20th century. At 13 pounds, he was born to become a large man, destined for grand entrances and memorable exits. He was the eldest child of the Morrisons, a marriage that was etched with struggles, insults, and uncertainties. The family was poor and moved a lot, eventually landing in California in 1914.

Out in the farmlands and small towns of his upbringing, he learned how to handle guns, having to protect his father from rattlesnakes while working untamed land. He learned to ride horses. He perfected his reading as he went through the Sears catalogs cover to cover, noting each item he wished he could afford. He learned the idea of hard work, even when it wasn’t profitable, something his father consistently experienced and was reminded of just as often by his mother. He honored both his parents, but he loved his father.

Studio portrait of American actor John Wayne wearing his signature cowboy hat and neckerchief, circa 1955. (Hulton Archive/Stringer/Archive Photos/Getty Images)

Wayne grew up strong and tall, suitable for an athletic career. His athleticism landed him a football scholarship to the University of Southern California in the fall of 1925. While attending school, he worked on movie sets as a prop man and was at times a film extra, typically a football player. During this time, he met the already famous and successful film director John Ford. While bodysurfing on the California coast, Wayne injured his shoulder and lost his scholarship. His football playing days were over, but he was still tall, dark, and handsome, and he decided to join the “swing gang” at Fox Film Corporation moving props.

His relationship with Ford blossomed. The two were opposite in nearly every way, but they attracted, as opposites sometimes do. Ford and Wayne developed a kind of father-son relationship, as Wayne would often call Ford “Coach” and “Pappy.” Ford would be credited with giving Wayne his big break—twice.

A Break and a Name

Ford introduced the young actor to director Raoul Walsh, who decided to have him star in his 1930 epic Western “The Big Trail.” The film was a flop at the box office, though in defense of the film, the Great Depression had just begun. During the filming, however, the studio executives decided that “Marion” was not much of a name for a leading man. Anthony Wayne, after the Revolutionary War general, was considered. Anthony didn’t work either. One of the executives suggested John. When the film was released, his new name was on the posters. Much like his nickname “Duke” was given him by local firemen, his new name, bestowed upon him by others, stuck throughout his life.

A new name and a starring role, however, would hardly change his film career. Throughout the 1930s, Wayne was relegated to B Westerns. As he ascended from his 20s into his 30s, he used his time wisely to perfect his on-screen persona―a persona that he assimilated off-screen as well. His choice of wardrobe, his walk, his fighting style were all tailored for himself by himself. The Duke was an icon in the making, and the making was all his creation. He just needed a true opportunity to showcase it.

American actor John Wayne as a young boy, sitting against a fence on the prairies with his younger brother Robert. (Photo by Hulton Archive/Getty Images)

A Memorable Entrance

That opportunity arrived in 1939 when Ford chose Wayne to star in his Western film “Stagecoach.” The director had always been a believer in Wayne. The young actor had proven to be a hard worker, receptive to directorial guidance, and willing to do many of his own stunts. Along with that, he was 6 feet, 3 inches tall, with a broad-shouldered frame, blue eyes that showed gray on the silver screen, and a strong nose and jawline. His acting also came across honest, as if he was speaking directly to the person in the audience. There was a magnetic pull with Wayne, and Ford decided to do all he could in his film to draw viewers to him.

Wayne was a familiar name and face for moviegoers, having already appeared in 80 films by this time. Familiar, yes. A star, no.

The 1939 film revolves around seven passengers trying to get from one town to the next while trying to avoid the inevitable Indian attack. Nearly 85 years removed, “Stagecoach” remains one of the great Westerns. The movie did more than tell a great story. It did something more important. It introduced the world to John Wayne. Eighteen minutes go by before Wayne makes his entrance in the film, and it is an entrance that was created specifically for the induction of a soon-to-be American icon.

In a wide shot, the stagecoach rides up a slight incline when suddenly there is a gunshot. The stagecoach comes to an abrupt halt. Starting with what is known as a cowboy shot (pioneered by Ford and also known as the American shot), the camera moves in for a close-up of Wayne, who twirls his Winchester rifle. The shot starts in focus, slightly goes out of focus as it moves toward the actor, and then finishes in focus. The actor stands majestically wearing a cowboy hat and neckerchief, which would soon become synonymous with Wayne. The shot was out of place not just for the film, but also for Ford. But it was intentional for reasons explained by Scott Eyman in his biography “John Wayne: The Life and Legend.”

“This is less an expertly choreographed entrance for an actor than it is the annunciation of a star.”

“Stagecoach” was Wayne’s big break into the Hollywood movies, making him one of America’s leading actors and soon to become a star. Theatrical poster for the 1939 American release of “Stagecoach.” ( Public domain)

America’s Leading Man

From this point on, Wayne would embrace his role as America’s leading man. There were other actors, of course, during his rise. Some on the decline, like Clark Gable and Gary Cooper. Some on the rise, like Cary Grant and Jimmy Stewart. Their greatness in their own ways cannot be diminished. Gable with his force of nature persona. Cooper as an embodiment of honesty and kindness. Grant as the romantic symbol of the 20th century. And Stewart, a personified symbol of truth. But Wayne embodied something else, and yet he was all of these things. He became the face of the country.

Authors Randy Roberts and James Olson both noted that Wayne became America’s “alter-ego.” Wayne hoisted that alter-ego upon his cinematic shoulders, which proved more than capable of bearing the load. The Duke chose films that promoted and often propagandized America’s greatness. His primary film genres were war films and Westerns.

When America entered World War II after the Pearl Harbor attack, Wayne was closing in on 35 years of age and already had four children. Film stars, like Stewart and Gable, along with directors, like Ford, joined the war effort overseas. In 1943, Wayne applied to join the Office of Strategic Services, the precursor to the Central Intelligence Agency, but the spots were filled. In 1944, when there was a fear of a shortage of men, Wayne’s status was changed to 1A (draft eligible), but Republic Pictures filed for a 3A deferment for Wayne, which kept him in front of the camera. Ultimately, Wayne believed, arguably correctly, that his impact as an actor (or more cynically a propagandist) would be far greater than as a soldier.

“I felt it would be a waste of time to spend two years picking up cigarette butts. I thought I could do more for the war effort by staying in Hollywood,” he told John Ford’s son, Dan.

Cinematographer Bert Glennon (L) and director John Ford on the set of “Stagecoach”
in 1939. (Public domain)

For all intents and purposes, Wayne, who would have been classified as a private, would have most likely remained behind the scenes doing busy work or promotional bits for the military. Though he would never be a military hero, Wayne proved more than patriotic. As Eyman wrote regarding the type of roles Wayne chose to perform, “His characters’ taste for the fulfillment of an American imperative was usually based on patriotic conviction, rarely for economic opportunity.”

Between the span of America’s entry into the war and the end of its occupation in Japan (1952), Wayne starred in eight World War II films. He would also join the United Service Organizations (USO) overseas, where he entertained the troops and helped boost morale.

A Conservative Stalwart

Throughout his career as America’s leading man, he never shied away from making his conservative views known, and he never wavered from opposing liberal viewpoints. He and Paul Newman, a known Hollywood liberal, regularly talked politics and shared books with each other that discussed their differing political perspectives. Wayne’s 1974 visit to Harvard University, to possibly be disparaged by the student body, resulted in both sides walking away with mutual respect.

Wayne knew what was to be expected, especially with the anti-war movement on campuses. He took verbal barbs and responded in his typical fun-loving yet pointed manner. At one point, he told the young audience: “Good thing you weren’t here 200 years ago or the tea would’ve never made the harbor.” The comment was greeted with cheers rather than boos.

As the New York Post columnist Phil Mushnick wrote, concerning the outcome of the Harvard visit, “There were many who found themselves actually—and incredibly—liking John Wayne. They still disliked his politics, of course, but was he any different from many of their parents?”

Wayne reads a “Prince Valiant” comic with his four children, 1942. (Hulton Archive/Stringer/ Archive Photos/Getty Images)

Eyman pointed out the actor’s quasi-familial influence on the American homeland. Wayne’s growth on-screen and off-screen proved to be near equal in its cultural weight. He reminded “people of their brother or son, he gradually assumed a role as everyone’s father, then, inevitably, as age and weight congealed, everyone’s grandfather.”

On June 11, 1979, America’s grandfather passed away from stomach cancer. He had beaten cancer once before, and it had cost him a lung and some ribs. His final film, “The Shootist,” is about an old gunfighter dying of cancer. Though he had another film lined up, his death after his final film is, still tragically, more fitting than ironic.

Wayne was America’s cowboy. He was the war effort on film. He worked to root out communists in Hollywood. He was a man who believed in patriotism when many Americans tried to give that a bad name. John Wayne was, and, according to polls, still is, part of the American family. When Wayne was being considered for the Congressional Gold Medal in May 1979, the stars came out in support. Elizabeth Taylor told Congress, “He has given much to America. And he has given to the whole world what an American is supposed to be like.”

He was awarded the medal a month after his passing. In 1980, President Jimmy Carter posthumously awarded him the Presidential Medal of Freedom. Of all the attributes one could give John Wayne, the one recommended to Congress by his five-time co-star Maureen O’Hara seems to be the most appropriate.

“I feel the medal should say just one thing,” O’Hara tearfully said. “John Wayne: American.”

Wayne stars as Robert Marmaduke Hightower in the 1948 western “3 Godfathers.” (Hulton Archive/Stringer/ Archive Photos/Getty Images)

From April Issue, Volume 3

Categories
Features History

The Cherished Inheritance of the Adams Family Lineage: Education

If you ask what education means to people, most will think “school.” If they are jaded, “debt.” But for the first great American family, it was much more than this.

In his autobiography, “The Education of Henry Adams,” the author describes growing up within a celebrated lineage that, by his lifetime, had become a cultural institution. During his childhood, Henry wrote, he would often transition between the Boston home of his father Charles Francis Adams, Lincoln’s future ambassador to England during the Civil War, and the home of his grandfather John Quincy Adams, where he played in the former president’s library. Sitting at his writing table as “a boy of ten or twelve,” he proof-read the collected works of his great-grandfather John Adams that his father was preparing for publication. While practicing Latin grammar, he would listen to distinguished gentlemen, who represented “types of the past,” discuss politics. His education, he reflected, was “an eighteenth-century inheritance” that was “colonial” in atmosphere. While he always revered his forebears and felt they were right about everything, he observed that this learning style did not sufficiently prepare him “for his own time”—a modern age that was increasingly defined by technology, commerce, and empire.

Henry Adams is today considered one of America’s greatest historians. Given this, one would probably conclude that his education served him exceedingly well, even if he hoped to produce history rather than merely record it. The substance of his educational ideals was, when stripped of their luxurious trappings, very similar to that of our second and sixth presidents. Although this was precisely the problem for a young man growing up in a new industrial epoch, there is much to admire about this cultivated reverence for tradition. Values, unlike skill sets, do not become obsolete.

Graduation photo of Henry B. Adams from the Harvard College Class of 1858. Massachusetts Historical Society, Boston. (public domain)

A Father Teaches His Son

The wealth and privilege Henry Adams experienced was far removed from the boyhood circumstances of his most famous forefather three generations previously. John Adams was born in a simple farmhouse where the family’s only valuable possessions were three silver spoons. The key to his rise was education. Not only of the formal kind, but of character. John took inspiration from his descendants, “a line of virtuous, independent New England farmers.” When he complained of losing interest in his studies due to a churlish teacher at his schoolhouse, his father, a deacon, enrolled him in a private school. Later, the deacon sold 10 acres of land to pay for his son’s college fund.

John admired his father, striving to embody the qualities of sincerity and patriotism he instilled. He called the deacon “the honestest man” he ever knew and passed on these ideals to his own son, John Quincy Adams. While John was in Philadelphia attending the Continental Congress, he instructed young “Johnny” through letters. Writing to Abigail on June 29, 1777, he said, “Let him be sure that he possesses the great virtue of temperance, justice, magnanimity, honor, and generosity, and with these added to his parts, he cannot fail to become a wise and great man.”

In letters to John Quincy during this same year, John advised his son to acquire “a taste for literature and a turn for business” that would allow for both subsistence and entertainment. Reading the historian Thucydides, preferably in the original Greek, would provide him with “the most solid instruction … to act on the stage of life,” whether that part be orator, statesman, or general. While John was away, Abigail constantly upheld her husband to John Quincy as an example of professional achievement and courage. She encouraged him to study the books in his father’s library and forbade him from being in “the company of rude children.”

For the Adamses, books were not just the means to a career, but a key to unlocking the sum of a person’s life. Education encompassed experience, conduct, and social ties. Like his grandson Henry, young John Quincy was sometimes unsure whether he would be able to live up to his ancestors’ example.

John Adams, second president of the United States from 1797–1801. Official presidential portrait of John Adams by John Trumbull, circa 1792–1793. Oil on canvas. White House Collection, Washington, D.C. (public domain)

A Family Heritage

John instructed John Quincy more directly when taking him along on diplomatic missions in Europe. In Paris, John Quincy began keeping a daily journal at his father’s request, recording “objects that I see and characters that I converse with.” John Quincy observed his father staying up at all hours to assemble diplomatic reports and would later emulate this diligent work ethic.

He then accompanied John to Holland. At the age of 13, he “scored his first diplomatic triumph,” according to biographer Harlow Unger. The precocious young student, dazzling professors with his erudition at the University of Leiden, caught the eye of an important scholar and lawyer named Jean Luzac. John Quincy introduced Luzac to his father, then struggling to convince the Dutch government to give America financial assistance in its costly war with Britain. Luzac was impressed with the Adams family, advocated their cause of independence, and succeeded in securing crucial loans for the desperate young nation.

John Quincy Adams, sixth president of the United States from 1825–1829. Official presidential portrait by George Peter Alexander Healy, 1858. Oil on canvas. White House Collection, Washington, D.C. (public domain)

During this time, John Adams encouraged his son to continue studying the great historians of antiquity: “In Company with Sallust, Cicero, Tacitus and Livy, you will learn Wisdom and Virtue.” He closed his letter by emphasizing the importance of the heart’s authority over the mind: “The end of study is to make you a good man and a useful citizen. This will ever be the sum total of the advice of your affectionate Father.”

John Quincy, ever the obedient son, attended to both the wisdom of the distant past and his family heritage that enshrined it. While following in John’s footsteps as a diplomat, and later president, he would pass these values on to his own children.

The success, achievement, and public legacy of the Adams family has everything to do with this conception of education as a living inheritance. Writing over a century later, Henry Adams saw the role of learning as a lifelong endeavor that was difficult to justify through any specific practical or monetary measurement. But, he added, “the practical value of the universe has never been stated in dollars.”

(Public domain)

From March Issue, Volume 3

Categories
Features Founding Fathers History

James Madison’s Essays Became the Foundation for Separating Church and State

Among the constitutional amendments, the First is the most sacred. Its guarantees of the freedoms of religion, speech, press, assembly, and the right to petition have made American shores a beacon for the world. The quiet and bookish man who first proposed it spent many years reflecting on its related issues in solitude—an uncommon pastime for a politician. The First Amendment has become so fundamental to the way Americans think about themselves as social creatures that it is easy to forget the skepticism, and even outrage, that it caused in its day.

A Scholar Enters Politics

James Madison was a shrewd student of political history. Of his many thoughts on government, though, one concern was foremost. In “The Three Lives of James Madison: Genius, Partisan, President,” Noah Feldman observes: “The subject that most animated James Madison was the freedom of religion and the question of its official establishment.” He developed an academic interest in the topic at Princeton under the Rev. John Witherspoon, who filled him with ideas of religious liberty inspired by the Scottish Enlightenment.

After graduating, Madison witnessed the persecution of religious dissenters in his native Virginia, where the Anglican Church was the established religion. In a letter to a friend dated January 24, 1774, Madison described traveling to a nearby county and encountering imprisoned Baptist ministers, “5 or 6 well-meaning men” who did nothing more than publish their orthodox views. That April, he wrote: “Religious bondage shackles and debilitates the mind and unfits it for every noble enterprise, every expanded prospect.”

“James Madison” by John Vanderlyn, 1816. Oil on canvas. The White House, Washington, D.C. (Public domain)

Madison entered local politics and attended Virginia’s constitutional convention in 1776. There, George Mason submitted his draft for a Declaration of Rights, which included a clause stating that “all Men should enjoy the fullest Toleration in the Exercise of Religion, according to the Dictates of Conscience.” Madison was not satisfied. He understood that a majority, in granting a minority permission to practice religion, could take it away as well. Going beyond John Locke’s idea of toleration, Madison successfully proposed changing the wording to reflect the right of “free exercise of religion.”

This guarantee ended the Anglican Church’s spiritual monopoly in Virginia. Eight years later, though, Patrick Henry spearheaded legislation to levy religious taxes. Madison opposed Henry but knew he was too soft-spoken to match the eloquent orator. He responded by writing a petition, “Memorial and Remonstrance against Religious Assessments.” Religious belief, he argued, “must be left to the conviction and conscience of every man; and it is the right of every man to exercise it as these may dictate.” Belief could not be coerced and must exist in a separate sphere from civil government. Even a small tax could become oppressive.

Madison’s essay became foundational for the idea of the separation of church and state. The petition garnered enough signatures to defeat the proposed bill, and in 1786 the Virginia Statute for Religious Freedom was passed.

The Bill of Rights

By 1789, Madison had designed the Constitution and convinced most of the states to ratify it by authoring 29 of the articles that comprised “The Federalist Papers.” But the groundbreaking document was not safe. North Carolina and Rhode Island still had not ratified it. Opponents who favored states’ rights over federal power wanted to hold a second constitutional convention to undo the new government.

To prevent this, Madison drafted amendments that would address the Constitution’s flaws. He submitted his draft to Congress on June 8, proposing protections of individual liberties without changing the government’s structure. He sought to encapsulate, among these, his years of religious reflections. The clause of the proposed amendment—originally the fourth rather than the first—was more descriptive than its final version:

“The civil rights of none shall be abridged on account of religious belief or worship, nor shall any national religion be established, nor shall the full and equal rights of conscience be in any manner, or on any pretext infringed.”

This proposal had three aspects: guaranteeing equal treatment of minority views, barring Congress from establishing a national church, and establishing conscience as a right free from coercion.

The Bill of Rights includes the first 10 amendments to the U.S. Constitution. (Jack R Perry Photography)

Madison struggled to get his amendments passed. Federalists ridiculed them as useless “milk and water.” Anti-Federalists unanimously opposed him. His old nemesis Patrick Henry called for a total revision of the Constitution, claiming a national bill of rights did not sufficiently guard them for individuals or states. An anonymous author, writing under the pen name “Pacificus,” asserted in a New York newspaper that Madison’s “paper declarations” were “trifling things and no real security to liberty.”

Madison defended his bill, arguing it would limit the tyranny of the majority and “establish the public opinion” in favor of rights. Federalist support began to trickle in. Madison wanted to fold the amendments into the Constitution itself, but he settled for appending them at the end. Representatives eliminated some of his proposals and altered others.

The final version of the First Amendment’s clause on religious liberty came to read: “Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof.” This slightly more restrictive version omitted Madison’s phrasing on the “rights of conscience,” but it is otherwise consistent with his intentions. Madison’s achievement made him the world’s foremost champion of religious liberty. His recognition of free exercise, rather than mere toleration, has been a model for other governments around the globe.

From February Issue, Volume 3

Categories
Features History

Henry David Thoreau, a Man Who Took Simplicity to Heart

“The earth is not a mere fragment of dead history, stratum upon stratum like the leaves of a book, to be studied by geologists and antiquaries chiefly, but living poetry like the leaves of a tree.” This living poetry was what led to Henry David Thoreau’s philosophy for life.

By most, Thoreau is considered one of America’s great 19th-century writers, but it would be nearly impossible to read his work without also thinking of him as one of its great 19th-century philosophers. Of his many works, none captures his philosophy as well as his “Walden; or, Life in the Woods.”

Thoreau didn’t simply espouse his philosophy. He lived it.

‘I Wished to Live Deliberately’

Thoreau lived in the northeast part of the country during the early to middle part of the 1800s. His home was in Concord, Massachusetts, but his abode was of his own making—at least, for a period of time. The writer and naturalist explained: “I went to the woods because I wished to live deliberately, to front only the essential facts of life, and see if I could not learn what it had to teach, and not, when I came to die, discover that I had not lived. … I wanted to live deep and suck out all the marrow of life.”

He believed there was only one way to accomplish this, and that was to venture far from the people of his town and live in Walden, among woodland hills that surrounded a large pond. Though beautiful, he noted that the scenery was of “a humble scale.” Interestingly, Walden, with its simultaneous beauty and humility, was a reflection of Thoreau’s philosophy. His idea was to live humbly within the beauty that nature presented: the wildlife, the change of seasons, the hardships, the solitude.

His philosophy was an exercise in self-reliance that focused on the three essential elements of food, water, and shelter. Fresh water was readily available to him with the pond’s deep well, except of course when the pond froze in the winter and required that he use his axe and pail and go in search of water. His food was either provided by nature or by his own efforts—gardening, fishing, picking wild berries, hunting small animals—and, as well, infrequent visits to town.

Interior of the replica of Thoreau’s cabin at Walden Pond. (Tom Stohlman (CC BY 2.0, CreativeCommons.org/ licenses/by/2.0))

His shelter took more time, but he was supplied with lumber from the woods. He began chopping down trees at the end of March 1845, and by July 4 he had occupied the “palace of his own”—furnished with table and chairs, flooring, and a fireplace all of his own making, and what he viewed as ever-changing transcendent artwork.

“When I see on the one side the inert bank—for the sun acts on one side first—and on the other this luxuriant foliage, the creation of an hour, I am affected as if in a peculiar sense I stood in the laboratory of the Artist who made the world and me—had come to where he was still at work, sporting on this bank, and with excess of energy strewing his fresh designs about.”

Poverty as Wealth

There was little doubt that the naturalist writer was different. His transcendentalist views, a movement that originated in New England, caused him to stand out. More than that, however, Thoreau desired to be a man apart. Despite his beliefs about humanity and nature, he, much like most anyone, had questions that only a trial could answer. What could he endure? What were the necessities of life? What was poverty and what was wealth? What was true philosophy and what was true economy?

Thoreau wanted to live well, although not in an economic sense. His view of wealth was concentrated on necessity and simplicity, and even morality.

“Give me the poverty that enjoys true wealth,” he boldly stated. “No man ever stood the lower in my estimation for having a patch in his clothes; yet I am sure that there is greater anxiety, commonly, to have fashionable, or at least clean and unpatched clothes, than to have a sound conscience.”

He witnessed anxiety in his neighbor (neighbor being a relative term, as most people lived at least a mile from him) whose desire for luxury items, like butter, coffee, and tea, caused discontentment. To Thoreau, his neighbor’s cycle of work-spend-work was so labor-intensive that the results hardly seemed worth the effort.

“Why should we live with such hurry and waste of life?” he asked. “We are determined to be starved before we are hungry.”

“Why should we be in such desperate haste to succeed and in such desperate enterprises? If a man does not keep pace with his companions, perhaps it is because he hears a different drummer.”

The Beat of a Different Drum

Thoreau believed that every man should be able to choose his own path, which was what he viewed as the makings of the “true America.” It bothered Thoreau that his neighbor had so little to show for his labor, while he himself labored little and showed nearly the same result. But it did not matter enough to him to force the issue. “Let everyone mind his own business, and endeavor to be what he was made,” he wrote.

Thoreau heard a drummer that many, if not most, could not hear. He noted that the problem was that people were listening to the same drummer, and it was the drummer of “public opinion.” And public opinion was often tied to what is new rather than what is valuable.

Thoreau’s reconstructed cabin in Walden Woods in Concord, Mass. (Alizada Studios/ Shutterstock)

“One generation abandons the enterprises of another like stranded vessels,” he wrote, and then added, “Every generation laughs at the old fashions, but follows religiously the new.”

For Thoreau, espousing a philosophy meant more than empty words; it should be a guide for living. He was a philosopher who lived his philosophy, while some, he believed, simply philosophized. He felt there was something tragic about that, as it not only caused the philosopher to not truly live, but also caused harm to those who were taught such philosophies.

Leaving Walden, and a Challenge

On September 6, 1847, Thoreau left Walden, having spent more than two years living out his philosophy “to live deliberately.” He endured the harsh New England winters and enjoyed its beautiful springs. He discovered what wealth truly was, at least for him. He realized what he could endure and he embraced the peace of solitude. For a transcendentalist, “Walden” was his magnum opus, connecting his spiritual beliefs, his love of nature, and his personal philosophy. It is hard to miss the connection when he recollects his walks through the “laboratory of the Artist.”

Sometimes I rambled to pine groves, standing like temples … so soft and green and shady that the Druids would have forsaken their oaks to worship in them; or to the cedar wood … where the trees … spiring higher and higher, are fit to stand before Valhalla, … and make the beholder forget his home with their beauty, and he is dazzled and tempted by nameless other wild forbidden fruits, too fair for mortal taste.

The work of “Walden; or, Life in the Woods” calls into question our own philosophies. What are they, and do we believe them? And if we believe them, do we live them? The Roman poet Horace famously wrote, “Carpe diem” (“seize the day”); Thoreau’s call is an echo of that. It is a call “to live deep and suck out all the marrow of life,” no matter the situation.

“However mean your life is, meet it and live it; do not shun it and call it hard names,” Thoreau wrote in the final pages of his great work. “Love your life, poor as it is. You may perhaps have some pleasant, thrilling, glorious hours.”

From January Issue, Volume 3

Categories
Features Founding Fathers History

Dining with Thomas Jefferson: Travel Back in Time for a Lively Evening of Wisdom and Whimsy

In 1962, our young, charismatic president John F. Kennedy was entertaining the year’s Nobel Prize winners at the White House. He said of the group, “I think this is the most extraordinary collection of talent, of human knowledge, that has ever been gathered together at the White House, with the possible exception of when Thomas Jefferson dined alone.” It is a great statement, to be sure.

Feasts of Wisdom

The journals of Margaret Bayard Smith tell us some interesting details about her visits to the “President’s House,” where she and her husband actually dined with Thomas Jefferson, our country’s third president. Margaret Smith came to Washington as a young bride in 1800. Her husband was a newspaperman and a strong supporter of Jefferson’s bid for the presidency. The Smiths and Jefferson frequently entertained each other. Unfortunately, Jefferson’s wife, Martha, had died years earlier in 1782.

The President’s House, far from being the stately edifice we know today, was a work in progress. Jefferson’s personal quarters were furnished as befit a man of his many interests. Smith writes; “The apartment in which he took most interest was his cabinet; this he had arranged according to his own taste and convenience. It was a spacious room. In the centre was a long table, with drawers on each side, in which were deposited not only articles appropriate to the place, but a set of carpenter’s tools in one and small garden implements in another from the use of which he derived much amusement. Around the walls were maps, globes, charts, books, etc.” This collection is reminiscent of Jefferson’s personal effects at Monticello. He placed such importance on reading that he would often greet his guests while putting down a book, both at the President’s House and back home in Monticello.

Jefferson had no long, rectangular tables where guests would sit in long rows, awkwardly conversing with those assigned within earshot. Instead, Jefferson introduced a round table and limited the number of guests to around 14. He preferred to be addressed as “Mr. Jefferson,” not “Mr. President.” The man truly enjoyed lively discussion, and this arrangement assured that no one was left out of it.

Far from reveling in his own words, Jefferson surrounded himself with a rich feast of wisdom, made all the more enjoyable by the implementation of intimacy and courtesy. His guests tended to be interesting people such as Alexander von Humbolt, the great Prussian naturalist and baron. Jefferson loved to mix such intellectuals with the important people of government whom he might have felt compelled to entertain. Smith certainly gives the impression that these were rich events to be savored rather than social obligations to be endured. She notes:

“Guests were generally selected in reference to their tastes, habits and suitability in all respects, which attention had a wonderful effect in making his parties more agreeable, than dinner parties usually are; this limited number prevented the company’s forming little knots and carrying on in undertones separate conversations, a custom so common and almost unavoidable in a large party. At Mr. Jefferson’s table the conversation was general; every guest was entertained and interested in whatever topic was discussed.”

Smith describes the fare as a mixture of “republican simplicity … united to Epicurean delicacy.” His guests loved it. Southern staples such as black-eyed peas and turnip greens shared the stage with delicacies prepared by Honoré Julien, the president’s French chef. We know that Jefferson loved and served fine wine, as well as macaroni and cheese created from his own recipe.

Lively Affairs

Jefferson was a man who never stopped learning, and these dinners were certainly an extension of that fact. His favorite parties were those limited to four. To keep the conversation flowing, Mr. Jefferson brought his inventiveness to the room’s design: He installed dumbwaiters and placed revolving shelves in the walls so that the distraction of serving dishes and clearing the table, typically accompanied by servants and the opening and closing of doors, was minimized.

That’s not to say there were no distractions, however. There was Jefferson’s pet bird, which “would alight on his table and regale him with its sweetest notes, or perch on his shoulder and take its food from his lips. … How he loved this bird!”

After dinner, guests might stretch their legs with a visit to the house gardens. Since Congress refused to appropriate money for improving the grounds, Jefferson did so at his own expense. It, too, was a work in progress. Jefferson, who planted European grapes at Monticello, did not do so at the President’s House. He chose instead to display flora and fauna native to America. For several months, guests could see two live grizzly bear cubs that Captain Zebulon Pike had acquired during his expedition along the Arkansas River.

Egalitarian dining, surrounded by the wonders that Jefferson collected, inevitably led to a convivial discussion. Here, ideas that shaped the course of a young nation would find lively expression.

From January Issue, Volume 3

Categories
History Features

How American Pilots Formed the Kosciuszko Squadron During World War I to Help the Polish Fight the Soviet Invasion

Near the end of World War I, the Russian Empire, its army rent by defeat after costly defeat and with morale ebbing, collapsed. Bread riots, strikes, and a mutinous army forced Czar Nicholas II Romanov to abdicate on March 15, 1917. After three centuries of Romanov control, the Russian Empire dissolved into a provisional government, which was soon replaced by an October coup that installed Vladimir Lenin and his Bolsheviks in command of vast swathes of the country.

The Bolsheviks were not content with Russia alone. Although now embroiled in a bitter civil war, they sought to spread the communist revolution all the way to the Atlantic Ocean. The newly independent former subjects of the Russian Empire—Poland, Finland, Ukraine, Estonia, Latvia, and Lithuania, in particular—now found the Bolshevik hordes on their doorsteps. The Red Army swept out from the steppe, snatching up territory and installing puppet governments, despite attempts of brave peoples along the line of march to resist. It seemed more and more likely that all Europe would turn red.

U.S. Army Capt. Merian Cooper watched the spread of Bolshevism and feared for the future of Europe. He had a personal stake in the continent, having flown a British De Havilland 4 over France during World War I. He’d demonstrated both daring and tremendous piloting skill when, hit by a flurry of rounds from a group of German Fokker aircraft that ignited his gas tank, he threw the plane into a steep dive, choking the flames and managing to crash land next to a German infantry outpost. He spent the remainder of the war in a prisoner of war camp. It was during his incarceration that he heard Russians plotting to spread communism throughout the world. His biographer considered this the moment that he began his “lifelong crusade against the Communists.”

A photograph of Cooper on Feb. 1, 1920, during the time he served with the Kosciuszko Squadron. (Public domain)

Fighting for Poland

After his release, Cooper joined a humanitarian mission to Poland, now embroiled in the midst of what history would call the Polish–Soviet War. The sight of innocent young Poles defending their freedom tore him to his heart. He wrote to his father, “It grieves me every day that I am doing so little for the cause of Polish liberty, when Pulaski did so much for us.” The story of Casimir Pulaski—the Polish noble who fought and died for American independence as the “father of the American cavalry” and was comforted at his death by Cooper’s own great-great-grandfather—deeply affected Cooper. He vowed to set aside his peaceful work and “get into the fight” against the Bolsheviks.

By early 1920, the colorful captain, joined by the equally eccentric Maj. Cedric Fauntleroy as commander and a total of 21 American pilots, entered the fray as the Polish 7th Air Escadrille. The squadron soon bore the name of Tadeusz Kosciuszko, the famed Polish army officer who aided the Americans in the Revolution. The American flyers saw themselves as repaying a debt owed to Poland, who loaned her son to the infant nation when she most needed aid.

Initially tasked with logistical and reconnaissance duties, the Americans gladly aided the Polish army’s May 1920 Kyiv Offensive, aimed at freeing the Ukrainians from the grasp of the communists. Largely unopposed in the air, the Kosciuszko Squadron nonetheless found the advancing Bolsheviks striking at their airfields and had to flee and regroup. Although the Polish and Ukrainian forces recaptured Kyiv on May 7, 1920, soon the tide turned under the wave of red, and the American flyers fought a bitter retreating action across Ukraine and back to Polish soil. Here the Americans did their namesake proud.

Photograph of the Kosciuszko Squadron of the Polish Air Force on Jan. 9, 1920. (Public domain)

Maintaining air superiority over the Bolshevik forces, the Americans bravely flew sortie after sortie, diving out of the sky to strafe the Bolshevik forces at horse-head height. This daring tactic and the roar of the biplane engines terrified the enemy horses and men alike, and to sow additional confusion, the Americans hand-dropped bombs with devastating accuracy, breaking up the enemy advance. Meanwhile, in what came to be known as the Miracle on the Vistula, the tattered body of the Polish army, bolstered by students, nurses, professors, and priests, smashed the encroaching Bolshevik army along the Vistula River and saved Warsaw from danger. Polish cavalry regrouped along the line and, aided by the exhausted pilots of the Kosciuszko Squadron, obliterated the Red cavalry at the Battle of Komarow in late August through early September 1920. The salvation of Poland was at hand.

One of the Polish commanders, Gen. Antoni Listowski, is said to have remarked that “without [the] assistance [of the Kosciuszko Squadron], we would have gone to the devil a long time ago.” Many of the men in the Squadron received Poland’s highest military decoration, the Virtuti Militari, including Merian Cooper and Cedric Fauntleroy. The brave airmen of the Kosciuszko Squadron never shied from the thick of the fight: Three of the 21 pilots were killed, and the group as a whole flew more than 400 sorties in defense of their fellow man. Just like the many foreign noblemen that came to fight for America during her own revolution—Baron von Steuben, the Marquis de Lafayette, and, of course, Tadeusz Kosciuszko, to name a few—the men of the Kosciuszko Squadron stood up against oppression and risked their lives for others. Whether drawn by a sense of adventure, of camaraderie, or of duty, they lived out the core values of what it means to be an American.

This article was originally published in American Essence magazine.