Categories
History American Success Entrepreneurs Features

Grumman Engineering Had to Get 30,200 Pounds of Apollo Spacecraft to Moon and Back

In the fall of 1962, a little airplane manufacturer on Long Island, Grumman Aircraft Engineering Corporation, beat out seven competitors for the lunar module contract. How did this happen?

The story begins when Leroy Grumman, the company’s founder, struck out on his own in 1929. Working out of a rented garage, he began developing some of his own experimental airplane designs. In 1932, he presented the U.S. Navy with the FF-1, his first production fighter aircraft. The plane’s design continued to be improved, leading eventually to the creation of the F4F Wildcat, Grumman’s first fighter with folding wings.

Diagram of the Apollo Lunar Module cockpit. (Jasmina Zhang for American Essence)

Grumman built tough planes. The “cat” series, built for the U.S. Navy, had a reputation for getting their crews home. The sturdy aircraft, designed and built for carrier deployment, earned the company the nickname “Grumman Iron Works.” Aluminum, however, was the material Grumman engineers had real mastery over, forming it into beautiful aerodynamic shapes to build their planes.

Enter Aeronautic Engineer Tom Kelly

Grumman engineer Tom Kelly spoke of his involvement in the early development of the moon lander: “I guess I’ve been involved in Apollo-related work as long as anybody in Grumman, actually. I started on the thing in 1960—April 1960.” Kelly and his team competed for NASA-funded studies. Though they didn’t win any of them, Kelly said, “we went down and gave our own study conclusions to the NASA people right along with everybody else—we had a very active interest in-house, and we just wouldn’t let it die; whether it was funded, or not, we kept going with it.” Kelly’s work ushered in a whole new era for the company.

Buzz Aldrin removes the passive seismometer from a compartment in the SEQ bay of the Lunar Lander (Apollo 11 “Eagle”), July 21, 1969. (Public domain)

Grumman was not one of the larger competitors for NASA contracts. They initially offered to be a subcontractor in General Electric’s bid to build the command module and service module. North American Aviation beat them out. NASA had originally intended for the command module and service module to land on the moon and take off directly from the lunar surface to return to Earth. That particular spacecraft configuration proved to be prohibitively massive. It would require a rocket larger than anything already developed just to get it into space. But an engineer at Langley Research Center, John Houbolt, suggested taking along a smaller spacecraft, just to land on the moon. It would then launch from the lunar surface and rejoin the command module, which would now remain in lunar orbit.

The lander would be discarded after the astronauts transferred back inside the command module, which alone would return to Earth. Rendezvous in lunar orbit seemed risky, but it saved so much weight that it allowed the program to go forward at a pace that would meet President John Kennedy’s challenge to land on the moon within the decade. When NASA decided that they would develop the program around the lunar-orbit rendezvous approach, Tom Kelly and his team were well prepared to offer their proposal. Grumman wrote up the proposal, and General Electric became the subcontractor for the lander’s electronics.

When they won the contract in 1962, Kelly and his engineering team realized that they would be faced with the same challenge that had faced Leonardo da Vinci, the Wright brothers, and Charles Lindbergh: weight! Every step forward in human flight had involved overcoming the limitations imposed by gravity. NASA gave them an initial estimate of 30,200 pounds for the spacecraft. The craft that landed on the moon and then launched from the lunar surface to rendezvous with the command module had to fit within this prescribed limitation. They had seven years.

Armstrong trains in the Lunar Module simulator at the Kennedy Space Center on June 19, 1969. (Public domain)

Overcoming Challenges

Kelly’s team worked tirelessly to conserve weight in unusual ways—in particular, the engineering of the astronauts’ seats. Grumman built 15 landers, 6 of which actually went to the moon. Some of the others are on display in museums, and visitors often ask where the astronauts’ seats are. In 1964, the design team eliminated them. The astronauts flew the lander standing up. In gravity that was one sixth that of Earth, the astronauts could fly, land, and take off standing in the craft. Their legs were all the shock absorbers they needed. With no seats, the astronauts also had more room for donning their space suits for the walk on the lunar surface. They could also hang their sleeping hammocks for the rest they needed while on the moon. Removing the seats saved weight in itself, but the move also allowed the astronauts to stand closer to the craft’s windows, allowing them to be significantly smaller. This saved hundreds of pounds of glass as well.

Astronaut Pete Conrad would refer to the cabin design as a “trolley car configuration.” Bethpage, New York, where the landers were built, is just 30 miles east of Brooklyn, where trolley car motormen actually stood up while operating a throttle with the left hand and a brake with the right. According to Kelly, those trolley cars had already inspired the name of a baseball team. Manhattan residents, who had more subways, sometimes referred to Brooklyn’s inhabitants as “trolley dodgers”; hence, the team’s name came to be the Brooklyn Dodgers. Did the trolleys of Brooklyn also influence the design of the lunar lander? Conrad’s reference suggests it might have.

The Apollo 9 Lunar Module (Spider) photographed from the Command Module on March 7, 1969, the fifth day of the Apollo 9 Earth-orbital mission. (Public domain)
Armstrong after the completion of the Lunar Extravehicular Activity on the Apollo 11 flight; photographed by Aldrin on July 20, 1969. (Public domain)

The landing module (LM) had to operate in extreme temperatures. The team came up with the Kapton sheeting (a kind of Mylar foil covered with gold leaf) that gives the lower part of the craft its “tinfoil” appearance. It simply reflected the solar heat away from the spacecraft, much like a windshield reflector does for a parked automobile. Because the lander never had to fly in atmosphere, it needed no aerodynamic design—no smooth, rounded surfaces to resist airflow. It could just be a long-legged, boxy shape. The first manned LM, flown in the Earth’s orbit by Apollo 9, would be called “Spider.” After one more dress rehearsal in lunar orbit by Apollo 10, the “Eagle,” flown by Neil Armstrong and Buzz Aldrin on Apollo 11, would land on the moon. The date was July 20, 1969—eight years after John F. Kennedy laid down the challenge.

Tom Kelly and the Grumman team did some thinking beyond the task at hand that proved invaluable just two missions later. They recommended designing “lifeboat” capabilities into the LM. These capabilities would save the lives of the Apollo 13 astronauts when their command-service module was crippled by an explosion. The crew fired up the LM and used it to provide life support and navigation right up to the time that they jettisoned it. The command module was the only part of the spacecraft that could reenter the atmosphere. Though the LM “Aquarius” was consumed in a fiery reentry itself, the “Grumman Iron Works” team had successfully delivered one more crew safely home. In 1994, Grumman merged with the Northrup Corporation to become Northrup Grumman, one of the country’s largest aircraft manufacturers. In 1994, Grumman merged with the Northrup Corporation to become Northrup Grumman, one of the country’s largest aircraft manufacturers.

This article was originally published in American Essence magazine. 

Categories
History Features

The Story of Nellie Bly, the Brave 19th-Century Journalist Who Went Undercover to Expose Abuses at an Insane Asylum

In 1887, Nellie Bly boarded the boat with the other patients bound for Blackwell’s Island, now known as Roosevelt Island. Their stay in the filthy cabin was mercifully short, and soon they crossed the East River and disembarked. After an ambulance ride, Bly and the others found themselves ushered into the stone buildings of the insane asylum. Unlike the others interned at the asylum, however, Bly came by choice. As an undercover reporter, she planned to witness the rumored abuses at the asylum firsthand and expose them.

“I had some faith in my own ability as an actress,” Bly later wrote. “Could I pass a week in the insane ward at Blackwell’s Island? I said I could and I would. And I did.”

The Reporter’s Beginnings

Nellie Bly was the pen name of Elizabeth Jane Cochran, born May 5, 1864, in Cochran’s Mills, Pennsylvania. When her father died young and his estate was split among his many children and second wife (Bly’s mother), the family fell on hard times. From a young age, Bly worked many jobs to help support her mother and family but struggled to find work that paid well.

In 1880, the family moved to Allegheny City, Pennsylvania, (the city was annexed by Pittsburgh in 1907). One day, Bly read an article in the Pittsburgh Dispatch opposing women in the workplace. She wrote a letter to the editor offering an opposing view on the subject. Managing editor George Madden was impressed, and in the next edition of the paper, he asked the author of the letter to come forward.

Photograph of Nellie Bly in 1890 from the Museum of the City of New York. (Public domain)

“She isn’t much for style,” Madden said, “but what she has to say she says right out.”

Bly went to the Dispatch’s office and soon had a job and a pen name—Nellie Bly, after the popular song “Nelly Bly” by Stephen Foster. One of her first series for the Dispatch covered the conditions of poor working girls in Pittsburgh. At 21 years of age, she went to Mexico and wrote articles for the Dispatch until her criticism of the country’s censorship almost resulted in her arrest.

The majority of the articles the Dispatch assigned her, however, were simple, women’s interest pieces on entertainment, arts, or fashion. Bly wasn’t satisfied writing these pieces, so in 1887 she packed her bags and headed to New York.

A Secret Assignment

Bly tried to find a job at a New York newspaper for a few months to no avail, but she wasn’t about to return to Pittsburgh in defeat. Giving up wasn’t an option. “Indeed, I cannot say the thought ever presented itself to me, for I never in my life turned back from a course I had started upon,” she wrote.

One night, Bly realized her purse was missing and with it the rest of her money. She went to the offices of The New York World and demanded to see the editor in chief. When Bly finally spoke to managing editor John Cockerill, she pitched the idea of riding in the steerage of a ship to Europe and back, reporting on the condition that passengers, primarily immigrants, endured. The World wasn’t interested in that idea, but Cockerill proposed a different idea instead. The 23-year-old Bly would get herself sent to Blackwell’s Island and experience the rumored abuses firsthand. Bly agreed to take the assignment.

“How will you get me out?” she asked.
“I do not know,” Cockerill replied. “Only get in.”

Illustration of Bly practicing feigning insanity from her 1887 book “Ten Days in a Madhouse.” (Public domain)
Illustrative plate of an insanity expert at work, from Bly’s 1887 book. (Public domain)

The Asylum on Blackwell’s Island

Bly rented a room at a boardinghouse called the Temporary Home for Females. Her theatrics there and at Bellevue Hospital soon earned her a place at the asylum. Once there, Bly quickly found the rumored abuses to be true. The food and overall conditions were horrendous. Many people at the asylum were wrongly interned, including some immigrants who didn’t get a chance to plead their cases because they couldn’t speak English.

The nurses and caretakers at the asylum treated all patients with contempt and cruelty. Bly gathered testimony from patients in addition to the experiences she herself endured. After arriving, she acted completely normal and explained to the doctors that she should be examined and let go. She quickly learned that her only way of escape would be when someone from the World came to get her.

“The insane asylum on Blackwell’s Island is a human rat-trap. It is easy to get in, but once there it is impossible to get out,” she wrote.

After 10 days in the asylum, an attorney from the New York World came and obtained her release. Bly found herself strangely conflicted upon her departure. “I had looked forward so eagerly to leaving that horrible place yet when my release came … there was a certain pain in leaving,” she wrote. “For ten days I had been one of them. Foolishly enough, it seemed intensely selfish to leave them to their sufferings.”

A journalist on the go, Bly poses with her carpetbag, 1890. (Public domain)

Bly wrote a series of articles exposing the asylum, which were then compiled into a book, “Ten Days in a Madhouse.” Later, she testified to a grand jury about her experiences. This led to an increase in funding for Blackwell’s and institutions like it to provide adequate care for patients. “I have one consolation for my work—on the strength of my story the committee of appropriation provides $1,000,000 more than was ever before given, for the benefit of the insane,” she wrote.

Honor and Truth

Bly’s career was never smooth sailing, but she continued to write the rest of her life. “Energy rightly applied and directed will accomplish anything,” she said.

Though plagued with writing filler pieces, she also wrote articles that exposed an employment agency, a company supposedly “selling” unwanted babies, another factory where girls worked in horrible conditions, a corrupt lobbyist, and more. She was in Europe when World War I broke out, so she served as a war correspondent, braving the front lines. All in all, Bly worked to report what she saw regardless of what subject she was assigned.

“Write up things as you find them, good or bad,” she said. “Give praise or blame as you think best, and the truth all the time.”

This article was originally published in American Essence magazine. 

Categories
House of Beauty Features History

The Breakers in Newport, Rhode Island: A Grand Tour of the Vanderbilts’ Italianate Summer Home

In the autumn of 1885, Cornelius Vanderbilt II paid a little over $400,000 for a summer cottage in Newport, Rhode Island. The Queen Anne style house, built in 1878, was considered the “crown jewel” of Newport. It had been designed by the architectural firm of Peabody and Stearns for Pierre Lorillard IV, whose fortune came from the Lorillard Tobacco Company. He bred thoroughbred race horses and financed archaeological expeditions to South and Central America. He helped to make Rhode Island a yachting center as well. The house was situated along Cliff Walk in Newport, with an amazing view of the ocean.

When Cornelius Vanderbilt II acquired the “cottage,” he hired Peabody and Stearns to oversee $500,000 in renovations to it, but in 1892 a fire that started in the kitchen largely destroyed the house. Vanderbilt decided to demolish the ruined house, right down to its foundations, and build anew. He brought in architect Richard Morris Hunt, who had worked for the Vanderbilt family in New York, and expressed to him his great concern about the new house being fireproof. Hunt responded by creating a design that would cost $7 million to build—even in 1893.

The entrance gates, manufactured by the William H. Jackson Company of New York, rise 30 feet above the driveway and feature a monogram of Cornelius Vanderbilt’s initials as well as acorns and oak leaves— symbolic of the Vanderbilt family. (Courtesy of The Preservation Society of Newport County)
Designed by Richard Morris in the style of ancient Rome, the walls of the Billiard Room are constructed from slabs of Italian cippolino marble with rose alabaster arches. Semi-precious stones create mosaics. The Billiard Room was featured in the second episode of “The Gilded Age” series on HBO. (Courtesy of The Preservation Society of Newport County)

The bones of the estate would be steel, brick, and Indiana limestone. Rather than using wood framing, the architect created masonry arches on steel beams. The boiler room was in a detached building and connected to the main house by an underground steam tunnel. What rose from the original foundations was not simply a reconstruction of the old house, but a grand edifice in the style of the Italian Renaissance. It would be the grandest Gilded Age mansion of Newport. In fact, the new Breakers is much larger than the original house, of which the remaining foundations made up only part of the base of Hunt’s grand masterpiece. Hunt took his inspiration for The Breakers from Peter Paul Rubens’s book “Palazzi di Genova,” written in 1622. He acquired the book on a trip to Genoa and referred to its detailed illustrations as he created a Renaissance villa for the Vanderbilts.

Approaching the mansion from the street, it appears to be three stories high (it is actually five). As you enter the foyer, there is a gentleman’s reception room to the right and a ladies’ reception room to the left. Continuing straight, you step into the immense Great Hall. Rising 50 feet above with its great balconies, the Great Hall creates the illusion of an Italian open courtyard, or cortile. Hunt organized the rooms of the mansion around this central space, in the manner of the villas depicted in “Palazzi di Genova.” The firm of Allard and Sons of Paris created the interiors, importing the finest materials for its work. Austrian sculptor Karl Bitter created the relief sculpture in the estate. Ogden Codman, a Boston architect, oversaw the design of the family quarters.

Portrait hanging inside the Morning Room at The Breakers of Countess Laszlo Szechenyi (Gladys Moore Vanderbilt), the youngest child and daughter of Mr. and Mrs. Cornelius Vanderbilt II, by Philip de Laszlo, 1921. (Public Domain)
The Music Room showcases a gilt-coffered ceiling lined with silver and gold. This room was featured in the season finale of the HBO series “The Gilded Age.” (Courtesy of The Preservation Society of Newport County)

For the grand view of the ocean, Hunt created the double loggia (covered exterior galleries, one above the other, created primarily as a place for sitting). The lower loggia has a vaulted ceiling covered in mosaic, and the upper loggia is painted to resemble canopies against the sky. The spandrels (panels) of the loggia arches feature figures representing the four seasons of the year. The materials and the artisans were imported from overseas. Inspired by the palaces and villas of 16th-century Genoa, Hunt drew from classical Greek and Roman motifs to create the splendor of The Breakers. While the exterior is constructed of Indiana limestone, the walls of the Great Hall are made of carved Caen limestone imported from the coast of France. The walls are inset with plaques of rare marbles such as pink marble from Africa and green marble from Italy.

The Great Hall’s pilasters (embedded columns) and medallions (circular decorations) are decorated with acorns and oak leaves, representing strength and longevity, symbols of the Vanderbilt family. On top sits a massive cornice that frames a ceiling mural of a windswept sky. Hunt enclosed the space in consideration of Rhode Island’s New England climate, but he quite successfully retained the illusion of an open courtyard. The contrast of the elaborately detailed cornice against the painted sky reinforces that feeling, as does the large glass wall between the hall and the loggias.

Portrait of Mrs. Cornelius Vanderbilt II by Raimundo de Madrazo y Garreta, 1880. (Public Domain)
The Dining Room is the most lavish room inside The Breakers, featuring 12 rose alabaster Corinthian columns, a ceiling mural of the goddess Aurora bringing in the dawn on a four-horse chariot, and two Baccarat crystal chandeliers. (Courtesy of The Preservation Society of Newport County)

Projecting from the estate’s south wing is the oval Music Room. Richard Van der Boyen designed the intricate woodwork and furnishings. Jules Allard and Sons built all the woodwork in their shops in Paris and shipped it to America for installation. Used originally for recitals and dances, the Music Room was featured in an episode of Julian Fellowes’s HBO series “The Gilded Age.”

The gardens of the 70-room estate were designed by Boston engineer Ernest W. Bowdtich, who was a student of Frederick Law Olmsted. Trees were carefully placed to increase the sense of distance between The Breakers and the neighboring houses. The enormous gate of the property and the wrought iron fence are flanked with rhododendron, mountain laurel, and other flowering shrubs to create a secluded place. Footpaths wind around the tree-shaded grounds, all of which provide a very natural backdrop for the more formal terrace gardens.

Facing east to welcome the rising sun, the Morning Room is a communal sitting room designed by Allard & Sons in France, featuring platinum-leaf wall panels adorned with muses from Greek mythology. (Courtesy of The Preservation Society of Newport County)

Paying homage to the original Breakers, Robert Swain Peabody and John Goddard Stearns, who designed the original house, were commissioned to create The Playhouse in the garden. It was a small, Queen Anne Revival style cottage, reminiscent of their original design, which was used as a children’s playhouse.

Cornelius Vanderbilt II died in 1899. He was 56. Alice, his wife, outlived him by 35 years. Not unlike the fictional Crawley family of “Downton Abbey,” the Vanderbilts faced the reality that such an estate, with its army of servants, was becoming increasingly difficult to maintain. Alice gave the mansion to her youngest daughter Gladys (Countess Széchenyi), who was an active supporter of the Preservation Society of Newport County. She opened the house for visitors in 1948, leasing it to the society for a dollar a year. The society eventually purchased The Breakers in 1972 for $365,000—slightly less than what Mr. Vanderbilt paid for the property almost a century before.

This article was originally published in American Essence magazine. 

Categories
History

Not Just Paul Revere: The Unknown Story of the Night Rider in Virginia Who Warned the British Were Coming

It was the spring of 1781, and war had come to Virginia.

Many Virginians were fighting elsewhere with George Washington’s forces, weakening the ability of the state to resist British advances. King George’s troops, some of them commanded by defector Benedict Arnold, had earlier that winter conducted raids and fought skirmishes with Americans along the James River. In May, these soldiers hooked up with the forces of Lord Cornwallis, who had marched his men up from North Carolina. In less than six months, this army would surrender to the Americans and French at Yorktown, but for now, they faced only light resistance and moved handily throughout the eastern part of Virginia.

Driven that winter out of the state’s new capital, Richmond, the Virginia legislature had opted in the spring to meet in Charlottesville, believing themselves secure from the British in that western hamlet. Among these lawmakers were Gov. Thomas Jefferson, now in the last days of his term of office, as well as famous patriots like Patrick Henry and signers of the Declaration of Independence Richard Henry Lee and William Harrison. Among their number was also Daniel Boone of Kentucky, then considered a part of Virginia.

When Lord Cornwallis learned that the legislature had gathered in Charlottesville, he dispatched Lt. Col. Banastre Tarleton and 200 mounted troopers to ride west and capture these lawmakers. Though despised by colonial patriots for his harsh treatment of militia and civilians in the Carolinas—he was nicknamed “Bloody Ban”—Tarleton was a fine horseman and an aggressive commander. He pushed his men toward Charlottesville, riding much of the time at night to conceal their objective. On June 3, he paused for a few hours at the Louisa County Courthouse to give his men and horses a well-earned rest before advancing into Charlottesville the following day.

And it was on this night that one American would upend this British raid.

“Thomas Jefferson” by Mather Brown, 1786. National Portrait Gallery, Washington, D.C. (Public domain)

Virginia’s Paul Revere

Born in 1754 to John and Mourning Harris Jouett, Jack Jouett had grown up in Charlottesville, where his father operated the Swan Tavern. On this evening of June 3, he was almost 40 miles away in Louisa County at the Cuckoo Tavern, so named because of the clock in that establishment. Jouett had seen the arrival of the British dragoons, overheard talk in the tavern of their plans to proceed to Charlottesville, and decided on his own initiative to race through the hills to that town and alert the threatened legislators.

Mounted on his bay mare Sally, Jouett set out through the dark countryside. Fearing British troops, he took the back roads and trails with which he was well familiar. Just around dawn, his fast-paced horse brought him to Monticello, Thomas Jefferson’s estate. There, he roused the household and explained the dire situation to Jefferson, who, as legend has it, offered Jouett a glass of Madeira to help revive the weary rider before he set out for nearby Charlottesville.

In Charlottesville, Jouett spread the word, which included a visit to his father’s popular tavern. The legislators agreed to move south to Staunton, about 40 miles away. Though Daniel Boone and several other members of this body were captured by the British, most of the representatives packed in haste, fled the town, and escaped safely to Staunton.

A Near-Run Disaster

Jefferson himself came close to being taken prisoner as well.

Aided by his body servant, Jefferson slowly packed up important papers and personal items, reluctant to leave the home he’d designed and built for fear the British would burn it. Only when a neighbor who was an officer in the Virginia militia, Christopher Hudson, found him still on the premises and urged him to flee did Jefferson mount his horse, Caractacus, and ride into the forest. Like Tarleton, he was an excellent horseman, knew the terrain, and was confident of his ability to escape Tarleton’s raiders.

The British arrived at Monticello within minutes of his departure, with Jefferson still close enough to hear them and to observe through his telescope. He rode away, but his fears regarding the destruction of his home proved unjustified. Perhaps the British remembered the story of Jefferson’s kind treatment of several captured officers earlier in the war. The troops did threaten to shoot a slave, Martin Hemmings, unless he informed them of his master’s whereabouts, at which point the servant demonstrated his loyalty to Jefferson by replying, “Fire away, then.” Hemmings was left unharmed, and after a thorough search of the house and grounds, the British headed to Charlottesville.

Lt. Col. Tarleton was determined to capture colonial lawmakers. “Portrait of Sir Banastre Tarleton” by Joshua Reynolds, 1782. National Gallery, London. (Public domain)

As for Jack Jouett, he moved to Kentucky the year after his ride, where he married Sallie Robards, became a father to 12 children, established himself as a successful farmer, and served in the Kentucky legislature. He was a stout advocate for statehood and was undoubtedly pleased when in 1792 Kentucky became the second state to join the newly formed United States of America.

Though honors for his heroism on that night-long ride were belated, Jouett eventually received official recognition from the Virginia government for his exploit and was awarded a brace of fine pistols and a sword for his service.

The Power of One

Jack Jouett isn’t as famous as Paul Revere, in large part because of Longfellow’s poem “Paul Revere’s Ride” with its well-known opening lines “Listen, my children, and you shall hear / Of the midnight ride of Paul Revere.” Yet Jouett’s bravery and boldness that June night and the following day may have helped save the American Revolution. At the time, the Americans had no sure hope of victory—far from it—and the capture of patriots like Jefferson and Richard Henry Lee might have brought disastrous consequences. At the least, such a triumph would have severely damaged American morale.

Jouett also deserves our esteem for demonstrating a particularly American trait: individual initiative. Unlike Paul Revere, who worked with a committee of others discerning and attempting to thwart British intentions, Jouett acted alone and spontaneously. No one commanded him to deliver his warning; he asked no one for advice as to what he should do. At great risk to himself, he saddled up that bay mare and set out on his self-imposed mission.

To put aside our fears, doubts, and self-interests in the pursuit of liberty and a righteous cause: That is Jack Jouett’s greatest lesson for us all.

This article was originally published in American Essence magazine. 

Categories
History

Gibson Guitars: Fascinating Stories Behind an American Icon Serving a Century of Musicians

It was Ray Whitley who started the excitement. Throughout the 1930s, Whitley traveled with the World’s Championship Rodeo, providing musical entertainment with his band, the Six Bar Cowboys. In 1937, he prodded the Gibson Mandolin-Guitar Manufacturing Co. to develop a “super jumbo” instrument, one that could go lick for lick with the nearly 16-inch-wide, rosewood-and-mahogany Dreadnought guitar issued by C.F. Martin & Co.

A Gibson L-4 CES, fit for jazz players. (Heath Brandon CC BY 2.0, CreativeCommons.org/ licenses/by/2.0)

“Of course, what somebody at Martin saw, and what no one at Gibson apparently did, was that players were forsaking banjos for guitars and demanding louder instruments,” wrote Walter Carter in his history of the Gibson company. There was no other way to match the volume of the singer and microphone. Playing live dates, Gene Autry, a star at Chicago’s WLS radio station, was already strumming an elaborately ornamented Martin D-45, which replaced the smaller Martin that had been stolen along with his Buick the year before. Whitley’s demands of Gibson resulted in a 17-inch-wide body with a mosaic pickguard and the slogan “Custom Built for Ray Whitley” inscribed on the headstock. The Super Jumbo 200 took its name from its generous size and steep list price: $200 (a 1938 Ford could be purchased for just over three times that amount). After World War II, the model would be known simply as the SJ-200, and Elvis Presley cradled one when he appeared on The Ed Sullivan Show in 1957.

“The Gibson-made instruments were louder and more durable than the competitive, contemporary fretted instruments, and were the go-to instruments demanded by players of the day,” explained ZZ Top’s Billy Gibbons in an email.

Elvis Presley’s Gibson J200 on display at his home, the Graceland mansion in Memphis, Tenn. (Mr. Littlehand CC BY 2.0, CreativeCommons.org/ licenses/by/2.0)

Whitley carried his SJ-200 to Hollywood, where he wrote “Back in the Saddle Again” for the cinematic mystery-romance “Border G-Man.” Meanwhile, Gibson made a dozen more SJs for key influencers. Autry bought two at the discounted price of $150 each, and his biographer, Holly George-Warren, wrote of one guitar that it was “embellished with a two-tone mother of pearl border; horses and bucking broncos inlaid with pearl; and his name writ large alongside horseshoes inset on the fingerboard.” It was a spectacular instrument and showpiece, indeed, making a lasting impact.

In 1939, Autry recorded his version of Whitley’s tune and adopted “Back in the Saddle Again” as his enduring theme song. Heard today, the lyrics still evoke feelings of truth and triumph, but cowboy singers would soon fall out of fashion. The music made by electric guitars took over radio airwaves. Autry finished his career with landmark recordings of holiday songs, namely “Here Comes Santa Claus,” “Rudolph, the Red-Nosed Reindeer,” and “Peter Cottontail.”

The integration of Gibson guitars into the upper echelons of popular music deserves some explanation. The company’s founder, Orville Gibson, had migrated from his native New York state to Kalamazoo, Michigan, by 1881, when he was 25 years old. After more than a dozen years as a clerk in a shoe store and a restaurant, he started manufacturing musical instruments. In his small workshop, he made mandolins from a patented design. The patent application of 1895 said existing instruments were made of too many parts, “to the extent that they have not possessed that degree of sensitive resonance and vibratory action necessary to produce the power and quality of tone and melody.” He boasted of having achieved “a sound entirely new to this class of musical instruments.” The first Gibson catalog offered a family of mandolins for the popular mandolin orchestras, as well as round- or oval-hole guitars and harp guitars with 12 or 18 strings. Five stages of ornamentation, from plain to fancy, were available.

A 1964 Gibson Country Western acoustic guitar (L) and a 1963 Southern Jumbo SJ. (Tony 1212 CC BY-SA 4.0, CreativeCommons.org/ licenses/by-sa/4.0)
A Gibson magazine advertisement from around 1939 to 1940. (Public Domain)

By 1902, an investor group took over Gibson’s enterprise, and the next year the founder—who had become a consultant for the company—quit, in order to teach music and collect royalties. Eventually, Gibson returned to New York; he died in 1918. His namesake company adopted an innovative marketing approach, turning music teachers into salesmen and letting customers pay small monthly installments. The Gibson banjo was introduced, but the 1911 L-4 and 1923 L-5 guitars were better fits with Jazz Age outfits like Duke Ellington’s Washingtonians at a time when people were losing their heads dancing the Charleston. With the finest materials and craftsmanship, the 1934 Super 400 extended the trend of successful rhythm instruments. Gibson’s first electric, the hollow-body ES-150, made its debut in 1936 and was popularized by the ill-fated jazz player Charlie Christian. Extolling “electrical amplification,” Christian showed the world how to perform a proper solo, before he died—too young at 25—of tuberculosis.

Singers Ray Whitley and Redd Harper, and actor Frank Seeley (far R), with fellow musicians at the Armed Forces Radio Service studio. (Public Domain)
An Orville by Gibson guitar, a line of instruments made for the Japanese market. (Public Domain)

While worthy competition came from the 1950 Fender Telecaster and 1954 Stratocaster—solid-body electrics made in Southern California—Gibson made a wily move in advance of the era of rock ’n’ roll and electric blues: In order to avoid the disdainful label of “plank” guitar, the solid-body 1952 Gibson Les Paul was developed with collaboration from Les Paul (Lester Polsfuss), who was a master player and something of a mad scientist. The guitar that bore his name had a carved maple top with no sound holes, and the gold color was intended to disguise a trade secret: the mahogany back. Like Orville Gibson’s mandolins, the new guitar was an innovative departure and an instant classic. The challenge was to figure out what to do with it, but players stepped to the fore. Bluesman John Lee Hooker, to name one, extracted grit and passion from his Les Paul. Billy Gibbons dubbed his own 1959 example “Pearly Gates,” explaining that the guitar “possesses those rare qualities found in a precise combination of elements which miraculously came together on that fateful day of fabrication.”

Renamed Gibson Guitar Corp., and now Gibson Brands, Inc., the company moved operations from Kalamazoo to Nashville by 1985, with acoustic guitars produced in Bozeman, Montana, since 1989. The company has experienced ups and downs in conjunction with fickleness in the national economy and the guitar industry—even restructuring in Chapter 11 bankruptcy in 2018. However, the pandemic has brought about a surge in guitar sales—“Did Everyone Buy a Guitar in Quarantine or What?” asked Rolling Stone—putting the company in a good position to capitalize on the upswing. Gibson guitars continue to lend their great sound and seriousness of intent to new musical acts. And it all started with Orville Gibson and his carving tools in Kalamazoo.

This article was originally published in American Essence magazine.

Categories
The Great Outdoors History

Raising a Forest by Hand

“The hills bear all manner of fantastic shapes,” Charles Bessey observed, noting that they sometimes featured open pockets of bare sand in blowouts and were “provokingly steep and high.” Bessey was describing the Sandhills, the area of post-glacial dunes wrought by mighty winds in north-central and northwestern Nebraska. Aided by his botany students from the University of Nebraska (today’s University of Nebraska–Lincoln), he cataloged a treasure of plant species in 1892. Yet besides spurges and gooseberries, herbaceous plants such as smooth beardtongue, and grasses such as Eatonia obtusata, he found the potential for forestation.

“He was convinced that the moist soil of the Sandhills would support forest growth,” the historian Thomas R. Walsh wrote. Nebraska had gained statehood in 1867 but still had enough untouched areas to be “a virgin natural laboratory,” as Walsh described it. And there were so few trees for wood, shelter, or shade. Bessey had been pushing the state legislature to reserve Sandhills tracts for tree planting. In 1891, urged by the top forestry official in Washington, D.C., he started a test plot at the eastern edge of the Sandhills, which encompassed an area about the size of New Jersey. Ponderosa pines were a big component of the experiment’s 13,500 conifers. With the initial indication that they would do fine, he started a campaign to convince people that forestation was practical. After all, as Walsh noted, “the area was once covered by a pine forest that was destroyed by prairie fires.”

The pre-dawn fog rises above the Niobara River, located in Valentine, Nebraska. (Pocket Macro/Shutterstock)

Bessey had come to the University of Nebraska in 1884, lured from Iowa Agricultural College (today, Iowa State University) by an offer of $2,500 per year. He was already the author of “Botany for High Schools and Colleges,” the nation’s first textbook on the subject. His motto of “Science with Practice” indicated a teaching philosophy that mixed laboratory and field work with classroom instruction. He was one of a small group of professors at the prairie university, attended by just 373 students in the year he arrived, but he had an outsized and enduring influence through his popular botany seminar. A top student in the 1892 cataloging project was Roscoe Pound, who claimed the university’s first Ph.D. in botany, then distinguished himself as a legal scholar and served two decades as dean of Harvard University’s law school.

Throughout the latter years of the Gilded Age, Bessey kept hammering away at the idea of national forests. To Gifford Pinchot, head of the national Division of Forestry, he wrote, “In the Sandhills, we have a region which has been shown to be adapted to the growth of coniferous forest trees, and here we can now secure large tracts which are not yet owned by private parties.” Pinchot had the ear of President Theodore Roosevelt, who in 1902 set aside 206,028 acres in two reserves in the Sandhills. “This was the first and only instance in which the federal government removed non-forested public domain from settlement to create a man-made forest reserve,” Walsh explained.

The two reserves are 75 miles apart. The northern Samuel R. McKelvie National Forest is on the Niobrara River near the city of Valentine. The southern one, first called Dismal River Forest Reserve, is now the Nebraska National Forest at Halsey and is managed by the Bessey Ranger District. (Nebraskans refer to it as “Halsey Forest.”) Within it are the Bessey Recreation Area and the crucially important Charles E. Bessey Tree Nursery, which yearly produces 1.5 million bare-root seedlings and up to 850,000 container seedlings for distribution in the Great Plains and Rocky Mountain states. Additionally, the nursery acts as the seed bank for Rocky Mountain Region 2, storing about 14,000 pounds of conifer seeds in case of wildfire or insect infestation.

Carson Vaughan, author of “Zoo Nebraska: The Dismantling of an American Dream,” grew up in Broken Bow, about 50 miles from Halsey Forest. It was only after he started writing articles about Bessey and the forest that he comprehended the magnitude of the original undertaking: creating the largest man-made forest in the United States. “Nothing like this has ever happened anywhere else on the planet,” he said. “And it all started because this pioneering botanist, Charles Bessey, had this wild idea and the patience, the dogged persistence, to stick with it over a couple decades and see it come to fruition.”

Vaughan remembered climbing Scott Lookout Tower, near Halsey, and feeling the impact upon viewing a forest amid treeless grasslands. “You get the rolling, billowing Sandhills right next to this very clear, dark, dense forest,” he said. The experience reinforced the concept that “it took human beings planting all of these trees to make this national forest grow out of this sandy, arid region.”

The sun rises over the Dismal River, which runs through the Nebraska Sandhills. (marekuliasz/Shutterstock)

After succeeding in the Sandhills, Bessey turned to other important challenges. In 1903, he was contacted about the effort to save the giant sequoias in certain groves in the Sierra Nevada Mountains of California. He tried to interest President Roosevelt in the cause, then introduced the matter into proceedings of scientific societies, sending their resolutions on the matter to congressional representatives. Although he helped to set the conservation process in motion, Bessey would pass away in 1915 without seeing his efforts bear fruit. 16 years later, the state of California acquired the Mammoth Tree Grove, which is a principal element of the eventual Calaveras Big Trees State Park.

On the other side of the country, Bessey became involved in the effort to create a national forest reserve in the southern Appalachians. “The cutting away and total destruction of the forests is a crime against the community as a whole,” he wrote. In 1908, a bill to authorize the reserves came before the House of Representatives, but soon died. It particularly galled Bessey that one of his former students, Representative Ernest M. Pollard, was on the agricultural committee, which had deferred action. “It does seem as though we had the most stupid and blinded lot of men in charge of our affairs that has ever cursed any country,” Bessey wrote to House Speaker Joseph G. Cannon. Bessey and others kept working, and ultimately, the Weeks Act of 1911 was passed, providing for acquisition and preservation of forested lands nationwide.

Today, visitors to the University of Nebraska–Lincoln can see an image of Bessey in bas-relief on a bronze tablet at—where else?—Bessey Hall. There’s also a Bessey Hall at Iowa State. And at Michigan State University, Ernst Bessey Hall is named for Charles’s son, who became a professor of botany and dean at MSU’s graduate school from 1930 to 1944. The apple didn’t fall far from the tree.

Categories
History

The First Selfie

As of this writing, about 700 billion photographs have been uploaded to the internet. Billions and billions more exist in physical form. Many of these photos fall into the category now referred to as “selfies,” a type of photograph that is typically assumed to be as young as Generation Y. However, the roots of the selfie actually go back almost 200 years.

The son of a Dutch immigrant, Robert Cornelius was born in Philadelphia in 1809. As a child, Cornelius was fascinated by chemistry. This interest was surely fanned by the boy’s father, a silversmith, who taught Robert the business of metal polishing and silver plating.

In 1839, the world was taken by storm when French artist Louis Daguerre invented the daguerreotype, a complex process—involving silver-plated copper, mercury vapor, and liquid chemical treatment—that could produce a photographic likeness. An account of Daguerre’s process was published in Philadelphia on October 15, 1839. The next day, Cornelius was approached by a local watchmaker and inventor named Joseph Saxton, who was at that time, an employee of the Philadelphia Mint. Saxton wanted Cornelius to help him produce a daguerreotype image. Cornelius agreed.

Cornelius created the silver plating for Saxton’s photographic image; and that image, as far as we know, was the first photograph ever taken in the United States. In dark hues of gold and brown, the image was taken from Saxton’s own Philadelphia Mint office window, and it portrays part of the State Arsenal and a section of a neighboring high school. A late-19th century description of the “camera” reveals Saxton’s quick ingenuity: A seidlitz powder (a laxative) box with a few flakes of iodine answered for a coating box, while a cigar box and burning glass were improvised for a camera. One Philadelphia photographer later wrote that the Saxton daguerreotype “created no small excitement among the curious in such matters; and from this date, many of our Philadelphia savants began cultivating the art.”

The experience ignited in Cornelius an abiding interest in photography, and he was determined to improve upon the makeshift daguerreotype he’d helped Saxton throw together. In this effort, he enlisted a physician named Paul Beck Goddard, and later that same month of October, they produced a daguerreotype image of Cornelius himself—the first photographic portrait (picture of a human being) ever taken in the United States. It was probably the first ever in history, although one earlier daguerreotype image had been taken a year before in Paris and happened to include a couple people in the background. But that image wasn’t meant as a portrait.

From the metallic, spotted image, Cornelius, with his head slightly tilted to the left, stares back at us with determined eyes set underneath a prominent forehead, that was partly covered by his thick, disheveled hair. He wears a dark coat with a cravat. His right arm is held upright across his chest, and his right hand is tucked beneath the left side of his coat. The limitations of the technology dictated that for this first-ever selfie, Cornelius had to sit still for up to 15 minutes.

The first photographic portrait made in the United States (and probably the world), by Paul Beck Goddard and Robert Cornelius. The subject is Cornelius himself, allowing him to claim the achievement of first-ever “selfie.”

Cornelius went on to establish several photo studios, manufacturing his own cameras, plates, and mats to produce portraits for the prominent people (among others) of his time. Many of those photographs survive to this day.

One of his innovations was in harnessing additional light via the use of reflectors. Writing several decades later, one observer of Cornelius described his process: For coating the plates, he used dry iodine exclusively; and by several large reflectors, set at different angles, both within doors and without, he was enabled, in strong sunshine, to concentrate upon his sitter light enough to obtain through a side-window facing south, an impression within from one to five minutes.

Cornelius was later able to improve his process to the point where he could produce “fair impressions, even without reflectors, in from 10 to 60 seconds—and this too within doors.” But success invited imitation, and, as one mid-19th century historian informs us: “Together with the improvements made by [Cornelius] and others in the heliographic apparatus [light reflectors] and manipulative methods, and the great advance consequent thereon in the mode of obtaining portraits from life, quite a number of persons directed their attention to the art from the hope of making it a source of profit.”

As the demand for and interest in photography spread, and as more studios opened, Cornelius opted to move on to other things: specifically, the invention of a solar lamp that proved highly popular across the United States and Europe—but that’s another story. Incidentally, few people knew or understood at the time that Cornelius had taken the first photographic portrait in American history. He wasn’t one to trumpet his own accomplishments.

Luckily for us, however, Cornelius mentored others at his studio. One of them was a young man named Marcus Root. A quarter-century after that first selfie was taken, in 1864, Root published “The Camera and the Pencil, or the Heliographic Art,” which included, along with the theory and practice of photography, a history of the field. That book explicitly, and rightly, credited Cornelius with the first photographic portrait.

Twelve years later, “The Camera and the Pencil” was exhibited at the Centennial Exhibition, where the book was noticed by a photographer named Julius Sachse. Sachse went on to interview Cornelius, and later became editor of the “American Journal of Photography.” In this way, Cornelius’ legacy was secure—and just in the nick of time. He died the following year, in1877.
So, the next time you take a selfie, take a moment to remember Cornelius—and be glad you don’t have to sit still for 15 minutes.

Categories
Founding Fathers History

Washington’s Presidency, the Glorious and the Mundane

George Washington, universally acclaimed nowadays as one of our best presidents, encountered a little bad press in his own day. Even before his inauguration, he knew that facing impossibly high expectations would be a challenge during his time as president.

“My movements to the chair of Government will be accompanied with feelings not unlike those of a culprit who is going to the place of his execution.” These were the unenthusiastic words of George Washington, written to fellow Revolutionary War veteran Henry Knox on April 1, 1789, not long before his nearly inevitable election as president.

For eight years (between 1775 and 1783) and without pay, Washington had led the Continental Army against the British. The aristocratic Virginian might have gone on to leverage his impressive victory to become a “conquering general” and establish a personal dictatorship—an end conceivably within his grasp and even suggested by some in his circle.

Instead, George Washington very emphatically retired. Lest anyone should miss the point, Washington even delivered a public resignation address. His days of service were over, and beloved Mount Vernon was calling.

But now, he was being summoned into service once more. Two weeks after Washington had compared his feelings to those of a culprit on his way to execution, a dispatch arrived at Mount Vernon notifying the retired general of his presidential election. Two days after that, 57-year-old George Washington left Mount Vernon, penning the following in his diary:

About ten o’clock I bade adieu to Mount Vernon, to private life, and to domestic felicity; and with a mind oppressed with more anxious and painful sensations than I have words to express, set out for New York … with the best dispositions to render service to my country in obedience to its call, but with less hope of answering its expectations.

Perhaps no one in America was more familiar with the challenges of directing the new union than George Washington, who had played such a central role in its inception and evolution. As such, he was clearly under no illusion as to the challenges that awaited him. His acquiescence (for so it was) to the presidency was informed less by political ambition and more by solemn duty. There was no relishing of the prospect, no celebration on his part, no reveling in his political achievement. Being the sort of president people wanted—by unanimous vote of the Electoral College, no less!—seemed at the very least a daunting task, and probably an impossible one. He seems to have known this.

Bad Roads and White Robes

New York was to serve as the first temporary capital of the new United States of America, but great distance and bad roads meant that it was quite a journey to get there from Virginia. And if Washington was really weighed down by “expectations” at the moment of his departure, he was certainly more so as the journey progressed. Everywhere he went, crowds cheered his arrival, casting roses and wreaths along his path, or erecting triumphal arches for him to pass through. At Trenton, 13 girls—representing the 13 states—in white robes hailed him as “mighty Chief” in song, while Washington was made to ride beneath a 13-columned arch.

Finally reaching Elizabethtown, New Jersey, across the Hudson from New York City itself, Washington was greeted by an ostentatious barge manned by 13 white-uniformed captains. Upon this gaudy vessel, the president-elect was ferried across the river to where Wall Street met the water. New York Governor George Clinton awaited him there—atop a set of specially prepared steps with their sides draped in lavish cloth.

Engraving depicting George Washington en route to Federal Hall for the first Presidential Inauguration, April 30, 1789. (Archive Photos/Getty Images)

George Washington was sworn in on April 30, his oath of office administered on the balcony of Federal Hall, in front of a massive crowd gathered along Broad and Wall Streets and on balconies and housetops in every direction. All was hushed during the swearing in, after which the officiator exclaimed, “Long live George Washington, President of the United States!”

Thunderous applause echoed throughout the city as a 13-gun salute rang out from the harbor. As the ovation continued, an American flag was hoisted above George Washington himself.

Expectations, indeed.

Complainers

Of course, the hoped-for utopia to be ushered in by America’s greatest Founding Father never materialized. Even Washington himself had hoped that the new federation would, at the very least, avoid political factions. Instead, the real world offered its usual share of complication and contention—including a highly combative two-party system. By the time Washington left office, his once-invulnerable image had taken a hit among some contemporary people. Complainers picked at flaws, real or not. American newspapers attacked his perceived disloyalty to republicanism and his personal integrity. They attacked the lavish receptions (or “levees”) he hosted with his wife, his “aristocratic” airs, his alleged “monarchical” pretensions, his cold and aloof manner. Critics accused him of being unintelligent and susceptible to bad advice from his cabinet, of treacherously betraying France by proclaiming neutrality—and of betraying the American Revolution by not eagerly supporting the French one.

A whole series of letters (called the “Belisarius” letters, after their author’s pen name), addressed personally to Washington and published in opposition newspapers, lambasted the president on a wide range of counts: for cultivating “a distinction between the people and their Executive servants”; failing to stand up to (post-war) Britain; overseeing a costly war with the American Indians; maintaining a standing army in peacetime; and supporting internal taxation (then “denouncing” the people most affected by it), among other allegations.

Tempering Expectations

Women laying flowers at George Washington’s feet as he rides over a bridge at Trenton, New Jersey, on the way to his inauguration as first president of the United States on April 30, 1789. (MPI/Getty Images)

It may be that the aspersions cast in his direction were a primary reason George Washington decided to retire after just two terms. Indeed, an earlier draft of his Farewell Address actually included these words:

As some of the Gazettes of the United States have teemed with all the Invective that disappointment, ignorance of facts, and malicious falsehoods could invent, to misrepresent my politics and affections; to wound my reputation and feelings; and to weaken, if not entirely destroy the confidence you had been pleased to repose in me; it might be expected at the parting scene of my public life that I should take some notice of such virulent abuse. But, as heretofore, I shall pass them over in utter silence.

The impossible expectations placed upon the first president demonstrate, perhaps, the futility of investing in one individual the utopian hopes and dreams of an entire people. One of the original American lessons, at least as they pertain to the state, is that political saviors don’t exist; not even the vaunted George Washington could be one! He’d felt the weight of such expectations right from the beginning. That weight probably helped drive him out of the spotlight in the end.

When election cycles come around, perhaps our expectations should be tempered by Washington’s experience.

And when politicians talk like saviors, remember George Washington, too.

Dr. W. Kesler Jackson is a university professor of history. Known on YouTube as “The Nomadic Professor,” he offers online history courses featuring his signature on-location videos, filmed the world over, at NomadicProfessor.com

Categories
History

The Real Johnny Appleseed

Walt Disney’s 1948 animated short, “The Legend of Johnny Appleseed,” famously depicts its main character as a Pennsylvania farmer yearning to join the pioneers heading west to the frontier. All those settlers would need something to eat—and so Disney’s Johnny is determined to plant apple trees across the land in order to feed them.

The real Johnny Appleseed—a “small, wiry man, full of restless activity” named John Chapman, born in Massachusetts in 1774—was indeed a trained orchardist, and he really did show up in Ohio Territory with “a horse-load of apple seeds” to plant. It’s no myth, either, that John Chapman introduced apple trees to large swathes of frontier territory, from modern Pennsylvania and Ohio to Indiana, Illinois, and even Canadian Ontario.

Orchards for Cider

But Chapman’s apples weren’t meant for eating at all (they would have been far too sour, anyway). No—the apples of the real Johnny Appleseed were meant for making hard cider. As a not-so-subtle Smithsonian Magazine headline framed it in 2014, “The Real Johnny Appleseed Brought Apples—and Booze—to the American Frontier.” Journalist Michael Pollan agreed, explaining in a 2015 interview, “Really, what Johnny Appleseed was doing and the reason he was welcome in every cabin in Ohio and Indiana was he was bringing the gift of alcohol to the frontier. He was our American Dionysus.” Perhaps as early as the 1810s, however, Chapman was already being referred to as “Johnny Appleseed,” recognized as such “in every log-cabin from the Ohio River to the Northern lakes, and westward to the prairies of […] Indiana,” according to an 1871 Harper’s New Monthly Magazine piece. Unfortunately, along with apple trees Chapman also planted countless acres of dogfennel, which he considered as possessing medicinal qualities; dogfennel is now seen as a pernicious weed.

And the real Appleseed’s motivations were also far more complicated than those of the happy planter portrayed in the Disney musical. Far from acting the altruistic scatterer of apple seeds tossed hither and thither wherever he happened to roam, Chapman was probably a shrewd entrepreneur. The nurseries Chapman planted weren’t open to all, but rather fenced off, each tree meant for selling to settlers for six and a quarter cents. In the late 18th and early 19th centuries, some land speculation companies required prospective colonists to plant fifty apple trees on their land; since it took the better part of a decade for these trees to bear fruit, such planting would serve to prove the colonists’ commitment to develop the land over many years. The real Appleseed saw in this a business opportunity. He would thus plant a nursery, enter into a business partnership with someone in the area to manage the sale of the trees to arriving settlers eager to quickly fulfill the companies’ requirements, and then move on to repeat the process elsewhere. Two or three years later, he might return to tend to the nursery.

Traversing the Frontier and In Touch With Indians

This was a business model that demanded he labor “on the farthest verge of white settlements,” and it wasn’t an easy line of work. Chapman essentially lived life as a nomad—and a barefoot one at that. One newspaper described him as “barefooted and almost naked,” and for a time he apparently wore a coffee sack, having cut out holes for his head and appendages. (There was, too, his signature hat: a tin vessel, with a visor-like peak in the front, that doubled as a pot for cooking.) The work itself could be hazardous. Once, while working in a tree, he fell and got his neck stuck between forking branches. If one of his helpers that day, a mere lad of eight years, hadn’t discovered him and run for help, the real Johnny Appleseed might have died in 1819. And as far away as they might have seemed from America’s economic centers, Chapman’s enterprises weren’t immune from the vicissitudes of the larger economy. When recession racked the United States in 1837, the price of his trees plummeted to just two cents apiece.

The frontier along which he worked, skirting American Indian country, could also be dangerous. But the natives, whom Chapman always admired and whose wilderness trails he often traversed, left the real Appleseed alone, considering him something of a medicine man; how else could one explain the privation and exposure which he so easily endured? Even during the War of 1812, when many natives along the frontier allied with Britain to devastate white frontier communities, Chapman never ceased his wanderings, and he was never harmed. One frontier settler, reporting his experience during this period, remembered with a “thrill” the peripatetic Chapman’s timely warning to his community: “The Spirit of the Lord is upon me, and he hath anointed me to blow the trumpet in the wilderness, and sound an alarm in the forest; for behold, the tribes of the heathen are round about your doors, and a devouring flame followed after them.” The real Appleseed’s warnings may have saved hundreds of lives.

Disney’s cartoon Johnny makes friends with a skunk and is beloved by all animals. In truth, the land he traversed teemed with potentially menacing wild animals, including wolves, rattlesnakes, wild hogs, and bears. One account of Chapman, however, does claim he had partially tamed a pet wolf, which followed him around everywhere he went, and his religious fervor (he followed Swedenborgianism) did eventually cultivate in him an almost Jain-like respect for all living creatures. By the time of his death, he had become a full-fledged vegetarian.

An Orator and a Gift-giver

Illustration of Johnny Appleseed delivering a speech, circa 1820. (Fotosearch/Getty Images).

As an itinerant orchardist-nurseryman, the real Appleseed, with his “long dark hair, a scanty beard that was never shaved, and keen black eyes,” was well-known up and down the frontier. This meant Chapman was often “passing through” the towns and settlements and American Indian villages of the region (in the words of one 19th-century newspaper, he “sauntered through town eating his dry rusk and cold meat”). Frequently, he would stop to entertain groups of children—apparently he was a master storyteller—or preach “on the mysteries of his religious faith” to any adult who might listen. To little girls he gifted bits of ribbon and calico; “Many a grandmother in Ohio and Indiana,” reported an article published a few decades after his death, “can remember the presents she received when a child from poor homeless Johnny Appleseed.” Once, after being gifted shoes for his bare feet by a particularly assertive settler, he was discovered a few days later again walking barefoot in the cold; the settler confronted him “with some degree of anger,” only to find out that Chapman had almost immediately re-gifted the shoes to a poor family, some of them barefoot, traveling west.

Days after strolling through the streets of Fort Wayne, Indiana, the real Appleseed died suddenly. The year was 1845, and John Chapman was around 70 years old. He was hailed in one obituary for “his eccentricity,” his “strange garb,” his material self-denial (apparently his faith had worn away at the material entrepreneurship of his youth), his “cheerfulness,” and his “shrewdness and penetration.” Johnny Appleseed was buried at Fort Wayne.

Within a quarter-century, the life of Johnny Appleseed was featured in the aforementioned Harper’s New Monthly Magazine piece, subtitled “A Pioneer Hero.” Even then, it was admitted that, as the frontier disappeared, “the pioneer character is rapidly becoming mythical.” The story of the nomadic nurseryman “whose whole [life] was devoted to the work of planting apple seeds in remote places” had begun to take on a life of its own—a myth which, by the mid-20th-century, had become musical Disney legend.

Dr. W. Kesler Jackson is a university professor of history. Known on YouTube as “The Nomadic Professor,” he offers online history courses featuring his signature on-location videos, filmed the world over, at NomadicProfessor.com

Categories
History Founding Fathers

Roger Sherman, Low-Key Founding Father

Among the Founding Fathers, Roger Sherman is one of the best-kept secrets. But he shouldn’t be, especially in light of the cumulative and lasting effect he has had on this nation, including the present-day debates on the meaning and legal effect of the Ninth Amendment.

Most notable is the fact that he is the only Founding Father to have signed all of these prominent founding documents: the Declaration and Resolves (1774), which contain many of the rights that are enumerated in the First Amendment; the Articles of Association (1774), which was a trade boycott with Great Britain; the Declaration of Independence (1776); the Articles of Confederation (1777); and the U.S. Constitution (1787).

Sherman’s influence on the Constitution was greater than most realize. Historian Richard Werther wrote in 2017 in the Journal of the American Revolution that, at the Constitutional Convention debates, “of 39 issues cited, Sherman prevailed on 19, Madison on 10, and 7 resulted in compromises (the other 3 were interpretational issues for which no clear-cut winner is determinable).” Werther adds, “While no one is arguing that Sherman, not Madison, assumes the mantle as ‘Father of the Constitution,’ clearly Sherman had a bigger role than may have been previously understood.”

As a boy in Connecticut, Roger Sherman was self-educated in his father’s library and later by a newly built grammar school. He managed two general stores. Although he had no formal education in law, he passed the bar exam and was admitted to the bar in 1754. He wrote and published an almanac each year from 1750 to 1761. He served as a mayor, a justice of the peace, a county judge, a Connecticut Superior Court judge, and as a delegate to both the First Continental Congress and the Second Continental Congress. After ratification of the Constitution, he served in the U.S. House of Representatives from 1789 to 1791 and in the U.S. Senate from 1791 until his death in 1793.

Sherman’s reputation was stellar. He was described as honest, cunning, a staunch opponent of slavery, a devout Christian who was outspoken about his faith, and a protector of states’ rights. William Pierce, a delegate to the Constitutional Convention who took extensive notes, said of Sherman, “He deserves infinite praise, no man has a better heart nor a clearer head. If he cannot embellish he can furnish thoughts that are wise and useful. He is an able politician, and extremely artful in accomplishing any particular object; it is remarked that he seldom fails.”

Role in the Bill of Rights and the Ninth Amendment

Originally, Sherman was opposed to adding a bill of rights to the Constitution due to its being “unnecessary” and “dangerous.” He, like other Federalists, stated that it was unnecessary as the powers enumerated in the Constitution granted limited authority; if certain powers were not enumerated and delegated, then the federal government wouldn’t have the authority to infringe upon the rights in question. Plus, the states had their own constitutions protecting their citizens’ rights, and the Constitution is concerned only with federal guarantees, not states’ guarantees. The Federalists considered it dangerous to list certain rights as it could be construed that other rights not singled out were surrendered to the government; in other words, if they were not written down, then those rights would not be considered protected.

The original Constitution was signed by 39 delegates on September 17, 1787. It was during the First Congress on June 8, 1789, that James Madison proposed to “incorporate such amendments in the Constitution as will secure those rights, which they consider as not sufficiently guarded […] to satisfy the public that we do not disregard their wishes.” After Madison persuaded Congress to create a Bill of Rights, the proposals were referred to a House select committee, the Committee of Eleven, which took up the debates. In 1987, the National Archives discovered among Madison’s papers the only known copy of the deliberations of that House Committee, and they are in Sherman’s handwriting, most likely reflecting the thoughts of the committee as opposed to his personal views.

This discovery has created a vigorous debate among legal scholars as to the meaning and legal effect of the Ninth Amendment, the text of which reads, “The enumeration in the Constitution, of certain rights, shall not be construed to deny or disparage others retained by the people”: namely, what are the rights “retained by the people” referring to, and what legal effect do they have? To give context, it is essential to go back to Madison’s original draft regarding retained rights:

The exceptions here or elsewhere in the Constitution, made in favor of particular rights, shall not be so construed as to diminish the just importance of other rights retained by the people, or as to enlarge the powers delegated by the Constitution; but either as actual limitations of such powers, or as inserted merely for greater caution.

After the House committee’s debates and revisions, Sherman’s notes read:

The people have certain natural rights which are retained by them when they enter into society, such as the rights of conscience in matters of religion; of acquiring property; and of pursuing happiness and safety; of speaking, writing and publishing their sentiments with decency and freedom; of peaceably assembling to consult their common good, and of applying to government by petition or remonstrance for redress of grievances. Of these rights therefore they shall not be deprived by the government of the United States.

According to the Bill of Rights Institute, once the Bill of Rights was drafted, Sherman supported it, just as the people of Connecticut supported it.

Deborah Hommer is a history and philosophy enthusiast who gravitates toward natural law and natural rights. She founded the nonprofit ConstitutionalReflections (website under construction) with the purpose of educating others in the rich history of Western civilization.

Categories
History

The Righteous Revolutionary Thanksgiving ‘Oration’

In mid-1772, a British customs schooner, the HMS Gaspee, attempted to catch an American packet ship off the coast of Rhode Island. The Gaspee was led by one Lieutenant William Dudingston, hated among Rhode Islanders for his strict enforcement of the Navigation Acts. In the case of Dudingston, rather than “enforcement,” the locals might have used the word “harassment.”

During the chase, the Gaspee ran aground. Stuck on a sandbar, and hence vulnerable, the Gaspee became the target of a group of Providence men, many of them Sons of Liberty. Assembled via the town crier, the patriots rushed toward the ship. The crew attempted to resist, but it was no use. Dudingston himself was shot and wounded, and his ship was burned down to the waterline.

British authorities tried to get the colonial perpetrators of the “Gaspee Affair” extradited to England to stand trial for treason, but the government couldn’t figure out who they were. Even a large reward failed to produce the names. But the Affair had only just begun to run its course.

A Baptist minister named John Allen, a recent arrival from Britain, while preaching in Boston at the end of that year, invoked this incident in his December 3 sermon, entitled “An Oration on the Beauties of Liberty.” Subsequently printed as a pamphlet, this sermon became a best-seller throughout the colonies. The question Allen posed was: Do the Rhode Islanders who destroyed the Gaspee receive their Laws from England?

“O! Amazing!” Allen reflected. “I would be glad to know what right the King of England has to America. It cannot be an hereditary right…; it cannot be a parliamentary right that lies in Britain, not a victorious right, for the King of England never conquered America. Then he can have no more right to America than what the people have, by compact, invested him with, which is only a power to protect them and defend their rights, civil and religious; and to sign, seal, and confirm as their steward such laws as the people of America shall consent to.”

And if this be the case, Allen thundered, “then judge whether the King of England and his ministry are not the transgressors in this affair in sending armed Schooners to America to steal by power and sword the people’s property.”

The message was clear: The British king hadn’t inherited America as some personal property; he’d never conquered it, and the power of Britain’s Parliament lay in Britain, not in the colonies. Rhode Island’s people were free, their rights were enshrined in their charter, and their laws originated in their own assembly. Who, then, was the true aggressor?

Five ‘Observations’

Allen’s “Oration” was built around five observations.

First: that “a craving, absolute Prince, is a great distress to a people.”

Second: that when the three branches of government, “king, judges, and senates unite to destroy the rights of the people by a despotic power… the destruction of the people’s rights is near at hand.”

Third: that “an arbitrary despotic power in a prince, is the ruin of a nation, of the King, of the crown, and of the subjects,” and that neither the King of England nor the Parliament of England can “justly make any laws to oppress or defend the Americans” because “they are not the representatives of America.”

Fourth, Allen channeled his inner John Locke:

“THAT it is not rebellion, I declare it before GOD, the congregation, and all the world, and I would be glad if it reached the ears of every Briton, and every American; That it is no rebellion to oppose any king, ministry, or governor, that destroys by any violence or authority whatever, the rights of the people. Shall a man be deem’d a rebel that supports his own rights? It is the first law of nature, and he must be a rebel to GOD, to the laws of nature, and his own conscience, who will not do it.”

And fifth: “That when the rights and liberties of the people are destroyed, it is commonly by the mischievous design of some great man,” who Allen wisely did not specifically mention by name.

These were radical sentiments—and tens of thousands of Americans read them enthusiastically. According to American Founder John Adams, by mid-1773, patriots like James Otis Jr. were regularly reading Allen’s “Oration” to “large Circles of the common People.”

In his “Oration,” Allen insisted on something Americans must remember today:

“A right to the blessing of freedom, we do not receive from Kings, but from Heaven, as the breath of life and essence of our existence, and shall we not preserve it, as the beauty of our being? Do not the birds of the air expand their wings? the fish of the sea their fins? and the worm of the earth turn again when it is trod upon? And shall it be deem’d rebellion? Heaven forbid it! … It is no more rebellion, than it is to breathe.”

Dr. W. Kesler Jackson is a university professor of history. Known on YouTube as “The Nomadic Professor,” he offers online history courses featuring his signature on-location videos, filmed the world over, at NomadicProfessor.com

Categories
History

The Capitol’s Statue of Freedom

As I step outside the House chamber on the second floor of the Capitol, I guide my visitors halfway down the stairs outside, offering them a sweeping view of the Supreme Court building and the Library of Congress. That’s when I call their attention to something else altogether—the crowning achievement, literally, of the Capitol: the Statue of Freedom, perched atop the dome, solitary, magisterial.

It is perhaps the most recognizable feature of the Capitol, an iconic world image of liberty and government by the people. Peering into the distance nearly 300 feet above the East Front Plaza, the bronze statue is of epic dimensions, soaring almost 20 feet high and weighing about 15,000 pounds. Freedom is decked out in an elaborate headdress topped by an eagle head and feathers. Her flowing dress is cinched with a large brooch emblazoned with two letters: U.S. In her right hand, she clasps a sheathed sword, while the other clutches a laurel wreath of victory and a shield.

The Statue of Freedom perched atop the Capitol is something to behold and serves as a symbol of my stewardship as a member of Congress, which is why I selected that image of the Capitol dome to adorn my letterhead. This is what I want my constituents to see, to be reminded of, when I write to them.

The Statue of Freedom also symbolizes the personal quest for freedom of one man, Philip Reid, born into slavery in Charleston, South Carolina, in 1820. In one of the great ironies of American history, Reid, as a slave, was assigned the complex project of creating and placing one of the world’s most powerful symbols of freedom on the most visible building in our nation.

The Statue of Freedom atop the U.S. Capitol dome in Washington, D.C., on July 1, 2010. (Paul J. Richards/AFP via Getty Images)

I’ll admit, I didn’t know the
 story of Reid until after I became
 a member of Congress and got the 
lowdown on the history of the Capitol from those in the know. But once I heard about Reid’s remarkable story, I delved deeper, reading more about it online. I mention all this about Reid during my tours, and though not one of my visitors has ever known the story beforehand, they are surely glad to hear it. Slavery is a terrible stain on our history, but my guests are palpably proud of how far America has come since then.

The statue was commissioned in 1855. Thomas Crawford, an American sculptor, created the plaster model of the statue in Rome, Italy. After his death in 1857, his widow shipped the statue in six crates, and the model was assembled and placed in what is now Statuary Hall. The following year, Clark Mills, a self-taught sculptor, was given the task of casting Freedom. Mills started his business in South Carolina, where he purchased Reid for $1,200. Reid dismantled the model in the Capitol, cast the individual sections, and finally assembled and mounted the bronze sections atop the dome.

On April 16, 1862, as Reid supervised the creation of the statute’s massive bronze sections, Congress passed the District of Columbia Emancipation Act, freeing thousands of slaves living within the district. That included Reid. As a free man, he kept working for Clark Mills. At noon, on December 2, 1863, under Reid’s supervision, the top section of the Statue of Freedom was raised and bolted on top of the Capitol dome.

Many of the experts with whom I have toured the Capitol offered various explanations for the direction the Statue of Freedom faces. Some say Freedom faces east because every morning she watches the sun rise on America with a new day of liberty for all. Others say she faces east because the primary entrance to the Capitol is on the east side, or because most residents of Washington, D.C., at the time lived on the east side. Yet others suggest she faces east because European settlers came from that direction.

The Statue of Freedom on top of the U.S. Capitol dome, silhouetted against the super moon on Jan. 20, 2019. (Brendan Smialowski AFP via Getty Images)

As the foreman in the casting of the Statue of Freedom, Philip Reid stepped in when an Italian sculptor hired to assemble the five sections refused unless granted a pay raise. It was Reid who figured out how the pieces were separated and put together. He was paid $1.25 a day, though his owner received those payments, except on Sundays, when it was his own. Mills, the man who bought Reid, described him as “short in stature, in good health, not prepossessing in appearance, but smart in mind.”

Reid was a freed man by the time the last piece of the Statue of Freedom was assembled in December 1863. He went on to become a respected businessman, identified in census records as a “plasterer.” While a plaque to Reid resides not at the Capitol, but where his remains lie at the National Harmony Memorial Park in Landover, Maryland, his place in history—and on my tour—remains resolute.

Excerpted from the 2020 book, “Capitol of Freedom: Restoring American Greatness,” by Colorado Rep. Ken Buck.