The History of Baby Formula

We should count ourselves lucky to live in a time when infant formula is worth it if we have to give it to our babies.

While formula is unable to replicate much of the components in breast milk, formula has been able to save countless infant lives.

We are fortunate to live in a time when we have a breast milk substitute that, while not the same as breast milk, is an adequate alternative. Certainly, less than a century ago this was not the case.

The Birth of Formula

While breastfeeding was at one time the primary way to feed infants, breast milk substitutes were not a concept new to the 20th century, when formula’s approval increased so drastically.

1858-1912

Babies who were unable to breastfeed in the 19th century primarily received straight cow milk as the substitution, with cereal and other solids introduced to older infants. Although evaporated milk was first marketed in 1858, its use was short lived because of the fear of it leading to scurvy. Powdered formula was available commercially by the late 1800s, but it was out of reach financially for many families.

Related: Why Milk Sharing Is A Gift That Keeps Giving

However, cow milk was not an ideal substitute: Major improvements in sanitation, dairying practices, and milk handling didn’t occur until 1900 — not to mention that sterile rubber nipples and the home icebox, precursor to the refrigerator, weren’t widely available until 1912.

1920s

Beginning in the 1920s, parents were directed to supplement their formula-fed infants with fruit juices to combat scurvy and cod liver oil to protect against rickets, two medical conditions caused by deficiencies of vitamins C and D. By the mid-1920s, formula prepared with evaporated milk dramatically improved upon the low digestibility and bacterial contamination of previous breast milk substitutes.

1930s

In the 1930s, the common breast milk substitute was prepared at home using either evaporated milk or fresh cow milk mixed with water and corn syrup as a carbohydrate. The advantage of evaporated milk was its vitamin D fortification. In either instance, the infant was supplemented with orange juice.

1940s

Breastfeeding rates dropped quickly during the 1940s when women were entering the workforce during World War II. After the war, coinciding with improved mass communications through printing methods, radio and television, formula use was so successfully marketed as superior to breastfeeding that breastfeeding rates declined to nearly nonexistent. This continued for several decades.

1970s

Even as late as 1970, fewer than 25% of mothers in the United States were initiating breastfeeding, and of those who breastfed, only 14% were still breastfeeding at 2 months old. Nearly all infants were weaned, either to formula or cow milk, by 4-6 months. The loss of breastfeeding benefits aside, formula use had its limitations.

The Development of Formula

In the 1950s, the medical community began to address several issues with infant formula use:

  • The tendency for infants to become dehydrated during illness;
  • Low content of iron resulting in high rates of iron deficiency;
  • Intestinal blood loss associated with fresh cow milk;
  • Low content of essential fatty acids and the continued problems with scurvy.

It was during this decade that concentrated liquid formula was developed, which was more affordable than powdered formula, and it reigned until 1970, when powdered and ready-to-feed formula use increased with the returned popularity of breastfeeding, since many breastfeeding mothers chose — or were advised to by their healthcare providers — to supplement with formula.

Related: Why I Gave Up on My Breastfeeding Dream

Formula quality improved dramatically in the 1950s when a vitamin B6 deficiency among formula-fed infants was corrected. Iron-fortified formula was introduced in 1959, and in 1962, the formula base was changed to better match the protein ratio in breast milk.

By the late 1960s, less than 10% of infants were fed home-prepared formula, greatly improving the nutrition and health among the formula-fed infant population.

In the 1970s, formula was further improved in response to chloride deficiency in formula-fed infants.

Still, iron deficiency among formula-fed infants was a common occurrence through 1980. It was in this decade that mothers began delaying
introduction of cow milk to beyond 6 months and when cereal’s iron fortification process was improved to increase
absorption — both reducing iron deficiency among formula-fed infants.

That infants can develop an allergy to cow milk was known early on. In fact, a soy formula was first developed in 1929. It regularly caused malodorous diarrhea and ulceration in the diaper area. A meat-based formula and a formula made with casein hydrolysate, a cow milk derivative, were also offered to sensitive infants. These early special formulas were not fortified with vitamins, because it was believed at that time that these could include allergens.

Not surprisingly, the medical community in the 1950s and 1960s reported several deficiencies in infants fed these special formulas, particularly vitamin K and iodine, leading to goiters. An improved, fortified soy formula was developed in the 1960s. By the 1990s, more than 20% of formula-fed infants were receiving soy formula.

Early Introduction to Solids

From at least the 1940s through the 1970s, most infants were weaned from formula to cow milk at 4-6 months of age, although it was increasingly recognized that feeding cow milk predisposed infants to dehydration during illness as well as led to iron deficiency. The thought was that cereal would mediate these effects of cow milk consumption in infants and that water was a suitable supplement during times of illness, in addition to cow milk being fed as the nutrition base.

The driving motivator for transitioning infants to cow milk so quickly was that cow milk was considerably less expensive and required no preparation compared to formula. It was also widely believed by parents that infants who were able to tolerate solids sooner were developing better — the so-called “early independence” factor.

In the 1960s, there was also the notion that whole cow milk would lead to obesity in infants, so some pediatricians were recommending feeding skim milk instead. Infants subjected to this practice consumed enormous amounts of milk and often lost skin folds, indicative of using their own body fat stores to stave off starvation.

While solid foods were to be introduced by 4-6 months in the 1930s, pediatricians were recommending solids to be started by 2 months in the 1950s, with some advising cereal by 2-3 days and strained vegetables by 10 days! In the 1960s, the guideline was to feed cereals by 1 month old. The first baby foods developed at this time also had salt, monosodium glutamate (MSG), sugar, and modified food starches added to improve taste and texture per adult taste standards.

The 1970s brought great change in infant-feeding views when the practices of feeding skim milk and putting food additives in baby foods discontinued. In addition, cereal introduction was recommended to be delayed until 4 months because earlier consumption was found to contribute to overeating habits. There was also research showing that iron deficiency during the first few years of life led to delayed cognitive development.

Comeback of a Breastfeeding Culture

The 1970s marked a turning point for infant feeding in other ways. This decade is when recommendations were first issued to defer introduction of cow milk to infants. This immediately resulted in infants being fed formula longer. In 1971, 20% of 6-month-old infants were formula-fed, whereas more than 50% of 6-month-old infants were on formula by 1980. The current recommendation that all infants are to either be breastfed or formula-fed until 12 months, when cow milk can be introduced, was issued in 1992.

At the same time, the U.S. Special Supplemental Nutrition Program for Women, Infants, and Children (WIC) was providing free formula to low-income families, those most at risk of weaning infants to cow milk before 6 months of age. What started out as a very small anti-hunger program in the 1970s grew to be a major influence on infant feeding by the late 1980s. By the mid-1990s, 1.99 million infants were being served by WIC, representing about 47% of live births.

For years, WIC was considered only a program through which to receive free formula. Over time, however, WIC has become a major influence in improving breastfeeding rates among at-risk populations as well.

The 1970s saw an explosion of information about the components of breast milk as well as the non-nutritive benefits of breastfeeding. This spurred research as well as medical interest, leading to the creation of a new profession: lactation services in the form of lactation consultants, breastfeeding educators, and peer counselors. Support of breastfeeding, and the accompanying Attachment Parenting movement, has expanded ever since.

Today, the general recommendation is to breastfeed —  or formula-feed, if breast milk is not available — exclusively for at least 6 months before introducing solids and to continue breastfeeding or feeding formula until 12 months before introducing cow milk. Breastfeeding may continue beyond the first birthday for as long as is mutually desired by mother and baby, and the World Health Organization recommends breastfeeding at least 2 years. Research on the benefits of breastfeeding infants is broad, while research on breastfeeding toddlers is growing.

According to the latest U.S. Breastfeeding Report Card by the CDC, American breastfeeding rates are notable:

  • 81% of all infants begin life breastfeeding
  • Skin-to-skin contact and rooming-in are both on the rise,
  • Support is increasing in the workforce as well as at child care centers, and
  • 51% of babies are being breastfed at 6 months and 30% at 12 months.

Formula is still a significant part of the picture, with 55% of breastfed infants receiving some formula at 3 months and 77% of breastfed infants getting some formula at 6 months.

A lot has improved in our knowledge of infant nutrition during the past century. Today, the medical community embraces breastfeeding as well as appropriate infant development, and the formulas available are able to provide mothers with the ability to feed a safe, adequate breast milk substitute if needed.

While very few mothers are entirely unable to breastfeed, many mothers still rely on formula to some extent — a trend that is slowly changing as breastfeeding support expands. There may always be babies who need a breast milk substitute. Fortunately, formula today is worth falling back on.


Leave a Reply

Your email address will not be published. Required fields are marked *