If Our Brains Are So Active During Infancy, Why Don’t We Remember Anything From That Time?

iStock
iStock

If our brains are so active and developing during infancy, why don’t we remember anything from that time?

Fabian van den Berg:

Ah, infantile amnesia as it’s better known. Weird, isn’t it? It’s a pretty universal phenomenon where people tend to have no memories before the age of four-ish and very few memories of the ages five to seven. What you say in the question is true, our brains are indeed very actively developing in that time, but they are still developing after five years as well.

The specifics aren’t known just yet. It’s tricky because memory itself is very complicated and there are swaths of unknowns that make it difficult to say for certain why we forget these early memories. This will be mostly about consensus and what can be supported with experiments.

(Image based on data from Rubin & Schulkind, 1997 [1] )

I’ll skip the whole introduction to memory bit and state that we focus on the episodic/autobiographical memories only—events that happened to us in a certain place at a certain time. And we have two forgetting phases, the early one until about four years old, and a later one from about five to seven years old, where we have very few memories.

The first notion to go is that this is “just normal forgetting,” where it’s just difficult to remember something from that long ago. This has been tested and it was found that forgetting happens quite predictably, and that the early years show less memories than they should if it was just regular old forgetting.

This leaves us with infantile amnesia, where there are probably two large camps of explanations: One says that children simply lack the ability to remember and that we don’t have these memories because the ability to make them doesn’t develop until later. This is the late emergence of autobiographical memory category.

The second big camp is the disappearance of early memory category, which says that the memories are still there, but cannot be accessed. This is also where the language aspect plays a part, where language changes the way memories are encoded, making the more visual memories incompatible with the adult system.

Both of them are sort of right and sort of wrong; the reality likely lies somewhere in between. Children do have memories, we know they do, so it’s not like they cannot form new memories. It’s also not likely that the memories are still there, just inaccessible.

Children do remember differently. When adults recall, there is a who, what, where, when, why, and how. Kids can remember all of these too, but not as well as adults can. Some memories might only contain a who and when (M1), some might have a how,
where, and when (M3), but very few, if any, memories have all the elements. These elements are also not as tightly connected and elaborated.

Kids need to learn this; they need to learn what is important [and] how to build a narrative. Try talking to a child about their day: It will be very scripted [and] filled with meaningless details. They tell you about waking up, eating breakfast, going to school, coming home from school, etc. Almost instinctively an adult will start guiding the story, asking things like, “Who was there?" or "What did we do?”

It also helps quite a bit to be aware of your own self, something that doesn’t develop until about 18 months (give or take a few). Making an autobiographical memory is a bit easier if you can center it around yourself.

(Image from Bauer (2015) based on the Complementary Process Account [2] )

This method of forming memories makes for weak memories, random spots of memories that are barely linked and sort of incomplete (lacking all the elements). Language acquisition can’t account for all that. Ever met a three-year old? They can talk your ears off! So they definitely have language. Children make weak memories, but that doesn’t completely tell you why those memories disappear, but I’ll get there.

The brain is still growing, very plastic, and things are going on that would amaze you. Large structures in the brain are still specifying and changing, the memory systems are part of that change. There’s a lot of biology involved and I’ll spare you all the science-y sounding brain structures. The best way to see a memory is as a skeleton of elements, stored in a sort of web.

When you remember something, one of the elements is activated (which can be by seeing something, smelling something, or any kind of stimulus), which travels through the web activating all the other elements. Once they are all activated, the memory can be built, the blanks are filled in, and we “remember."

This is all well and good in adults, but as you can imagine this requires an intact web. The weak childhood memories barely hung together as they were, and time is not generous to them. Biological changes can break the weak memories apart, leaving only small isolated elements that can no longer form a memory. New neurons are formed in the hippocampus, squeezing in between existing memories, breaking the pattern. New strategies, new knowledge, new skills—they all interfere with what and how we remember things. And all of that is happening very fast in the first years of our lives.

We forget because inefficient memories are created by inefficient cognitive systems, trying to be stored by inefficient structures. Early memories are weak, but strong enough to survive some time. This is why children can still remember. Ask a four-year-old about something important that happened last year and chances are they will have a memory of it. Eventually the memories will decay over the long term, much faster than normal forgetting, resulting in infantile amnesia when the brain matures.

It’s not that children cannot make memories, and it’s not that the memories are inaccessible. It’s a little bit of both, where the brain grows and changes the way it stores and retrieves memories, and where old memories decay faster due to biological changes.

All that plasticity, all that development, is part of why you forget. Which makes you wonder what might happen if we reactivate neurogenesis and allow the brain to be that plastic in adults, huh? Might heal brain damage, with permanent amnesia as a side-effect ... who knows!

Footnotes

[1] Rubin, D. C., & Schulkind, M. D. (1997). Distribution of important and word-cued autobiographical memories in 20-, 35-, and 70-year-old adults. Psychol Aging.

[2] Bauer, P. J. (2015). A complementary processes account of the development of childhood amnesia and a personal past. Psychological review, 122(2), 204.

This post originally appeared on Quora. Click here to view.

Why Do We Eat Candy on Halloween?

Jupiterimages/iStock via Getty Images
Jupiterimages/iStock via Getty Images

On October 31, hordes of children armed with Jack-o'-lantern-shaped buckets and pillow cases will take to the streets in search of sugar. Trick-or-treating for candy is synonymous with Halloween, but the tradition had to go through a centuries-long evolution to arrive at the place it is today. So how did the holiday become an opportunity for kids to get free sweets? You can blame pagans, Catholics, and candy companies.

Historians agree that a Celtic autumn festival called Samhain was the precursor to modern Halloween. Samhain was a time to celebrate the last harvest of the year and the approach of the winter season. It was also a festival for honoring the dead. One way Celtics may have appeased the spirits they believed still walked the Earth was by leaving treats on their doorsteps.

When Catholics infiltrated Ireland in the 1st century CE, they rebranded many pagan holidays to fit their religion. November 1 became the “feasts of All Saints and All Souls," and the day before it was dubbed "All-Hallows'-Eve." The new holidays looked a lot different from the original Celtic festival, but many traditions stuck around, including the practice of honoring the dead with food. The food of choice for Christians became "soul cakes," small pastries usually baked with expensive ingredients and spices like currants and saffron.

Instead of leaving them outside for passing ghosts, soul cakes were distributed to beggars who went door-to-door promising to pray for souls of the deceased in exchange for something to eat. Sometimes they wore costumes to honor the saints—something pagans originally did to avoid being harassed by evil spirits. The ritual, known as souling, is believed to have planted the seeds for modern-day trick-or-treating.

Souling didn't survive the holiday's migration from Europe to the United States. In America, the first Halloween celebrations were a way to mark the end-of-year harvest season, and the food that was served mainly consisted of homemade seasonal treats like caramel apples and mixed nuts. There were no soul cakes—or candies, for that matter—to be found.

It wasn't until the 1950s that trick-or-treating gained popularity in the U.S. Following the Great Depression and World War II, the suburbs were booming, and people were looking for excuses to have fun and get to know their neighbors. The old practice of souling was resurrected and made into an excuse for kids to dress up in costumes and roam their neighborhoods. Common trick-or-treat offerings included nuts, coins, and homemade baked goods ("treats" that most kids would turn their noses up at today).

That changed when the candy companies got their hands on the holiday. They had already convinced consumers that they needed candy on Christmas and Easter, and they were looking for an equally lucrative opportunity to market candy in the fall. The new practice of trick-or-treating was almost too good to be true. Manufacturers downsized candies into smaller, bite-sized packages and began marketing them as treats for Halloween. Adults were grateful to have a convenient alternative to baking, kids loved the sweet treats, and the candy companies made billions.

Today, it's hard to imagine Halloween without Skittles, chocolate bars, and the perennial candy corn debates. But when you're digging through a bag or bowl of Halloween candy this October, remember that you could have been having eating soul cakes instead.

Have you got a Big Question you'd like us to answer? If so, let us know by emailing us at bigquestions@mentalfloss.com.

What's the Difference Between Cement and Concrete?

Vladimir Kokorin/iStock via Getty Images
Vladimir Kokorin/iStock via Getty Images

Picture yourself walking down a city block. The sidewalk you follow may be obscured by shuffling feet and discarded gum, but it’s clearly made from something hard, smooth, and gray. What may be less clear is the proper name for that material: Is it concrete or cement? Is there even a real difference between the two words?

Though they’re often used interchangeably, concrete and cement describe different yet related elements of the blocks, flooring, and walls that make up many everyday structures. In simple terms, concrete is the name of the gray, gritty building material used in construction, and cement is an ingredient used in concrete.

Cement is a dry powder mixture that looks much different from the wet stuff poured out of so-called cement trucks. It’s made from minerals that have been crushed up and mixed together. Exactly what kind of minerals it’s made from varies: Limestone and clay are commonly used today, but anything from seashells to volcanic ash is suitable. After the ingredients are mixed together the first time, they’re fired in a kiln at 2642°F to form strong new compounds, then cooled, crushed, and combined again.

Cement
Cement
lior2/iStock via Getty Images

This mixture is useless on its own. Before it’s ready to be used in construction projects, the cement must be mixed with water and an aggregate, such as sand, to form a moldable paste. This substance is known as concrete. It fills whatever mold it’s poured into and quickly hardens into a solid, rock-like form, which is partly why it’s become the most widely-used building material on Earth.

So whether you’re etching your initials into a wet sidewalk slab, power-hosing your back patio, or admiring some Brutalist architecture, you’re dealing with concrete. But if you ever happen to be handling a chalky gray powder that hasn’t been mixed with water, cement is the correct label to use.

Have you got a Big Question you'd like us to answer? If so, let us know by emailing us at bigquestions@mentalfloss.com.

SECTIONS

arrow
LIVE SMARTER