Disengaged Students, Part 16: Too Much Parental Concern, But Not About Education

In this 20-part series, I explore the root causes and effects of academic disengagement in K-12 learners and explore the factors driving American society ever closer to being a nation that lacks intellectualism, or the pursuit of knowledge for knowledge’s sake..

Today’s parents are hyper-aware that parental actions in the formative years can lead to problems in later life, and the widespread disagreement about which actions are harmful leads to great stress and confusion .  Raising children safely to adulthood is no longer sufficient to earn the label of a “good” parent. A glance at any handful of parenting blogs shows the self-conscious nature of the job. Moms and dads debate the repercussions of breastfeeding versus formula feeding, working full-time versus staying home with children (or a hybrid version).  They speculate about how many extra-curricular activities are healthy and how many will ruin kids forever. Instead of parents defending their own views on parenting, there is a lot of hand-wringing and self-doubt that screams for reassurance in the comment section.

Parents today worry about their actions and daily choices far more than new parents did just ten years ago. While it is admirable that parents take so much interest in how they interact with their kids, much of this worry is not wisely focused. It seems that in the debate over the long-term life effects of wearing organic clothing or the impact of allowing children to buy school lunches, a more important issue is being neglected: the involvement of parents in early childhood education and in the years that follow.

The Irrationality of the Vaccination Debate

Perhaps one of the hottest of hot button issues in contemporary parenting is the issue of vaccinations. The average child will have 30 immunizations or booster shots by the time she enters Kindergarten, not counting the recommended yearly influenza shot. The dramatic rise in vaccinations since 1980, when immunizations were offered for only seven known diseases, is a direct result of improvements in modern medicine that make it possible to ward off other illnesses. Chicken pox, once considered an accepted childhood rite of passage, is no longer a concern since all children are required to be vaccinated for it within their first year of life. In fact, the Centers for Disease Control and Prevention recommend that children today be vaccinated against 16 preventable diseases throughout childhood. Instead of praising these marvels of modern medicine, though, many parents are questioning them.

On its website, the CDC addresses the most common questions asked by parents who are concerned about the adverse effects of vaccination on the littlest family members:

Why are vaccines given at such a young age? Wouldn’t it be safer to wait? Can so many vaccines, given so early in life, overwhelm a child’s immune system… so it does not function correctly?

Each question is answered with scientific data that validates the CDC’s recommendations and warns against delaying or skipping vaccines, claiming that the only measurable effect such actions have is to increase the number of incidences of childhood diseases for which vaccines were created. Though childhood diseases, some life-threatening or life-altering at the very least, have been basically vanquished, vaccinations fuel the contemporary parenting paranoia.

But Vaccines Cause Autism, Right?

The most widely-held concern about vaccinations is that they may cause autism and autism-spectrum disorders. The parents who believe this cite the rise in autism cases, up 78 percent since 2000. It’s a convenient argument, really, and one that releases parents of any responsibility for the rising problem that has no cut-and-dry answer yet. It is easy to blame vaccines and to feel a false sense of protection when asking doctors to delay giving them or refusing them altogether. These same parents may also believe that the hours children spend in front of a television or computer screen as infants add to the propensity for autism, or they may dismiss that theory altogether because the vaccine answer “makes the most sense.”

Scientists have found no direct link between vaccines, or their timing, on a rise in autism. In research on autism-spectrum disorders, scientists have learned that the brains of affected people are shaped differently than those of their neuro-typical peers. Genetics are believed to be a factor, particularly since some family trees tend to show a pattern of autism or other related issues. Some researchers will concede that there may be “triggers” that usher in the disorder, though these experts are quick to point out that this only happens in people already prone to the disorder. Leading autism research does not place any blame on recommended vaccines after children are born, but does argue that environmental causes, particularly during pregnancy, could contribute.

So if the people at the forefront of autism research place no blame on immunizations, why is the theory so prevalent? Actress and comedian Jenny McCarthy subscribes to the vaccination-induced autism theory as it relates to her son, and she has been widely publicized for speaking out. Perhaps it is easier for parents, angry and looking for somewhere to lay blame, to read a popular book written by McCarthy than to delve into the scientific research available from scientists with names they have never heard. Like other facets of American life, it is simpler to jump on the nearest bandwagon than to take time to hear all sides of an issue.

Organic Foods, Unsafe Car Seats

The vaccination debate is just one example of how parents are quick to succumb to paranoia and slow to research facts.  The same tendency appears in parents who buy food with the word “organic” stamped on the side, but do not actually read the ingredients on the label or research the true way the food was made. Parents worry about the negative effects of vaccines, processed foods and cyber-creeps stalking their children online. Yet, despite the fact that automobile accidents account for the largest percentage of accidental child deaths, the site SeatCheck.org reports that 70 percent of parents and caregivers improperly restrain children in car seats. Parents seem to invest great passion in warding off perceived threats and spend less energy on dealing with real dangers. It is easier to make decisions based on ideology than actual reality.

This trend of paranoia among parents is dangerous in physical terms but also in less measurable ways. Children who learn to cling to ideas without proof are less likely to seek out true answers academically. It is enough for these children to simply memorize what is placed in front of them, without any real questioning, because they have learned in their pre-K years to accept ideas automatically rather than examining them critically.

Some parents may mistake this trait for tolerance or sensitivity.  But unless they demand substantive understanding and seek actual truth, children have no ownership of knowledge. What’s worse, they are indifferent to the distinction between reality and groundless ideas. Educators are tasked with awakening students’ desire to learn the why and how of things, not simply to accept what is put in front of them. Parents who are not vigilant about seeking their own truths at home make the intellectual pursuit of knowledge more of a challenge for K-12 educators – and play into the theories of irrational thought.

Disengaged Students, Part 15: Careerism vs. Intellectualism in K-12 Education

In this 20-part series, I explore the root causes and effects of academic disengagement in K-12 learners and explore the factors driving American society ever closer to being a nation that lacks intellectualism, or the pursuit of knowledge for knowledge’s sake.

Parents want what is best for their kids and most will say that they just want them to be “happy.” Children’s education is important to parents, but so is the promise that they will get a job someday. Well-rounded approaches to education are not favored as strongly as focused learning programs that emphasize job skills and applications. Academic engagement, therefore, has been weakened by the belief that children should only go to school to learn marketable skills.

This is nothing new, particularly in American culture, as schools have long been viewed as vehicles for job-readiness. A student reading books from the traditional literary canon, whose authors are often referred to as “dead white guys,” is not truly preparing for a career or a way to make a living. The lessons gleaned from the words of Shakespeare or William Wordsworth do not have a marketable application – unless, of course, their reader ends up a scholar of either author. The importance of those lessons, therefore, is diminished by the general public (which includes some teachers and administrators) who believe that students should instead focus on what will be used in making a living. In other words, if it won’t help your lifetime earning potential, what good is it?

The Emphasis on Economics

The mission statement for the Common Core Standards, issued in 2013, includes this phrase: The standards are designed to be robust and relevant to the real world, reflecting the knowledge and skills that our young people need for success in college and careers. There is also a sentence talking about the standards strengthening Americans’ place in the “global economy.” There is no mention of intellectualism or education for knowledge’s sake. The Common Core Standards target practical reasons for knowledge attained in K-12 learning, lending support to the idea that everything taught must have a marketable application.

This emphasis on career-driven learning furthers the stereotype that people who seek knowledge for the sake of simply knowing more are “eggheads” or “nerds.” Has the recitation of poetry ever landed a person a promotion? When was the last time that an understanding of the satirical works of Jonathan Swift earned a person that sought-after raise at an annual review? People who waste time with wisdom that they cannot sell are seen as un-American, elitist and abnormal.

This push toward teaching skills as opposed to content has certainly been a longstanding part of American culture, but today it is exacerbated by the Internet. Schools do not have to be the places where students find classic works of literature, or are introduced to scientific theories, because all of that information is readily available with the click of a mouse. Schools, therefore, should seek to present as much knowledge as possible but should focus more succinctly on skills not easily attained through a search engine – or so the common belief goes. While it is true that a student who is savvy when it comes to attaining and maximizing information will fare well in the current K-12 system, and the workforce beyond, the reduction of knowledge and understanding to the finding and repetition of facts is anti-intellectual in nature.

Gaining an Advantage through Delayed Education

At an alarming rate, parents are voluntarily “red shirting” their children old enough for Kindergarten, citing social concerns or even worse, the desire to have an athletic advantage in the years to come. A report from Stanford University and the University of Virginia found that as many as 5.5% of children begin Kindergarten late as a result of parent preference. Lack of academic skills at the same level as peer students has always been a valid reason parents or educators decide to hold children back in grade levels, but should factors beyond actual learning achievement also be considered?

Proponents of parental choice when it comes to Kindergarten redshirting say that while academic merit can be measured, emotional impacts cannot. The separation anxiety that accompanies a child who goes to Kindergarten “too early” can have a negative impact for the rest of the student’s K-12 career, and beyond. These are certainly strong points and in some cases, valid ones. But when redshirting tactics become popular and are packaged as a choice for all children, academic engagement takes a hit. Schools are seen as arenas of socialization first and foremost, and the idea that K-12 education should be just that – education – is forgotten.

Children are inherently social beings, and humans naturally learn to adapt to their individual situations to maximize practicality. So why should schools need to teach either thing? Academic disengagement happens when students do not value the learning which is offered to them. When adults tell children that knowledge is worthwhile only when it helps the learner to enter a higher income bracket or gain promotions, children begin to undervalue other key aspects of their education.

So which is the more responsible approach? Encouraging the pursuit of all knowledge – or building a brighter economic future for our kids and the nation as a whole?

Disengaged Students, Part 14: Educational Technology – Intellectual or Anti?

In this 20-part series, I explore the root causes and effects of academic disengagement in K-12 learners and explore the factors driving American society ever closer to being a nation that lacks intellectualism, or the pursuit of knowledge for knowledge’s sake.

Technology penetrates every aspect of society – even our K-12 classrooms. The way knowledge is delivered today takes the shape of tablets, and computer screens, and even in-class projectors. Does all the flash and glamor of the fancy gear take away from the basic pursuit of knowledge, though?

It Starts in Infancy

Early childhood educational technology targets children from infancy and makes it easier for parents to feel good about using media in the early childhood years. Television programs and videos claim to offer the correct answer to the parent’s question “What should my baby be learning?”  Since such programming is developed by experts who certainly know more than the average parent about child development, these marketing ploys are accepted. Programs for infants are promoted as safe in small doses, as long as parents watch them with their little ones and participate too. Instead of reading books aloud, parents put children on their laps and spend a half an hour clapping along to classical music and gazing at bright, swirling colors on a screen.

This contrived form of “bonding” replaces tangible activities like rolling around on the floor, naming objects in the home or letting a baby turn the pages of sturdy board book. The American Academy of Pediatrics warns that children under the age of 2 should be exposed to NO screen time, but parents adjust the recommendation to fit their own family unit and routine, telling themselves that the APA warnings are for “other families” who use television or other media as a babysitter, not families like their own that use it as a form of early education.

Once the two-year mark is passed, it seems that children face a no-holds-barred attitude when it comes to television watching. A University of Michigan study found that television viewing among young children is at an eight-year high. Children between the ages of 2 and 5 watch an average of 32 hours of television every week between regular programs, videos, and programming available through gaming consoles. It is not the actual television shows that are harmful; in fact, the Journal of American Medical Association found that some educational television between the ages of 3 and 5 improves reading skills. It is the overuse of television and technology, and the underuse of basic learning activities like reading a book or playing with a ball, that lends itself to academic disengagement in the school years.

How Technology Warps the Learning Process

Even more active technological engagement, like using a computer or tablet for toddler learning activities, can foster academic disengagement by making the learning process entirely too easy. If a two-year-old child learns that the answer is always the touch of a screen away, how can the same child be expected to search for answers or show his work in his K-12 career? What parents today view as learning improvements are actually modern conveniences that devalue the pursuit of knowledge.

Though the eagerness to let technology replace traditional early childhood learning methods presents large-scale problems, the intent of the parents using that technology is often benign. Why not give children a head start on learning ABCs, colors and numbers that are easily taught through repetitious technology applications? Parents are not deliberately leading their pre-K offspring down the road of academic disengagement or anti-intellectualism for life, but when they allow technology to define early childhood learning, they sow the seeds of both problems. Questions that cannot be answered within a simple application format become too difficult, or too bothersome, for children to try to sort out later on.

Educators have not yet come to grips with the issue of parental dependence on technology. The first children who have had access to mobile applications from infancy are just beginning their K-12 careers and will likely see some of that technology made available in their classrooms. How will these children react when they are given a book to read, or when they receive a returned, marked-up math worksheet that requires editing by hand? Will these children scoff at the idea of non-digital requests, or handle them graciously as part of the learning process?

As with any technological progress in classrooms, mobile technology certainly has its positive place but educators (and parents before them) should also be asking what is being replaced – and how much of K-12 learning should be delegated to technology. Dependency on technology, particularly in relation to educational goals, is planted by parents (often unknowingly), and contributes to academic disengagement by making digitally enhanced learning too convenient and traditional learning pursuits too “boring”.

 

Disengaged Students, Part 13: Athletes as Heroes

In this 20-part series, I explore the root causes and effects of academic disengagement in K-12 learners and explore the factors driving American society ever closer to being a nation that lacks intellectualism, or the pursuit of knowledge for knowledge’s sake.

The worship of celebrities and athletes in contemporary culture perpetuates the devaluation of intellectual thought. Particularly when it comes to impressionable children, the way certain Americans (and not others, often equally worthy) are placed on pedestals steals some validity from intellectual pursuits.

The Legend that Michael Built

Those learning to become teachers today may not quite remember the phrase “Be Like Mike” which had a catchy, accompanying song and was all the rage during the height of the Chicago Bulls three-peat championship runs in the 1990s. You couldn’t turn on a television in the mid-90s without catching a Gatorade, Chevrolet or Fruit of the Loom commercial starring Michael Jordan. Chicago’s United Center is often called “the house that Michael built” and the National Basketball Association’s total television revenue grew $877 million over the course of Jordan’s career. Jordan was a goldmine for the products he endorsed in part because of his prowess on the court, but also because people really liked him.

The story of Jordan getting cut from his high school basketball team and then rising to become the best basketball player of all time, with six championship rings to prove it, resonated with ordinary people who had faced their own adversity. He was a symbol of how the great American characteristics of hard work, determination and rugged individualism could combine to create a superstar. That his talent on the court was incredible goes without saying. But the deep adoration he enjoyed from fans around the world ran deeper than the points he tallied on the scoreboard or the number of times in a season he earned the elusive “triple double.” People did not just want to BE like Mike. People liked Mike.

In a 2013 column that asks the questions “Do I Still Want to Be Like Mike?” writer Kelly Scaletta takes a retrospective look at the absurdity of an entire generation putting Jordan on a pedestal. Gambling addiction, paternity lawsuits and what was, at the time, the most expensive divorce in history (complete with millions in hush money to a mistress) have all pointed to a man who may not quite have been a model of integrity for his own kids, let alone everyone else’s. Even during his Hall of Fame speech Jordan showed a petty, narcissistic side by devoting much of his podium time to criticizing his nay-sayers and airing old grudges. Scaletta admits that Jordan was and is a great basketball player, but points out that his “character flaws are many.” He sums up the Jordan idolization, indicative of a much larger cultural norm, by saying:

Jordan is fascinating in that he’s a perfect example of what the problem has become with the way we view the American athlete. We demand them to be more than what they are, then we despise them for being what they aren’t. If they win, we forgive them for not being what they never weren’t.

Athletes Behaving Badly

Jordan is not the only professional athlete who has been adopted as a role model due to his professional prowess in spite of glaring character flaws. Look at the narcissism of current NBA star Dwight Howard, who reportedly has fathered six children by several different mothers. Howard’s outspoken, non-team-player attitude is at least more straightforward than the model of austerity that was fostered during Jordan’s golden years. The news that New England Patriots’ player Aaron Hernandez was arrested for a shooting death, and suspected for several others, sent shockwaves through the sports world and beyond. As more details emerged on the violent past of the former Florida Gator, who played alongside devout Christian and sometimes-missionary Tim Tebow, other players, coaches and friends of Hernandez suddenly remembered his sketchy and questionable past actions. Maybe he should not have been given a pass in life simply because he was an effective blocker. Maybe his athletic ability should not have overshadowed the basic tenets of his character.

And who could forget the sex scandal surrounding golf megastar Tiger Woods? A fixture of the golf world from his teenage years, Woods passed up the traditional adolescent rites of passage that accompany high school graduations, college party weekends, and post-college apartment living for the privilege his athletic ability afforded. After his marriage to model Elin Nordegren in 2004, Woods’ image as a clean-cut young man was transformed to one of an ideal family man. Though Woods never asked to be the poster child, or man, for young golfers, young black athletes, or young husbands, he became a role model in all three arenas. When his shiny gold star was scratched, he inadvertently disappointed millions and then had to apologize for it. Woods was forced to say “I’m sorry” for not being what he never wasn’t.

The stars like Woods or Jordan or Hernandez are not to blame for the American cult of unthinking hero-worship. Sure, they could act a little nicer, and not cheat on their wives, and not commit violent acts, but the core problem does not lie in their actions – it lies in Americans’ blind faith where athleticism is concerned. The same is true of celebrities, who it can be argued have even less admirable talents than athletes. When was the last time you heard a child name an astronaut or leading genetic researcher as his or her role model? Where is the love for philosophers, and authors, and Nobel Prize winners? The greatest problem is not the athletes themselves, but everyone else who worships them as more than what they truly are: flawed human beings.

There is something to be said, of course, for realistic role models who are viewed as humans and not gods. But if we, as a culture, expect our children to develop intellectualism and rationalism than we need to value critical thinking and learning over physical ability or attractiveness.

Which is more admired: a kid who earns a college scholarship because of what he knows, or one who earns it to bring the school a football championship? We cannot change the values of Americans overnight and have no control over what they view at home or in the media. Educators have the special responsibility of making an impact on students’ lives, and their desire for intellectual pursuits, in a very limited span of time. It’s an uphill battle but one that is better fought with recognition of the current state of stars as role models and its impact on K-12 academic disengagement.

 

Disengaged Students, Part 12: A Call to Revive Middlebrow Culture

In this 20-part series, I explore the root causes and effects of academic disengagement in K-12 learners and explore the factors driving American society ever closer to being a nation that lacks intellectualism, or the pursuit of knowledge for knowledge’s sake.

In her book The Age of American Unreason Susan Jacoby discusses “middlebrow” culture, the product of the early and mid-20th century push for middle-class people to attain higher levels of sophistication through easily accessible literature, art and music. As the name implies, middlebrow leanings are an intermediary between highbrow and lowbrow cultures.

Middlebrow pursuits, such as purchasing expensive encyclopedias on installment plans or hanging up duplications of famous paintings by artists like Van Gogh, were scoffed at by people who believed themselves to be of a higher class. Even Jacoby pokes some fun at her own modest upbringing in which her parents bought into some aspects of middlebrow culture in an effort to raise the sophistication level of her family. She remembers seeing duplications of famous works on art on her walls and listening to classical music on public radio broadcasts. At the time, she did not realize that her family could be the target of snickering for their attempts to attain higher levels of culture through mass production and consumption.

Where middlebrow culture went right, however, is in its intent. People without much money or access to higher levels of education took it upon themselves to broaden their world views with whatever resources they could afford. Parents in post-World War II America wanted their children to aspire to greater heights, and not just in the economic sense. They wanted their children to know more about the world and have high levels of intellect. For these parents, a heightened level of critical thinking meant an opportunity to improve one’s social, economic and intellectual standing. In other words, the development of the intellect came first and the desired consequences of that development came in second place. There were no shortcuts to ascending classes; knowledge was the key to making the jump.

Where have Middlebrow Tendencies Gone?

Today’s parents could take a cue from middlebrow aspirations, and so could the surrounding society. People who rise quickly to wealth are idolized, whether they are athletes or entrepreneurs like Facebook creator Mark Zuckerberg. Yes, Zuckerberg is an innovator who is highly regarded for his intelligence, but the emphasis on his monetary worth overshadows his actual accomplishments. Instead of examining how Zuckerberg used his intellect to develop a huge concept cleverly, our culture uses the Facebook story to promote the simplistic theory that all it takes is “one big idea” to achieve extreme wealth and prestige. That theory postulates that success is based on luck, fate, and being in the “right place at the right time.” This does not create a thirst for knowledge but a desire for dollars.

Society’s obsession with wealth as a determiner of success has a direct impact on K-12 education. So-called “practicality” in education simply encourages students to ask, “How much can I make with this knowledge?” The problem with the “practical” view of schools is that it focuses more on economic impact and less on knowledge and the love of intellectual pursuits. If one generation of students is allowed to graduate high school with only “real world” learning, the intellectual future of the entire nation is in peril. Thankfully, even with increasing pressure from parents and communities, educators still value at least some indirectly practical knowledge and so K-12 students still have some exposure.

Our students could benefit from a return to this seeking of classical knowledge, and not just using a search engine to find the easiest answer. Without the sacrifice to seek out the information, much of it is being devalued – something that will not only impact this generation of students, but the next ones too.

Disengaged Students, Part 11: The Irrationality of Modern Politics

In this 20-part series, I explore the root causes and effects of academic disengagement in K-12 learners and explore the factors driving American society ever closer to being a nation that lacks intellectualism, or the pursuit of knowledge for knowledge’s sake.

The anti-intellectualist tendencies in K-12 education in America have not arisen in a vacuum.  Schools have always been profoundly affected by the fears and political divisions of the broader culture.   This was true during the 1950s and early 1960s when elementary students regularly cowered underneath their desks during mandated bomb drills. It was true when metal detectors first started appearing following high-profile incidents of school violence in the 1980s. It is true today as students are more connected than ever through use of mobile technology in the classroom, blurring the boundaries between school hours and leisure time. Schools cannot teach in isolation; they must recognize the impact of outside forces on the learning experience in order to have any hope of reaching students.

But a vital question remains: How much should educators alter their teaching to adjust to cultural forces, and how much should they hold fast to their preferred teaching methods in spite of those cultural forces, in this time when cultural perceptions are changing and worldviews evolving with unprecedented rapidity? Is academic disengagement actually fueled by the influence of modern society?

Tea Party Politics and the Self-Made Man Fallacy

Certain popular political movements profoundly influence the worldview and characters of today’s students. Take the founding of the contemporary Tea Party movement around the time of the 2008 Presidential election. At its core, the Tea Party calls for less ‘government interference’ in the form of taxes, health care reform or other initiatives meant to serve the “greater good”. At the risk of oversimplifying, Tea Party members believe in self-made Americans who make their own money, keep their own money and should not be required to share any of that money with their fellow countrymen.

The problem with Tea Party philosophy, on an intellectual level, is that no American is fully self-made. The successful businessman who built an empire after growing up in conditions of poverty did that with at least some help from other Americans. Even if he never took advantage of any federal grant money for college, or any tax breaks when he was building his small business (unlikely in both cases, but certainly possible), that businessman enjoys his success in part because of the unique freedoms which America offers to its citizens. If this same hypothetical person and his business were set in another region of the world, say Ethiopia or Turkey, would he be as successful?

Without the privileges of American society, all earned by the sacrifices of other people, would that businessman have a shingle to hang at all?

Conservative Politics Versus Scientific Theory

It is not just government control that the Tea Party opposes.  Their base appears to believe that proven scientific facts are just another government-created scheme against religion and individual rights. Rick Perry, the governor of Texas, has repeatedly spoken out against the scientific theory of evolution and has also dismissed global warming facts as “fraudulent.” Mitt Romney expressed concern about manmade climate change when he was the governor of Massachusetts, but when he made his presidential run he rejected the notion that human actions could be behind the rise in global temperatures.

This change seems to have been made in order to appeal to Tea Party and other conservative constituents. A push by Newt Gingrich and fellow Republicans to eliminate the Office of Technology Assessment in Congress was successful during the term of George W. Bush, along with censorship of government-led scientific research (Mazo, 2011). Research Fellow for the International Institute for Strategic Studies Jeffrey Mazo sums up Republican-led rejection of scientific theory with this faulty syllogism:

If the scientists are correct, it means certain actions should be taken. Those actions should not be taken; therefore, the scientists are not correct.

Those who completely reject scientific theory on philosophical or religious grounds act as though they do not need to have a different, better explanation for the observed and proven facts. They simply believe what they believe and refuse to ask certain questions. This closed-off acceptance of non-truths damages the entire American mindset and is a danger to the intellectual growth of children. Even secular public schools feel the pressure of parents who have ethical objections to what is taught in accepted textbooks.

How Politics Influence K-12 Learners

When children come into the classroom with a warning from home regarding the inaccuracy of the lesson at hand, it undercuts the rest of what is taught too. How can students be academically engaged with material that they believe is at odds with their own upbringing?

Of course, The Tea Party is not the only group to blame for academic disengagement and cultural anti-intellectualism. It is merely one influential source that manages to penetrate the walls of K-12 classrooms, encouraging students to question the validity of facts. If people are free to openly doubt accepted theory as adults, then children should have that same right. Or so goes the belief.

Openly and strongly expressed political convictions, liberal or conservative, are easy to notice and criticize, but what about more subtle aspects of American thought and behavior? Kids’ attitudes often reflect the small, mundane details of their everyday lives. A political protest is a dramatic sight, and certainly it could stand out in the memory of a child, potentially influencing their belief system. But the behaviors and attitudes children see in their homes and learn about in their own communities day after day, the assumptions that are taken for granted rather than dramatized, exert an influence that is all the more potent because it so often goes unexamined.

 

Disengaged Students, Part 10: Paranoia and the Pressure to be Anti-Intellectual

In this 20-part series, I explore the root causes and effects of academic disengagement in K-12 learners and explore the factors driving American society ever closer to being a nation that lacks intellectualism, or the pursuit of knowledge for knowledge’s sake.

It is not only traditional authoritative bodies, like government and religion, which have promoted anti-intellectual thought. Widespread beliefs fueled by the people can be just as dangerous to intellectual progress. Consider the paranoia that erupts when a particular group is associated with a crime, or with anything that threatens the American way of life.

Anti-Muslim thought ran rampant in the days following the 9/11 attacks in 2001, much in the way that anti-Japanese sentiments were common, and even accepted, following the now infamous Pearl Harbor attacks of 1941. In both cases there is an argument to be made for the influence of media coverage on public perception, but outright discrimination was not sanctioned by official, authoritative bodies as a whole. People who chose to blame the actions of a small group of offenders on a larger demographic without further consideration did it on their own, or as a result of peer pressure.

Paranoia and Irrational Thought

Much like the anti-intellectual activities of government and religion, witch hunts carried out by common man have been around for centuries. Pearl Harbor and 9/11 took place in the 20th and 21st centuries but the public reaction bears a striking resemblance to the anti-Irish sentiments that ran high in 19th -century America. Those outside Irish culture often assumed that Irish immigrants were alcoholics, only smart enough to land low-paying manual jobs, based only on their ethnicity.

Since many Irish Americans were also Catholic, initiatives like the Know Nothing Movement of the 1850s were created to keep all Catholics from holding public office. Given the current 75 million Americans who claim to be Catholic, these discriminatory movements now seem barbaric – even silly. It is important to note, however, that as recently as John F. Kennedy’s election to Commander in Chief in 1960, resentment against Catholics in America was rampant, with mistrustful non-Catholics portraying Catholics as blind followers of the Pope.

This religious prejudice cut both ways. It was a common practice for even barely-devout Catholics of the 1940s, 1950s and beyond to shun establishments that did not carry the stamp of approval of the church. The YMCA and Salvation Army, both considered fairly neutral religious organizations by the contemporary public, were off-limits for people who answered to the Pope during the middle of the 20th century because they were not expressly Catholic. The church certainly influenced this movement, but Catholic-law abiding citizens who associated with the YMCA or the Salvation Army risked social suicide more than church expulsion. Peer pressure intensified religious division then, and it still does today.

Conversion to Mainstream Beliefs

Consider the recent controversy surrounding DOMA, or the Defense of Marriage Act.  The challenge to DOMA reached the Supreme Court on the shoulders of outspoken octogenarian Edith Windsor, who represented the human face of a theoretical debate surrounding the legal validity of same-sex marriages along with the rights associated with that distinction. DOMA’s Supreme Court appearance, which began with a purely legal argument based on Windsor having to pay over $350,000 in estate taxes after the death of her partner Thea Spyer in 2009, made people feel inspired to air their own beliefs about the sanctity of marriage and about who should be allowed to enjoy its privileges.

There were politicians and religious leaders who spoke up on both sides of the debate, but perhaps the most vocal and heated discourse came from the common people themselves, particularly through social media circles. For weeks before the SCOTUS ruling, people posted stories, photos, facts and other memes meant to influence their friends one way or another.

It mattered not that the case did not reference the Biblical definition of marriage, or any other religious documents; people hotly debated whether or not Jesus (who never married, according to his followers) would approve and whether an end to DOMA would spell the end of days for America. It mattered not that nothing in the lawsuit asked for churches to be forced to perform same-sex marriage rites or even to recognize the unions in their congregations. It mattered not that the lawsuit did not ask for the legal rights of heterosexual married couples to change even slightly. Common man turned the pending ruling into a religious and moral debate, whether for or against DOMA.

When the Supreme Court struck down the 1996 act in June of 2013, the Internet-waves exploded with the pronouncements of ecstatic and disappointed people. Even after an official decision had been made at a federal level, arguments raged about what the ruling truly meant for the moral future of Americans. A comparison to the landmark Roe vs. Wade decision was made on both sides of the issue without much regard for the fact that, though Roe was considered a “victory” by liberals, abortion rights remain on the perennial state legislature chopping block 40 years after the supposed triumph. The inability of either side of the issue to see the basic, rational grounds for overturning the law demonstrated a primitive side of American anti-intellectualism that is often guided by emotions at the cost of facts.

Social Acceptance and Anti-Intellectualism

Pressure to keep up with the Joneses, or at least to be considered “normal” by peers, is ubiquitous. People want to be accepted by others in their immediate surroundings – neighborhoods, workplaces, churches and family units. Sometimes it is simply easier to go with the popular thought than to speak out against it, even if a person has inner doubts. In this way, anti-intellectualism is often a result of the social pressures of everyday life. It is a byproduct of taking the easy way out, desperately seeking the acceptance of similar people.

The K-12 education system is often a victim of the common man’s anti-intellectualism. Throughout the U.S. educational standards vary by region and reflect the majority belief systems of particular geographic areas. This is evident in religious squabbles over displaying The Ten Commandments on school property, and in parental input in the amount of hours children should spend at school in rural areas where they are needed for family income.

The way that people believe in a certain community has an impact on what is taught in the schools there, and this can be detrimental to students. The power to determine what is valuable in educational settings should not be given over to parents, or business owners, or even politicians, but the idea that the majority knows best overshadows many K-12 educational pursuits. When beliefs, religious or otherwise, interfere with educational pursuits, they should be re-examined for validity and if they are impeding the learning process. Fear is a powerful thing – so much so that it gets in the way of knowledge, time and time again. So how can it be overcome to bring the next generation of K-12 learners back to intellectual pursuits?

Disengaged Students, Part 9: How Religion Can Discourage Rational Thought

In this 20-part series, I explore the root causes and effects of academic disengagement in K-12 learners and explore the factors driving American society ever closer to being a nation that lacks intellectualism, or the pursuit of knowledge for knowledge’s sake.

Americans with differing religious convictions disagree on the severity of the risk to intellectualism posed by religious control. For atheists, any person who believes in a non-tangible “god” and takes orders on how to live a life of faith from another mortal is simply being bamboozled. For those who choose to incorporate faith as part of a larger life, and abide by the widely socially acceptable characteristics of kindness, charity and concern for their fellow human beings, religion does not pose a negative threat and is not overtly labeled as “dangerous” to free thinking. But which is more intellectual – a carefully chosen belief system of “faith without seeing,” or one that is grounded in observing physical proof?

Is it all a Cult?

To explore those answers more fully, let’s start with extreme incidences of religious contributions to anti-intellectualism. There is the outright manipulation of people through fear or promise of reward, most often labeled as a “cult” in American culture. Some experts claim that those who use these brainwashing techniques do not simply cast a wide net, but that they target particularly vulnerable people. Malaysian professor and cult expert Dr. Ong Seng Huat has called these so-called cult religious leaders “charismatic” and talks about how their victims are often people dealing with failure or depression.

In other words, people with mental instability are often the very same people who are quick to cling to a pre-determined path when it comes to spiritual matters. Author Peggy Riley calls the impulse of charismatic leaders to build utopian empires, however selfish, a “uniquely American” one. While high profile cults in the U.S. like the Branch Davidians of Waco standoff notoriety and the Church of Bible Understanding prove that there is a market for religious anti-intellectualism, extreme groups claiming divinity date much further back than American settlement.

In ancient Greek and Roman mythology, believers in the gods took strange and often dangerous actions to prove their faithfulness. Followers of the Roman god Mithras were put through life-threatening initiation ceremonies, and women who adhered to the Bacchic cult proved their devotion by running wild in the countryside. The act of “hero worship” celebrated humans who had proven their worthiness.  A part of the body of the deceased hero, like the head, was often put on display in a common area, and believers felt that it led to greater prosperity, safety and fertility. To show appreciation to the hero-god, followers would sacrifice an animal (often a ram) and allow the blood to flow to the remaining piece of the hero’s body. It was believed that through contact with the blood, these heroes were momentarily revived and could pass down advice and decrees to the faithful.

The ancient Greek and Roman world’s sacrificial rituals are surprisingly similar to those of Old Testament Jews who placed their own sins on the backs of animals that were then led to slaughter, often called “sacrificial lambs.”  This act of worship was later interpreted by Christians as symbolic of the transgressions that the Jesus took upon his own body in death. The text of the Christian Old Testament, along with other supporting historical documents, records repeated offenses by the followers of Jehovah who often stepped outside the confines of their religion to seek out other gods. According to these writings, great turmoil and misfortune fell upon those who chose a life outside the belief system set forth by the accepted god of the Jewish people. Even today, teachings about the dangers of idol worship, or putting anything before the Heavenly Father, are used as cautionary tales in even the most mainstream Christian churches.

Notable members of The Church of Jesus Christ of Latter-Day Saints, generally considered a non-cult religion in America, have spoken out about the discord between religion and rational thought. The Book of Mormon consistently criticizes people who seek answers outside the church, accusing such people of being prideful or elitist. Such people are seen as being tempted by Satan or adversarial forces, not simply seeking answers to normal questions of human nature. Therein lies the intellectual problem with religion in America, and not just with Christianity. If people are condemned for asking questions, how can they exercise their rationalism?

Is Atheism Intellectual?

Atheist thinkers like Richard Dawkins claim that a belief in any higher, unseen being is akin to faith in Santa Claus or the Easter Bunny. Theories of Creationism are no longer acceptable in public schools that teach Evolutionary science instead – much to the anger of parents and religious groups who feel Darwinian theory is far from proven. People are quick to categorize themselves and others based on opposing theories of world origins, or theism versus atheism, or even one religious group versus another one.

Susan Jacoby addresses this uniquely American approach to religion, and the ways in which it feeds anti-intellectualism, in her book The Age of American Unreason. She contends that the ability to create any religion with no outside government interference has led to numerous unsubstantiated sects whose followers display unquestioning enthusiasm and commitment. The government has no right to decide which spiritual philosophies are better than others, so long as their followers do not actually violate the law of the land, so religious beliefs run the gamut. Of course, Americans are free to see through any crackpot beliefs and point out their invalidity, but there are always at least a few who are duped. In this way, Jacoby says that intellectualism is not actually broadened by more religious choices but stunted.

Religion can be Well-Reasoned

Contemporary religion, therefore, is not inherently anti-intellectual. It is a choice, a lifestyle that has its place in millions of lives. A person who is informed about other belief systems, or has considered evidence from other perspectives, and who chooses to live in a particular faith is not a blind follower. That person is enlightened, and an argument could be made that he or she is more of an intellectual than another who denies all religious beliefs without investigating any.

It is when religion purports to trump new findings proven through scientific discovery – when it places itself as the end-all and be-all of human thought, despite evidence to the contrary– that it becomes a dangerous force against intellectual progress.

 

Disengaged Students, Part 8: Anti-Intellectualism in Government, Then and Now

In this 20-part series, I explore the root causes and effects of academic disengagement in K-12 learners and explore the factors driving American society ever closer to being a nation that lacks intellectualism, or the pursuit of knowledge for knowledge’s sake.

 If the main purpose of intellectualism is to reveal truths, it is no wonder that governments throughout history have opposed free thinkers. Power attained through deceitful or heavy-handed tactics is always in jeopardy of exposure. In contemporary culture, conservative politicians are often viewed as being at odds with progressive ideas or rational thought. The truth is that all politicians at extreme ends of the spectrum– and the ones in between–can feel threatened by an enlightened constituency. This is certainly the case if a politician has something to hide.

Ancient Anti-Intellectualism in Government

Think back to the days of the Egyptian Pharaoh in the time of Moses, in the mid-1400s B.C. There is debate as to the identity of this Pharaoh, who has been variously identified as Tuthmosis III, Amenhotep II or Apophis. There is more widespread agreement about the historicity of his decree that all Hebrew boys should be thrown into the river to drown. The Quran, Hebrew Bible and Christian Old Testament all document the mistreatment of the Hebrews by Pharaoh because of his own fear that the Jewish people would rise up and outnumber his army.

His plan demonstrated strategic genius. Rather than trying to kill off a strong group capable of self-defense, he targeted an easier demographic. This Pharaoh was planning for the long term; he could protect his own political agenda by wiping out the future generation of Hebrews. His motivation was certainly grounded in the fear of a physical uprising, but he was also waging an intangible war. The Jews were not only dangerous because of their growing population; they frightened the Egyptian ruler (or rulers) with their difference in beliefs and their claim to have the ear of the one true God. Rather than working toward a long-term plan for harmony between Jews and other Egyptians, it was easier to use brute force. Anti-intellectualism at its finest.

Contemporary Anti-Intellectualism in Government

More recently, government-fueled anti-intellectualism took place in the USSR in the mid-1900s. In 1948, Lysenkoism, an agricultural science developed by Trofim Lysenko, was adopted to determine the practices of farms all over the country. This political approach to science stressed control over genetics and the ability to determine scientific outcomes. Meant to further Communist theory, Lysenkoism had dismal results when it came to its actual yields. The theory was officially abandoned, and renamed a pseudoscience, by 1964. Before it disappeared as a tenet of Russian government and farming, however, many scientists who spoke out against Lysenkoism lost their livelihoods and even their lives. Those who did not agree with the narrow view of the government were punished, a process that discouraged intellectualism in this particular area and others too.

These examples of persecution are just the types of things that Americans vehemently oppose, at least in theory. Government-issued science and genocide based on paranoia – things that the founders of this nation took great strides to avoid – are part of other cultures, other ways of life. Horror stories of young women facing government-sanctioned genital mutilation in African nations like Ethiopia are viewed as distant problems of the Third World. Americans don’t wonder that anti-intellectualism flourishes in places that lack the resources, education and freedom of choice available within the U.S. borders.

Is American Government Anti-Intellectual?

Perhaps it is the subtlety of the American brand of anti-intellectualism that is so dangerous. Blind belief in the prescriptions which the people elected to the highest offices have condensed into 8-second sound bites is worrisome. Religious freedom is a foundational principle of American government and culture – yet lawsuits are still regularly filed because public schools punish students for declining to recite the Pledge of Allegiance or stand for the Star-Spangled Banner. Perhaps the lip service Americans pay to liberty of thought serves to disguise the anti-rationalism underneath.

It is true that in the capitalist U.S. a mandate like Lysenkoism would never stand, particularly since agriculture represents $100 billion in U.S. exports alone each year. It is also hard to imagine a U.S. government official declaring that all baby boys of a particular ethnic group be thrown into the nearest Great Lake. The American government would not be accepted if it acted as a vehicle of anti-intellectualism in outright ways. Free thought and the liberty to believe pretty much anything, no matter how virtuous or ridiculous, are the marks of a true American.

But can a democratic country that allows the voice of the majority to make decisions really achieve heightened states of intellectualism for the individual? Beginning with the public educational system, this is a question that Americans must answer if they are to avoid complete loss of rational thought in the future. While a level playing field, mandated by the government though not always carried out in practice, serves idealistic purposes, does it hurt intellectual progress of citizens? Through its democratic idealism, is America’s government actually the most anti-intellectual of them all?

 

Disengaged Students, Part 7: Too Much Information Access?

In this 20-part series, I explore the root causes and effects of academic disengagement in K-12 learners and explore the factors driving American society ever closer to being a nation that lacks intellectualism, or the pursuit of knowledge for knowledge’s sake.

It’s no secret that we are living in an information age, one that has lifted data barriers across the world and opened up access to knowledge like never before seen in the history of modern humankind. On the surface, this access to information appears to be a democratization of knowledge – a way that more people can learn about more things in the fastest amount of time. In reality, though, the internet and all its interconnected technology has given rise to less effort put towards the pursuit of knowledge, and more energy focused on simply finding the quickest, easiest answer.

Is Shared Knowledge Best?

This growing challenge to intellectualism in contemporary America is grounded in a rapidly expanding access to information coupled with a complete lack of hierarchy based on expertise. Take sites like Wikipedia, for example. Such Internet sensations are victories for crowd-sourced knowledge that hypothetically offers more than one side to every argument, but they have bolstered the assumption that all knowledge is equal.

Wikipedia is known for allowing anyone, regardless of credentials, to post on its pages for the greater good of shared knowledge. Some other sites are less forthcoming about the credentials of their contributors. Businesses clamoring to improve their search engine rankings commission writing which is disguised as expert information but is actually designed to get consumers to their sites when a certain word or phrase is typed into a search bar.  The writers are more likely to have expertise in sales writing than in areas of knowledge relevant to the products and services they tout. Customer review sites give peers an idea of what to expect from a particular product or company.  In an ideal world these would offer balanced feedback, but in fact they tend to weigh heavy on the negative side; it is in human nature to warn others of danger, not to assure them that the path is safe.

More Info, Less Learning

While attempting to place more power at the fingertips of the people, the digital age is actually distorting the public’s sense of reality, blurring the lines between fact and fiction for many willfully ignorant participants. Just as the removal of limits on religious beliefs spawned many different denominations, some of which led followers completely astray, a limitless online community promotes misinformation on a regular basis. Even the information that is correct comes fraught with anti-intellectual challenges. The information that was once confined to textbooks, library visits or expensive encyclopedia sets is now just a click or brush of a touchscreen away. A child who is given everything from birth and never has to work for any of his possessions will inherently devalue those items. In the same way generations growing up with Internet access devalue knowledge.

While no one would argue against the convenience and knowledge that the Internet has provided on a global scale, ongoing use of its information predecessors is necessary in order to preserve intellectualism. At least some weight has to be given to information in order for the youngest learners to differentiate between well-researched, well-proven facts and the passionate ravings of people with no expertise or training on a particular subject.

If American children are to learn to think for themselves, they need more information than what can be found in a search engine, and they need tools with which to sort out and evaluate the information which they find. But what will make them want to take the long route to data when there are so many convenient shortcuts? That’s the question educators and parents have to broach if there is to be a semblance of any intellectualism when this generation graduates and starts contributing to American society.