Our Cognitive Toolbox for Working with Reality

by James Wallace Harris,

All too often we think we know but we don’t. Why do so many people argue with 100% certainty against others who feel equally convinced? Often wisdom tells us the more we know the more we don’t know. Does that mean the person who claims to know nothing knows the most? Why is this reality so hard to understand? Even eyewitnesses are often fooled. And why is it so important to know thyself?

Reality is complex, but is it unknowable? Humans believe they are the crown of creation because all animals are ignorant of their own existence. Is our sentience really a quantum leap over all other life forms on this planet? If we compared ourselves to an amoeba, ant, or cat, we can see that awareness of reality has slowly gotten more complex and each of those animals perceives a larger portion of reality. Does that mean we see everything in reality, or are we just as blind to a much larger reality?

I believe we’ve evolved a number of cognitive tools to analyze reality, but it’s important to know the effectiveness of each.

First-Hand Experience. Common thought claims we have five senses for perceiving reality, but we actually have many more. People often believe seeing and hearing things for themselves is a primary source of knowledge. However, our senses can deceive us. For example, the lady cop who shot a man in Texas because she thought he was a burglar in her apartment when she was in his apartment. Just pay attention to how often eye witness accounts fail. Or better yet, recall all the times your senses have fooled you.

Instinct and Intuition. Our genes and unconscious mind direct us to act without thinking. Many people prefer to go by gut reaction than thinking it through. But how often does gut reaction tell us to kill or take what we want?

Language. By breaking reality down into pieces and giving each part a name goes a long way into gaining useful insight. But language is imprecise and the parts of reality are many. People who know the different names for trees have a greater understanding than the person who only knows the word tree. Language has evolved tremendously giving us one of our best tools. Pay attention to how words help you to perceive how reality works, and observe how people with lesser or better language skills fare compared to you.

Word of Mouth. We learn from other people’s observations. When we were hunters and gatherers hearing from scouts describe where animals could be hunted was vital. On the other hand, if a seafarer told you about mermaids you ended up believing in an unreal being. Word of mouth is very unreliable. Remember the Kindergarten game of Telephone? Word of Mouth evolved into journalism, and we know how reliable that can be. Word of Mouth has always had a fake news problem. Gossip, innuendo, slander are also descendants of word of mouth.

Counting and Measuring. Simple arithmetic became a tool that lets us invent, build, grow crops, trade, and develop an economy. Counting and measuring evolved into mathematics.

Mysticism. Mystics are people who claim to acquire knowledge from a higher source. They became shamans and seers who influenced other people. They also speculated about how reality worked, inventing higher beings. Even today many people still give credence to mystical insight. However, mystical insight has produced an infinite variety of conflicting information. We have to assume its all suspect. Mysticism tries to be the first-person experience of the divine.

Religion. Religion is codified mystical insight that is retaught as the truth. Religion allowed us to create very complex social structures. However, its truth is suspect. If there are a thousand gods, most followers are atheists to 999 of them. Religion succeeds in creating artificial realities that may or may not interface well with actual reality. Religion spreads best through word of mouth.

Laws. Laws are an external tool to encourage consistent thinking. Religious laws attempt to force mystical insights onto a population. Secular laws attempt to get people to work together.

History. If you study the Old Testament you’ll see it’s more about a history of a people than spiritual instruction. We have always tried to remember the past to explain how we got here. Early histories were no better than word of mouth stories that could be highly inaccurate. And each succeeding generation of historians alters the histories. A good example is the New Testament. Whoever Jesus was, and whatever he taught, has been constantly changed by each new writer of the New Testament. It appears the historical Jesus advocated communal living and sharing that today would be called communistic. The historical Jesus was concerned about creating heaven on Earth. It was later writers that gave him superpowers and turned him into God. Studying the history of Christianity is an excellent way to understand how history constantly mutates. History is a worthy way of understanding reality but it has to be tempered by comparing multiple histories.

Philosophy. Where religion taught that knowledge came from God or other spiritual authorities, philosophy teaches us we can figure things out for ourselves. Using rhetoric, logic, and mathematics men and women observe reality and deduce what’s going on. This was a great paradigm shift away from religion. However, like the game Mastermind, it leads to a lot of false assumptions. Elaborate castles of logic can build imposing concepts but that often turns out to be illusions of great knowledge. Philosophy is a major tool for understanding reality but it also has major faults.

Ethics. Ethics, like laws, attempt to come to a consensus on what’s right and wrong. Ethics is based on philosophy. Although in recent years, some ethicists have tried to look for a scientific foundation.

Science. Science combines mathematics, statistics, observation, testing, and philosophy into a systematic way to evaluate reality. Science assumes if tested observations and measurements prove consistent by scientists from any nation or culture then they might be true. Science never assumes it finds the absolute truth, but just the current best guess based on all the existing data. Science is statistical. Some science is so refined that it works incredibly well with reality. Space probes visiting distant worlds validate hundreds of years of scientific endeavors.

Scholarship. We have made education into a major portion of our life. We spend our entire lives trying to figure things out. We study, we think, we make assumptions. Like philosophy, scholarship often builds vast models of speculation. Scholarship tends to endorse results from competing trends. However, scholarly theories can be deceptive and even dangerous.

The problem is we use all these tools to explain our version of reality. Unfortunately, most are unreliable or clash with other people’s version of reality. Science has proven to be the most consistent at explaining reality, but science doesn’t cover everything. For example, right and wrong. These two concepts are ancient, probably coming out of mysticism or an instinctive desire for justice. Both religion and philosophy have tried to perfect them, but our reality is completely indifferent to morality or ethics. We have invented many concepts that just don’t exist in reality.

This causes problems. Several million people might believe with absolute certainty in a particular concept and then try to impose that view on millions of others who are just as certain such a concept is invalid.

We live in a polarize society because we all embrace different ancient beliefs, most of which we can’t explain how they came about. We just accept them as true. Most people believe in God because it was something they learned as a little kid. They won’t let the idea of God go no matter how much other cognitive tools disprove God’s existence.

Donald Trump seems to base most of his knowledge from first-hand experience and word of mouth information. Twitter is the perfect tool for word of mouth. Trump is neither religious, philosophical, or scientific. But this isn’t an uncommon way of dealing with reality. Few people are philosophical or scientific. Too many people only want to trust first-hand experience and instinct, but we know how unreliable those cognitive tools are. People who rely heavily on first-person experience and word of mouth tend to disbelieve science.

There have been various disciplines that try to teach self-programming that jettisons cognitive bullshit. Zen Buddhism is one. Meditation can be used to seek mystical insight or to observe the working of our own being.

The reason I wrote this essay was to help me think clearer. I’ve been reading books on Greek philosophy, and early Christian history. They are teaching me what people 2,000-2,500 years ago thought. I can see those ancient people struggled to make sense of reality without science. I can also see the same struggles today in people. We just don’t think clearly. We’re too influenced by low-level cognitive tools that deceive us. We base our existence on illusions created by those most primal cognitive tools.

I keep hoping the human race will get its act together and create a sane society that coexists with reality, and not on insane illusions and delusions. I realize until everyone becomes a master of their various cognitive tools, and learn the limits and limitations of each, we can’t start working on that sane society. We can’t start learning what’s real until we learn how to perceive what’s not real.

JWH

 

 

A Tale of Two Screen Generations

by James Wallace Harris, Sunday, October 6, 2019

I believe growing up with the television screen made me different from my parents and grandparents. I wonder if kids growing up with smartphone screens will be even more different?

The education you get before starting school is the bedrock of your soul. For most of human history, kids grew up listening to family stories while acquiring their beliefs in religion, economics, and politics. Books, magazines, and newspapers didn’t affect those early years, but when radio came along, a new source of influence competed to program our early childhood. This escalated with television and accelerated even faster with computers, networks, tablets, and smartphones.

In those early years before we learn to read we acquire all kinds of concepts that become the cognitive bricks to our psychological foundation. For example, I didn’t acquire religion during those years, but a belief in science fiction. Aliens replaced gods and angels, heavens replaced heaven, and space exploration replaced theology. And because kids are learning to read at an earlier age today, more concepts are compressed into those formative years. I assume kids today are smarter than we were in the 1950s.

Isn’t this why traditional religious beliefs and family history is less important to people growing up today? Sociologists have long talked about peer pressure influencing teens, but didn’t television shaped the toddlers of my generation? Doesn’t everyone agree that social media pressure is shaping the early childhood of today?

A more descriptive name for Baby Boomers is The Television Generation. We got our name because so many of us showed up all at once after WWII. But more importantly, we were also the first generation to grow up with the television screen. We were raised with three new network eyes on the world. We’re now seeing a generation growing up with mobile devices such as smartphones and tablets, and these kids have countless extra inputs.

I was born in 1951 and it seemed perfectly natural to suckle at the glass teat. Even now I have a hard time comprehending how my parents’ generation grew up without it. And I can’t conceive of what it’s like growing up today playing with mobile devices in the crib. Mobile devices are so much more intelligent than televisions, especially television programming in the 1950s.

Before radio, children acquired limited mythology from their parents, but also from large extended families that crossed generations, and the church. Whatever creation story you were told you accepted. There wasn’t a lot of skepticism back then. Starting with the radio, it was easy for kids to encounter competing creation myths at an earlier age. But it was television that made a quantum leap in providing alternative explanations about reality.

My earliest extensive memories begin around age four. I don’t remember what my parents told me, or what I heard in church. I do remember the television shows I  watched. I remember exactly where I came from – Romper Room, Captain Kangeroo, The Mickey Mouse Club, Howdy Doody, LassieTopper, Love That Bob, Gunsmoke, The Twilight Zone.  Television ignited my imagination. I remember being four and trying to communicate the ideas I got from television with my parents, but they seemed clueless. It’s like we spoke a different language and lived on different planets. They’d tell me about growing up on farms, or the depression, and I just couldn’t imagine what they were talking about. I’d eventually learned about their upbringing from television.

Once I started school I bonded with other kids over the television shows we loved. Television provided a shared language and mythology. However, I think growing up in the 1950s and 1960s is definitely different from today. We had three television networks, and two Top 40 radio stations, and limited access to a small number of popular movies. Among my generation, everyone pretty much watched and listened to the same shows and music. Sure we arranged our top ten favorites a little differently, but everyone pretty much knew about what everyone else liked.

Growing up today the TV screen now brings kids hundreds of cable channels, and a variety of streaming channels with thousands of different choices, and Spotify lets people listening to tens of millions of different songs. Every week countless new movies show up. But more than that, mobile devices let you choose what feels like an infinity of rabbit holes to fall into. I can understand why social media is so popular, it allows people to share their discoveries and make common connections. And I can see why movie franchises are so popular, it’s another way to bond over a limited selection. We really don’t want more shows, we want more shows we all love the same.

I’m writing this over six decades after I grew up. I wonder what people growing up today will say about their early education sixty years from now? In my generation, it was easy to share because we pretty much shared the same content. Now kids need powerful computers to find friends that like the same stuff they do.

I believe the appeal of the church today is not theology but communion. Not the communion of wine and wafers but being with other people sharing a common experience. However, I do believe television in my generation undermined the hold church had on programming our young minds.

Bible stories no longer provided our ontology. The TV screen widened our epistemology. Mobile devices are the fentanyl of screens. I imagine in another generation or two, cyborg-like devices will inject data into kiddies at an even faster rate. However, I believe there’s a limit to what our brains can handle. I’m not sure if smartphones and tablets aren’t exceeding that limit now. But that might be old fogie thinking, and we’ll have future technology that will match our wildest science fiction.

Yet, I also see signs of a backlash movement. Why are record players and LPs making a comeback? Why are there so many Top Ten lists on the web? Aren’t those signs that people want a smaller selection of inputs, ones that have a commonality with other people? Sure, everyone wants to be famous on YouTube, but 75 million kids can’t all have 75 million followers. What we want are five friends that love the same five shows and songs.

When I was growing up we often watched TV with other people. Our parents, our siblings, our friends, our neighbors. When I was little, I’d have friends over and we’d watch Saturday morning TV under tents built of blankets. As teenagers, we’d get high and watch TV together. At college, we’d watch TV in the student union together. Watching TV on a smartphone or tablet is as solitary as masturbation.

Since around 2000 I’ve stopped keeping up with hit songs and albums. I no longer know what new shows begin in the fall. As a kid, my parents used me as a walking TV guide. When I see the magazines at the grocery store checkout line, I don’t know the famous faces on their covers. Movie stars have to be in their fifties before I can remember their names. There’s a limit to how much pop culture I can absorb. I feel pop music peaked in 1965, although I struggled to keep up with it through the 1980s.

I have to wonder if kids growing up playing with smartphones can handle more data than my generation. Can they drink more from the fire hose of the internet longer? I can only chug so much data before I start spewing. Is that my age showing, or does it reveal my limitations shaped by my training watching television in the 1950s? Are those babies growing up playing with smartphones becoming like that little robot Number Five in the film Short Circuit that kept demanding, “More data, more data!”

Is growing up with a mobile device screen wiring kids differently from how we were wired by our television screens? Does Greta Thunberg represent a new stage of consciousness? I hope so. The Television Generation threw a fit in the 1960s. I feel the Smartphone Generation is about to throw a fit in the 2020s. Good for them. Don’t assume you know more than they do – you don’t!

JWH

p.s. That’s me above with my mother and sister when I was four, and my cyclopic guru.

Unraveling a Loose Thread of History Found in a 1956 Issue of Galaxy Science Fiction

by James Wallace Harris, Monday, September 16, 2019

This morning I was flipping through some old issues of Galaxy Science Fiction I had bought on eBay and ran across this ad in the October 1956 issue:

Geniac - Galaxy 1956-10

At first, I flipped right by it. Then in the next issue I picked up, the December 1956 issue, I found this ad:

Geniac - Galaxy 1956-12

This one promised a whole lot more. Could this be for real? Computes, plays games, composes music? I don’t ever remember reading about home computers existing this early. I thought computer kits were something from the 1970s. This December ad promised a new improved 1957 model, and for only $19.95. In 1956, $19.95 was some serious money for a kid. It would probably be hundreds of dollars in today’s money. And was this a genuine computer, or was it some kind of trick, like those X-Ray glasses advertised in the back of comic books?

First stop: Wikipedia.

Geniac was an educational toy billed as a "computer" designed and marketed by Edmund Berkeley, with Oliver Garfield from 1955 to 1958, but with Garfield continuing without Berkeley through the 1960s. The name stood for "Genius Almost-automatic Computer" but suggests a portmanteau of genius and ENIAC (the first fully electronic general-purpose computer).

Operation
Basically a rotary switch construction set, the Geniac contained six perforated masonite disks, into the back of which brass jumpers could be inserted. The jumpers made electrical connections between slotted brass bolt heads sitting out from the similarly perforated masonite back panel. To the bolts were attached wires behind the panel. The circuit comprised a battery, such wires from it to, and between, switch positions, wires from the switches to indicator flashlight bulbs set along the panel's middle, and return wires to the battery to complete the circuit.

With this basic setup, Geniac could use combinational logic only, its outputs depending entirely on inputs manually set. It had no active elements at all – no relays, tubes, or transistors – to allow a machine state to automatically influence subsequent states. Thus, Geniac didn't have memory and couldn't solve problems using sequential logic. All sequencing was performed manually by the operator, sometimes following fairly complicated printed directions (turn this wheel in this direction if this light lights, etc.)

The main instruction book, as well as a supplementary book of wiring diagrams, gave jumper positions and wiring diagrams for building a number of "machines," which could realize fairly complicated Boolean equations. A copy of Claude Shannon's groundbreaking thesis in the subject, A Symbolic Analysis of Relay and Switching Circuits, was also included.

Okay, so it was real! But in 1956? In the mid-fifties, commercial computers were just beginning to be rolled out to businesses. In 1957 American audiences got to see a humorous look at computers in the film Desk Set with Spencer Tracy and Katherine Hepburn. Rumors of computers produced a fear that the librarians would lose their jobs, but ultimately humans prevailed. I expect most Americans in 1957 had never seen a computer and only knew about them from funny cartoons in magazines and newspapers. Geniac came out before Sputnik which ignited a fear that American youths weren’t being educated in science. Was there a desire by kids that early in the 1950s to know about computers?

Here is a History of Computer timeline that shows the Geniac for 1955. And here’s an article about the history of computers that played NIM games, which includes the Geniac.

Scientific American 1950-11The main designer of Geniac appears to be Edmund Berkeley. He wrote an early book about computers in 1949, Giant Brains, or Machines That Think. Berkeley was also written about in Edmund Berkeley and the Social Responsibility of Computer Professionals by Bernedette Long. If you follow that link she writes about his influence with Geniac. I’m awful tempted to buy the Kindle edition. He also designed what some people call the first personal computer, Simon. Simon appeared as 13 how-to articles that began running in Radio-Electronics magazine in October 1950. (All 13 parts can be read online here.) It would have cost around $600 to build and had very limited features with only 2-bits of memory. Berkeley wrote the article “Simple Simon” for the November 1950 issues of Scientific American.

Electronics was a big tech hobby back then and had been since the early days of the radio in the 1910s. Looking at the Geniac ad carefully though showed it wasn’t an electronics kit, but merely electrical. It might contain 400 parts, but they were wires, light bulbs, batteries, nuts, and little contacts. It seems designed to set up simple logic programs. How much could a kid do with one? YouTube to the rescue:

And this film, which features a later model from the 1960s called a Brainiac:

This brings up even more questions. Did kids really play with them? Where they inspired to study computers and become computer programmers and engineers? Were there any famous computer pioneers that started with a Geniac or Brainiac? Could Steve Wozniak or Bill Gates have played with one? Of course, those two might have been too young for this era.

The kit seemed aimed at kids, but it would have required a great deal of work and patience to produce any results. Actually putting one together and doing any of the example projects would have been very educational.

David Vanderschel describes his Geniac computer from 1956. He says an IBM 1620 was the first real computer he encountered in 1962. That was the first computer I programmed on in 1971 at computer school using FORTRAN.

Hackaday had a post last month about the Geniac claiming that Mike Gardi credits his professional success in software development to educational logic games like the Geniac. Gardi created a replica of a Geniac and has links to the original documentation. This 1955 manual had instructions for a couple dozen projects. Gardi said:

Technically GENIAC was a collection of configurable N-pole by N-throw rotary switches, which could be set up to cascaded and thus perform logical functions. As a result GENIAC could use combinational logic only, its outputs depending entirely on inputs manually set. However, projects outlined in the manual, which started with basic logic circuits, ultimately progressed to such things as a NIM machine and TIC-TAC-TOE machine.

I did find a Geniac on eBay that has a $99.99 buy it now price. There’s a Brainiac for sale for $349! That’s more than I’d want to spend. The Brainiac is in great shape though. It’s probably the one from the film above.

The more I Googled, the more intrigued I became about the impact of the Geniac computer. Is this how historians get sucked into writing books? I checked a couple books on the history of personal computers I own, but neither mention Geniac or Edmund Berkeley. If you search Google for the first personal computer you usually get the MITS Altair 8800. Maybe that’s not true. Maybe I could write a whole history book about home computers before 1975.

Additional Reading:

Update:

I went to my public library and looked through the books about the history of computing. I found no mentions of Geniac or Edmund Berkeley. I then checked The Reader’s Guide to Periodical Literature for the years 1950-1960. I found no references to Geniac and only a handful of articles by Berkeley. His articles did sound interesting:

  • “Robots for Fun” Life, 173-74+, March 19, 1956
  • “Relations Between Symbolic Logic and Large-Scale Calculating Machines” Science, 395-399, October 6, 1950
  • “Simple Simon” Scientific American, 40-43, November 1950
  • “Tomorrow’s Thinking Machines” Science Digest, 52-57, January 1950
  • “2150 A.D. Preview of the Robotic Age” New York Times, 19, November 19, 1950
  • “Robot Psychoanalyst” Newsweek, 58, December 12, 1949
  • “Algebra and States and Events” Science Monthly, 332-342, April 1954
  • “We Are Safer Than We Think” New York Times, 11, July 29, 1951

An amusing thing happened at the library. I kept asking the librarians where the Reader’s Guide to Periodical Literature was located. They didn’t know. Finally, they asked a very old librarian and she found it for me. She then came back with the younger librarians, they wanted to see it too. I had told them when I was young every kid was taught to begin their library research with that classic index.

JWH

 

What Would Have Made Me Want To Study as a Schoolkid?

by James Wallace Harris, Friday, August 23, 2019

I considered my K-12 education a 13-year prison sentence. I did my mediocre best getting mostly Cs and Bs, with rare As and Ds. My good grades didn’t reflect my ability but showed what I was actually interested in. I had a lot of great teachers that tried hard to get me to learn, but I didn’t cooperate. I wish to apologize to all of them now, especially my 12th-grade math teacher. I just didn’t want to pay attention, study, or do homework. Life was full of fun diversions and I found no incentive to make the most of my school years.

I regret that now and it’s really pointless to worry about it now, but it is an interesting problem to think about solving. How do you get kids to want to study? A certain percentage of children respond well to traditional classroom learning, but most don’t. When I’m shopping in used bookstores I look at K-12 textbooks and I’m horrified by how much crap they want to stuff in a young person’s head.

Part of the problem is society wants kids to acquire proficiency in a specific set of subjects before they’re 18. Then they up the ante by a couple of magnitudes for higher education. Before you can start life you have to be programmed with 400,000 facts. We’re told we need that many factoids to succeed in life but I doubt many believe it. I always considered it cruel and unusual punishment. I never knew what crime I committed to deserve such torture.

And it’s not like I didn’t enjoy learning as a child. I was a bookworm from the 4th-grade on, reading several hundred books while serving my K-12 time. I just didn’t want to read the books teachers wanted me to read.

I don’t know if I was a typical child. But I’d guess most kids didn’t like the system either. I’ve often thought about what if I could have designed my own pedagogy. It’s a fun thing to fantasize about. Try it and post a comment. I have come to some conclusions for me only, not a general system.

  1. The most important thing I should have been taught as a kid is about the world of work and how I’d spend forty years doing something that I could either like or dislike. I needed to learn as early as possible if I didn’t find my right vocation I’d spend those years in quiet desperation at best and crushing resentment at worse.
  2. I needed to have been shown by experience that there are many kinds of tasks and work environments. After high school, it took me several jobs to realize I preferred working inside rather than outside. I eventually learned I rather work with machines than people, but I liked an environment with well-educated people, and tasks that produced something useful to humanity rather than the bottom line. And I didn’t need to be the boss. I’m pretty sure I could have learned all of that in grade school.
  3. I learned too late in life that I loved science and technology. Again, I can imagine ways to get kids to learn subjects they like while they are still in grade school. It might require spending some classroom time in real work environments.
  4. What I sorely missed was a real incentive to study. I was told an education led to a good job but I never knew what a good job meant. I think study incentives need to be more immediate. I think the goal of being freed from classes would have been the incentive that would have worked for me. In other words, tell me the week’s goal. If I can finish by Thursday I could have Friday off. If I could finish in four weeks of a six weeks period, I could have two weeks off. If I could finish the year in March, I could have a long summer. Or even, if I could finish at 14 I could bum around for a few years before college. That would have inspired me to study harder. (I know that K-12 schools also serve as babysitters, so being freed from classes might mean more library days, or sports, or clubs, or other school activities. Although I wanted to be out on the streets or at home.)
  5. For such a finish-early system to work we’d need to carefully define and quantify what needs to be learned. Right now schools are one-size-fits-all. Not every kid wants to learn everything every other kid learns. Society needs to decide what subjects form a basic education, and what should be electives. We should find creative ways to test everything. Educators have gone nuts with cultural literacy.
  6. Society is discovering all kinds of learning and teaching methods. They didn’t have personal computers when I was little. But I think if they did I would have learned best in the classroom and taking quizzes at night on the computer for homework. If testing had been more like computer games and trivia contests they would have been fun. Competing for high scores would have pushed me, but grades never did in the least. If every subject had a rating like in chess, that would have been fun.

I’m curious if anything could have motivated me to study as a kid. It’s too bad we don’t have time machines. It would be a fun challenge to go back in time and see if could motivate my younger self.

Uh, maybe that’s an idea for a science fiction novel.

JWH

 

 

 

I Haven’t Studied Biology in a Classroom Since 1967

by James Wallace Harris, Friday, January 5, 2019

How old is your knowledge? That question can be taken in two ways. The years since the last time you studied a subject, which for me and biology is 52. Or, the age of the subject itself. For example, Euclidean geometry is two thousand years old. And dating the ages for either isn’t precise. I’m sure when I studied biology in the tenth grade (1966/67) my textbooks were not up-to-date, and far from chronicling the current discoveries in biology. Thus, my simple-minded memories of cell structure might be about two hundred years old.

In the first third of life, we go to school and college to prepare ourselves to be functional adults for our middle third of life, but how much do we need to know for our last third of life? What is a useful education for our retirement years? I certainly could sneak by without knowing any more biology, but should I?

I’m reading The Tangled Tree: A Radical New History of Life by David Quammen for a book club. Reading it makes me feel ashamed of how little I know about biology while blowing my mind with new information. It makes me wonder just how current my knowledge should be in various important subjects, subjects that help me understand my place in reality. Just because I might be leaving this reality soon, doesn’t mean I should fall into oblivion knowing so little.

universal-phylogenetic-tree-showing-relationships-between-major-lineages-of-the-three

The Tangled Tree starts out by announcing “recent” discoveries in biology, such as horizontal gene transfer (HGT) and the third domain of life called archaea and how they are disrupting our old image of an evolutionary tree structure, thus the title of his book. Both discoveries occurred after my last biology class. I had heard of archaea since and seen the graph above. I’ve read about prokaryotes (bacteria) and eukaryotes (plants, animals) but I couldn’t remember those labels. They say to really learn a subject you should be able to teach it, but I could only confuse small children with the vague ideas about biology.

Of course, I’m not totally ignorant of later biological developments. I regular watch PBS Nova and Nature, and over the decades read books like The Double Helix, The Selfish Gene, and a few popular books about the history of evolutionary theory, but they don’t require the same kind of learning that taking a class does. To really know a subject, even at a fundamental level requires knowing the words that describe it. As an adult, I’ve read many books about physics and astronomy, so I know some of their vocabularies, but I know very little of the terminology of biology. Quammen describes many fields within biology that are new to me, like molecular phylogenetics. I’m savvy enough to know what molecules and genetics are, and I could guess that ‘phylo’ concerns their taxonomy, but I’m totally clueless about how scientists could go about classifying these wee bits of proto-life.

Before jumping into the work of Carl Woese, Quammen succinctly describes the history of how the idea of evolution emerged in the 19th-century with various scientists using the tree metaphor to illustrate life emerging out of an orderly process. And he gives passing references to those scientists that developed taxonomy systems to categorize all living things. This lays the groundwork for understanding why Carl Woese wanted to develop a tree model and taxonomy of bacterial life.

1837_notebookb_cul-dar121.-_040Quammen grabbed my interest by describing how 19th-century scientists first started drawing trees to describe their theories. He even describes a page from Darwin’s notebook saying his first tree was rather simple. I was shocked when I saw it though, it was too simple looking, but the basic idea is there. I’ve vaguely remembered seeing this before, but to be honest, I’ve never tried to learn all of this information in a way that I’d memorize and use it. I put my faith in science, in evolution, but I know very little of the actual science. What I know probably compares to what the average Christian knows about this history of Christianity.

This got me to thinking. Should I study biology before I die? I doubt I’ll need it after death since I’m an atheist. So, what should my educational aspirations be in my retirement years? I’d like to pass from this world knowing as much about reality as possible. Why leave in ignorance? Why live in ignorance? There’s no meaning to our existence, but why not try to understand our situation to the fullest extent possible?

linnaeusWe’re a bubble of consciousness that has accidentally formed in reality. That’s pretty far out. Most of the matter in this reality is unconscious stuff like subatomic particles, atoms, molecules, and a smidgeon of biological living things. Reading The Tangle Tree makes me want to do more than reading over the subject and forgetting it again. Like Linnaeus, I want to organize what I should know into categories, into a Tree of Knowledge I Should Know. But I realize I am limited by time and energy – the time I have remaining to live and the dwindling personal energy I have each day.

How would I even go about studying the subjects I deem time worthy? I do have access to free university courses. And there are countless online courses, and I already subscribe to The Great Courses on my Amazon Fire TV. I could pick out some standardized tests for my goals, and thus limit the scope of what I want to learn. Or I could start studying and then try to teach what I learn by writing essays for this blog. That sounds more doable.

Other than the history science fiction, I don’t think there’s a single topic I could teach. I’m not even sure how many other topics I’d like to study — at any level. I do feel a sense of challenge that I should work on biology. At least for a while. Maybe read a few books on the subject this year. Maybe take a Great Course.

That makes me think I could choose a topic each year to study. I can’t promise much, but I think I should try.

Thus I declare:  2019 is the year to learn about biology.

JWH