31 Lessons to Save the World

James Wallace Harris, 3/4/21

Reading 21 Lessons for the 21st Century (2018) by Yuval Noah Harari and Ten Lessons for a Post-Pandemic World (2020) by Fareed Zakaria made it all too obvious that everyone needs to get to work together to save the world. But will we? Harari and Zakaria are two tiptop brains who have been thinking mighty hard on what needs to be done and have come up with a total of 31 useful insights. However, while reading these books I kept wondering if humanity will do what it takes to save itself.

Of course, both books carefully assess the major governments around the world and generalize on the psychological abilities of their citizens. Harari focuses more on people, while Zakaria deals more with governments. Harari is an international philosopher from Israel, while Zakaria is a savvy political commentator on CNN. Harari’s lessons focus on how people think and his main advice advocates freeing oneself from all the bullshit that confuse our thinking. Because our modern world lays a lot of crap on us, Harari offers a great number of lessons to free ourselves. Zakaria asks us to focus on what is good government and how can we build them. Since the United States has been sinking deeper and deeper into bad governmental practices for decades Zakaria suggests a lot of changes too.

Can individuals and humanity as a whole make all the needed transformations before our problems reach a perfect storm of self-destruction? One of the lessons Harari covers is how people live by the stories they tell themselves. He makes a case that people generally don’t think for themselves, but buy into group thinking. Psychologically, it’s beneficial and easier to accept a story from a group than invent your own. That’s why people embrace religion, nationalism, and political parties – they give meaning to their lives, a satisfying sense of purpose and understanding, and a story to embrace and share.

At first, you’d think Yuval Noah Harari is a liberal, but as he recounts the history of various philosophies, dismissing each, he comes to liberalism and says its dead too, and keeps on going. That made me question my own stories I got from hanging with the liberals. It made me ask: What story do I live by? Well, here’s my story abbreviated as much as possible:

I don't use the word universe to mean everything anymore after science started speculating about multiverses. I use the word reality. From all my studying of science there appears to be no limits to be discovered from exploring larger and larger realms, or by delving into smaller and smaller pieces. Evidently reality is infinite in all directions in both time, space, and any other possible dimension or existence. Earth is an insignificant portion of reality. But in the domain of human life, this planet is all that matters because it sustains our existence. I am an accidental byproduct of reality churning through all the infinities of infinite possibilities. I am a bubble of consciousness that has a beginning and end. I coexist on a planet with other similar consciousnesses, as well as a spectrum of other living beings with their own versions consciousness. Life on planet Earth has the potential to exist here for billions of years, but it appears our species is about to destroy its current level of civilization, if not commit species suicide, or even wipe out all life. We can all continue to live pursuing our own stories ignoring their cumulative effect on the planet, or we can collectively decide to protect the planet.

You can see why these books appeal to me.

To cooperate means everyone working from the same pages. I’m not sure that’s possible, but these two books describe what some of those pages should look like. As long as we selfishly pursue the individual stories we currently live by, cooperation can not happen.

I cannot bet we’ll cooperate because the odds are so impossible. But I am quite confident that we’re quickly approaching an endpoint to our current civilization. All the odds are just too high for that. If you haven’t read Collapsed by Jared Diamond, you might consider doing so. It’s about all the civilizations before our current ones, they all failed. But just pay attention to all the trends you encounter. They all seem to be aiming at a near future omega endpoint bullseye.

To solve our problems requires everyone becoming a global citizen. We must all put the security of the Earth before our own goals. That involves learning a new story. But as Harari points out, most people don’t switch stories once they’ve found one that gives their life meaning, even if it has no connection to reality whatsoever.

We live in a era where people are embracing nationalism over globalism. This is Zakaria’s territory. Not only must individuals must change, but nations need to change too. Zakaria covers how some nations are succeeding and others are not.

In the story I live by as described above, I know my place and limitations. I’m a single consciousness that will endure for a few more years. Basically, I putter about in my tiny portion of this planet, pursuing things that interest me. I enjoy what I can, and try to limit my suffering as much as possible. I am quite thankful for having this experience of existing in reality. Maybe it is too much to hope that we could collectively control our environment and the fate of our species. Reality is all about creation and destruction, roiling through all the Yin-Yang possibilities. Maybe in some locations in reality the inhabitants do work together to shape their existence, and theoretically this could be such a location, but I doubt it.

I told my friend Linda the other day, to save the world will require everyone reading a certain number of books to understand what needs to be done. I’m not sure how many books would be required, but I’m pretty sure they won’t get the readers needed. That’s why my most popular essay is “50 Reasons Why The Human Race Is Too Stupid To Survive,” getting tens of thousands of hits. And most of the people who leave comments are quite cynical about our odds too. I really need to update that essay with current examples, but I could call this essay reason #51.

JWH

Hopes, Dreams, and Bullshit

by James Wallace Harris, 2/2/21

Rereading the 1984 book Hackers: Heroes of the Computer Revolution by Steven Levy unearthed long suppressed feelings and ambitions that first emerged in my personality back in the 1960s and 1970s. When I first read Hackers in 1985 it rekindled those formative emotions and desires then as well. I’ll start my seventies this year and I have to wonder when do hopes that I formed in my teens finally fade away? When can I just give up and be here now? When do I stop trying to constantly be who I was? Why don’t hopes have expirations dates? Why are these books so exciting after all these years?

I remember four years ago triggering these same emotions and ambitions when I reread The Soul of a New Machine by Tracy Kidder. I tried to write about what I felt, but those words don’t capture what I’m trying to say now. One thing about growing older, at least for me, is seeking clarity about my time in reality. Before I die, or my mind fades away, I want eliminate all the bullshit barnacles that encrusts my soul.

My current theory is we acquire our personal dreams and desires from pop culture and subcultures. During my lifetime I’ve belonged to many subcultures, but the two I loved most are science fiction and computers. Both current forms of those subcultures have long past me by, but their initial seduction have left subprograms running within my mind that never stop. Why was I able to deprogram myself of childhood religious programming, but I’ve never been able to escape that cultural programming acquired from age 12-22?

You’d think we’d forget old beliefs as we acquired new insights. Of course, I’m generalizing, assuming all people are the same. Maybe other people do that, but I don’t. Why can’t we emotionally be like historians who rewrite history with new discovers. For example, after rereading Hackers I read A People’s History of Computing in the United States (2018) by Joy Lisi Rankin. Basically, Rankin is saying, hold on there Steven Levy, your history of computer pioneers from MIT and Silicon Valley leave out a lot of middle America computer pioneers. Her book is reshaping my sense of computer history I got from Hackers. Why don’t I do the same things with my personal history?

This is not the book review I sat down to write. I might try again, but let’s go with the flow. These books hit the bullseye of my old computer ambitions. Over the past year I’ve been watching a lot of YouTube videos about 8-bit computers, especially those from The 8-Bit Guy. David Murray essentially has traveled back in time to work on computers at the point where Hackers ends in 1984. Many other YouTubers have done this too. I’ve wondered if the solution to my problem with all these old hopes and desires is to return to a past point in time and start over. I realize at this moment, that’s exactly what I’ve done with science fiction. I’m reading and collecting what I loved best from 1965-1975. That’s kind of weird when you think about it. But maybe it’s a natural aspect of aging too.

However, I also tell myself I should jettison my past like they were my first and second rocket stages and seek orbit for what I could be in 2021. But could that be me bullshitting myself that I’m not too old to learn new tricks. Of course, maybe one way not to stir up old emotions and desires is to stop consuming old pop culture. Does my library of old books, magazines, movies, and TV shows keep those old subprograms going? Actually, yes.

I have a friend, Anne, who lives so in the present that she hates the past, and even throws away old photographs and mementos when she finds them. I also live in the present by reading books published in 2020 and magazines that are February 2021 current. If I tossed out my old library and read only new books and magazines I would become a different person. I could become a fast nimble speedboat. But because I loved old pop culture, and can’t let go of old ambitions, magazines, and books, I feel the past I carry around has grown to the size of the Titanic. (I wish I had a photo of a guy in a rowboat towing the Titanic on a rope to put right here.)

The current nonfiction books and science fiction magazines I’m reading are about politics, climate change, and all the other dark clouds the horizon of this century. (No wonder I want to return to last century.) If I only read new books and magazines I’d completely reshape my present personality. Reading these three computer histories rekindles the futures I wanted back in the 1970s and 1980s, and they were tremendously more appealing than the futures I envision now. The people profiled in those books had such wonderful dreams about what computers would bring to the 21st century. And their dreams came true beyond anything they imagined or hoped. Yet, I wonder if they could see the downside of their creations, would they have done anything different? And isn’t that what I’m doing now by rereading these old books, second guessing my past decisions?

One of the reasons I can’t let the past go is it feels unfinished. I didn’t get to consume all the pop culture I wanted back then, satisfy all my wants, or achieve all my ambitions. But having lived in the future, it also feels like we took so many wrong turns. I can’t help but want to go back and finish what I started and even try different paths.

There is a whole lot more I want to say about Hackers, but this essay has already gotten too long for chiseling on this stone. Hopefully to be continued on another rock.

JWH

Our Cognitive Toolbox for Working with Reality

by James Wallace Harris,

All too often we think we know but we don’t. Why do so many people argue with 100% certainty against others who feel equally convinced? Often wisdom tells us the more we know the more we don’t know. Does that mean the person who claims to know nothing knows the most? Why is this reality so hard to understand? Even eyewitnesses are often fooled. And why is it so important to know thyself?

Reality is complex, but is it unknowable? Humans believe they are the crown of creation because all animals are ignorant of their own existence. Is our sentience really a quantum leap over all other life forms on this planet? If we compared ourselves to an amoeba, ant, or cat, we can see that awareness of reality has slowly gotten more complex and each of those animals perceives a larger portion of reality. Does that mean we see everything in reality, or are we just as blind to a much larger reality?

I believe we’ve evolved a number of cognitive tools to analyze reality, but it’s important to know the effectiveness of each.

First-Hand Experience. Common thought claims we have five senses for perceiving reality, but we actually have many more. People often believe seeing and hearing things for themselves is a primary source of knowledge. However, our senses can deceive us. For example, the lady cop who shot a man in Texas because she thought he was a burglar in her apartment when she was in his apartment. Just pay attention to how often eye witness accounts fail. Or better yet, recall all the times your senses have fooled you.

Instinct and Intuition. Our genes and unconscious mind direct us to act without thinking. Many people prefer to go by gut reaction than thinking it through. But how often does gut reaction tell us to kill or take what we want?

Language. By breaking reality down into pieces and giving each part a name goes a long way into gaining useful insight. But language is imprecise and the parts of reality are many. People who know the different names for trees have a greater understanding than the person who only knows the word tree. Language has evolved tremendously giving us one of our best tools. Pay attention to how words help you to perceive how reality works, and observe how people with lesser or better language skills fare compared to you.

Word of Mouth. We learn from other people’s observations. When we were hunters and gatherers hearing from scouts describe where animals could be hunted was vital. On the other hand, if a seafarer told you about mermaids you ended up believing in an unreal being. Word of mouth is very unreliable. Remember the Kindergarten game of Telephone? Word of Mouth evolved into journalism, and we know how reliable that can be. Word of Mouth has always had a fake news problem. Gossip, innuendo, slander are also descendants of word of mouth.

Counting and Measuring. Simple arithmetic became a tool that lets us invent, build, grow crops, trade, and develop an economy. Counting and measuring evolved into mathematics.

Mysticism. Mystics are people who claim to acquire knowledge from a higher source. They became shamans and seers who influenced other people. They also speculated about how reality worked, inventing higher beings. Even today many people still give credence to mystical insight. However, mystical insight has produced an infinite variety of conflicting information. We have to assume its all suspect. Mysticism tries to be the first-person experience of the divine.

Religion. Religion is codified mystical insight that is retaught as the truth. Religion allowed us to create very complex social structures. However, its truth is suspect. If there are a thousand gods, most followers are atheists to 999 of them. Religion succeeds in creating artificial realities that may or may not interface well with actual reality. Religion spreads best through word of mouth.

Laws. Laws are an external tool to encourage consistent thinking. Religious laws attempt to force mystical insights onto a population. Secular laws attempt to get people to work together.

History. If you study the Old Testament you’ll see it’s more about a history of a people than spiritual instruction. We have always tried to remember the past to explain how we got here. Early histories were no better than word of mouth stories that could be highly inaccurate. And each succeeding generation of historians alters the histories. A good example is the New Testament. Whoever Jesus was, and whatever he taught, has been constantly changed by each new writer of the New Testament. It appears the historical Jesus advocated communal living and sharing that today would be called communistic. The historical Jesus was concerned about creating heaven on Earth. It was later writers that gave him superpowers and turned him into God. Studying the history of Christianity is an excellent way to understand how history constantly mutates. History is a worthy way of understanding reality but it has to be tempered by comparing multiple histories.

Philosophy. Where religion taught that knowledge came from God or other spiritual authorities, philosophy teaches us we can figure things out for ourselves. Using rhetoric, logic, and mathematics men and women observe reality and deduce what’s going on. This was a great paradigm shift away from religion. However, like the game Mastermind, it leads to a lot of false assumptions. Elaborate castles of logic can build imposing concepts but that often turns out to be illusions of great knowledge. Philosophy is a major tool for understanding reality but it also has major faults.

Ethics. Ethics, like laws, attempt to come to a consensus on what’s right and wrong. Ethics is based on philosophy. Although in recent years, some ethicists have tried to look for a scientific foundation.

Science. Science combines mathematics, statistics, observation, testing, and philosophy into a systematic way to evaluate reality. Science assumes if tested observations and measurements prove consistent by scientists from any nation or culture then they might be true. Science never assumes it finds the absolute truth, but just the current best guess based on all the existing data. Science is statistical. Some science is so refined that it works incredibly well with reality. Space probes visiting distant worlds validate hundreds of years of scientific endeavors.

Scholarship. We have made education into a major portion of our life. We spend our entire lives trying to figure things out. We study, we think, we make assumptions. Like philosophy, scholarship often builds vast models of speculation. Scholarship tends to endorse results from competing trends. However, scholarly theories can be deceptive and even dangerous.

The problem is we use all these tools to explain our version of reality. Unfortunately, most are unreliable or clash with other people’s version of reality. Science has proven to be the most consistent at explaining reality, but science doesn’t cover everything. For example, right and wrong. These two concepts are ancient, probably coming out of mysticism or an instinctive desire for justice. Both religion and philosophy have tried to perfect them, but our reality is completely indifferent to morality or ethics. We have invented many concepts that just don’t exist in reality.

This causes problems. Several million people might believe with absolute certainty in a particular concept and then try to impose that view on millions of others who are just as certain such a concept is invalid.

We live in a polarize society because we all embrace different ancient beliefs, most of which we can’t explain how they came about. We just accept them as true. Most people believe in God because it was something they learned as a little kid. They won’t let the idea of God go no matter how much other cognitive tools disprove God’s existence.

Donald Trump seems to base most of his knowledge from first-hand experience and word of mouth information. Twitter is the perfect tool for word of mouth. Trump is neither religious, philosophical, or scientific. But this isn’t an uncommon way of dealing with reality. Few people are philosophical or scientific. Too many people only want to trust first-hand experience and instinct, but we know how unreliable those cognitive tools are. People who rely heavily on first-person experience and word of mouth tend to disbelieve science.

There have been various disciplines that try to teach self-programming that jettisons cognitive bullshit. Zen Buddhism is one. Meditation can be used to seek mystical insight or to observe the working of our own being.

The reason I wrote this essay was to help me think clearer. I’ve been reading books on Greek philosophy, and early Christian history. They are teaching me what people 2,000-2,500 years ago thought. I can see those ancient people struggled to make sense of reality without science. I can also see the same struggles today in people. We just don’t think clearly. We’re too influenced by low-level cognitive tools that deceive us. We base our existence on illusions created by those most primal cognitive tools.

I keep hoping the human race will get its act together and create a sane society that coexists with reality, and not on insane illusions and delusions. I realize until everyone becomes a master of their various cognitive tools, and learn the limits and limitations of each, we can’t start working on that sane society. We can’t start learning what’s real until we learn how to perceive what’s not real.

JWH

 

 

A Tale of Two Screen Generations

by James Wallace Harris, Sunday, October 6, 2019

I believe growing up with the television screen made me different from my parents and grandparents. I wonder if kids growing up with smartphone screens will be even more different?

The education you get before starting school is the bedrock of your soul. For most of human history, kids grew up listening to family stories while acquiring their beliefs in religion, economics, and politics. Books, magazines, and newspapers didn’t affect those early years, but when radio came along, a new source of influence competed to program our early childhood. This escalated with television and accelerated even faster with computers, networks, tablets, and smartphones.

In those early years before we learn to read we acquire all kinds of concepts that become the cognitive bricks to our psychological foundation. For example, I didn’t acquire religion during those years, but a belief in science fiction. Aliens replaced gods and angels, heavens replaced heaven, and space exploration replaced theology. And because kids are learning to read at an earlier age today, more concepts are compressed into those formative years. I assume kids today are smarter than we were in the 1950s.

Isn’t this why traditional religious beliefs and family history is less important to people growing up today? Sociologists have long talked about peer pressure influencing teens, but didn’t television shaped the toddlers of my generation? Doesn’t everyone agree that social media pressure is shaping the early childhood of today?

A more descriptive name for Baby Boomers is The Television Generation. We got our name because so many of us showed up all at once after WWII. But more importantly, we were also the first generation to grow up with the television screen. We were raised with three new network eyes on the world. We’re now seeing a generation growing up with mobile devices such as smartphones and tablets, and these kids have countless extra inputs.

I was born in 1951 and it seemed perfectly natural to suckle at the glass teat. Even now I have a hard time comprehending how my parents’ generation grew up without it. And I can’t conceive of what it’s like growing up today playing with mobile devices in the crib. Mobile devices are so much more intelligent than televisions, especially television programming in the 1950s.

Before radio, children acquired limited mythology from their parents, but also from large extended families that crossed generations, and the church. Whatever creation story you were told you accepted. There wasn’t a lot of skepticism back then. Starting with the radio, it was easy for kids to encounter competing creation myths at an earlier age. But it was television that made a quantum leap in providing alternative explanations about reality.

My earliest extensive memories begin around age four. I don’t remember what my parents told me, or what I heard in church. I do remember the television shows I  watched. I remember exactly where I came from – Romper Room, Captain Kangeroo, The Mickey Mouse Club, Howdy Doody, LassieTopper, Love That Bob, Gunsmoke, The Twilight Zone.  Television ignited my imagination. I remember being four and trying to communicate the ideas I got from television with my parents, but they seemed clueless. It’s like we spoke a different language and lived on different planets. They’d tell me about growing up on farms, or the depression, and I just couldn’t imagine what they were talking about. I’d eventually learned about their upbringing from television.

Once I started school I bonded with other kids over the television shows we loved. Television provided a shared language and mythology. However, I think growing up in the 1950s and 1960s is definitely different from today. We had three television networks, and two Top 40 radio stations, and limited access to a small number of popular movies. Among my generation, everyone pretty much watched and listened to the same shows and music. Sure we arranged our top ten favorites a little differently, but everyone pretty much knew about what everyone else liked.

Growing up today the TV screen now brings kids hundreds of cable channels, and a variety of streaming channels with thousands of different choices, and Spotify lets people listening to tens of millions of different songs. Every week countless new movies show up. But more than that, mobile devices let you choose what feels like an infinity of rabbit holes to fall into. I can understand why social media is so popular, it allows people to share their discoveries and make common connections. And I can see why movie franchises are so popular, it’s another way to bond over a limited selection. We really don’t want more shows, we want more shows we all love the same.

I’m writing this over six decades after I grew up. I wonder what people growing up today will say about their early education sixty years from now? In my generation, it was easy to share because we pretty much shared the same content. Now kids need powerful computers to find friends that like the same stuff they do.

I believe the appeal of the church today is not theology but communion. Not the communion of wine and wafers but being with other people sharing a common experience. However, I do believe television in my generation undermined the hold church had on programming our young minds.

Bible stories no longer provided our ontology. The TV screen widened our epistemology. Mobile devices are the fentanyl of screens. I imagine in another generation or two, cyborg-like devices will inject data into kiddies at an even faster rate. However, I believe there’s a limit to what our brains can handle. I’m not sure if smartphones and tablets aren’t exceeding that limit now. But that might be old fogie thinking, and we’ll have future technology that will match our wildest science fiction.

Yet, I also see signs of a backlash movement. Why are record players and LPs making a comeback? Why are there so many Top Ten lists on the web? Aren’t those signs that people want a smaller selection of inputs, ones that have a commonality with other people? Sure, everyone wants to be famous on YouTube, but 75 million kids can’t all have 75 million followers. What we want are five friends that love the same five shows and songs.

When I was growing up we often watched TV with other people. Our parents, our siblings, our friends, our neighbors. When I was little, I’d have friends over and we’d watch Saturday morning TV under tents built of blankets. As teenagers, we’d get high and watch TV together. At college, we’d watch TV in the student union together. Watching TV on a smartphone or tablet is as solitary as masturbation.

Since around 2000 I’ve stopped keeping up with hit songs and albums. I no longer know what new shows begin in the fall. As a kid, my parents used me as a walking TV guide. When I see the magazines at the grocery store checkout line, I don’t know the famous faces on their covers. Movie stars have to be in their fifties before I can remember their names. There’s a limit to how much pop culture I can absorb. I feel pop music peaked in 1965, although I struggled to keep up with it through the 1980s.

I have to wonder if kids growing up playing with smartphones can handle more data than my generation. Can they drink more from the fire hose of the internet longer? I can only chug so much data before I start spewing. Is that my age showing, or does it reveal my limitations shaped by my training watching television in the 1950s? Are those babies growing up playing with smartphones becoming like that little robot Number Five in the film Short Circuit that kept demanding, “More data, more data!”

Is growing up with a mobile device screen wiring kids differently from how we were wired by our television screens? Does Greta Thunberg represent a new stage of consciousness? I hope so. The Television Generation threw a fit in the 1960s. I feel the Smartphone Generation is about to throw a fit in the 2020s. Good for them. Don’t assume you know more than they do – you don’t!

JWH

p.s. That’s me above with my mother and sister when I was four, and my cyclopic guru.

Unraveling a Loose Thread of History Found in a 1956 Issue of Galaxy Science Fiction

by James Wallace Harris, Monday, September 16, 2019

This morning I was flipping through some old issues of Galaxy Science Fiction I had bought on eBay and ran across this ad in the October 1956 issue:

Geniac - Galaxy 1956-10

At first, I flipped right by it. Then in the next issue I picked up, the December 1956 issue, I found this ad:

Geniac - Galaxy 1956-12

This one promised a whole lot more. Could this be for real? Computes, plays games, composes music? I don’t ever remember reading about home computers existing this early. I thought computer kits were something from the 1970s. This December ad promised a new improved 1957 model, and for only $19.95. In 1956, $19.95 was some serious money for a kid. It would probably be hundreds of dollars in today’s money. And was this a genuine computer, or was it some kind of trick, like those X-Ray glasses advertised in the back of comic books?

First stop: Wikipedia.

Geniac was an educational toy billed as a "computer" designed and marketed by Edmund Berkeley, with Oliver Garfield from 1955 to 1958, but with Garfield continuing without Berkeley through the 1960s. The name stood for "Genius Almost-automatic Computer" but suggests a portmanteau of genius and ENIAC (the first fully electronic general-purpose computer).

Operation
Basically a rotary switch construction set, the Geniac contained six perforated masonite disks, into the back of which brass jumpers could be inserted. The jumpers made electrical connections between slotted brass bolt heads sitting out from the similarly perforated masonite back panel. To the bolts were attached wires behind the panel. The circuit comprised a battery, such wires from it to, and between, switch positions, wires from the switches to indicator flashlight bulbs set along the panel's middle, and return wires to the battery to complete the circuit.

With this basic setup, Geniac could use combinational logic only, its outputs depending entirely on inputs manually set. It had no active elements at all – no relays, tubes, or transistors – to allow a machine state to automatically influence subsequent states. Thus, Geniac didn't have memory and couldn't solve problems using sequential logic. All sequencing was performed manually by the operator, sometimes following fairly complicated printed directions (turn this wheel in this direction if this light lights, etc.)

The main instruction book, as well as a supplementary book of wiring diagrams, gave jumper positions and wiring diagrams for building a number of "machines," which could realize fairly complicated Boolean equations. A copy of Claude Shannon's groundbreaking thesis in the subject, A Symbolic Analysis of Relay and Switching Circuits, was also included.

Okay, so it was real! But in 1956? In the mid-fifties, commercial computers were just beginning to be rolled out to businesses. In 1957 American audiences got to see a humorous look at computers in the film Desk Set with Spencer Tracy and Katherine Hepburn. Rumors of computers produced a fear that the librarians would lose their jobs, but ultimately humans prevailed. I expect most Americans in 1957 had never seen a computer and only knew about them from funny cartoons in magazines and newspapers. Geniac came out before Sputnik which ignited a fear that American youths weren’t being educated in science. Was there a desire by kids that early in the 1950s to know about computers?

Here is a History of Computer timeline that shows the Geniac for 1955. And here’s an article about the history of computers that played NIM games, which includes the Geniac.

Scientific American 1950-11The main designer of Geniac appears to be Edmund Berkeley. He wrote an early book about computers in 1949, Giant Brains, or Machines That Think. Berkeley was also written about in Edmund Berkeley and the Social Responsibility of Computer Professionals by Bernedette Long. If you follow that link she writes about his influence with Geniac. I’m awful tempted to buy the Kindle edition. He also designed what some people call the first personal computer, Simon. Simon appeared as 13 how-to articles that began running in Radio-Electronics magazine in October 1950. (All 13 parts can be read online here.) It would have cost around $600 to build and had very limited features with only 2-bits of memory. Berkeley wrote the article “Simple Simon” for the November 1950 issues of Scientific American.

Electronics was a big tech hobby back then and had been since the early days of the radio in the 1910s. Looking at the Geniac ad carefully though showed it wasn’t an electronics kit, but merely electrical. It might contain 400 parts, but they were wires, light bulbs, batteries, nuts, and little contacts. It seems designed to set up simple logic programs. How much could a kid do with one? YouTube to the rescue:

And this film, which features a later model from the 1960s called a Brainiac:

This brings up even more questions. Did kids really play with them? Where they inspired to study computers and become computer programmers and engineers? Were there any famous computer pioneers that started with a Geniac or Brainiac? Could Steve Wozniak or Bill Gates have played with one? Of course, those two might have been too young for this era.

The kit seemed aimed at kids, but it would have required a great deal of work and patience to produce any results. Actually putting one together and doing any of the example projects would have been very educational.

David Vanderschel describes his Geniac computer from 1956. He says an IBM 1620 was the first real computer he encountered in 1962. That was the first computer I programmed on in 1971 at computer school using FORTRAN.

Hackaday had a post last month about the Geniac claiming that Mike Gardi credits his professional success in software development to educational logic games like the Geniac. Gardi created a replica of a Geniac and has links to the original documentation. This 1955 manual had instructions for a couple dozen projects. Gardi said:

Technically GENIAC was a collection of configurable N-pole by N-throw rotary switches, which could be set up to cascaded and thus perform logical functions. As a result GENIAC could use combinational logic only, its outputs depending entirely on inputs manually set. However, projects outlined in the manual, which started with basic logic circuits, ultimately progressed to such things as a NIM machine and TIC-TAC-TOE machine.

I did find a Geniac on eBay that has a $99.99 buy it now price. There’s a Brainiac for sale for $349! That’s more than I’d want to spend. The Brainiac is in great shape though. It’s probably the one from the film above.

The more I Googled, the more intrigued I became about the impact of the Geniac computer. Is this how historians get sucked into writing books? I checked a couple books on the history of personal computers I own, but neither mention Geniac or Edmund Berkeley. If you search Google for the first personal computer you usually get the MITS Altair 8800. Maybe that’s not true. Maybe I could write a whole history book about home computers before 1975.

Additional Reading:

Update:

I went to my public library and looked through the books about the history of computing. I found no mentions of Geniac or Edmund Berkeley. I then checked The Reader’s Guide to Periodical Literature for the years 1950-1960. I found no references to Geniac and only a handful of articles by Berkeley. His articles did sound interesting:

  • “Robots for Fun” Life, 173-74+, March 19, 1956
  • “Relations Between Symbolic Logic and Large-Scale Calculating Machines” Science, 395-399, October 6, 1950
  • “Simple Simon” Scientific American, 40-43, November 1950
  • “Tomorrow’s Thinking Machines” Science Digest, 52-57, January 1950
  • “2150 A.D. Preview of the Robotic Age” New York Times, 19, November 19, 1950
  • “Robot Psychoanalyst” Newsweek, 58, December 12, 1949
  • “Algebra and States and Events” Science Monthly, 332-342, April 1954
  • “We Are Safer Than We Think” New York Times, 11, July 29, 1951

An amusing thing happened at the library. I kept asking the librarians where the Reader’s Guide to Periodical Literature was located. They didn’t know. Finally, they asked a very old librarian and she found it for me. She then came back with the younger librarians, they wanted to see it too. I had told them when I was young every kid was taught to begin their library research with that classic index.

JWH