By: Dr. Paul Fishwick, UT Dallas
When I first learned about computers, I learned FORTRAN programming. It was the programming language that dominated the scene back in, and before, the early 70s. In today’s world, we have numerous efforts that go by monikers such as “code.” We are told, “learning how to code” is good for your career, and it is fun. While I favor the phrase “programming” because “code” is a bit too narrow (think of Morse code or the ENIGMA code), programming is not without its double meanings. When TV producers talk about programming, they are referring to listings and scheduling. Coding or programming? Either will do.
The issue I have with coding is that it really does not represent true computer science. Consider chemistry or physics lab. In such a lab, you explore the science. In a chemistry lab, there are Bunsen burners, sources for gas and water, lots of differently shaped glass containers, and rubber tubing. When you engage in a chemistry lab, you are doing practical chemistry but you are not learning the essence of chemistry: how atoms are defined, how reactions occur. This is how you can view coding. Coding is like going to the chemistry lab. You learn practical modes of doing computer science. But, if you want to learn real computer science, you must look elsewhere.
This is where “theory” comes into play. Computer science at its most fundamental conception is a theory of information. How is information stored, retrieved, and structured? How does information flow? This theory goes well beyond computers. You may have heard that a food recipe is like an algorithm. That is true. An algorithm is a type of flow of control that can explain everything from how you open and close a door to the flowcharts defining how society regulates industry. Here are some example theoretical concepts:
Set Theory & Logic — representing knowledge about something. Logic is often taught by starting with the Greeks and Aristotle’s syllogism. From syllogism, we progress to propositional logic. “Fred eats potatoes and Mary drinks soda” in propositional logic would be coded as P ^ Q with P=”Fred eats potatoes” and Q=”Mary drinks soda”. From there, we can add inference and we can also group terms together. Predicate (First order) logic takes things to the next level. Set theory is our way of organizing the world. A lot of diagrams are rooted in logic. For instance, you may know of concept maps, semantic networks, or mind maps. Essentially, these are diagrammatic representations of logic.
Automata — Alan Turing offered the idea that “a machine” could be virtual. In doing so, computer scientists over the past 70 years have created classes of machines. A simple one called a “Finite State Machine” defines a machine that can remember only one state at a time. For instance, if “ball is going up” is State One and “ball going down” is State Two, then we know from gravity and friction, that eventually, State One will transition to State Two. Notice how this line of reasoning has nothing intrinsically to do with computer software or hardware; the state machine is not confined to technology. The state is considered an attribute of the ball object. One can teach physics with such observations.
Language — we usually understand language by writing and speaking one. For example, I write and speak English. Noam Chomsky in the late 50s created classes of languages, from simple to complicated and more expressive. These languages turn out to be correlated with automata. A key conceptual part of learning languages is to learn programming language concepts. For example, the idea of object-oriented (OO) design is a design concept that applies to numerous programming languages. What is more vital? Learning Python or learning the principles of languages? You often need to know one or two languages before wading into the sea of concepts, but it is in the concepts you can connect to worlds outside of the computer: think — apply object-oriented design to Alice in Wonderland (one of the projects I had in last semester’s modeling class). OO is conceptual, and thus, mathematical. OO has multiple representations and realizations.
At this point, you might wonder “Why is theory such a big deal?” It seems like coding is much simpler and approachable than theory. That depends entirely on how it is taught. It is possible to convey all theoretical concepts in everyday language, spaces, and things. That is why theory is more vital than coding — because by focusing on the concepts and principles (i.e., computer science), you go beyond technology, beyond programming languages, and you can easily cross-connect with many other disciplines. Finite State Machines in business, biology, or physics? Sure —theory is mathematically grounded, and so transferable. Time to crawl out of the metal box.
If mathematics is the “science of patterns” then computer science, which came from applied mathematics, is the “science of information patterns.”
It is time to push the boundaries and leave the chemistry lab — meaning evolving out of the “coding” obsession. Coding is not real computer science any more then telescopes are real astronomy, microscopes are real microbiology, or Bunsen burners are real chemistry. As a group, we computer scientists need to take the theory and apply it on a broad scale. There are some efforts to do this, but we need to do a lot more work. Many other disciplines can use what computer science has created in the way of theory. To do this, we have to shift beyond coding and not limit theory to computer science majors.
If you want to learn procedural literacy, algorithmic thinking, or computational thinking, there is a phrase for this: computer science. The field has been evolving since the inception of the mathematical components in the 1930s. Time to put away the Erlenmeyer flasks and get back into mathematics. Time to convey the mathematics as broadly as possible, using multiple disciplines and representations.