“Computational thinking” re-visited in the age of generative madness

Jesse Kim
5 min readApr 26, 2024

--

Computational Thinking / © 2024 Jesse Kim

Statistics was definitely the most boring, uninspiring subject in my last two years of high school. Years later, I got into a postgraduate programme in an unrelated discipline. There, my resentment towards statistics temporarily shot up beyond three standard deviations when I was asked to sit and pass a test recapping the essence of high-school statistics as part of the admission formalities.

Some more years passed, and something rather intriguing happened. Statistics became known by a new name: Data Science. All of a sudden, Data Scientist became the sexiest profession of the 21st century. People even to this day are rushing to bootcamps, Master-level degrees, and YouTube videos to figure out what data science is and how to get on the bandwagon.

Antiquated connotations

Computer Science, a discipline I majored in, on the other hand, has not benefited from such timely rebranding; it has always been called that since the Cold War days. Much like Statistics, “Computer” carries antiquated connotations in which some sociopathic pizza-eating nerd is typing away lines of unintelligible code day and night in a back room.

What’s a computer, really? Is it a pocket calculator? A beige box? A robot with four limbs? Coin-mining semiconductors? A data centre submerged in the North Sea? Human-mimicking intelligence descending from the cloud that sucks up unprecedented amounts of electricity? Perhaps there is a reason why Computer Scientist is not an intuitively understood term as a profession, much less so, arguably, than Data Scientist, Civil Engineer, or Art Historian. Computer is a vocabulary that has lost meaning for consistent usage. [There are, of course, other equally abused terms where it is difficult to answer, “What isn’t?” and requires the speaker to explain what they mean every single time: Digital, Strategy, Governance, IT, and AI, just to name a few.]

What did I gain from computer science?

Reflecting on what computer science as an academic discipline taught me, and what competitive edge materialised out of that, makes me think long and hard.

A degree in computer science did not equip me with hard skills required for real-world system design, provisioning, automation, business intelligence, or — gulp — generative AI. Nor was it supposed to, because a multi-year academic curriculum is not some six-week bootcamp set up in a haste to chase whatever the latest fad is. The programming exercises and assignments in Y2K-era university were collectively a broad yet shallow exploration of historically validated concepts, often in languages students suspected they would not be using out in the industry.

The theoretical part of computer science, namely data structures and algorithms, together with all the textbook jargon, is now just a faint memory to me. I have never in my career said to myself or anyone, “This problem can easily be solved with a linked list!” or “If that needs to be implemented in polynomial time, binary search is the way to go.” Not once. My hands-on programming, data, and consulting skills to produce commercial outcomes are heavily and incestuously attributed to fresh learning, re-learning, and soul-searching on the job.

What’s left is computational thinking.

Does my recollection as a CS major, then, imply that my undergraduate years were a waste of time?

I don’t think so. Computer science actually taught me a great deal of principles and techniques that I have used and developed, often subconsciously, throughout my career and post-university life. If I were to give these principles and techniques a name, it would be methodical approaches to problem-solving and realising advancement.

Recently, however, I came across a 2006 article by Professor Jeannette Wing of Carnegie Mellon University who referred to everything computer science taught me by a shorter name: computational thinking. Okay, computational thinking, it is. I particularly like that this 3-page piece is highly readable to any audience, academic or not. No, it is definitely nothing like the kind of otherworld research publication a community AI cheerleader would casually recommend on social media, knowing full well that their audience won’t bother reading and they have not digested the content themselves, either. In any case, I quote from the Wing article:

Computational thinking involves solving problems, designing systems, and understanding human behavior, by drawing on the concepts fundamental to computer science. […] Having to solve a particular problem, we might ask: How difficult is it to solve? and What’s the best way to solve it? Computer science rests on solid theoretical underpinnings to answer such questions precisely.

The author then does an amazing job of deriving the crux of computational thinking from all corners of computer science, succinctly summarised in an array of principles, techniques, priorities, and characteristics. If there were a test designed to recap the essence of computer science, her article would make the ultimate cheat sheet. It is a timeless distillation of how to approach, evaluate, and solve problems.

More importantly, Professor Wing declares from the get-go that “[computational thinking] represents a universally applicable attitude and skill set everyone, not just computer scientists, would be eager to learn and use.” I may be biased, but I can positively see computational thinking apply equally to chefs running a commercial kitchen, subject-matter experts in medical device sales, chief operating officers, and athletes competing on all levels. Admittedly, every respected scholar is able to position their knowledge and intellectual values as universally applicable truths whose reach extends far beyond their own domain of specialisation; Wing’s computational thinking is a stellar example of that. After all, such is precisely how most philosophy, psychology, and management books are born.

Conceptualising, not programming

Professor Wing’s article may only be 3 pages long, but the coverage of computational thinking is enormous, ranging from complexity to business continuity to modularisation to aesthetics. As with any intellectually stimulating piece of work, it takes curiosity and basic research to see through terminology and internalise each resonating concept into an attitude and skill set one can identify with and develop.

One of the key characteristics of computational thinking is “a way that humans, not computers, think,” said Professor Wing in 2006. Computational thinking is a product of conceptualising, not programming; a fundamental skill, not mechanical routines. Such is what lies in the core of computer science, she stressed early on.

We humans make computers exciting. Equipped with computing devices, we use our cleverness to tackle problems we would not dare take on before the age of computing and build systems with functionality limited only by our imaginations.

Curiously, the above contrasts sharply with the view many humans hold in 2024 that machines generate output of their own intelligence: Equipped with many billions of parameters, we use generative AI’s cleverness to tackle problems we would not dare take on before the age of large language models and build systems with functionality limited only by available training data and cloud computing bills.

All in the name

Lastly, circling back to my earlier observation: Computer Science will most definitely benefit from a name change, however subtle it may be. Computational Science already sounds far more modern and on point. Besides, there is no reason why it should remain an area of science exclusively taught in universities and equivalent tertiary institutions. Perhaps myriads of new bootcamps and online courses titled Computational Thinking or Computational Methods will spring up as soon as the generative madness dies down.

--

--