Abstraction
If you’re enjoying Brain Lenses, consider subscribing, sharing it with a friend, and/or clicking the little heart to give it an algorithmically significant “like.”
Free essays are published every Tuesday, and paid subscribers receive an additional essay on Thursdays.
Abstract art is often, though not always, non-representative. Meaning it’s a departure from traditional art forms meant to portray things that exist in tangible reality—people, nature, bowls of fruit.
Instead, abstract art is generally meant to portray less palpable ideas and concepts: emotions, vibes, sensibilities and sensations, relationships, even mathematical formulae.
An abstract, used in the context of a research paper or scientific article, is a high-level summary of what was done, what was found, and what it might mean.
This section of a paper doesn’t fixate on the raw data or other nitty-gritty specifics: it’s a representation derived from all those details, then blended and blurred so that a new, more concise type of analysis is available for consumption by those who care about the outcomes, and perhaps the legitimacy of the approaches taken, rather than every last detail about how it was done and the precise shape of each data-point captured along the way.
In the world of computer science, abstraction refers to the generalization of detail-rich actions and concepts into more succinct, and often casually powerful, versions of the same.
In practice, this might mean bundling up a collection of tools and processes so that, rather than needing to write code using the base-level 1s and 0s that computers utilize as logic gate-based language, we can type out instructions using letters and numbers, which are then converted into 1s and 0s, allowing us to focus on higher-level concerns communicated via human-optimized language while that lower-level of operation is “abstracted away” by the development languages and methods we utilize.
Each line of a modern coding language like Python, then, is a stand-in for a bundle of other processes, which are themselves simplified versions of other processes, and that unpacking of overt complexity continues, layer by layer, all the way down to the level of 1s and 0s; at which point we realize that computer science, to some degree, has been an abstraction of certain electrical engineering and mathematical concepts all along.
There’s abstraction within abstraction, then, and as we simplify at each higher level—often called an abstraction layer—we allow people operating within that loftier layer to focus on other things; not necessarily more important things, but things that exist at a different scale of complexity. Because although obvious complexity increases as we go lower, down toward the 1s and 0s, we’re often able to do more complex things utilizing higher-level languages because they make that complexity more usable.
When you no longer need to worry about the specifics of how your app allows users to pay for things, for instance—or going back to abstract art, which painted, real-life objects might cause someone to feel a particular emotion—you can focus, instead, on other, previously inaccessible or difficult-to-ponder elements of whatever it is you’re doing; you can call a code library that does all the money-processing for you, and you can utilize raw color and texture to stimulate those emotions.
What’s fascinating about this concept is that the more you think about it, the more you become aware that there are few aspects of everyday life that are not abstractions of something else; and this is perhaps true of everything if we think in terms of brain signal processing.
But even stopping short of that, most of us exist within ultra-high abstraction layers; for better and for worse.
Civilization, for instance, is a broad term that refers to a collection of complex systems that allow us to do things like work together to make things, build cities, specialize, and manage resources.
Fundamental to many civilizations are the organizational models that allow us to figure out who decides what, which negative and positive incentives will be used to keep people doing things that are beneficial for our collective aims while disincentivizing things that are not, and how our civilization will perpetuate—including how we deal with black swan events, like natural disasters, wars, and other potentially civilization-ending variables.
Tucked within the high-level abstraction of civilization, we find other concepts like the rule of law, economics, and ideological systems.
Taking just one of these concepts apart, we find that economic theory is predicated on having, among other things, a means of exchanging value.
Exchanging value can be done in all kinds of ways, but the method most cultures have settled upon in the modern world is using a central unit of exchange called currency, which itself is considered valuable because of its centrality and finitude.
Currency is an abstraction of the value it represents, and the systems that allow us to create said currency, to back it with physical or theoretical objects of value, and, to some degree, the civilizational infrastructure that maintains the metaphorical pipes that keep it flowing, cycling, and ready to be exchanged—including the enforcement of its use over any other possible means of exchange (like cryptocurrency or cowry shells).
These exchanges, too, abstract away a more natural level of resource-management that’s often directly reliant upon social status, mating, and violence—in some cases all three at once.
When we discuss abstraction, too, we’re using language, which if spoken, is a collection of sounds we’ve agreed to use in a particular way to represent certain language-based concepts, which themselves represent a collection of ideas, and which, when written, involve using glyphs—letters and scripts and other symbols—to encode those chunks of meaning, instead.
These chunks, when combined, represent organized collections of concepts, expressed in a particular fashion, according to linguistic rules, which themselves represent lower-level concepts.
Abstraction, then, blurs our perception of, and shapes our experience of, reality.
It’s like each abstraction layer zooms out our perspective, like a camera moving further and further away from whatever it is we’re viewing.
As we move further back, that thing we’re looking at becomes less clear, less distinct: perhaps still visible, but we can’t make out as many details.
But other, larger structures begin to come into view as we pull the lens back and back and back: things we never noticed before, because we were zoomed too far in.
This is how our perception can both increase in scope and diminish in clarity as we grow as both individuals and as a species.
Such growth, though valuable, often requires that we abstract away otherwise-visible details in order to catch a glimpse of the bigger picture.
Enjoying Brain Lenses? You might also enjoy my podcast, Let’s Know Things.