Last Updated: February 25, 2016
·
551
· xaviervia

Teaching is the duty of the wise.

The old teach the young. It has been the norm since the dawn of humanity, and the basic mechanism behind it—people becoming old and full of knowledge and new people being born unlearned—has not changed. It happened, however, that somewhere in the past centuries a confusion was introduced by the institution of formal education, and we now mistakenly tend to think of teaching as the duty of teachers alone.

From an effectiveness standpoint, this is a mystifying attitude. Teachers that only do teaching become really good only at teaching, and not so much at what they are supposed to teach. This is a good enough fit for the early years of education, when motivation and containment are far more important than content, but it is not for the process of learning a serious craft.

Universities are usually staffed with some of the best professionals in each field, which has resulted in an acceptable compromise. This compromise could go on forever, had not something happened in recent years that challenged the whole edifice of formal education. That something is the Internet.

When you attend a class, you do it because you want to learn from a teacher that knows more than you. Since someone hires the teacher, you are conceding that the learning institution has the means to identify a knowledgeable person with reasonable accuracy. She must be good enough to actually know more than you, and to continue doing so for a period. For a growing number of disciplines however—software development as a lead example—this is no longer the case. Not only is the competence of a person not knowable in an objective manner, the field itself is not fully knowable for a long enough time frame. Many fields can and will change several times over the course of a single year, an impossible situation for any professor with a class program.

When explaining this situation to people, I often hear of a case made for defending a teacher’s ability to know permanent things, such as algebra, but the point I’m trying to make is about the utility of the formal education system as a preparation for the industry, and in that regard the eternal nature of mathematics is an easily contested argument. Sure thing, algorithms are forever—still, which algorithms are relevant in the current programming paradigm? Is bidimensional geometry relevant for user interface design or has augmented reality taken over? You can’t study all of mathematics just in case, it’s simply too much. Instead, you have to be able to flow.

Furthermore, formal and foundational knowledge of most disciplines is both easy to access in an on demand basis (for example, using an encyclopedia) and a fit to study on your own. This is not the case for most practical endeavours.

Waking up the Master

The challenge is not as serious as it may sound (although it may pose a serious threat to formal education institutions) because the means for learning informally were always available: you can learn from uncelebrated colleagues who have reached some deep understanding of the craft, or as tradition named them, masters.

Yet you cannot simply ask someone to become a master in order for her to teach you. You need to commit yourself to the process, because mastery is not just a state of wisdom: it’s a relationship between the learned and the trainee, a recognition that the now-teacher was once ignorant and clueless, and that the student may some day become a master himself. Because knowledge has evolved beyond the point in which a central institution can impart it efficiently, the calling falls back on the community as a whole, and the old relationship becomes relevant again. In particular, we the coders have not yet had the collective epiphany that teaching is everyone’s responsibility. But we should. Our culture does not yet reflect the extent in which giving back by sharing is close to being a moral imperative. But it should.

Feed the Beast

Knowledge is free on the Internet. On the Internet, it feels as if knowledge yearned to be free, as if it actively wanted to break every possible limitation—be it language, speed, format, even legislation. But information has a cost, since someone, a real person walking this world, has collected it. Somebody wrote the enlightened articles on A List Apart. Other somebodies wrote those articles on Wikipedia in which so many thesis are based, while other somebodies made the experimental research that was cited in the articles. This chain of human collaboration is not a controlled process and has no central agent, but we all can see its impact on our daily life, and we all have our share of gratitude towards anonymous, unknown people that contributed to solve vital problems we no longer face. You cannot throw either money or mere thanks to those people, most of whom already passed away anyways, but you can still, you must still give back to the process. You can feed it further, give it what it craves, and that is more knowledge.

Ask yourself, how many people that you know can benefit from the snippets of wisdom you adquired over the years? I’m not talking about common sense or catch phrases either, I mean actual tips, immediately appliable, that make work and life that much easier for you. Can you imagine many other people benefiting from those tips? Have you written them down? Have you published them? Has anyone’s day been easier because of some small (or large) discovery you shared?

This is the price of all knowledge, even the free one. And as you become wiser, you can think as wisdom coming to you because of an oath you made. This is my oath, in appropiately apocalyptic prose:

"I shall share my knowledge. I shall expand my field of study. Whenever experience enables me, I shall poke the limits of the possible, explore the uncharted territories, and share my discoveries with the next generations, as my elders did with me, and so my debt shall be payed".

This is mastery.