Select Page

What degrees of knowledge are there? What defines smart?

Is it test scores? Is it something located in the context of “top professors”? Is it some sort of peer judgement?

A top professor in the Ivies, Cal, Chicago, Stanford, MIT etc gets paid a whole lot more than an adjunct professor at a State College.

The conventional take is that a “top professor” is worth the $$$ because somehow they impart a smarter version of the same material. Though we’d be hard pressed to define in what way it’s smarter exactly, we just know it when we see it.

The same problem occurs with defining how smart a particular version of a particular  LLM might be. There’s the test scores, various numbers on various graphs…and then there’s the usage worthiness. Many AI experts are now are saying it’s the latter that really tells the tale on smartness values.

One of the wonders of the world of LLM’s is the compression of the Internet  into a file that would fit on a thumbnail drive. We don’t really know how that’s being accomplished. And we don’t know how closely this simulacrum reproduces or replaces internet content

 

Is knowledge or content being lost in the process of compression? Or as seems most likely, is AI simply a better form of knowledge compression than we previously had access to?

We have used knowledge compression without necessarily calling it that in all forms of learning and education. Perhaps best explained by the distinction between the cliff notes version, and the read the book version. the latter generally considered to be  the superior level of knowledge.

 

We might look to the entry level of a college course, the  101 version and compare it to the 200 and above versions of the same material.

One  is  very compressed, while the others are elaborated, detailed , comprehensive and usefully specific.

 

Then there’s  postgraduate, which for the most part is simply a continuation of “smarter versions” and “more specific versions” of the same material. Or sometimes that’s just more  academic process that just trains for new postgraduate professors.

We could also look at K to 12…each grade a step up in the level of knowledge being imparted.

One of the current ways we use LLM’s is to take a mass of specialized content and turn it into a summarized cliff notes version. LLMs can produce a more or less unlimited amount of deep research, while humans have definite limits on time, energy, attention, and retention. That’s a mismatch. Plus, AI never sleeps, or eats, or takes a weekend off…etc.

So we summarize AI created content into bite size chunks… but even then, we soon have a huge pile of summaries that we need to archive or put in some form of library for future reference. Which often is never accessed again…and might become obsolete…as AI continues to improve and give better answers.

So how might we characterize this form of knowledge compression that seems so tantalizingly powerful and yet so tantalizingly just out of reach if our goal is to be fully educated and maximally “smart”? There’s a seeming paradox, in that more content can be created which generally means more comprehensive knowledge, but if humans don’t have the bandwidth to “consume it” efficiently or effectively, one would question it’s value.

There’s also a scary implication in that if AI can produce mounds of content beyond human capabilities to absorb, the usefulness of human brains and brainpower may seem besides the point.

Here we’re back to the fundamental questions of what is learning, what is education, what is useful smarts?  Our answers are still very fuzzy and vague to those questions. We need better more useful answers in this era of all encompassing knowledge at our fingertips.