But I think a critical point here is that just being able to clone
willy-nilly
is confusing one metalevel with another -- and here is where I disagree
with
the "Lieberman" and SELF (what I think of as the LISP, or "data
structure")
schools of prototypes. I feel it is very important to at some point make a commitment to "kind" (I use this word because better terms like "type" and
Kind is as kind does, I would think.
Given that, even Lieberman and Self were still enjoined to make sure that no matter how you fiddle with your objects, they'll still know what to do when they are asked to respond to the subset of their vocabulary that each client throws at them. (I too, am trying to avoid loaded terms.) As long as what you do to an object at runtime doesn't violate the assumptions of the code that uses it, Smalltalk's traditional implicit dynamic "kinds" will still be intact.
Indeed, it's interesting to try to explain to programmers who are now being weaned on interfaces that Smalltalk programmers have been casually using this notion implictly for twenty-five years.
The notion of somehow formalizing protocols has hung as an under-ripe piece of low hanging fruit for fifteen years (I first saw it in the Dan Ingall's Green Book chapter sometime during mid-Reagan era.)
The original impetus was to use types to generate better code (as was the custom during that period). Indeed our group went down that road with Typed Smalltalk.
I think this one reason this one is ripening now because people are finally becoming convinced that black-box components are better for building flexible code that inhertiance, and are looking for better language support , mechanisms, and tools for it. Over half of the Gang-of-Four design patterns involve some sort of delegation or forwarding, so it's natural to want to be able to express, for example, that for a given class (or instance, for that matter) all messages in a given set (Protocol, etc.) will be forwarded or delegated to a particular component. Therein can lie a tale.
"class" have already been colonized with too-specific meanings in the
computer
world). So, while I think that you should be able to do absolutely
anything
with and to your programming system, I also think there should be
"guardians at
the gates" when you go from one metalevel to another. E.g. it shouldn't be
as
easy to change the inheritance chain as it is to send a safe message to
another
object.
The issue isn't as much whether its easy as whether the meta-architecture, or language objects, tell a good story and provide the right hooks to let you accomplish what you want in a way that is clear to readers, and not merely diabolically clever. Good language objects, like any other objects emerge from experience with trying to solve a range of real problems, and diligent cultivation and refactoring. This means you need a coherent architecture at the metalevel, cast in terms of concepts that make sense to people who need to tailor the language to solve their problems, and not just an wide open door.
get rid of data structures is because an assignment statement is a
different
metalevel than a function call, and you just should not be able to do
either
They really are on different "levels", only <level> isn't quite the right way to think of it. Rather, independent design dimensions emerge as you factor the language objects. Having <state> be a distinct metalevel concern is valuable because lots of issues arise that have to do with state indepentent of what is being modelled by an object: memory management, persistence, marshalling, to name but three.
These are the kinds of issues I think Alan was getting at when he talked about tangling one metalevel with another earlier. You can't do these with one flat, single inheritance/delegation chain very easily. It's the architecture behind these, the one that was built in C++ in Self, under-the-hood, where some of the really interesting stuff happens. The pieces you build the chain of still need to be first-class themselves too.
Smalltalk-76 we perhaps reacted too much in the opposite direction, and
didn't
put a more appropriate meta-level in the language for defining syntax --
but
the earler way we worked simply led to a tower of Babel.
When the language itself is built out of objects, who cares how many semicolons can dance on the head of a pin? We know how to build different, dynamic views of things that are built out of objects.
you mentioned this book in your oopsla keynote, but i haven't been able to get my hands on it. to be honest, i haven't tried that hard to get it because you remarked that it was probably only comprehensible to someone with an extensive Lisp back ground (which ain't me) and it really needed to
As a Smalltalker who learned Common Lisp expressly to get his hands on the early MOP and review the book, the Lisp just wasn't an huge obstacle given where I'd already been. I hope the part of Alan's recommendation that sticks with people is that it's a fascinating book, and not that the land of parentheses will eat you alive.
be rewritten for Object Land at large (for which, i believe, you offered a second Smalltalk balloon like the one you gave Dan). so, since *you* read
it
and seemed to get a lot out of it, why don't you clue the rest of us in on some of its more salient points and how you think we might be able to translate them in squeak (or the next blue thang).
The sound bite goes something like:
When the language itself is built out of objects, which can be extended, specialized, customized, and manipulated on the fly like any other objects, it can become the vehicle for its own evolution.
A language built out of objects can adapt (within certain confines), without your having to spend five years before a standards committee to get it to change.
Instead, when you need new mechanisms and features, you build them. The "Art" of the MOP is coming up with a compeling, powerful, disciplined set of objects that make sense to people who want to change the language in the same way that Collections and Dictionaries make sense to people who want to juggle piles of other objects. Otherwise, you're throwing people blindfolded into the boiler room without knowing what levers will blow up the ship (as Alan alluded to with the ST76 experience).
The CLOS/MOP was designed after Smalltalk's, and given that it was fitted to CLOS and Lisp, has an oddly complimentary object model, in which generic functions (selectors) rather than reciever objects control the lookup and dispatch process, and all the arguments to a generic function call can be involved in the dispatch process (that is, it has multimethods).
I suspect Alan is more enamored with the notion of a simple, extensible, reflective, accessible kernel of language objects that have compelling, useful architectural identities than with multimethods and generic functions per se (though you can build these in ST80, and if you get the objects right, you can start to make it look pretty. This is one of the issues we've been looking into.)
There is, too, the beauty of weaving the reflective braid; of constructing the hall-of-mirrors illusion, and bootstrapping it out of bits, and establishing an interesting relationship between the under-the-hood objects and the language objects, and...
The MOP tale, of course, has a long history, in which ST80 and Lisp both play prominent roles. The Smalltalk philosophy of making as much as possible be as open and first-class as possible was one of the primary influence on the MOPers. The philosophy above is one I first saw in the ST74/6/80 languages, and can traced back to McCarthy's little blue book, as Alan noted in Georgia.
--BF (who has been doing the MOP and Reflection stuff with STxx for a Real Long Time(TM). http://laputa.isdn.uiuc.edu
squeak-dev@lists.squeakfoundation.org