Any HW hackers here? Has anyone computed what a Dynabook would cost?
First, I'm interested in understanding why true Dynabook hardware has not been available. Is it just too costly? I think we can trace the reasons why the software side of a Dynabook has not been fulfilled (although, I trust that Viewpoints NSF grant will close this gap). I thought that something more powerful and a bit bigger than the Nokia N800 but smaller than today's typical small notebook would be available by now. But I can't find one. I gotta believe that companies like Toshiba have investigated this thoroughly. And, if so, they've come to the conclusion that there is no business. Or if not, maybe by their bureaucratic blindness they have totally missed this market.
Second, I'd like to understand what the cost of a Dynabook would actually be in today's dollars.
I think a true Dynabook is long over due. Today's hardware doesn't suffice. Either it's too underpowered, or too bloated. Too small, or too heavy. (and all too expensive!)
Any thoughts about the specs of the Dynabook and why we are still waiting? (hmm... is the XO is close?... wonder if parts will be available)
(I still have my Sony magic cap. It was a bit too underpowered and bit too thick. and alas, no smalltalk)
Hi Brad--
I think a true Dynabook is long over due. Today's hardware doesn't suffice. Either it's too underpowered, or too bloated. Too small, or too heavy. (and all too expensive!)
What do you have in mind? How fast, how big, how heavy, how much? Personally, I think something even as mundane as a Powerbook is adequate; certainly more appealing than the old Dynabook mockups.
-C
Craig Latta wrote:
Hi Brad--
I think a true Dynabook is long over due. Today's hardware doesn't suffice. Either it's too underpowered, or too bloated. Too small, or too heavy. (and all too expensive!)
What do you have in mind? How fast, how big, how heavy, how much?
Personally, I think something even as mundane as a Powerbook is adequate; certainly more appealing than the old Dynabook mockups.
My question was open-ended to allow others to describe what they feel would represent the best in dynabook hw. Do you feel the Powerbook is only adequate? a good model, or the best out there right now?
My question was open-ended to allow others to describe what they feel would represent the best in dynabook hw.
Ah, but you said the current stuff is unacceptable in various ways, which to me says that you have some magnitudes in mind that *would* be acceptable. I always like to know where the finish line is when someone says we haven't reached it. (Usually it turns out that there isn't one, people are just never satisifed. :)
Do you feel the Powerbook is only adequate? a good model, or the best out there right now?
Yes, I'd call it only adequate, but that's an important milestone after all the painfully inadequate stuff we tortured ourselves with in the past. :) I'd also say it's the best thing going right now. The improvements I'd make are enough battery life to operate full-out for a person's entire waking day, a modular processor connection design that allows use in other smaller and larger devices, and a multi-touch display.
But this is above and beyond the "Dynabook" hardware vision, which I think has effectively been met already. What's missing is the software, and a culture that aspires to doing more than reading email.
-C
Craig Latta wrote:
My question was open-ended to allow others to describe what they feel would represent the best in dynabook hw.
Ah, but you said the current stuff is unacceptable in various ways,
which to me says that you have some magnitudes in mind that *would* be acceptable. I always like to know where the finish line is when someone says we haven't reached it. (Usually it turns out that there isn't one, people are just never satisifed. :)
I should be more accurate here. What I mean is that I haven't seen an acceptable Dynabook. But, I do believe the hardware parts are available today to construct such a beast. (plus, I'm never satisfied...)
If you witness the battle between say OLPC and intel classmate I think the problem is the vendors just want to push out yet another windows laptop. Even Apple had to fight to convince people that laptops could come in other colors than Black.
I wonder if we would still be lugging about telephone book sized black laptops because that represents the less risky path.
On May 22, 2007, at 11:10 AM, Brad Fuller wrote:
Craig Latta wrote:
My question was open-ended to allow others to describe what they feel would represent the best in dynabook hw.
Ah, but you said the current stuff is unacceptable in various ways, which to me says that you have some magnitudes in mind that *would* be acceptable. I always like to know where the finish line is when someone says we haven't reached it. (Usually it turns out that there isn't one, people are just never satisifed. :)
I should be more accurate here. What I mean is that I haven't seen an acceptable Dynabook. But, I do believe the hardware parts are available today to construct such a beast. (plus, I'm never satisfied...)
-- ======================================================================== === John M. McIntosh johnmci@smalltalkconsulting.com Corporate Smalltalk Consulting Ltd. http://www.smalltalkconsulting.com ======================================================================== ===
On Tuesday 22 May 2007 11:42 am, Brad Fuller wrote:
Any thoughts about the specs of the Dynabook and why we are still waiting? (hmm... is the XO is close?... wonder if parts will be available)
AFAIK, the driving factors for personal computing has always been battery life, weight, networking and tight integration between hardware and operating software (i.e. no superfluous components) in that order. It is frustrating to see computing devices sold by megahertz, multi-core, RAM/HDD capacity, camera megapixels and so on.
I think the word book in the name biases us to think of a screen built into the machine. If we drop this assumption, then a Dynabook could just be a small computer embedded in a foldable panel that opens out to a 84-key keyboard and a small 2" preview OLED screen at the top and a resistive touchpad at the bottom. A micro-projector would cast a screen upto 17". USB slots along the edges take in flash memory cards for user-data. When a card is plugged in, the machine starts up automatically and personalizes itself based on files on card. When the card is ejected, the system shuts down.
Dreaming :-) .. Subbu
I'm thinking not in terms of what it is, but rather how and in what context it would be used:
* To be used ubiquitously in any context, it needs to not only be small and have good battery life, but it needs to be cheap and "losable." I think Alan gives an example of taking it to the beach or a raft in the pool. (This also implies replicated external storage.)
* I don't want to just execute prescribed tasks with it, I want to explore and problem-solve (e.g., in the http://nakedobjects.org sense). This may be getting beyond the scope of an electronic book, but I think this is consistent with the general thrust of the dynabook and dynamic languages community. (You could maybe argue that real books with pages are more exploratory/problem-solving than scrolls.) In any case, my feeling (which I'm a relatively recent convert to) is that the best way to do this is with direct manipulation (in both the language sense like self, and the UI sense like the iPhone).
* The things I want to explore and manipulate include all media, for which I want to both get existing media/communications (networked) and capture my own (camera and microphone, possibly in stereo or higher degrees for 3D scanning).
I don't think the hardware -- or the software -- is quite there yet to accomplish all this, but it's getting close. To the degree that one believes that the dynabook hasn't really happened yet, I wonder if it is because we have not yet satisfyingly achieved all the above simultaneously.
-H
subbukk wrote:
On Tuesday 22 May 2007 11:42 am, Brad Fuller wrote:
Any thoughts about the specs of the Dynabook and why we are still waiting? (hmm... is the XO is close?... wonder if parts will be available)
AFAIK, the driving factors for personal computing has always been battery life, weight, networking and tight integration between hardware and operating software (i.e. no superfluous components) in that order. It is frustrating to see computing devices sold by megahertz, multi-core, RAM/HDD capacity, camera megapixels and so on.
I think the word book in the name biases us to think of a screen built into the machine. If we drop this assumption, then a Dynabook could just be a small computer embedded in a foldable panel that opens out to a 84-key keyboard and a small 2" preview OLED screen at the top and a resistive touchpad at the bottom. A micro-projector would cast a screen upto 17". USB slots along the edges take in flash memory cards for user-data. When a card is plugged in, the machine starts up automatically and personalizes itself based on files on card. When the card is ejected, the system shuts down.
Dreaming :-) .. Subbu
Howard Stearns wrote:
I'm thinking not in terms of what it is, but rather how and in what context it would be used:
- To be used ubiquitously in any context, it needs to not only be
small and have good battery life, but it needs to be cheap and "losable." I think Alan gives an example of taking it to the beach or a raft in the pool. (This also implies replicated external storage.)
- I don't want to just execute prescribed tasks with it, I want to
explore and problem-solve (e.g., in the http://nakedobjects.org sense). This may be getting beyond the scope of an electronic book, but I think this is consistent with the general thrust of the dynabook and dynamic languages community.
This is consistent with what I understand the Dynabook to be: a dynamic medium that molds to one's needs; a unique and personal extension of one's work and play. And, this system should be able to be updated with object enhancements authored by others. For instance, if a new video playback medium is invented, the Dynabook should be able to update/adapt. Squeak isn't quite there yet, today. It has problems just moving objects from one version to another. (it can't even playback all video and audio formats.) But, it's a solvable problem: maybe one area to look at is not at the object, but at the message. Maybe the objects are different on each personal machine in that they match they needs locally (the user's modifications and the HW.) and the message is what is unique across Dynabooks.
I also see the Dynabook interoperate with other Dynabooks and other external objects. Not that a user pulls up a web browser and surfs, but access external objects as if the objects are locally resident. Today, Rich Internet Applications (RIA) are a buzzword (those applications that access the network for their own need w/o a browser) -- but this is something Squeak has had fundamentally from early on and croquet has developed further with islands. Again, not completely usable in Squeak today, but solvable. Web data should be manipulated as any other object inside the Dynabook. Security is a concern that needs more research.
The idea of a projector is interesting in that removes real estate from the product. But, it means the product is less personal and not be able to be used outdoors. I like the idea of the XO's display and technology. Maybe it's be a good candidate for the Dynabook display.
I don't think the hardware -- or the software -- is quite there yet to accomplish all this, but it's getting close. To the degree that one believes that the dynabook hasn't really happened yet, I wonder if it is because we have not yet satisfyingly achieved all the above simultaneously.
I don't know. Not that the Dynabook will ever be in stone, I think a version of the Dynabook can be pretty much thought out and planned today. I think the hardware is there, or extremely close (1.8" HD are reaching 100-120GB). I think the software is almost there (of course, "almost" is relative!) What I see is that we are past the concept and idea phase and the rest of the work is mostly sweat - with the occasional, and needed, brilliant light bulbs along the way to encourage new development ideas. Alan, Dan, Yoshiki, Andreas and Ian's paper is a good example of this thought process. Now, I think it is a matter of scale.
I want one now, though ;-)
On Tue, May 22, 2007 at 09:44:37AM -0500, Howard Stearns wrote:
I'm thinking not in terms of what it is, but rather how and in what context it would be used:
- To be used ubiquitously in any context, it needs to not only be small and
have good battery life, but it needs to be cheap and "losable." I think Alan gives an example of taking it to the beach or a raft in the pool. (This also implies replicated external storage.)
XO has all these features.
- I don't want to just execute prescribed tasks with it, I want to explore
and problem-solve (e.g., in the http://nakedobjects.org sense). This may be getting beyond the scope of an electronic book, but I think this is consistent with the general thrust of the dynabook and dynamic languages community. (You could maybe argue that real books with pages are more exploratory/problem-solving than scrolls.) In any case, my feeling (which I'm a relatively recent convert to) is that the best way to do this is with direct manipulation (in both the language sense like self, and the UI sense like the iPhone).
This area is the most lacking area of the Dynabook. Pepsi, Slate, Magritte, Seaside, Enlightenment[1], Sugar[2], eToys, Scratch, Tweak, and Croquet are all partial solutions to the problem, but they are all quite separate so far. This list is clearly Squeak-biased, as I don't keep up with other projects. I don't understand what you mean by "the UI sense", so I cannot comment.
- The things I want to explore and manipulate include all media, for which
I want to both get existing media/communications (networked) and capture my own (camera and microphone, possibly in stereo or higher degrees for 3D scanning).
I don't think the hardware -- or the software -- is quite there yet to accomplish all this, but it's getting close. To the degree that one believes that the dynabook hasn't really happened yet, I wonder if it is because we have not yet satisfyingly achieved all the above simultaneously.
I think the hardware is ready, and OLPC does a great job of uniting the hardware with secure AND extensible software. XO is definitely the closest thing to the Dynabook that has yet been created.
[1]: The Enlightenment Foundation Libraries are, in my opinion, the only non-smalltalk graphic framework that can rival Morphic and Tweak in the area of statically composable models, views, and behavior. However, being C, it cannot really do post-load composability. It allows composition all the way until load-time.
[2]: Sugar, the XO default shell.
Matthew Fulmer wrote:
On Tue, May 22, 2007 at 09:44:37AM -0500, Howard Stearns wrote:
I'm thinking not in terms of what it is, but rather how and in what context it would be used:
- To be used ubiquitously in any context, it needs to not only be small and
have good battery life, but it needs to be cheap and "losable." I think Alan gives an example of taking it to the beach or a raft in the pool. (This also implies replicated external storage.)
XO has all these features.
It does, but I think a Dynabook (or what we now are contemplating a Dynabook to be) would require a bit more processor and storage power (and a better design of the input methods - be it a QWERY keyboard, chord and/or touch-screen.) The Geode is a low-power processor, and that's great and what I think we want to head for. But I think a better balance could be made with another processor to address the needs of a broader user base - balancing software needs with power consumption.
I think also that memory is too low and storage should probably be in the 50GB range. I think also that some compromise in video/audio codec delivery should be made so that most of the multimedia recording/playback processing can be assisted by HW (motion compensation, bitblt, even H.264, etc.).
The Viewpoints "Steps Toward The Reinvention of Programming" (maybe I'll call the authors "The Gang of 5") does not go into detail on what their "metal" consists of. I'd like to find out more of what they are thinking. Maybe they'll start with a powerbook and use the XO in parallel. Maybe they have some ideas of building a new platform in parallel with the software development.
Along those lines, is there any contemporary non-Von Neuman processor architecture that would be better suited for Smalltalk? Perhaps to reduce power and to streamline OO architectures. I remember reading Dan and Alan mentioning the handy use of microcoding processors at Xerox to help their work. I also recall a processor that was built for OO but I can't find my notes. Anyone?
=== General XO specs:
CPU AMD Geode 400 MHz x86 Memory 128 Mb 133 MHz DRAM BIOS LinuxBIOS stored on 512k flash ROM Storage 512 Mb SLC NAND Flash memory Video 693 x 520 pixel color display capable of switching to a 1200 by 900 sunlight-readable monochrome mode Network internal 802.11 wi-fi with mesh networking capability Keyboard various depending on target language. They will include two 5-key cursor-control pads. Mouse a touchpad pointing device Interfaces 4 USB ports Power input jack for DC from 10 up to 25 volts. A human-powered generator, probably foot powered, will be bundled with the unit. The internal rechargable battery has 5 NiMH cells. Sound built-in stereo speakers and microphone
On Tue, May 22, 2007 at 03:29:52PM -0700, Brad Fuller wrote:
Along those lines, is there any contemporary non-Von Neuman processor architecture that would be better suited for Smalltalk? Perhaps to reduce power and to streamline OO architectures. I remember reading Dan and Alan mentioning the handy use of microcoding processors at Xerox to help their work. I also recall a processor that was built for OO but I can't find my notes. Anyone?
Jecel is the local expert in processors. You should check out his site: http://www.merlintec.com:8080/hardware
He and I will probably be working collaboratively on implementing one of those designs (we are both starting a Master's degree at the same time, and we are planning to work together on the same project)
Nothing official yet, but it is something to look forward to.
On 22-May-07, at 8:26 PM, Matthew Fulmer wrote:
Jecel is the local expert in processors. You should check out his site: http://www.merlintec.com:8080/hardware
I still think a very simple RISC architecture with a substantial above-the-bus chunk of memory that can be used for 'microcode' or data store, no traditional (and expensive) cache, transputer-like communication channels to other cores and probably no special floating point hardware would be nice. If you can get to a state where dozens/hundreds of cores can be sensibly used then one or two can spend their time as floating point units and if needed many more can join in. Likewise for video stream processing.
The really hard part is getting people to actually think about multi- processing solutions to problems. The software world is far too comfortable with single-thread thinking and the cosy fantasy version of Moore's Law.
tim -- tim Rowledge; tim@rowledge.org; http://www.rowledge.org/tim Strange Opcodes: ZZZZZZZZZZZZ: enter sleep mode
On Wednesday 23 May 2007 10:46 am, tim Rowledge wrote:
The really hard part is getting people to actually think about multi- processing solutions to problems....
On the contrary, it is much simpler to write and reason about programs if multiprocessing capability is given. Dijkstra's do-od structure was inherently multi. But building machines to 'execute' such programs was hard, so system designers invented languages that forced programmers to code for efficiency rather than simplicity. This trend was beautifully captured by Gerald Weinberg in his story of Levine the Genius Tailor:
http://www.zafar.se/bkz/Articles/GeniusLanguageDesigner
Regards .. Subbu
subbukk subbukk@gmail.com writes:
On the contrary, it is much simpler to write and reason about programs if multiprocessing capability is given. Dijkstra's do-od structure was inherently multi. But building machines to 'execute' such programs was hard, so system designers invented languages that forced programmers to code for efficiency rather than simplicity. This trend was beautifully captured by Gerald Weinberg in his story of Levine the Genius Tailor:
That's a cool story. It sounds just like language design!
I am not so sure, however, that we have found even great ways to *think* about parallel programs. Dijkstra's discussion of do-od is really cool. However, here are two challenges it does not address:
1. Even when you are thinking about formal proof, like Dijkstra was, parallelism in large OO programs means you have to reason about aliasing, i.e. you have to do data flow analysis. This is hard, probably too hard if the language is unconstrained.
2. Most programs are not formally proven. Instead, their correctness relies on careful reasoning, on good processes, and on testing. Parallelism undermines the testing part.
The jury is out. One promissing avenue, though, is the message passing used in E and in Actors-based systems. You can try this style in Squeak by using SynchronizedQueue's. Instead of having locks to protect shared data structures, you eliminate sharing and instead assign each structure to a single thread. Any other thread that wants to access the structure, must do it by sending a message to the appropriate thread. In Squeak, you can implement the message sending using SynhcronizedQueue's.
Lex
On Sat, 26 May 2007 13:02:05 -0700, Lex Spoon lex@lexspoon.org wrote:
I am not so sure, however, that we have found even great ways to *think* about parallel programs.
Actually, I think programming in parallel could be easier than programming serially, given suffiicient help from the tool.
When I was fooling around with language designs, one thing that occurred to me was that, when we program serially, we imply that we =care= what order things are executed in, when very often we don't really. Order has a implied significance which can actually be deceptive. Worse, it can have a significance that is hidden.
For example, we could have some initialize code:
aVar := someClass new. anotherVar := someOtherClass new.
Does anotherVar need aVar to be initialized before it can be initialized? Good programming practice sez "it shouldn't" but play along. The point is, there's no way to take say "I don't care what order things occur in, as long as THIS is done before THAT begins."
You could program in two dimensions, almost like music, where everything on a particular line had to be done sequentially, while things that were vertically arranged (rather than being chords, as in music) could be done in any order. Synchronization points could be set up vertically (almost like measure bars).
------serial execution-----> | ] | ] parallel ] | synch point | ] | ]
I think it would actually clarify things.
Blake wrote:
When I was fooling around with language designs, one thing that occurred to me was that, when we program serially, we imply that we =care= what order things are executed in, when very often we don't really. Order has a implied significance which can actually be deceptive. Worse, it can have a significance that is hidden.
For example, we could have some initialize code:
aVar := someClass new. anotherVar := someOtherClass new.
Does anotherVar need aVar to be initialized before it can be initialized? Good programming practice sez "it shouldn't" but play along. The point is, there's no way to take say "I don't care what order things occur in, as long as THIS is done before THAT begins."
Occam allows you do to that with its SEQ and PAR constructs.
On Sunday 27 May 2007 1:32 am, Lex Spoon wrote:
- Even when you are thinking about formal proof, like Dijkstra was, parallelism in large OO programs means you have to reason about aliasing, i.e. you have to do data flow analysis. This is hard, probably too hard if the language is unconstrained.
What is hard about programming is coming up with the right invariants. What happens between the starting invariant and ending invariant is determined by the semantics. E.g. x, y := y, x is easier to reason about than bringing in a temp to swap x and y.
- Most programs are not formally proven.
True. Proving large programs automatically is theoretically infeasible for most languages in use today. But many algorithms do go thru formal proofs before adoption - e.g. floating point math, semaphores, regular expressions are all based on proven algorithms.
Instead, their correctness relies on careful reasoning, on good processes, and on testing. Parallelism undermines the testing part.
Testing invariants is not affected by serial/parallel execution, but debugging or tracing parallel programs is not really for faint-hearted :-).
BTW, this thread is veering off-topic. We should get back to the cost of Dynabook before Brad Fuller starts reaching out for a 2x4 :-).
Regards .. Subbu
subbukk subbukk@gmail.com writes:
On Sunday 27 May 2007 1:32 am, Lex Spoon wrote:
- Even when you are thinking about formal proof, like Dijkstra was, parallelism in large OO programs means you have to reason about aliasing, i.e. you have to do data flow analysis. This is hard, probably too hard if the language is unconstrained.
What is hard about programming is coming up with the right invariants. What happens between the starting invariant and ending invariant is determined by the semantics. E.g. x, y := y, x is easier to reason about than bringing in a temp to swap x and y.
An interesting example. This one is about parallelism, but not non-determinism. A generalization would be "clocked" systems, which indeed are really cool.
- Most programs are not formally proven.
True. Proving large programs automatically is theoretically infeasible for most languages in use today. But many algorithms do go thru formal proofs before adoption - e.g. floating point math, semaphores, regular expressions are all based on proven algorithms.
Yes, algorithms are more amenable to proof than implementation. If you implement a proven algorithm, and something goes wrong, then it's a normal old bug. If you do like Java and implement an unproven type system, then you can get really nasty surprises. The last I heard, it is still not even known if Java's generics *can* be fully implemented to spec. That is not a nice place to be as a programmer!
The language question you raise is also interesting. Sometimes implementation languages are pointlessly difficult to prove things about. Other times, though, there is a real tradeoff between implementation speed and provability. A good example is inheritance plus method override. These are really great for adding functionality to existing programs, but are awkward for proof tools because the meaning of a method call depends on which overridden versions of the method you have loaded.
It would be really neat to have a subset of Squeak that was designed to be amenable to proof, and then to teach one of the existing proof systems about this subset. If you include blocks, but reject inheritance, then you could come up with something close to the lambda-calculus-like languages that the existing proof tools are so good at. You would not like programming this way, compared to using full Squeak, but for core things like Semaphore and SharedQueue it would seem useful.
Lex
On Tuesday 29 May 2007 12:40 pm, Lex Spoon wrote:
Yes, algorithms are more amenable to proof than implementation. If you implement a proven algorithm, and something goes wrong, then it's a normal old bug.
Proofs and Implementation represent two ends of a large spectrum. Algorithms falls there somewhere in between (but closer to Proofs). I think Alan Kay pointed out a few mails before that "HW is just SW crystallized early". A mathematician would work near the Proof end, while an engineer would have to tackle the Implementation side. For engineers, any system that takes us "close enough" is good enough. But, algorithm to implementation is a big leap of faith today :-(. Where the translation covers just a couple of orders of magnitude (e.g. math, graphics etc), we can achieve an accuracy that is sufficient to commit to HW. But, where modularity is introduced at compile time but vanishes at runtime, we are forced to analyze code that runs into millions of machine instructions. Bugs will be the norm rather than an exception :-).
Projects like Exupery are good because they help us short-circuit many steps between algorithms to machine instructions. The first Smalltalk machine description didn't come with invariants and the bugs were discovered 'at execution time' (cf. Bits of History). It would be interesting to see if Dynabook could be described in Smalltalk along with all invariants.
Regards .. Subbu
Hi Subbu --
See what you think of the "preposterous proposal" (as one reviewer termed it).
http://www.vpri.org/pdf/NSF_prop_RN-2006-002.pdf
One of the ideas here is that you have to work the whole spectrum, but try to find qualitative boundaries that can be maintained and will help all.
Cheers,
Alan
At 07:46 AM 5/29/2007, subbukk wrote:
On Tuesday 29 May 2007 12:40 pm, Lex Spoon wrote:
Yes, algorithms are more amenable to proof than implementation. If you implement a proven algorithm, and something goes wrong, then it's a normal old bug.
Proofs and Implementation represent two ends of a large spectrum. Algorithms falls there somewhere in between (but closer to Proofs). I think Alan Kay pointed out a few mails before that "HW is just SW crystallized early". A mathematician would work near the Proof end, while an engineer would have to tackle the Implementation side. For engineers, any system that takes us "close enough" is good enough. But, algorithm to implementation is a big leap of faith today :-(. Where the translation covers just a couple of orders of magnitude (e.g. math, graphics etc), we can achieve an accuracy that is sufficient to commit to HW. But, where modularity is introduced at compile time but vanishes at runtime, we are forced to analyze code that runs into millions of machine instructions. Bugs will be the norm rather than an exception :-).
Projects like Exupery are good because they help us short-circuit many steps between algorithms to machine instructions. The first Smalltalk machine description didn't come with invariants and the bugs were discovered 'at execution time' (cf. Bits of History). It would be interesting to see if Dynabook could be described in Smalltalk along with all invariants.
Regards .. Subbu
On Tuesday 29 May 2007 8:58 pm, Alan Kay wrote:
Hi Subbu --
See what you think of the "preposterous proposal" (as one reviewer termed it).
Some of the ideas presented (e.g. Island, pseudo-time) would have been considered 'bold' or 'risky' in the nineties, but not so today. Kairos works so well in real world (e.g. children, artists, farmers, etc), so there is no reason for computing should stick to chronos time.
Building a 20KLOC monolith would definitely be a challenge. Unix kernel was only around 6K lines of C when it started. TinyCC (circa 2004), a bootloader that compiles Linux kernel on the fly and runs it, is about 7500 lines of C. The issue here is not one of space or speed constraints but one of verifying correctness of such a monolith within engineering tolerances.
Regards .. Subbu
Subbukk-
Thanks for sharing this! I had been ignorant of the Kairos / Khronos distinction. Quite relevant for Croquet!
-H
subbukk wrote:
...Kairos works so well in real world (e.g. children, artists, farmers, etc), so there is no reason for computing should stick to chronos time. ...
On Tuesday 29 May 2007 11:18 pm, subbukk wrote:
Building a 20KLOC monolith would definitely be a challenge. Unix kernel was only around 6K lines of C when it started. TinyCC (circa 2004), a bootloader that compiles Linux kernel on the fly and runs it, is about 7500 lines of C. The issue here is not one of space or speed constraints but one of verifying correctness of such a monolith within engineering tolerances.
Let me clarify what I meant since the way I phrased the sentence is ambiguous. I am not claiming that dynabook software can be built in less than 20KLOC. If our basic unit of assembly is a line of code, then putting together 20K of them reliably in a way that millions could use it is indeed a challenge. Just like Smalltalk/Lisp unburdened programmers from tracking memory allocs/frees, we need better abstractions and separation of concerns. Instead of 20KLOC, the problem could be factored into two systems of 5-7K each to make it tractable.
Regards .. Subbu
From: Alan Kay alan.kay@squeakland.org Reply-To: The general-purpose Squeak developers listsqueak-dev@lists.squeakfoundation.org To: The general-purpose Squeak developers listsqueak-dev@lists.squeakfoundation.org, squeak-dev@lists.squeakfoundation.org Subject: Re: Dynabook hw cost Date: Tue, 29 May 2007 08:28:23 -0700
Hi Subbu --
See what you think of the "preposterous proposal" (as one reviewer termed it).
http://www.vpri.org/pdf/NSF_prop_RN-2006-002.pdf
One of the ideas here is that you have to work the whole spectrum, but try to find qualitative boundaries that can be maintained and will help all.
Cheers,
Alan
Well personally I am very hopeful about this proposal. I have been thinking for some time now that it is time to look at a new style of OS, and this sounds very good indeed. Do you have funding yet? When can we expect to see something we can play with? :)
_________________________________________________________________ Catch suspicious messages before you open themwith Windows Live Hotmail. http://imagine-windowslive.com/hotmail/?locale=en-us&ocid=TXT_TAGHM_migr...
Tim Rowledge wrote on Tue, 22 May 2007 22:16:13 -0700
On 22-May-07, at 8:26 PM, Matthew Fulmer wrote:
Jecel is the local expert in processors. You should check out his site: http://www.merlintec.com:8080/hardware
Thanks for the "local expert" comment :-)
I am catching up with the past week's email and came across this interesting discussion. There is a particularly relevant link nearly hidden at the bottom of the page you indicated which Brad might find interesting - the list of Smalltalk computers.
I still think a very simple RISC architecture with a substantial above-the-bus chunk of memory that can be used for 'microcode' or data store,
Your wish is my command - check out my RISC42 core design. Its 16 bit instruction set was heavily influenced by my effort to come up with a nice microcode for a Squeak processor.
no traditional (and expensive) cache,
Caches have their uses as well, specially if they are tweaked to deal with objects and PICs. I see no reason not to have both caches and visible local memory so all my recent designs have had this style.
transputer-like communication channels to other cores
That might be a bit simplistic, and even the last Transputer from Inmos (T9000) replaced that with hardware routing. Perhaps something more like the old J-Machine from MIT (a Smalltalk computer with 1024 processors)?
and probably no special floating point hardware would be nice.
How about bitblt in hardware and stuff like that?
If you can get to a state where dozens/hundreds of cores can be sensibly used then one or two can spend their time as floating point units and if needed many more can join in. Likewise for video stream processing.
Indeed, which is the spirit of Chuck Moore's current design (24 Forth processors): http://www.intellasys.net/
And this is also why I decided to register the "Plurion" trademark (multiple RISC42 cores). For my thesis, however, I will be trying to take advantage of the fact that the hardware is implemented using a FPGA to replace one or two processors with hardware implementation of key objects if the adaptive compilation system (Self/Strongtalk style) decides these are extreme "hot spots" in the current execution.
The project Matthew mentioned is the one named "RNA" on the above page. That one doesn't have a memory/processor separation at all.
The really hard part is getting people to actually think about multi- processing solutions to problems. The software world is far too comfortable with single-thread thinking and the cosy fantasy version of Moore's Law.
That died with the Pentium 4. Nearly half of the recent talks at http://ee380.stanford.edu/ start out with some graph that proves just that. The software people are slowing starting to figure this out.
But back to this thread's subject: the XO is not a particularly good indication of how much a dynabook could cost. It is a great effort to reduce cost but has far too much PC legacy overhead. Certainly some laptop that costs around $500 retail could approach $175 when bought directly off the assembly line in huge quantities and with absolutely no taxes. Not that this would make it as nice a computer for children as the XO, of course! But I feel we should aim to have a Dynabook for $30 under these conditions (some $80 retail) by the end of this decade.
-- Jecel
Brad,
The Viewpoints "Steps Toward The Reinvention of Programming" (maybe I'll call the authors "The Gang of 5") does not go into detail on what their "metal" consists of. I'd like to find out more of what they are thinking. Maybe they'll start with a powerbook and use the XO in parallel. Maybe they have some ideas of building a new platform in parallel with the software development.
I should mention that there are more people working on the project (in addition to the authors who are committing to the project in different ways). The Gang of "5" wouldn't really match what is going on^^;
General XO specs:
There was a change on the spec while ago. The latest one is available at http://wiki.laptop.org/go/Hardware_specification#Beta_Test_3_Systems_.28BTes... but the following is the excerpt:
CPU AMD Geode 400 MHz x86
AMD Geode LX 433 (should allow 500, I think) Mhz.
Memory 128 Mb 133 MHz DRAM
256 MB.
Storage 512 Mb SLC NAND Flash memory
1GB.
Video 693 x 520 pixel color display capable of switching to a 1200 by 900 sunlight-readable monochrome mode
Did you devide the number by three to get the color resolution? It doesn't quite work in that way^^; Take a look at great OLPC screen simulator by Bert (http://freudenbergs.de/bert/etoys/OLPC-XO-Display.pr)
Mouse a touchpad pointing device
You might add that the touch pad has two "layers".
Interfaces 4 USB ports
3 are available to the outside.
-- Yoshiki
For the fun of being a devil's advocate, I'll suggest here that current (or near-future) technology has already leap frogged over the physical definition of the need for a Dynabook.
It's really all about convenience to the human person for the task at hand, both physical convenience and mental convenience. So, these two observations suggest to me what we're really headed for.
1. Clearly, mentally, we now desire to search, view, organize, and manipulate much more information than can be stored in a Dynabook in any using any current or anticipated portable storage technology and to access that volume of data more quickly than can be transfered to any portable Dynabook. So, in many ways, interconnected, online, hosted, data and processes are more interesting and convenient for time and space than most things a Dynabook can manipulate in isolation. 2. As shown by the fact that monocles have turned into glasses and, in turn, into contact lenses, pocket watches to wrist watches, desk phones to portable phones to cell phones to bluetooth ear pieces, and pens to touch screens to motion sensors (manipulating your own digits while they're close to you) ... it's more convenient for a device to hold on to you than for you to manually hold on to it.
Therefore, I suspect that future screens will be those devices that point low power lasers into your eye and track head and eye motion giving you a virtually infinite view-screen in size, depth, and detail ... in 3D ... wearable ... hands-free. Maybe not sociably acceptable now, but needs override social tastes over time. The audio will include a little voice, like your conscience, overlaid on whatever you're listening to. The book part won't be something you hold, but worn devices which input your voice for situations where that's convenient, input your hand/finger gestures where that's convenient, and your subvocalization where that's convenient, and hide themselves in physically social situations.
Now if only I could invent a device that could make me look like I'm interested and listening to when I'm not while someone is speaking directly to me ... and recording was being said for later reference and prompting me to say the appropriate "uh-huh" at the appropriate places.... I could make a fortune selling that one :^) (to married men).
The syllable "book" suggests a codex, which has been a convenient, familiar, and dense medium of transporting, preserving, organizing, and sharing information. But, new mediums tend to adapt the the resources available so the codex form (morph) itself may be prosaic now.
Also, "dyna" means power in Greek so a "Dynabook" is a "power-book" by definition. ;-)
Cheers, Darius
OK, before you ask me "What little voice?"...
It's the little voice you hear saying "What little voice?"
Can you hear it now?
Cheers, Darius
Also, manufacturing such a device would use a lot less natural resources than a Dynabook would for both the display and the battery.
And it's harder to "lose" when you're wearing it. And harder for personal data to be stolen from your person or learned by "shoulder surfers" but can be easily shared virtually with anyone near or far. Like Murray suggests in "Hamlet on the Holodeck - the Future of Narrative in Cyberspace" we'll be living "in" our data soon, either constructing organized knowledge, or playing at destroying organized knowledge.
Personally constructed data/processes can also be more redundant, accessible, shareable and take advantage of the economy of scale and the economy of repurposing unused resources when hosted, so long as you trust your hosts.
Cheers, Darius
"Darius Clarke" socinian@gmail.com writes:
- Clearly, mentally, we now desire to search, view, organize, and
manipulate much more information than can be stored in a Dynabook in any using any current or anticipated portable storage technology and to access that volume of data more quickly than can be transfered to any portable Dynabook. So, in many ways, interconnected, online, hosted, data and processes are more interesting and convenient for time and space than most things a Dynabook can manipulate in isolation.
Networking is important, but I'd put it on a different reason. Raw storage capacity of portables is astronomical. It is routine nowadays to store full-length movies on a laptop.
Networking remains important, though, for communication. It's very powerful to be able to check on wikipedia whenever a question comes to your mind. It's really useful to be able to download new software on demand as the need arises. And gee, a lot of what people do with their fancy personal computers, anyway, is sit around and talk chat groups and post on message boards
Lex
squeak-dev@lists.squeakfoundation.org